-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add cpu_pinning and isolate_emulator_thread to vm resource #105
base: master
Are you sure you want to change the base?
Conversation
2d3d359
to
c46ed10
Compare
Good job. Looks pretty good to me. But one thing to consider is that a feature that goes hand-in-hand with CPU pinning is isolating the Qemu threads: https://kubevirt.io/user-guide/compute/dedicated_cpu_resources/#requesting-dedicated-cpu-for-qemu-emulator This is an additional flag that can be set to allocate an additional physical CPU just for the purpose of running the emulator and (depending on the ioThreadsPolicy) the io threads. This can have an impact on the latency of the remaining CPUs, that are assigned to the VM. Perhaps this flag should also be made available, to fully support workloads that benefit from CPU pinning? |
Thank you for the review! The spec:
domain:
cpu:
dedicatedCpuPlacement: true
isolateEmulatorThread: true
resources:
limits:
cpu: 2 In this case, the virt-launcher pod would actually acquire 3 CPUs. c.c. @bk201, do you think we should consider this as a follow-up in the CPU pinning implementation? |
@brandboat We can create a new enhancement issue if it's not in the current implementation scope. |
filed harvester/harvester#6672 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks for the PR. Test both with / without dedicatedCpuPlacement
and isolateEmulatorThread
cases. All cases work as expected.
Signed-off-by: Cooper Tseng <[email protected]>
Rebased PR to fix conflicts and refactor acc tests. |
Signed-off-by: Cooper Tseng <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests fail for me:
Running tests:
github.com/harvester/terraform-provider-harvester coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/network coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/storageclass coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/keypair coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/clusternetwork coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/image coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/cloudinitsecret coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/vlanconfig coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/provider/volume coverage: 0.0% of statements
=== RUN Test_checkKeyPairsInUserData
=== RUN Test_checkKeyPairsInUserData/correct_ssh_authorized_keys_in_root
=== RUN Test_checkKeyPairsInUserData/wrong_ssh_authorized_keys_in_root
=== RUN Test_checkKeyPairsInUserData/correct_ssh_authorized_keys_in_users
=== RUN Test_checkKeyPairsInUserData/wrong_ssh_authorized_keys_in_users
=== RUN Test_checkKeyPairsInUserData/no_ssh_authorized_keys
=== RUN Test_checkKeyPairsInUserData/empty_content
--- PASS: Test_checkKeyPairsInUserData (0.00s)
--- PASS: Test_checkKeyPairsInUserData/correct_ssh_authorized_keys_in_root (0.00s)
--- PASS: Test_checkKeyPairsInUserData/wrong_ssh_authorized_keys_in_root (0.00s)
--- PASS: Test_checkKeyPairsInUserData/correct_ssh_authorized_keys_in_users (0.00s)
--- PASS: Test_checkKeyPairsInUserData/wrong_ssh_authorized_keys_in_users (0.00s)
--- PASS: Test_checkKeyPairsInUserData/no_ssh_authorized_keys (0.00s)
--- PASS: Test_checkKeyPairsInUserData/empty_content (0.00s)
PASS
coverage: 2.8% of statements
ok github.com/harvester/terraform-provider-harvester/internal/provider/virtualmachine 0.031s coverage: 2.8% of statements
? github.com/harvester/terraform-provider-harvester/pkg/constants [no test files]
github.com/harvester/terraform-provider-harvester/pkg/helper coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/pkg/importer coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/internal/util coverage: 0.0% of statements
github.com/harvester/terraform-provider-harvester/pkg/client coverage: 0.0% of statements
=== RUN TestAccImage_basic
provider_test.go:48: [{0 Get "https://192.168.0.131/k8s/clusters/local/apis/harvesterhci.io/v1beta1/settings/server-version": dial tcp 192.168.0.131:443: connect: connection refused []}]
--- FAIL: TestAccImage_basic (0.00s)
=== RUN TestAccKeyPair_basic
resource_keypair_test.go:48: Step 2/3 error: Error running pre-apply plan: exit status 1
Error: Get "https://192.168.0.131/k8s/clusters/local/apis/harvesterhci.io/v1beta1/settings/server-version": dial tcp 192.168.0.131:443: connect: connection refused
on <empty> line 0:
(source code not available)
--- FAIL: TestAccKeyPair_basic (1.45s)
=== RUN TestAccNetwork_basic
resource_network_test.go:50: Step 2/2 error: Error running pre-apply plan: exit status 1
Error: Get "https://192.168.0.131/k8s/clusters/local/apis/harvesterhci.io/v1beta1/settings/server-version": dial tcp 192.168.0.131:443: connect: connection refused
on <empty> line 0:
(source code not available)
--- FAIL: TestAccNetwork_basic (0.25s)
=== RUN TestAccStorageClass_basic
resource_storageclass_test.go:45: Step 1/1 error: Error running pre-apply plan: exit status 1
Error: Get "https://192.168.0.131/k8s/clusters/local/apis/harvesterhci.io/v1beta1/settings/server-version": dial tcp 192.168.0.131:443: connect: connection refused
on <empty> line 0:
(source code not available)
--- FAIL: TestAccStorageClass_basic (0.19s)
=== RUN TestAccVirtualMachine_basic
resource_virtualmachine_test.go:157: Step 1/2 error: Error running pre-apply plan: exit status 1
Error: Get "https://192.168.0.131/k8s/clusters/local/apis/harvesterhci.io/v1beta1/settings/server-version": dial tcp 192.168.0.131:443: connect: connection refused
on <empty> line 0:
(source code not available)
--- FAIL: TestAccVirtualMachine_basic (0.19s)
=== RUN TestAccVirtualMachine_cpu_pinning
--- FAIL: TestAccVirtualMachine_cpu_pinning (0.00s)
panic: interface conversion: interface {} is nil, not *client.Client [recovered]
panic: interface conversion: interface {} is nil, not *client.Client
goroutine 834 [running]:
testing.tRunner.func1.2({0x25f1ec0, 0xc0010cef30})
/usr/lib64/go/1.22/src/testing/testing.go:1631 +0x24a
testing.tRunner.func1()
/usr/lib64/go/1.22/src/testing/testing.go:1634 +0x377
panic({0x25f1ec0?, 0xc0010cef30?})
/usr/lib64/go/1.22/src/runtime/panic.go:770 +0x132
github.com/harvester/terraform-provider-harvester/internal/tests.TestAccVirtualMachine_cpu_pinning(0xc001093ba0)
/go/src/github.com/harvester/terraform-provider-harvester/internal/tests/resource_virtualmachine_test.go:203 +0x14f7
testing.tRunner(0xc001093ba0, 0x2b9a500)
/usr/lib64/go/1.22/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
/usr/lib64/go/1.22/src/testing/testing.go:1742 +0x390
FAIL github.com/harvester/terraform-provider-harvester/internal/tests 2.119s
FAIL
FATA[0039] exit status 1
make: *** [Makefile:11: test] Error 1
I'm not exactly sure what's up with the connection refused
errors, since I definitively have a Harvester cluster available at that address, but the panic needs to be fixed regardless.
Actually I got the same error after I believe it's not related to this PR but I'll look into this issue, thanks for the comment @m-ildefons |
Signed-off-by: Cooper Tseng <[email protected]>
After deb9126, now `make test` output
|
related to harvester/harvester#6551, include ACC test case for cpu_pinning and isolate_emulator_thread.
Test Plan
Prerequisite: Prepare a harvester cluster with v1.4 dev version (e.g. v1.4.0-dev-20240918)
kubeconfig_test.yaml
and then move the config underterraform-provider-harvester/
./scripts/test
make
to build binariesexport TF_CLI_CONFIG_FILE=~/.terraformrc
to.bashrc
terraform init
and then runterraform apply
dedicatedCpuPlacement: true
andisolateEmulatorThread: true