Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add memory_dynamic_min,memory_dynamic_max #306

Closed
wants to merge 2 commits into from

Conversation

aslak11
Copy link

@aslak11 aslak11 commented Mar 4, 2024

fixes #211

@ddelnano
Copy link
Collaborator

ddelnano commented Mar 5, 2024

Hi @aslak11, thanks for your interest in contributing to the project!

For all terraform changes, we require adding terraform acceptance tests in order to verify that the functionality works. Ensuring VM resource arguments/attributes are created and modified properly can be quite complex and sometimes requires rebooting the VM first. Given the complexity of the memory settings, having test coverage is a must and I recommend that you check out the existing VM memory tests and the contributing doc for more details.

Another thing to consider since the memory support is there but half baked. We may need to perform a terraform state upgrade. I haven't thought through the static/dynamic settings yet, but we want to minimize the pain in upgrading after the dynamic settings are supported. An example of that can be found in #271.

I'm happy to help you through the development of these tests and the state migration functionality. Please let me know if you have any questions or comments.

* change vm set/create params memoryMax to memoryStaticMax

* add state migrations
@ddelnano
Copy link
Collaborator

@aslak11 thanks for giving this another pass. I hope to be able to review this over the next week. Unfortunately the Jenkins job is broken due to #313. As part of my review, I'll need to fix that issue so we can see the status of the test suite.

@ddelnano
Copy link
Collaborator

@aslak11 I still haven't had a chance to review this, but we need to track down the following test failures. This output was collected from the Jenkins job that ran against #316.

The TestAccXenorchestraVm_addAndRemoveDisksToVm failure seems like it would be unrelated, but it occurred many times (see re-run count). While the test suite does have some transient failures, our build will retry a test up to 5 times, so I think it's more likely to be related to your change since a transient issue will resolve within the retries.

=== FAIL: xoa TestAccXenorchestraVm_updatesThatRequireReboot (re-run 5) (180.93s)
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
    resource_xenorchestra_vm_test.go:1643: Step 3/5 error: Check failed: Check 6/6 error: xenorchestra_vm.bar: Attribute 'memory_dynamic_min' expected "1073741824", got "2147483648"

=== FAIL: xoa TestAccXenorchestraVm_addAndRemoveDisksToVm (re-run 5) (90.83s)
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
[DEBUG] Comparing MACs old= new=
    resource_xenorchestra_vm_test.go:1268: Step 2/3 error: After applying this test step, the plan was not empty.
        stdout:
        
        
        Terraform used the selected providers to generate the following execution
        plan. Resource actions are indicated with the following symbols:
          ~ update in-place
        
        Terraform will perform the following actions:
        
          # xenorchestra_vm.bar will be updated in-place
          ~ resource "xenorchestra_vm" "bar" {
                id                                  = "b38d77a8-caa0-50e7-da84-ee3cd310eaaa"
                tags                                = []
                # (24 unchanged attributes hidden)
        
              ~ disk {
                  ~ attached   = false -> true
                    # (6 unchanged attributes hidden)
                }
        
                # (2 unchanged blocks hidden)
            }
        
        Plan: 0 to add, 1 to change, 0 to destroy.

@ddelnano
Copy link
Collaborator

Since there hasn't been activity here I'm going to close this pull request. I'm happy to assist if/when you are able to consider the details and test failures mentioned above.

@ddelnano ddelnano closed this May 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Dynamic memory control
2 participants