Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add dbus commands for dynamic creation of instances #596

Merged

Conversation

adriaan42
Copy link
Contributor

Commit taken from #580

@adriaan42 adriaan42 force-pushed the adriaan/dynamic-instance-creation branch from 9523660 to 9931566 Compare March 1, 2024 12:15
@adriaan42
Copy link
Contributor Author

@yarda @zacikpa to continue the discussion on dynamic creation (and deletion) of plugin instances (originally in #580):

I have updated the behavior of instance_destroy:

  • any instance can be destroyed (not only the ones created dynamically)
  • devices attached to the instance are released, and later the plugin's _add_device function is called, and existing logic will attach the device to a suitable instance

For instance_create it's a bit more complex:

  • I can extract devices and devices_udev_regex from the options and pass it to the instance creation. So when new devices are plugged, the new instance is taken into consideration.
  • But what should happen to existing devices already attached to another instance?
    • Current implementation: nothing happens, and the user explicitly calls instance_acquire_devices to move devices, OR
    • Automatically move devices, i.e., iterate through all instances and check if any of their devices should be moved to the new instance. Would be possible with _check_matching_devices, while also considering instance priorities, but this would override any manual assignments made by previous calls to instance_acquire_devices... so not sure about the side effects.

Also, I noticed that the code (existing instance_acquire_devices and also my new instance_create and instance_destroy) call methods only available in Hotplug plugins, but without checking if we're actually dealing with a Hotplug plugin. Is that a bug in the existing instance_acquire_devices? Should I just add an isinstance() check to all three functions?

@adriaan42 adriaan42 force-pushed the adriaan/dynamic-instance-creation branch from 9931566 to 584fcf9 Compare March 7, 2024 13:40
@adriaan42
Copy link
Contributor Author

  • Automatically move devices, i.e., iterate through all instances and check if any of their devices should be moved to the new instance. Would be possible with _check_matching_devices, while also considering instance priorities, but this would override any manual assignments made by previous calls to instance_acquire_devices... so not sure about the side effects.

After thinking about it some more, I believe that we do want to automatically move matching devices. So the latest update implements that, acquiring all matching devices from plugin instances with an equal or lower priority (i.e. if the priority value of the new instance is lower of equal to the one of the existing instance).

@adriaan42 adriaan42 force-pushed the adriaan/dynamic-instance-creation branch 2 times, most recently from 0a49980 to 790df99 Compare March 11, 2024 05:27
@adriaan42
Copy link
Contributor Author

After some more testing and fixing, I'm now quite happy with the current implementation.

However, I'm seeing some potential inconsistencies in the current behavior of the instance.active property, where instances are only active when they have attached devices:

  • non-active instances are not listed by the get_instances dbus call, so it seems they don't exist, but instance_get_devices can be called successfully -> the view exposed via the dbus interface is not consistent.
  • for non-active instances, instance_apply_tuning (https://github.com/redhat-performance/tuned/blob/master/tuned/plugins/base.py#L250, and related functions) immediately returns, so in particular, the static tuning is not executed -> it is not possible to have static instance-specific tuning that is independent of devices.

Copy link
Contributor

@zacikpa zacikpa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update, @adriaan42.

Also, I noticed that the code (existing instance_acquire_devices and also my new instance_create and instance_destroy) call methods only available in Hotplug plugins, but without checking if we're actually dealing with a Hotplug plugin. Is that a bug in the existing instance_acquire_devices? Should I just add an isinstance() check to all three functions?

I think this is indeed a bug. It hasn't caused any issues yet because there aren't many non-hotplug plugins with device support, and for plugins that do not support devices at all, the existing function returns earlier. I would add the type check.

non-active instances are not listed by the get_instances dbus call, so it seems they don't exist, but instance_get_devices can be called successfully -> the view exposed via the dbus interface is not consistent.

The only issue I see here is the name of the API function get_instances, which probably should have been named get_active_instances. Nonetheless, it's well-documented that it returns only active instances, so I wouldn't worry about it much.

for non-active instances, instance_apply_tuning (https://github.com/redhat-performance/tuned/blob/master/tuned/plugins/base.py#L250, and related functions) immediately returns, so in particular, the static tuning is not executed -> it is not possible to have static instance-specific tuning that is independent of devices.

Good catch - I also found a related issue, see one of my comments. I think resolving this would require some heavier refactoring where we would decouple the non-device tuning from the device tuning. I don't really think it's a blocker for this PR - I would open a separate issue for that if you're not up for it. @yarda, what do you think?

Functionality-wise, I tested this and it works quite well.

tuned/daemon/controller.py Outdated Show resolved Hide resolved
tuned/daemon/controller.py Show resolved Hide resolved
Currently instance priorities are only used at startup to sort the
instance classes. To enable dynamic creation of instances, we need
to store the priorities, so new instances can be sorted accordingly.

Signed-off-by: Adriaan Schmidt <adriaan.schmidt@siemens.com>
This adds a check to instance_acquire_devices to ensure
that the plugin actually supports the requested operation.

Signed-off-by: Adriaan Schmidt <adriaan.schmidt@siemens.com>
Signed-off-by: Adriaan Schmidt <adriaan.schmidt@siemens.com>
@adriaan42 adriaan42 force-pushed the adriaan/dynamic-instance-creation branch from 790df99 to cddcd23 Compare March 15, 2024 11:14
@adriaan42
Copy link
Contributor Author

adriaan42 commented Mar 15, 2024

Thanks for the update, @adriaan42.

Also, I noticed that the code (existing instance_acquire_devices and also my new instance_create and instance_destroy) call methods only available in Hotplug plugins, but without checking if we're actually dealing with a Hotplug plugin. Is that a bug in the existing instance_acquire_devices? Should I just add an isinstance() check to all three functions?

I think this is indeed a bug. It hasn't caused any issues yet because there aren't many non-hotplug plugins with device support, and for plugins that do not support devices at all, the existing function returns earlier. I would add the type check.

I've added checks.

non-active instances are not listed by the get_instances dbus call, so it seems they don't exist, but instance_get_devices can be called successfully -> the view exposed via the dbus interface is not consistent.

The only issue I see here is the name of the API function get_instances, which probably should have been named get_active_instances. Nonetheless, it's well-documented that it returns only active instances, so I wouldn't worry about it much.

If inactive instances are "invisible" then any code using the interface to create a new instance would have to explicitly check (using instance_get_devices) if the chosen name is already taken, or just try to create and catch the error. But I can live with that, and you're right, it's the documented behavior.

for non-active instances, instance_apply_tuning (https://github.com/redhat-performance/tuned/blob/master/tuned/plugins/base.py#L250, and related functions) immediately returns, so in particular, the static tuning is not executed -> it is not possible to have static instance-specific tuning that is independent of devices.

Good catch - I also found a related issue, see one of my comments. I think resolving this would require some heavier refactoring where we would decouple the non-device tuning from the device tuning. I don't really think it's a blocker for this PR - I would open a separate issue for that if you're not up for it. @yarda, what do you think?

Functionality-wise, I tested this and it works quite well.

Thanks, I appreciate it! :)

@adriaan42
Copy link
Contributor Author

@yarda anything missing/blocking this?

@yarda
Copy link
Contributor

yarda commented May 21, 2024

for non-active instances, instance_apply_tuning (https://github.com/redhat-performance/tuned/blob/master/tuned/plugins/base.py#L250, and related functions) immediately returns, so in particular, the static tuning is not executed -> it is not possible to have static instance-specific tuning that is independent of devices.

Good catch - I also found a related issue, see one of my comments. I think resolving this would require some heavier refactoring where we would decouple the non-device tuning from the device tuning. I don't really think it's a blocker for this PR - I would open a separate issue for that if you're not up for it. @yarda, what do you think?

OK, NP, it makes sense.

@yarda
Copy link
Contributor

yarda commented May 21, 2024

LGTM. I also tested basic functionality and it seems everything works OK, thanks for contribution.

@yarda yarda merged commit f75d795 into redhat-performance:master May 21, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants