-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for nested parameters/parameterizing objects that can't be called by name. #181
Comments
Thanks for raising this! You’re absolutely right. Currently, we don’t support optimizing both a Keras While all optimizers do have an As you mentioned, this could probably be addressed by adding support for nested/conditional hyperparameter optimization, but I still think it gets messy rather quickly in the case of Keras optimizers. This messiness is compounded when we consider that HH is going to automatically read all of our old Experiments and try to match them with the new search space to jump-start optimization. All the old Experiment results, of course, will also have the default values for hyperparameters that weren’t explicitly given. All that said, I would very much like to support nested/conditional dimensions… Could you please clarify what you mean by,
If you’re referring to using …
That’s correct. I won’t bore you with all the technical details, but HH actually rewrites the Keras Doing this may seem unnecessarily complicated, but the reason for it is to enable providing search dimensions directly to Keras layers. Otherwise it wouldn’t be possible to do something like If you have any ideas or would actually like some more technical details, I’d be more than happy to discuss it further, but that’s the “short” answer to why you’re getting that NameError. Your wrapper function approach does seem really interesting, though, and I’m trying to figure out how we could integrate it into HH to support nested/conditional dimensions. |
adamax and nadam are callable by name? I wasn't aware. Either way i think we'd have the same problem when using custom optimizers like adabound, that really have to be defined in a wrapper function.
Maybe it is possible to allow a list of methods as params in the build_fn function? Then if we annotated our wrappers HH could possibly supply the appropriate function by name. This way we can supply our own wrapper functions. I'm probably over simplifying things... |
In a few different scenarios you'd want to have nested parameters, but sometimes a parameter isn't just a numerical/string value but instead a layer or object. An obvious example would be an optimizer with a learn rate. You need to call the learn rate inside of the optimizer initialization function.
So while you can have an optimizer as a hyper parameter like in the default example...
And you can have learn rate as a param for a static optimizer
I'm not sure how you can have both as parameters simultaneously.
What i have done in my own custom setup to get around a similar issue is to have a wrapper function that returns a optimizer. This way i can return optimizers that can't be called by name (Ie. Adamax or Nadam). With this same approach i think i could also have learn rate in the wrapper function as well. So i could do something like to get the functionality i want.
get_custom_optimizer(Categorical(['adam', 'nadam')], Real(0.001, 0.01))
Which should just return an Adam or Nadam optimizer, with a random learn rate.
The problem is i don't think you can call any non-native function inside your build_fn...
For example:
Results in an error
NameError: name 'get_opt' is not defined
The text was updated successfully, but these errors were encountered: