Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyper parameter tuning #171

Open
kshitijyad opened this issue Jan 14, 2020 · 1 comment
Open

Hyper parameter tuning #171

kshitijyad opened this issue Jan 14, 2020 · 1 comment

Comments

@kshitijyad
Copy link

My questions might seem dumb, but I was trying to understand the project. As seen in the documentation, various parameters go into the model. I was thinking if there is an easy way to do Hyperparameter tuning in spotlight?

@jspisak
Copy link

jspisak commented Jul 27, 2020

HPO doesnt tend to be part of these types of domain level frameworks. I doubt it has been tested here but, for PyTorch, you could try Optuna or Ray Tune which are both well supported by Preferred Networks and the UCB folks respectively.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants