Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hyperparameter sweeps #143

Open
epwalsh opened this issue Jan 19, 2022 · 1 comment 路 May be fixed by #214
Open

Hyperparameter sweeps #143

epwalsh opened this issue Jan 19, 2022 · 1 comment 路 May be fixed by #214

Comments

@epwalsh
Copy link
Member

epwalsh commented Jan 19, 2022

It would be great if Tango provided a simple yet general mechanism for doing hyperparameter searches. Here's an outline of how that could look 馃憞

We provide a new subcommand: tango sweep. This command takes

  • a "sweep" config,
  • a regular experiment config,
  • and a target step name.

For example:

tango sweep sweep-config.jsonnet target-config.jsonnet step-name

step-name should correspond to the main step of interest in target-config.jsonnet that provides the results we are trying to optimize for (might require #142). For example, this could be a validation/eval step that spits out some metrics of your model on a dataset.

The sweep-config.jsonnet would define which hyperparameters to search and how to search over them. By "hyperparemeters" I really just mean any fields in target-config.jsonnet. There are many ways we could do the search, and this is an active area of research. So I think it would be ideal if we were able to integrate with existing hyperparameter sweep frameworks / platforms, like W&B, Optuna, etc. These integrations should be optional, however, and I think we should provide a simple default search method, which could just be grid search.

Under the hood tango sweep could use the tango run subcommand with the --overrides parameter to select hyperparameter values. We should also be able to run the search in parallel.

@dhruvdcoder
Copy link

dhruvdcoder commented Dec 9, 2022

This feature would be very helpful. With allennlp I had been using the ability of registering a new subcommand, to register a train-with-wandb command and use it with sweeps on W&B. All that command did was translate the hyperparameters supplied by the wandb server into JSON string that could be passed as the --overrides to the top-level call to the function responsible for training with the allennlp train command.

However, I could not find a way to add a new subcommand with the click based cli for tango. Is there a way I could replicate the setup mention above and call tango's _run() function from my subcommand?

Is using forwarding as described in the click documentation, a good candidate?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants