LightningArgumentParser.instantiate_classes() does not instantiate optimizer and scheduler #19004
torsten-wilhelm
started this conversation in
General
Replies: 1 comment 1 reply
-
You should use dependency injection as explained in multiple-optimizers-and-schedulers. In the config the parameters of the optimizer and scheduler would be nested inside the model with Note that optimizers require as input the parameters of the model to instantiate. Thus, it is impossible to instantiate outside the module. Likewise schedulers require as input an optimizer, so can only be instantiated inside the module. So |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use LightningCLI to create my LightningModule and LightningDataModule. Specifying optimizer and lr_scheduler in the config file instantiates these without the need to implement configure_optimizers() in the LightningModule. This is very convenient.
Now I want to to parameter sweeps using Optuna. In this setup, I want to parse the same config file and let Optuna modify some settings and than instantiate all classes and run the training. Instead of LightningCLI, I use LightningArgumentParser:
parser = LightningArgumentParser()
parser.add_class_arguments(MyDataModule, "data")
parser.add_class_arguments(MyModule, "model")
parser.add_class_arguments(lp.Trainer, "trainer")
parser.add_optimizer_args()
parser.add_lr_scheduler_args()
parser.add_argument("-c", "--config", action=ActionConfigFile)
args = parser.parse_args()
Then I let Optuna modify settings, e.g.:
args.data.batch_size = trial.suggest_categorical("batch_size", [256, 512])
to finally create the experiment (instantiate classes):
experiment = parser.instantiate_classes(args)
LightningModule and LightningDataModule are created as when using LightningCLI, but optimizer and scheduler are still of type <class 'jsonargparse._namespace.Namespace'>.
The only way to get an optimizer is by implementing configure_optimizers(). However, I would prefer to define the optimizer and its parameters in the config file and potentially also include it in the optuna sweeps.
I guess I could instantiate the optimizer myself, but that would add boilerplate code.
There should be a standard solution for this.
Beta Was this translation helpful? Give feedback.
All reactions