Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

f'The provided lr scheduler "{scheduler}" is invalid' #91

Open
luoclab opened this issue Apr 8, 2024 · 4 comments
Open

f'The provided lr scheduler "{scheduler}" is invalid' #91

luoclab opened this issue Apr 8, 2024 · 4 comments

Comments

@luoclab
Copy link

luoclab commented Apr 8, 2024

my torch vision is 2.01,I don't konw how to solve this error

@dai-jiuhun
Copy link

me too

@MisterBourbaki
Copy link

Hi @luoclab , @dai-jiuhun ,
Could you be more specific? What model did you try, did you change some piece of code, what are the parameters you chose etc.
In particular, of course, the name of the scheduler you try to use :)

@arpu-nagar
Copy link

arpu-nagar commented May 25, 2024

Hey @MisterBourbaki

No changes to the code/YAML files

I tried running VanillaVAE with torch=2.2 with CUDA 11.8 and got the following error while trying to execute run.py.

[rank0]:   File "/home/arpan/ddpm/PyTorch-VAE/run.py", line 63, in <module>
[rank0]:     runner.fit(experiment, datamodule=data)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/trainer.py", line 737, in fit
[rank0]:     self._call_and_handle_interrupt(
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/trainer.py", line 682, in _call_and_handle_interrupt
[rank0]:     return trainer_fn(*args, **kwargs)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/trainer.py", line 772, in _fit_impl
[rank0]:     self._run(model, ckpt_path=ckpt_path)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/trainer.py", line 1140, in _run
[rank0]:     self.accelerator.setup(self)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/accelerators/gpu.py", line 46, in setup
[rank0]:     return super().setup(trainer)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/accelerators/accelerator.py", line 93, in setup
[rank0]:     self.setup_optimizers(trainer)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/accelerators/accelerator.py", line 351, in setup_optimizers
[rank0]:     optimizers, lr_schedulers, optimizer_frequencies = self.training_type_plugin.init_optimizers(
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/plugins/training_type/training_type_plugin.py", line 245, in init_optimizers
[rank0]:     return trainer.init_optimizers(model)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/optimizers.py", line 44, in init_optimizers
[rank0]:     lr_schedulers = self._configure_schedulers(lr_schedulers, monitor, not pl_module.automatic_optimization)
[rank0]:   File "/home/arpan/miniconda3/envs/torch-2.2/lib/python3.10/site-packages/pytorch_lightning/trainer/optimizers.py", line 192, in _configure_schedulers
[rank0]:     raise ValueError(f'The provided lr scheduler "{scheduler}" is invalid')
[rank0]: ValueError: The provided lr scheduler "<torch.optim.lr_scheduler.ExponentialLR object at 0x7f63e8121ed0>" is invalid

Any solutions?

@MisterBourbaki
Copy link

Hi @arpu-nagar , I just had a look at the issue, and I think it comes from the old age of the code. This repo is great, but very old in terms of code. In particular, torch and in particular lightning's API have change a lot since then.

So, to avoid spending to much time finding the right way to change a few lines of code here and there, I think it is best to craft from scratch a training pipeline using Lightning. They have really good tutorials :)

And if I may, I am trying to rebuild this repo in a more modern way here . It is still a work in progress, but I hope to catch on quick.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants