Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: optimizer got an empty parameter list #29

Open
kk701710 opened this issue Apr 1, 2021 · 1 comment
Open

ValueError: optimizer got an empty parameter list #29

kk701710 opened this issue Apr 1, 2021 · 1 comment

Comments

@kk701710
Copy link

kk701710 commented Apr 1, 2021

...... Initialize the network done!!! .......
Traceback (most recent call last):
File "/home/jiannan/project/pytorch_classification-master/train.py", line 76, in
optimizer = optim.Adam(filter(lambda p: p.requires_grad, model.parameters()), lr=cfg.LR)
File "/home/jiannan/anaconda3/envs/pt-gpu/lib/python3.6/site-packages/torch/optim/adam.py", line 42, in init
super(Adam, self).init(params, defaults)
File "/home/jiannan/anaconda3/envs/pt-gpu/lib/python3.6/site-packages/torch/optim/optimizer.py", line 46, in init
raise ValueError("optimizer got an empty parameter list")
ValueError: optimizer got an empty parameter list
这个问题要怎么解决???

@WhyFear
Copy link

WhyFear commented Jun 5, 2021

你好,请问你解决了吗?我是在用densenet的时候出现了和你一样的问题。


我弄好了,把优化器改成SGD,然后在loss = criterion(out, labels.long())这一行的下面添加:
loss.requires_grad_(True)
就能正常跑起来了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants