You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The LayerNormLSTMCell modules initialised in the MetaOptimizer class are not properly registered as parameters of the MetaOptimizer model. Appending them to the self.lstms list:
If I am not mistaken, the current version will not train the LSTM weights at all. In general, I would suggest to restructure the initialisation and MetaOptimizer.forward method, but as a quick fix one could replace the entire self.lstms initialization block with this:
self.lstms = nn.Sequential(*[LayerNormLSTMCell(hidden_size, hidden_size)
for _ in range(num_layers)])
The text was updated successfully, but these errors were encountered:
This quick fix worked, thanks! By the way, have you been able to re-implement the experiments which were in the article using MetaOptimizer? For me the final loss of each epoch is really big, which is around 4 for best, and I cannot figure out what to do. Could you please give a little help?
The
LayerNormLSTMCell
modules initialised in theMetaOptimizer
class are not properly registered as parameters of theMetaOptimizer
model. Appending them to theself.lstms
list:pytorch-meta-optimizer/meta_optimizer.py
Line 27 in 0154d4d
will not add their trainable parameters to the model parameter list in:
pytorch-meta-optimizer/main.py
Line 63 in 0154d4d
If I am not mistaken, the current version will not train the LSTM weights at all. In general, I would suggest to restructure the initialisation and
MetaOptimizer.forward
method, but as a quick fix one could replace the entireself.lstms
initialization block with this:The text was updated successfully, but these errors were encountered: