You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The use of nn.LazyLinear will result in an error in the _dump_init_info function of the BaseModule. The main reason is that _dump_init_info writes the model's weights to the saved info, and if nn.LazyLinear is used, the weights are not initialized before the forward pass, leading to an error.
It is hoped that support for the use of nn.LazyLinear can be provided.
What is the feature?
The use of
nn.LazyLinear
will result in an error in the_dump_init_info
function of theBaseModule
. The main reason is that_dump_init_info
writes the model's weights to the saved info, and ifnn.LazyLinear
is used, the weights are not initialized before the forward pass, leading to an error.It is hoped that support for the use of
nn.LazyLinear
can be provided.Any other context?
https://pytorch.org/docs/stable/generated/torch.nn.modules.lazy.LazyModuleMixin.html#torch.nn.modules.lazy.LazyModuleMixin
https://pytorch.org/docs/stable/generated/torch.nn.LazyLinear.html#torch.nn.LazyLinear
The text was updated successfully, but these errors were encountered: