You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In your hierarchical_att_model.py, you initialized the hidden state for the GRU, with zeros. self.word_hidden_state = torch.zeros(2, batch_size, self.word_hidden_size) self.sent_hidden_state = torch.zeros(2, batch_size, self.sent_hidden_size)
According to the torch documentation of GRU, the h_0 will default to zero if not provided.
Do you have any reason to do this manually?
Thanks.
The text was updated successfully, but these errors were encountered:
littleflow3r
changed the title
init hidden state is necessary?
Is init hidden state necessary?
Feb 28, 2019
Hi,
In your hierarchical_att_model.py, you initialized the hidden state for the GRU, with zeros.
self.word_hidden_state = torch.zeros(2, batch_size, self.word_hidden_size)
self.sent_hidden_state = torch.zeros(2, batch_size, self.sent_hidden_size)
According to the torch documentation of GRU, the h_0 will default to zero if not provided.
Do you have any reason to do this manually?
Thanks.
The text was updated successfully, but these errors were encountered: