Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Try to apply softmax to a batch of data with variable length #489

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

hfxunlp
Copy link

@hfxunlp hfxunlp commented Nov 21, 2017

Hi, I want to make SoftMax support variable length input, so you can use a batch of data with different length as the input of this module. This is helpful for Natural Language Processing, especially for the Attention model of seq2seq and Attention-over-Attention model for reading comprehension. And this pull request is corresponding to torch/nn#1297.

@hfxunlp
Copy link
Author

hfxunlp commented Nov 21, 2017

Sorry, There is still a bug that I have to fix

@hfxunlp hfxunlp closed this Nov 21, 2017
@hfxunlp hfxunlp reopened this Nov 21, 2017
@hfxunlp
Copy link
Author

hfxunlp commented Nov 21, 2017

I fix the bug in a weird way in the second commit. It will be wonderful if someone can tell me why and show a more beautiful solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant