Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Beam Search: Error in attn_decoder_input_fn in concat statement #8

Open
ravibansal opened this issue Jul 15, 2017 · 2 comments
Open

Comments

@ravibansal
Copy link

https://github.com/JayParks/tf-seq2seq/blob/master/seq2seq_model.py#L368
It gives that the dimension 0 of inputs and attention do not match (as we are tile_batching it to batch_size * beam_width). Didn't you get any error while running with beam_search?

@ghost
Copy link

ghost commented Apr 11, 2018

@ravibansal hi, I have the same problem.
did you solve your problem?

@ravibansal
Copy link
Author

@hastirad no unfortunately not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant