Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting Attention Activations to Visualize Attention in Seq2Seq #1668

Closed
nicholaslocascio opened this issue Mar 27, 2016 · 3 comments
Closed

Comments

@nicholaslocascio
Copy link

All attention papers feature some visualization of the attention weights on some input. Has anyone been able to run a sample through the Seq2Seq Attention Decoder model in translate.py and get the attention activations to do such a visualization?

@keveman
Copy link
Contributor

keveman commented Mar 28, 2016

The attention mask is available as a tensor here :
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/seq2seq.py#L522

It should be easy to fetch it out during a run call and visualize it. You can try posting this to StackOverflow to see if someone in the general community has done this visualization. I am closing this issue, since we have the required functionality in TensorFlow.

@jarheadfa
Copy link

The link is broken. What is the correct link?

@guotong1988
Copy link
Contributor

same problem

fsx950223 pushed a commit to fsx950223/tensorflow that referenced this issue Nov 28, 2023
…upstream-52-hip-include

[develop-upstream] Merge in ROCm5.2 HIP include file changes from staging
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants