Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Categorical Reparameterization with Gumbel-Softmax comments #5

Open
utterances-bot opened this issue Oct 5, 2022 · 1 comment
Open

Comments

@utterances-bot
Copy link

Categorical Reparameterization with Gumbel-Softmax - Reading Collections

https://owen-liuyuxuan.github.io/papers_reading_sharing.github.io/Building_Blocks/GumbelSoftmax/

Copy link

river-mz commented Oct 5, 2022

你好,看了你的博客,我很有收获。对于Gumbel softmax,我的理解是可以通过调节温度的变化,将分布向均匀或onehot的形式调整,但是总体分布还是和原来分布相符合的(比如在非极端的温度下,最大值的位置应该依然保持不变)。但是在实际动手尝试调用pytorch的时候,发现使用Gumbel softmax前后,分布有较大改动,代码和输入如下。
logits = torch.randn(1, 5)
logits = torch.softmax(logits,dim=-1)
print(logits)
soft = F.gumbel_softmax(logits, tau=1, hard=False)
print(soft)

tensor([[0.2096, 0.1859, 0.1033, 0.1246, 0.3767]])
tensor([[0.0823, 0.1950, 0.1542, 0.4819, 0.0865]])

我对结果感到很困惑,请问是我对Gumbel softmax的理解存在误差吗?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants