Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于 Embedded Gaussian问题 #45

Open
mymuli opened this issue Aug 31, 2021 · 4 comments
Open

关于 Embedded Gaussian问题 #45

mymuli opened this issue Aug 31, 2021 · 4 comments

Comments

@mymuli
Copy link

mymuli commented Aug 31, 2021

想问一下,在non_local_embedded_gaussian.py实现文件中,我并没有发现Embedded Gaussian的具体表达式,只是有矩阵乘法+softmax函数...如果是Embedded Gaussian的实现,具体是哪几行代码实现呢?

f = torch.matmul(theta_x, phi_x)

f_div_C = F.softmax(f, dim=-1)

@AlexHex7
Copy link
Owner

@mymuli Hi, theta_x, phi_x 是embedded后的x。

@mymuli
Copy link
Author

mymuli commented Aug 31, 2021

@mymuli Hi, theta_x, phi_x 是embedded后的x。

我想问一下,theta_x, phi_x 是Embedded Gaussian后的x,在non_local_embedded_gaussian.py文件中具体语句是?

@AlexHex7
Copy link
Owner

@mymuli 在论文3.2节-Embedded Gaussian部分

A simple extension of the Gaussian function is to compute similarity in an embedding space.

Here θ(x_i) = W_θx_i and φ(x_j ) = W_φx_j are two embeddings

@LAB123-tech
Copy link

想问一下,在non_local_embedded_gaussian.py实现文件中,我并没有发现Embedded Gaussian的具体表达式,只是有矩阵乘法+softmax函数...如果是Embedded Gaussian的实现,具体是哪几行代码实现呢?

f = torch.matmul(theta_x, phi_x)

f_div_C = F.softmax(f, dim=-1)

在f_div_C = F.softmax(f, dim=-1) 之前,加个指数函数就可以了吧。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants