Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

您好 请问在simbert模型里面 seq2seq的loss比similarity的loss大的多是正常的吗? #61

Open
EddieChen324 opened this issue Jul 10, 2022 · 3 comments

Comments

@EddieChen324
Copy link

No description provided.

@920232796
Copy link
Owner

两者能正常下降,大概率应该就是没问题

@EddieChen324
Copy link
Author

两者能正常下降,大概率应该就是没问题

您好 我把您关于similarity loss获得labels函数的最后一段的argmax给注释掉了 因为我看苏神的bert4keras这一块好像直接就把labels矩阵传过去了 请问您为什么要在那一块加argmax呀

@920232796
Copy link
Owner

这块感觉问题不大,去掉也行。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants