Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LR为什么要用cross entropy而不是MSE #9

Open
pluszeroplus opened this issue Jul 5, 2020 · 0 comments
Open

LR为什么要用cross entropy而不是MSE #9

pluszeroplus opened this issue Jul 5, 2020 · 0 comments

Comments

@pluszeroplus
Copy link

我看到你LR里面有附带一个链接,解释为什么classification 的问题loss 用 cross entropy而不是用MSE.
链接里面的解释是因为MSE会有很多局部极小值,曲线不光滑(MSE求导不是一个线性函数吗,为什么会不光滑)
这个问题我也研究了很久,我觉得最好的解释是regression 和 classification 的假设是不同的
regression 的假设是派P(y|x) 服从正太分布,classification 的假设是P(y|x)服从伯努利分布。
具体我自己有写一个推导过程https://github.com/pluszeroplus/Deep-Learning/blob/master/loss/why%20not%20MSE.pdf

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant