We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好,个人认为,BinomialDeviance损失函数中,计算loss的公式有问题,类似于sklearn包中的损失计算,此处的正确的公式应该是:-2.0(yf - log(1 + exp(f))),代码中是-2.0(yf - (1 + exp(f)))
The text was updated successfully, but these errors were encountered:
上一条有问题,你好,个人认为,BinomialDeviance损失函数中,计算loss的公式有问题,类似于sklearn包中的损失计算,此处的正确的公式应该是:-2.0(yf - log(1 + exp(f))),代码中是-2.0(yf - exp(1+f))
Sorry, something went wrong.
"""Compute the deviance (= 2 * negative log-likelihood).
Parameters ---------- y : array, shape (n_samples,) True labels pred : array, shape (n_samples,) Predicted labels sample_weight : array-like, shape (n_samples,), optional Sample weights. """ # logaddexp(0, v) == log(1.0 + exp(v)) pred = pred.ravel() if sample_weight is None: return -2.0 * np.mean((y * pred) - np.logaddexp(0.0, pred)) else: return (-2.0 / sample_weight.sum() * np.sum(sample_weight * ((y * pred) - np.logaddexp(0.0, pred))))
我也觉得BinomialDeviance损失函数有问题,sklearn中代码是这么写的。
No branches or pull requests
你好,个人认为,BinomialDeviance损失函数中,计算loss的公式有问题,类似于sklearn包中的损失计算,此处的正确的公式应该是:-2.0(yf - log(1 + exp(f))),代码中是-2.0(yf - (1 + exp(f)))
The text was updated successfully, but these errors were encountered: