You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your interesting work, and replying for my previous question.
I have an another question regarding computing prediction probability distribution of a given data x.
When I run the eval_prob_adaptive.py and get a array of losses with respect to each class prompt, I found that the losses are very close to each other.
That is, when I softmax the array, the probability for each class is close to 1/N which may not be desired.
I found that similar issue had been raised before. (#11 (comment))
Could you check about this issue?
Thank you in advance.
The text was updated successfully, but these errors were encountered:
To get calibrated probabilities, you need to find the correct temperature scaling. If you have a small amount of training data, you can check out the approach in this paper to find the right scaling of the losses.
Note that eval_prob_adaptive only cares about finding the most likely class, not calibrated probabilities for each class. The adaptive approach may not be very calibrated for classes that are pruned early.
Thank you for your interesting work, and replying for my previous question.
I have an another question regarding computing prediction probability distribution of a given data x.
When I run the
eval_prob_adaptive.py
and get a array of losses with respect to each class prompt, I found that the losses are very close to each other.That is, when I softmax the array, the probability for each class is close to 1/N which may not be desired.
I found that similar issue had been raised before. (#11 (comment))
Could you check about this issue?
Thank you in advance.
The text was updated successfully, but these errors were encountered: