-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable uncertainty quantification on newer models #3833
Comments
I followed through on this and applied a monkey patch, basically removing
|
@z-q-y This is a very worthwhile project but definitely a larger one. Can you come by OH sometime? Or if not, could you put together a full design doc for review? I can read through a proposed approach and give you feedback |
Uncertainty quantification based on Gal et al.'s work [https://arxiv.org/pdf/1703.04977.pdf] has been implemented on many models since issues #1119 and #1211 . However, for many of the newer models, e.g. deepchem.models.torch_models.DMPNNModel, this has not been done yet:
In the meantime, I would like guidance to come up with an ad hoc function to calculate epistemic uncertainty via Monte Carlo dropout. My main difficulty is that I'm having trouble applying dropout at inference. For PyTorch models, I suppose it could be done by simply switching on training mode, but I'm not sure how.
The text was updated successfully, but these errors were encountered: