Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensorflow 1.x backend: add dropout to DeepONet #1579

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

vl-dud
Copy link
Contributor

@vl-dud vl-dud commented Dec 3, 2023

Now DeepONet supports dropout technique.

@lululxvi
Copy link
Owner

lululxvi commented Dec 4, 2023

Do you use dropbox to prevent overfitting?

@vl-dud
Copy link
Contributor Author

vl-dud commented Dec 4, 2023

Do you use dropbox to prevent overfitting?

Yes, I'm using dropout_rate right now during hyperparameter tuning. I'll write how useful it is in my case.

@lululxvi
Copy link
Owner

lululxvi commented Dec 4, 2023

I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.

@vl-dud
Copy link
Contributor Author

vl-dud commented Apr 8, 2024

I am curious how useful it is. In general, I found dropout is not that useful, and L1/L2 regularization seems good enough.

Hyperparameter tuning showed that in my case neither dropout nor regularization is required. However I will use dropout anyway for UQ.

@lululxvi
Copy link
Owner

lululxvi commented Apr 8, 2024

Yes, dropout is useful for UQ. How do you implement DeepONet UQ?

@vl-dud
Copy link
Contributor Author

vl-dud commented Apr 8, 2024

I just run model.predict many times to get the final prediction with CI:

def predict_with_uncertainty(model, x, trial_num=100):
    predictions = []
    for _ in range(trial_num):
        predictions.append(model.predict(x))
    return np.mean(predictions, axis=0), np.std(predictions, axis=0)

@lululxvi
Copy link
Owner

lululxvi commented Apr 8, 2024

In fact, we have this callback https://deepxde.readthedocs.io/en/latest/modules/deepxde.html#deepxde.callbacks.DropoutUncertainty . Does this work for your case?

@vl-dud
Copy link
Contributor Author

vl-dud commented Apr 9, 2024

Yes, I used DropoutUncertainty. The above code snippet is more convenient in my case, since I use it with already trained models.
In addition, it allows you to set trial_num, which is a constant in DropoutUncertainty.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants