Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model fails to train on v0.6.0 with default settings having trained on v0.4.0 #218

Open
joshnevin opened this issue Feb 16, 2022 · 0 comments

Comments

@joshnevin
Copy link

I have found that upgrading to v0.6.0 causes the following warning and then a breaking error, which I do not get using v0.4.0:

warning:
/home/zcicjne/dbpgpe/labgit/LabCode/gpenv/lib64/python3.6/site-packages/scipy/stats/_distn_infrastructure.py:1844: RuntimeWarning: divide by zero encountered in double_scalars
x = np.asarray((x - loc)/scale, dtype=dtyp)
Prior solver failed to converge

error:
File "/home/zcicjne/dbpgpe/labgit/LabCode/gpenv/lib64/python3.6/site-packages/scipy/stats/_distn_infrastructure.py", line 1844, in cdf                                   
    x = np.asarray((x - loc)/scale, dtype=dtyp)
FloatingPointError: divide by zero encountered in double_scalars

steps to reproduce:
kern = 'SquaredExponential'
gp_m = mogp_emulator.GaussianProcess(good_points_m, good_targets_m, kernel=kern)
gp_m = mogp_emulator.fit_GP_MAP(gp_m)

Data (attached):
inputs: good_points_m_issue.pkl
targets: good_targets_m_issue.pkl
data.zip

Environment results of pip list:
Package Version


mogp-emulator 0.6.0
numpy 1.19.5
patsy 0.5.2
pickle-mixin 1.0.2
pip 21.3.1
scipy 1.5.4
setuptools 39.2.0
six 1.16.0
Python 3.6.8

As a workaround, specifying the priors like this recovers the old behaviour from v0.4.0:
priors=mogp_emulator.Priors.GPPriors(nugget_type="adaptive",n_corr=3)
gp_m = mogp_emulator.GaussianProcess(good_points_m, good_targets_m, kernel=kern, priors=priors)
gp_m = mogp_emulator.fit_GP_MAP(gp_m)
i.e. the model fits fine when this is added.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant