New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Heteroscedastic GPR model #1704
base: develop
Are you sure you want to change the base?
Conversation
Use HeteroskedasticTFP likelihood
This is the kind of model that I'd love for GPflow to support in terms of API needed to implement it, but would be a bit wary about including as an explicitly supported model - perhaps this is something better shown as a notebook example ? |
Start to simplify hetgpr
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is looking good. I suppose the next steps are to tidy up the script and add some tests?
from ..utilities import add_linear_noise_cov | ||
|
||
|
||
class het_GPR(GPR_with_posterior): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be useful if there was a way to follow the existing naming GPflow model naming convention in this case.
|
||
class het_GPR(GPR_with_posterior): | ||
""" While the vanilla GPR enforces a constant noise variance across the input space, here we allow the | ||
noise amplitude to vary linearly (and hence the noise variance to change quadratically) across the input space. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it true that the noise amplitude varies linearly with this model? Even with the PseudoPoissonLikelihood
?
self, | ||
data: RegressionData, | ||
kernel: Kernel, | ||
likelihood, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this type annotation be MultiLatentTFPConditional
?
"\n", | ||
" def __init__(self, ndims: int = 1, **kwargs):\n", | ||
" self.noise_gradient = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n", | ||
" self.constant_noise = Parameter(1.0, transform=positive(lower=MIN_NOISE_SCALE))\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure how much it really makes sense to constrain the constant to be positive? With the gradient, the scaled value could be very different from the constant? Maybe you should be doing something down in scale_transform
to ensure positiveness instead?
"\n", | ||
" self.noise_gradient = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n", | ||
" self.noise_curvature = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n", | ||
" self.constant_noise = Parameter(1.0, transform=positive(lower=MIN_NOISE_SCALE))\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As above, I'm not sure how much this positive transform really is the right way to solve the problem...
PR type: enhancement
Related issue(s)/PRs:
Resolves #1703
Summary
Proposed changes
Work in progress...
What alternatives have you considered?
Minimal working example
# Put your example code in here
Release notes
Fully backwards compatible: yes / no
If not, why is it worth breaking backwards compatibility:
PR checklist
make format
)make check-all
)