Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Heteroscedastic GPR model #1704

Draft
wants to merge 19 commits into
base: develop
Choose a base branch
from
Draft

Heteroscedastic GPR model #1704

wants to merge 19 commits into from

Conversation

frgsimpson
Copy link
Contributor

PR type: enhancement

Related issue(s)/PRs:
Resolves #1703

Summary

Proposed changes
Work in progress...

What alternatives have you considered?

Minimal working example

# Put your example code in here

Release notes

Fully backwards compatible: yes / no

If not, why is it worth breaking backwards compatibility:

PR checklist

  • New features: code is well-documented
    • detailed docstrings (API documentation)
    • notebook examples (usage demonstration)
  • The bug case / new feature is covered by unit tests
  • Code has type annotations
  • Build checks
    • I ran the black+isort formatter (make format)
    • I locally tested that the tests pass (make check-all)
  • Release management
    • RELEASE.md updated with entry for this change
    • New contributors: I've added myself to CONTRIBUTORS.md

@frgsimpson frgsimpson requested a review from vdutor July 27, 2021 16:20
@st--
Copy link
Member

st-- commented Aug 27, 2021

This is the kind of model that I'd love for GPflow to support in terms of API needed to implement it, but would be a bit wary about including as an explicitly supported model - perhaps this is something better shown as a notebook example ?

Copy link
Contributor

@johnamcleod johnamcleod left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking good. I suppose the next steps are to tidy up the script and add some tests?

from ..utilities import add_linear_noise_cov


class het_GPR(GPR_with_posterior):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be useful if there was a way to follow the existing naming GPflow model naming convention in this case.


class het_GPR(GPR_with_posterior):
""" While the vanilla GPR enforces a constant noise variance across the input space, here we allow the
noise amplitude to vary linearly (and hence the noise variance to change quadratically) across the input space.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it true that the noise amplitude varies linearly with this model? Even with the PseudoPoissonLikelihood?

self,
data: RegressionData,
kernel: Kernel,
likelihood,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this type annotation be MultiLatentTFPConditional?

"\n",
" def __init__(self, ndims: int = 1, **kwargs):\n",
" self.noise_gradient = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n",
" self.constant_noise = Parameter(1.0, transform=positive(lower=MIN_NOISE_SCALE))\n",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure how much it really makes sense to constrain the constant to be positive? With the gradient, the scaled value could be very different from the constant? Maybe you should be doing something down in scale_transform to ensure positiveness instead?

"\n",
" self.noise_gradient = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n",
" self.noise_curvature = Parameter(np.ones(ndims), prior=GRADIENT_PRIOR)\n",
" self.constant_noise = Parameter(1.0, transform=positive(lower=MIN_NOISE_SCALE))\n",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As above, I'm not sure how much this positive transform really is the right way to solve the problem...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Modelling linear noise in GPR
4 participants