Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend gp.Latent to multi-outputs #7226

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

hchen19
Copy link
Contributor

@hchen19 hchen19 commented Mar 28, 2024

Description

add an argument n_outputs to gp.Latnet class with default value = 1

Related Issue

Checklist

Type of change

  • New feature / enhancement
  • Bug fix
  • Documentation
  • Maintenance
  • Other (please specify):

馃摎 Documentation preview 馃摎: https://pymc--7226.org.readthedocs.build/en/7226/

Copy link

welcome bot commented Mar 28, 2024

Thank You Banner]
馃挅 Thanks for opening this pull request! 馃挅 The PyMC community really appreciates your time and effort to contribute to the project. Please make sure you have read our Contributing Guidelines and filled in our pull request template to the best of your ability.

@hchen19 hchen19 changed the title ENH: Extend gp.Latent to multi-outputs Extend gp.Latent to multi-outputs Mar 28, 2024
pymc/gp/gp.py Outdated
else:
f = pm.MvNormal(name, mu=mu, cov=cov, **kwargs)
f_single = pm.MvNormal(name, mu=mu, cov=cov, **kwargs)
f = pt.stack([f_single] * num_outputs, axis=0) if num_outputs > 1 else f_single
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This approach won't wont, because what you're doing here is repeating a single GP, so each output is the same. We want each output to be as if they were independent draws from the same GP.

def _build_prior(self, name, X, reparameterize=True, jitter=JITTER_DEFAULT, **kwargs):
def _build_prior(
self, name, X, num_outputs=1, reparameterize=True, jitter=JITTER_DEFAULT, **kwargs
):
mu = self.mean_func(X)
cov = stabilize(self.cov_func(X), jitter)
if reparameterize:
size = np.shape(X)[0]
v = pm.Normal(name + "_rotated_", mu=0.0, sigma=1.0, size=size, **kwargs)
Copy link
Contributor

@bwengals bwengals Mar 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As a starting point, you'll need to make v here have size=(n_outputs, X.shape[0]) (or transpose of that). To check, you can draw some samples from the prior predictive.

import numpy as np
import pymc as pm
import arviz as az

x = np.linspace(0, 10, 100)
with pm.Model() as model:
    cov = pm.gp.cov.ExpQuad(1, ls=1)
    gp = pm.gp.Latent(cov_func=cov)
    f = gp.prior("f", X=x[:, None], num_outputs=3)

    idata = pm.sample_prior_predictive(samples=1)

f = az.extract(idata.prior, var_names="f")
# plot and you should see 3 different draws from the same GP

You'd have to do the same for the case reparameterize=False. But honestly, I think that is so rarely used it'd be just fine to just raise a NotImplementedError if n_outputs > 1.

Copy link
Contributor Author

@hchen19 hchen19 Mar 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your comments and suggestions, I will work on it according to your suggestions.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

glad I could help, looking forward to it

@bwengals
Copy link
Contributor

thanks for picking this up!

):
mu = self.mean_func(X)
cov = stabilize(self.cov_func(X), jitter)
if reparameterize:
size = np.shape(X)[0]
size = (X.shape[0], n_outputs) if n_outputs > 1 else X.shape[0]
Copy link
Contributor Author

@hchen19 hchen19 Apr 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I updated the _build_prior() method by changing the size of v

v = solve_lower(L, rxx)
mu = self.mean_func(Xnew) + pt.dot(pt.transpose(A), v)
v = solve_lower(L, rxx.T)
mu = self.mean_func(Xnew) + pt.dot(pt.transpose(A), v).T
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Extend _build_conditional() to multiple outputs when marginal_likelihood() returns (n_outputs, X.shape[0]) size tensor

@bwengals
Copy link
Contributor

bwengals commented Apr 6, 2024

This is looking good, though I still need to check out your branch and make sure. It looks like pre-commit isn't happy though. This page should show you how to fix it. Once that's green we can trigger the tests.

@hchen19
Copy link
Contributor Author

hchen19 commented Apr 6, 2024

This is looking good, though I still need to check out your branch and make sure. It looks like pre-commit isn't happy though. This page should show you how to fix it. Once that's green we can trigger the tests.

I fixed the pre-commit issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants