Skip to content

google-research/gpax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPax

A codebase for Gaussian processes in Jax.

Disclaimer: This is not an officially supported Google product.

Gaussian process probes (GPP)

Please find algorithm descriptions in Gaussian Process Probes (GPP) for Uncertainty-Aware Probing.

To use GPP, simply call gpp with embeddings of queries and observed data.

To obtain uncertainty measurements of a Beta GP, call beta_gp_uncertainty with predictions returned by beta_gp_predict.

Baseline probabilistic probing methods are also implemented in this codebase, including linear probe ensembles and Gaussian process regression.

For out-of-distribution (OOD) detection, use latent_var, the variance of the latent function returned by gpp.

Other score-based OOD detection methods implemented in this codebase are Mahalanobis distance and maximum predicted softmax probabilities.

Citation

@article{wang2023gpp,
  title={{Gaussian Process Probes (GPP) for Uncertainty-Aware Probing}},
  author={Zi Wang and
          Alexander Ku and
          Jason Baldridge and
          Thomas L Griffiths and
          Been Kim},
  journal={arXiv preprint arXiv:2305.18213},
  year={2023}
}

Pre-trained Gaussian processes

Please find algorithm descriptions in Pre-trained Gaussian processes for Bayesian optimization. An alternative implementation can be found at https://github.com/google-research/hyperbo.

Implemented models include vanilla Gaussian processes (GaussianProcess) as well as meta and multi-task Gaussian processes (MultiTaskGaussianProcess).

For pre-training the multi-task Gaussian process, you can call an optimizer (minimization) on the empirical KL divergence (EKL) objective or the negative log likelihood (NLL) objective. Examples of evaluating these objectives can be found in the test for EKL and the test for NLL.

We also implemented classic acquisition functions for Bayesian optimization. See GPax/bayesopt/acquisitions_test.py for an example of how to evaluate these acquisition functions.

Citation

@article{wang2023hyperbo,
  title={{Pre-trained Gaussian processes for Bayesian optimization}},
  author={Zi Wang and
          George E. Dahl and
          Kevin Swersky and
          Chansoo Lee and
          Zachary Nado and
          Justin Gilmer and
          Jasper Snoek and
          Zoubin Ghahramani},
  journal={arXiv preprint arXiv:2109.08215},
  year={2023}
}

Installation

We recommend using Python 3.7 for stability.

To install the latest development version inside a virtual environment, run

python3 -m venv env-pd
source env-pd/bin/activate
pip install --upgrade pip
pip install "git+https://github.com/google-research/gpax.git#egg=gpax"

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages