Skip to content

BorealisAI/latent-bottlenecked-anp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Latent Bottlenecked Attentive Neural Processes (LBANPs)

This is the official implementation of the paper Latent Bottlenecked Attentive Neural Processes.

Why LBANPs

LBANPs are a new computationally efficient sub-quadratic NP variant, that has a querying computational complexity independent of the number of context datapoints. LBANPs achieve results competitive with the state-of-the-art on meta-regression, image completion, and contextual multi-armed bandits. Furthermore, they can trade-off the computational cost and performance according to the number of latent vectors while also scaling beyond existing attention-based NP variants to larger dataset settings.

Install

Create and activate a conda environment. Install the dependencies as listed in requirements.txt:

conda create --name lbanp python=3.7
conda activate lbanp
pip install -r requirements.txt

Usage

Please check the directory of each task for specific usage.

Reference

For technical details, please check the conference version of our paper.

@inproceedings{
    feng2023latent,
    title={Latent Bottlenecked Attentive Neural Processes},
    author={Leo Feng and Hossein Hajimirsadeghi and Yoshua Bengio and Mohamed Osama Ahmed},
    booktitle={International Conference on Learning Representations},
    year={2023},
    url={https://openreview.net/forum?id=yIxtevizEA}
}

Acknowledgement

This code is based on the official code base of Transformer Neural Processes (which is based on the official code base of Bootstrapping Neural Processes). We also use related parts from the Pytorch implementation of Perceiver.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages