Skip to content

UQatKIT/MTMLDA

Repository files navigation

Docs Python Version Ruff

MTMLDA: Within-Chain Parallelism for Multilevel MCMC based on Prefetching

Important

MTMLDA is a library developed in the course of a research project, not as a dedicated tool. As such, it has been tested for a number of example use cases, but not with an exhaustive test suite. Therefore, we currently do not intend to upload this library to a public index.

This repository contains a specialized implementation of the Multilevel Delayed Acceptance (MLDA) algorithm. MLDA is a multilevel Markov Chain Monte Carlo method, which has been proposed here. Like other multilevel sampling procedures, MLDA utilizes a hierarchy of models that approximate the target distribution with varying fidelities. The implemented version comprises within-chain parallelism through prefetching, the expansion of possible future states of the Markov chain in a binary decision tree. The target density evaluations at these states can be performed in parallel, potentially increasing MCMC execution speed. This is particularly useful for scenarios where burn-in is significant, such that parallelizaion through multiple chains is inefficient. We combine MLDA with asynchronous prefetching, to make full use of a hierarchy of models with potentially vastly different evaluation times. The theoretical background and conceptual setup of the implementation can be found in the accompanying publication, Scalable Bayesian Inference of Large Simulations via Asynchronous Prefetching Multilevel Delayed Acceptance (to be published).

Installation

The library in this repository is a Python package readily installable via pip, simply run

pip install .

For development, we recommend using the great uv project management tool, for which MTMLDA provides a universal lock file. To set up a reproducible environment, run

uv sync --all-groups

To render images from generated dot files, you also need to have Graphviz installed on your system.

Usage

The documentation provides further information regarding usage, technical setup and API. Alternatively, you can check out the runnable examples.

License

This Software is distributed under the MIT license.

It has been developed as part of the innovation study ScalaMIDA. It has received funding through the Inno4scale project, which is funded by the European High-Performance Computing Joint Un- dertaking (JU) under Grant Agreement No 101118139. The JU receives support from the European Union’s Horizon Europe Programme.

About

Parallelized MLDA Sampler based on Prefetching

Topics

Resources

License

Stars

Watchers

Forks

Contributors 3

  •  
  •  
  •  

Languages