This is a python implementation of the following paper:
[1] Gao, M., Aragam, B. (2021). Efficient Bayesian network structure learning via local Markov boundary search (NeurIPS 2021).
If you find this code useful, please consider citing:
@inproceedings{gao2021tam,
author = {Gao, Ming and Aragam, Bryon},
booktitle = {Advances in Neural Information Processing Systems},
title = {{Efficient Bayesian network structure learning via local Markov boundary search}},
year = {2021}
}
The TAM algorithm is used for learning the structure of a directed acyclic graph (DAG) G. Given data from distribution P(X) that satisfies the entropic conditions in the paper above, TAM efficiently learns the DAG G that generated the samples.
- python
- Package
numpy - Package
igraph - Package
networkx
tam.pyMain function to run the algorithm, see demo belowutils.pySome helper functions to simulate data and evaluate resultsentropy.pyMnimiax optimal entropy estimator, see referencecoeffs.txtBest polynomial approximation coefficients used byentropy.py
Generate a Tree graph with 5 nodes. Then generate data by a MOD model with p=0.2 and sample size 1000.
from utils import *
from tam import TAM
G = simulate_dag(d=5, s0=5, graph_type='Tree')
X = sample_from_mod(G=G, n=1000, prob=0.2)Apply algorithm with kappa=0.005 and omega=0.001.
model = TAM(X)
model.train(kappa=0.005, omega=0.001)Print result and SHD with the true graph.
print(model.Gr)
print(count_accuracy(G, model.Gr)['shd'])We adopt the minimax optimal entropy estimator from here.