Skip to content

official repo for the NeurIPS 2022 paper "A Solver-Free Framework for Scalable Learning in Neural ILP Architectures"

Notifications You must be signed in to change notification settings

dair-iitd/ilploss

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Solver-Free Framework for Scalable Learning in Neural ILP Architectures

This repository contains the code to reproduce the results reported in the paper A Solver-Free Framework for Scalable Learning in Neural ILP Architectures, which has been accepted at NeurIPS 2022. We also provide the core components of our technique as a light-weight python package.

Install

git clone https://github.com/dair-iitd/ilploss
cd ilploss
conda env create -f env_export.yaml
conda activate ilploss

Download and unzip the data from here into a directory named data/.

We recommend mamba, which drastically speeds up conda environment creation.

The installation has been tested to work on Linux.

Run

./trainer.py --config <path-to-config>

All our experiments are available as config files in the conf/ directory. For example to train and test ILP-Loss on random constraints for the binary domain with 8 ground truth constraints and dataset seed 0, run:

./trainer.py --config conf/random_constraints/binary_random/ilploss/8x16/0.yaml

Citation

@inproceedings{ilploss,
  author = {Nandwani, Yatin and Ranjan, Rishabh and Mausam and Singla, Parag},
  title = {A Solver-Free Framework for Scalable Learning in Neural ILP Architectures},
  booktitle = {Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2022, NeurIPS 2022, November 29-Decemer 1, 2022},
  year = {2022},
}

About

official repo for the NeurIPS 2022 paper "A Solver-Free Framework for Scalable Learning in Neural ILP Architectures"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published