Living Survey for papers on Compositional Generalization in NLP
-
Updated
Jun 3, 2024 - Vue
Living Survey for papers on Compositional Generalization in NLP
Minimal model of tool discovery and tool innovation using active inference
A curated list of Composable AI methods: Building AI system by composing modules.
Object-Centric Disentangled Mechanisms
Multiple paper open-source codes of the Microsoft Research Asia DKI group
Official pytorch implementation of CVPR2023 paper "Learning Conditional Attributes for Compositional Zero-Shot Learning"
ReCOGS: How Incidental Details of a Logical Form Overshadow an Evaluation of Semantic Interpretation
Diverse Demonstrations Improve In-context Compositional Generalization
Official implementation of ICLR 2023 paper "A Minimalist Dataset for Systematic Generalization of Perception, Syntax, and Semantics"
This repository shares the most important sources used for the reasearch paper "More Diverse Training, Better Compositionality! Evidence from Multimodal Language Learning" by Caspar Volquardsen, Jae Hee Lee, Cornelius Weber, and Stefan Wermter.
Code from the article: "Lost in Latent Space: Examining Failures of Disentangled Models at Combinatorial Generalisaton" (NeurIPS, 2022)
Distributional Generalization in NLP. A roadmap.
Baby Abstract Reasoning Corpus (BabyARC) dataset engine, for generating grid-world-based abstract reasoning tasks on a large scale.
The codebase for Inducing Causal Structure for Interpretable Neural Networks
Causal Abstraction of Neural Models Trained to Solve ReaSCAN
ReaSCAN is a synthetic navigation task that requires models to reason about surroundings over syntactically difficult languages. (NeurIPS '21)
Add a description, image, and links to the compositional-generalization topic page so that developers can more easily learn about it.
To associate your repository with the compositional-generalization topic, visit your repo's landing page and select "manage topics."