Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
-
Updated
Nov 16, 2022 - Python
Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
Lanczos Network, Graph Neural Networks, Deep Graph Convolutional Networks, Deep Learning on Graph Structured Data, QM8 Quantum Chemistry Benchmark, ICLR 2019
[ICLR'19] Meta-learning with differentiable closed-form solvers
PyTorch code for ICLR 2019 paper: Self-Monitoring Navigation Agent via Auxiliary Progress Estimation
The Reinforcement-Learning-Related Papers of ICLR 2019
A simplified PyTorch implementation of GANsynth
[ICLR'19] Complement Objective Training
Code for the paper 'Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology'
Variance Networks: When Expectation Does Not Meet Your Expectations, ICLR 2019
Single shot neural network pruning before training the model, based on connection sensitivity
We propose a Seed-Augment-Train/Transfer (SAT) framework that contains a synthetic seed image dataset generation procedure for languages with different numeral systems using freely available open font file datasets
PyTorch implementation of "Variational Autoencoders with Jointly Optimized Latent Dependency Structure" [ICLR 2019]
✂️ Repository for our ICLR 2019 paper: Discovery of Natural Language Concepts in Individual Units of CNNs
Implementation of https://arxiv.org/pdf/1805.12352.pdf (ICLR 2019)
Add a description, image, and links to the iclr2019 topic page so that developers can more easily learn about it.
To associate your repository with the iclr2019 topic, visit your repo's landing page and select "manage topics."