Skip to content

BinbinBian/Awesome-Code

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 

Repository files navigation

Awesome-Code

  • Links to a curated list of awesome implementations of neural network models.(tensorflow,torch,theano,keras,...)
  • Mainly Question Answering,Machine comprehension,Sentiment Analysis...
  • Contributions are welcomed.

##Table of Contents

##Python - [context2vec: Learning Generic Context Embedding with Bidirectional LSTM](https://github.com/orenmel/context2vec) - [Deep Unordered Composition Rivals Syntactic Methods for Text Classification(Deep Averaging Networks ACL2015)](https://github.com/miyyer/dan) ##Tensorflow - [Neural Turing Machine(NMT)](https://github.com/carpedm20/NTM-tensorflow).Taehoon Kim’s(Tensorflow) - [Neural Turing Machine(NMT)](https://github.com/kaishengtai/torch-ntm). Kai Sheng Tai’s (Torch) - [Neural Turing Machine(NMT)](https://github.com/shawntan/neural-turing-machines)Shawn Tan’s (Thenao) - [Neural Turing Machine(NMT)](https://github.com/fumin/ntm)Fumin’s (Go) - [Neural Turing Machine(NMT)](https://github.com/snipsco/ntm-lasagne)Snip’s (Lasagne) - [Neural GPUs Learn Algorithms](https://github.com/tensorflow/models/tree/master/neural_gpu) - [A Neural Attention Model for Abstractive Summarization](https://github.com/BinbinBian/neural-summary-tensorflow) - [Recurrent Convolutional Memory Network](https://github.com/carpedm20/RCMN) - [End-To-End Memory Network](https://github.com/carpedm20/MemN2N-tensorflow)@carpedm20 - [End-To-End Memory Network](https://github.com/domluna/memn2n)@domluna - [Neural Variational Inference for Text Processing](https://github.com/carpedm20/variational-text-tensorflow)---[wikiQA Corpus]() - [Word2Vec](https://github.com/carpedm20/word2vec-tensorflow) - [CNN code for insurance QA(question Answer matching)](https://github.com/BinbinBian/insuranceQA-cnn)---[InsuranceQA Corpus](https://github.com/shuzi/insuranceQA) - [Some experiments on MovieQA with Hsieh,Tom and Huang in AMLDS](https://github.com/YCKung/MovieQA) - [Teaching Machines to Read and Comprehend](https://github.com/carpedm20/attentive-reader-tensorflow) - [Convolutional Neural Networks for Sentence Classification (kIM.EMNLP2014)](https://github.com/dennybritz/cnn-text-classification-tf)Tensorflow - [Convolutional Neural Networks for Sentence Classification (kIM.EMNLP2014)](https://github.com/yoonkim/CNN_sentence)Theano - [Separating Answers from Queries for Neural Reading Comprehension](https://github.com/dirkweissenborn/qa_network) - [Neural Associative Memory for Dual-Sequence Modeling](https://github.com/dirkweissenborn/dual_am_rnn) - [The Ubuntu Dialogue Corpus: A Large Dataset for Research in Unstructured Multi-Turn Dialogue Systems.](https://github.com/dennybritz/chatbot-retrieval) - [Key-Value Memory Networks for Directly Reading Documents](https://github.com/siyuanzhao/key-value-memory-networks) - [A statistical natural language generator for spoken dialogue systems(SIGDIAL 2016 short paper)](https://github.com/UFAL-DSG/tgen) ##Theano - [ End-To-End Memory Networks, formerly known as Weakly Supervised Memory Networks](https://github.com/npow/MemN2N) - [Memory Networks](https://github.com/npow/MemNN) - [Dynamic Memory Networks](https://github.com/swstarlab/DynamicMemoryNetworks) - [Ask Me Anything: Dynamic Memory Networks for Natural Language Processing](https://github.com/YerevaNN/Dynamic-memory-networks-in-Theano)YerevaNN’s (Theano) - [Memory Networks](https://github.com/facebook/MemNN)Facebook’s (Torch/Matlab) - [Recurrent Neural Networks with External Memory for Language Understanding](https://github.com/npow/RNN-EM) - [Attention Sum Reader model as presented in "Text Comprehension with the Attention Sum Reader Network"](https://github.com/rkadlec/asreader)---[ CNN and Daily Mail news data QA]() - [character-level language models](https://github.com/lipiji/rnn-theano) - [Hierarchical Encoder-Decoder](https://github.com/BinbinBian/hierarchical-encoder-decoder) - [A Recurrent Latent Variable Model for Sequential Data](https://github.com/jych/nips2015_vrnn) - [A Fast Unified Model for Sentence Parsing and Understanding(Stack-augmented Parser-Interpreter Neural Network)](https://github.com/stanfordnlp/spinn) - [ Semi-supervised Question Retrieval with Gated Convolutions. NAACL 2016](https://github.com/taolei87/rcnn) - [ Molding CNNs for text: non-linear, non-consecutive convolutions. EMNLP 2015](https://github.com/taolei87/rcnn) - [Tree RNNs](https://github.com/ofirnachum/tree_rnn) - [A Character-Level Decoder without Explicit Segmentation for Neural Machine Translation(ACL2016)](https://github.com/nyu-dl/dl4mt-cdec) - [Charagram: Embedding Words and Sentences via Character n-grams](https://github.com/jwieting/charagram) - [Towards Universal Paraphrastic Sentence Embeddings](https://github.com/jwieting/iclr2016) - [Dependency-based Convolutional Neural Networks for Sentence Embedding](https://github.com/cosmmb/DCNN) - [Siamese-LSTM - Siamese Recurrent Neural network with LSTM for evaluating semantic similarity between sentences.(AAAI2016))](https://github.com/aditya1503/Siamese-LSTM) ##Keras - [Learning text representation using recurrent convolutional neural network with highway layers](https://github.com/wenying45/deep_learning_tutorial/tree/master/rcnn-hw) ##Torch - [Sequence-to-sequence model with LSTM encoder/decoders and attention](https://github.com/harvardnlp/seq2seq-attn) - [Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks](https://github.com/rajarshd/ChainsOfReasoning/tree/master/model) - [Recurrent Memory Network for Language Modeling](https://github.com/ketranm/RMN) - [Bag of Tricks for Efficient Text Classification.(FastText)](https://github.com/kemaswill/fasttext_torch) - [Bag of Tricks for Efficient Text Classification.(FastText)](https://github.com/facebookresearch/fastText)Facebook C++ - [Character-Aware Neural Language Models (AAAI 2016).](https://github.com/yoonkim/lstm-char-cnn) - [Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks(Tree-LSTM)](https://github.com/stanfordnlp/treelstm) - [A Neural Attention Model for Abstractive Summarization.](https://github.com/facebook/NAMAS) - [Text Understanding with the Attention Sum Reader Network, Kadlec et al., ACL 2016.](https://github.com/ganeshjawahar/torch-teacher) - [A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task, Chen et al., ACL 2016.](https://github.com/ganeshjawahar/torch-teacher) - [The Goldilocks Principle: Reading Children's Books with Explicit Memory Representations, Hill et al., ICLR 2016.](https://github.com/ganeshjawahar/torch-teacher) ##Matlab - [When Are Tree Structures Necessary for Deep Learning of Representations](https://github.com/jiweil/Sequence-Models-on-Stanford-Treebank) ##Deep Reinforcement Learning

##=========================================== ##machine learning and deep learning tutorials, articles and other resources

##People -[carpedm20](https://github.com/carpedm20)

About

Links to a curated list of awesome implementations of models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published