Skip to content

clab/dynet_tutorial_examples

Repository files navigation

Practical Neural Networks for NLP

A tutorial given by Chris Dyer, Yoav Goldberg, and Graham Neubig at EMNLP 2016 in Austin. The tutorial covers the basic of neural networks for NLP, and how to implement a variety of networks simply and efficiently in the DyNet toolkit.

  • Slides, part 1: Basics

    • Computation graphs and their construction
    • Neural networks in DyNet
    • Recurrent neural networks
    • Minibatching
    • Adding new differentiable functions
  • Slides, part 2: Case studies in NLP

    • Tagging with bidirectional RNNs and character-based embeddings
    • Transition-based dependency parsing
    • Structured prediction meets deep learning

About

Tutorial on "Practical Neural Networks for NLP: From Theory to Code" at EMNLP 2016

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •