Skip to content

Tutorial on how to use AllenNLP for sequence modeling (including hierarchical LSTMs and CRF decoding)

License

Notifications You must be signed in to change notification settings

jbarrow/allennlp_tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

67 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AllenNLP Tutorial

New: July 20, 2020: The tutorial now uses AllenNLP 1.0, and so should be up-to-date with the latest AllenNLP.

This tutorial is meant to teach you both how to use AllenNLP and a principled approach to doing deep learning research in NLP. The content is mirrored (and updated) on my personal site: jbarrow.ai. If you're interested in reading the latest version, you can find it there. But the code will always be stored in this repository. It consists of 10 sections, and I recommend you do them in order:

  1. Setup
  2. Building a Dataset Reader
  3. Building a Baseline Model
  4. Configuring Experiments
  5. Tackling Your Own Experiments
  6. Predictors
  7. Debugging [WIP]
  8. Advanced Modeling: Hierarchical LSTMs, CRF Decoding, and BERT [WIP]
  9. Digging Into the Documentation [WIP]
  10. Hyperparameter Search: AllenTune [WIP]
  11. Appendix: Migrating from AllenNLP 0.9 to 1.0 [WIP]

The tutorial makes no assumptions about familiarity with AllenNLP, and goes through using it as an experimental platform, using JSON configurations.

About

Tutorial on how to use AllenNLP for sequence modeling (including hierarchical LSTMs and CRF decoding)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published