Skip to content

An introduction to Automatic Differentiation with theory and code examples.

License

Notifications You must be signed in to change notification settings

hzdr/autodiff101

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

64 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GitHub release (latest by date) GitHub Workflow Status DOI

An introduction to Automatic Differentiation

theory/: AD background theory, introducing the concept of forward and reverse mode plus Jacobian-vector / vector-Jacobian products. To go deeper, make sure to check out the excellent JAX autodiff cookbook as well as @mattjj's talk on autograd.

talk/: Talk version of those notes. The talk was given at the @hzdr http://helmholtz.ai local unit's Machine Learning journal club and at a @hzdr and @casus workshop on physics-informed neural networks, both organized by Nico Hoffmann (@nih23).

examples/: AD examples using autograd, jax and pytorch. The examples focus mostly on how to define custom derivatives in jax (and autograd). This has helped to understand how Jacobian-vector products actually work. More examples to come!

Download talk and theory PDF files from the Releases page or the latest CI run. You can also click the badges above. The talk is also available via figshare.