Skip to content

mandubian/neural-ode

Repository files navigation

Tensorflow Experiments on Neural Ordinary Differential Equations

You can contact me on twitter as @mandubian

The notebook is a sandbox to test concepts exposed in this amazing paper:

This notebook is a sandbox to test concepts exposed in this amazing paper:

Neural Ordinary Differential Equations http://arxiv.org/abs/1806.07366

Authors: Chen, Ricky T. Q. Rubanova, Yulia Bettencourt, Jesse Duvenaud, David

My idea is to reproduce the concepts exposed in the paper fully in Tensorflow using Eager-Mode and GradientTape.

I didn't want to depend on Autograd and to be able to use classic Keras models.

For ODE Solver, I wanted to compare implementations in TF with Scipy very robust ones (the only one I found is Runge-Kutta Dopri5).

  • Added batched TF augmented gradient
  • Added mini-batch optimization inspired by cool Pytorch implementation https://github.com/rtqichen/torchdiffeq allowing to have much faster converging & deterministic training
  • First Implementation of TF augmented gradient
  • Samples with basic optimization on whole dataset

Licensed under MIT License

About

Neural Ordinary Differential Equation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published