Skip to content

Releases: patrick-kidger/diffrax

Diffrax v0.5.1

19 May 15:41
Compare
Choose a tag to compare

New research paper

One of the new features of this release is the simulation of space-time Lévy area, over arbitrary intervals, deterministically with respect to a PRNG key. This is a required component for adaptive step-size and higher-order SDE solvers in particular.

Well, this turned out to be an unsolved research question! And so with huge credit to @andyElking for diligently figuring out all the details -- we now have a new paper on arXiv discussing all the details! So far as we know this sets a new state-of-the-art for numerical Brownian simulation, and is what now powers all of the numerical SDE solving inside Diffrax.

If you're interested in numerical methods for SDEs, then check out the arxiv paper here.

New features for SDEs

  • Added a suite of Stochastic Runge--Kutta methods! These are higher-order solvers for SDEs, in particular when the noise has a particular form (additive, commutative, ...). A huge thank-you to @andyElking for implementing all of these:

    • GeneralShARK: recommended when the drift is expensive to evaluate;
    • SEA: recommended when the noise is additive and wanting a solve that is cheap-and-low-accuracy;
    • ShARK: recommended default choice when the noise is additive;
    • SlowRK: recommended for commutative noise;
    • SPaRK: recommended when performing adaptive time stepping;
    • SRA1: alternative to ShARK (this is a now-classical SRK method).
  • Added support for simulating space-time Lévy Area to VirtualBrownianTree and UnsafeBrownianPath. This is the bit discussed in the "new research paper" section above! The main thing here is the ability to sample from random variables like space-time Lévy area, which is a doubly-indexed integral of Brownian motion over time: $H_{s,t} = \frac{1}{t-s} \int_s^t ((W_r - W_s) - \frac{r-s}{t-s} (W_t - W_s)) dr$.

New features for all differential equations

  • Added TextProgressMeter and TqdmProgressMeter, which can be used to track how far through a differential equation solve things have progressed. (Thanks @abocquet! #357, #398)
  • Added support for using adaptive step size controllers on TPUs (Thanks @stefanocortinovis! #366, #369)
  • All AbstractPaths are now typing.Generics parameterised by their return type; all AbstractTerms are now typing.Generics parameterised by their vector field and control. (Thanks @tttc3! #359, #364)

Other

  • Improved documentation for PIDController (Thanks @ParticularlyPythonicBS! #371, #372)
  • Now have a py.typed file to declare that we're static-type-checking compatible. (Thanks @lockwo! #408)
  • Bugfix for CubicInterpolation when there is nothing to interpolate. (Thanks @allen-adastra! #360)
  • Compatibility with future versions of JAX by removing the now-deprecated jax.config import. (Thanks @jakevdp! #377)

New Contributors

Full Changelog: v0.5.0...v0.5.1

Diffrax v0.5.0

08 Jan 23:23
Compare
Choose a tag to compare

This is a fun release. :)

Diffrax was the very first project I ever released for the JAX ecosystem. Since then, many new libraries have grown up around it -- most notably jaxtyping, Lineax, and Optimistix.

All of these other libraries actually got their start because I wanted to use them for some purpose in Diffrax!

And with this release... we are now finally doing that. Diffrax now depends on jaxtyping for its type annotations, Lineax for linear solves, and Optimistix for root-finding!

That makes this release mostly just a huge internal refactor, so it shouldn't affect you (as a downstream user) very much at all.

Features

  • Added diffrax.VeryChord, which is a chord-type quasi-Newton method typically used as part of an implicit solver. (This is the most common root-finding method used in implicit differential equation solvers.)
  • Added diffrax.with_stepsize_controller_tols, which can be used to mark that a root-finder should inherit its tolerances from the stepsize_controller. For example, this is used as:
    root_finder = diffrax.with_stepsize_controller_tols(diffrax.VeryChord)()
    solver = diffrax.Kvaerno5(root_finder=root_finder)
    diffrax.diffeqsolve(..., solver=solver, ...)
    This tolerance-inheritance is the default for all implicit solvers.
    (Previously this tolerance-inheritance business was done by passing rtol/atol=None to the nonlinear solver -- and again was the default. However now that Optimistix owns the nonlinear solvers, it's up to Diffrax to handle tolerance-inheritance in a slightly different way.)
  • Added the arguments diffrax.ImplicitAdjoint(linear_solver=..., tags=...). Implicit backpropagation can now be done using any choice of Lineax solver.
  • Now static-type-checking compatible. No more having your IDE yell at you for incorrect types.
  • Diffrax should now be compatible with JAX_NUMPY_DTYPE_PROMOTION=strict and JAX_NUMPY_RANK_PROMOTION=raise. (These are JAX flags that can be used to disable dtype promotion and broadcasting, to help write more reliable code.)
  • diffrax.{ControlTerm, WeaklDiagonalControlTerm} now support using a callable as their control, in which case it is treated as the evaluate of an AbstractPath over [-inf, inf].
  • Experimental support for complex numbers in explicit solvers. This may still go wrong, so please report bugs / send fixing PRs as you encounter them.

Breaking changes

  • diffrax.{AbstractNonlinearSolver, NewtonNonlinearSolver, NonlinearSolution} have been removed in favour of using Optimistix. If you were using these explicitly, e.g. Kvaerno5(nonlinear_solver=NewtonNonlinearSolver(...)), then the equivalent behaviour is now given by Kvaerno5(root_finder=VeryChord(...)). You can also use any other Optimistix root-finder too.
  • The result of a solve is now an Enumeration rather than a plain integer. For example, this means that you should write something like jnp.where(sol.result == diffrax.RESULTS.successful, ...), not jnp.where(sol.result == 0, ...).
  • A great many modules have been renamed from foo.py to _foo.py to explicitly indicate that they're private. Make sure to access features via the public API.
  • Removed the AbstractStepSizeController.wrap_solver method.

Bugfixes

  • Crash fix when using an implicit solver together with DirectAdjoint.
  • Crash fix when using dt0=None, stepsize_controller=diffrax.PIDController(...) with SDEs.
  • Crash fix when using adjoint=BacksolveAdjoint(...) with VirtualBrownianTree with jax.disable_jit on the TPU backend.

New Contributors

Full Changelog: v0.4.1...v0.5.0

Diffrax v0.4.1

03 Aug 20:45
Compare
Choose a tag to compare

Minor release to fix two bugs, and to introduce a performance improvement.

New Contributors

Full Changelog: v0.4.0...v0.4.1

Diffrax v0.4.0

22 May 15:33
a5e160a
Compare
Choose a tag to compare

Features

  • Highlight: added IMEX solvers! These solve the "easy" part of the diffeq using an explicit solver, and the "hard" part using an implicit solver. We now have:
    • diffrax.KenCarp3
    • diffrax.KenCarp4
    • diffrax.KenCarp5
    • diffrax.Sil3
      Each of these should be called with e.g. diffeqsolve(terms=MultiTerm(explicit_term, implicit_term), solver=diffrax.KenCarp4(), ...)
  • diffrax.ImplicitEuler now supports adaptive time stepping, by using an embedded Heun method. (#251)

Backward incompatibilities

  • scan_stages, e.g. Tsit5(scan_stages=True), no longer exists. All Runge--Kutta solvers now scan-over-stages by default.
    • If you were using scan_stages, then you should simply delete this argument.
    • If you were using the interactive API together with forward-mode autodiff then you should pass scan_kind="bounded" to the solver, e.g. Tsit5(scan_kind="bounded").

Bugfixes

  • Fixed AbstractTerm.vf_prod being ignored, so that naive prod(vf(...), control) calls were being used instead of optimised vf-prod routines, where available. (#239)
  • Implicit solvers now use the correct stage value predictors. This should help the implicit solvers converge faster, so that overall runtime is decreased. This should mean that they occasionally take a different number of steps than before -- usually fewer.

Performance

  • Overall compilation should be faster. (Due to patrick-kidger/equinox#353)
  • Initial step size selection should now compile faster. (#257)
  • Fixed dense output consuming far too much memory. (#252)
  • Backsolve adjoint should now be much more efficient (due to the vf_prod bugfix).

Full Changelog: v0.3.1...v0.4.0

Diffrax v0.3.1

23 Feb 03:53
16b08d5
Compare
Choose a tag to compare

See the previous v0.3.0 release notes for the most significant recent changes.

This hotfix

Hotfix for the previous release breaking backprop through SaveAt(dense=True).

Full Changelog: v0.3.0...v0.3.1

Diffrax v0.3.0

21 Feb 03:30
9280c3a
Compare
Choose a tag to compare

Highlights

This release is primarily a performance improvement: the default adjoint method now uses an asymptotically more efficient checkpointing implementation.

New features

  • Added diffrax.citation for automatically generating BibTeX references of the numerical methods being used.
  • diffrax.SaveAt can now save different selections of outputs at different times, using diffrax.SubSaveAt.
  • diffrax.SaveAt now supports a fn argument for controlling what to save, e.g. only statistics of the solution. (#113, #221, thanks @joglekara in #220!)
  • Can now use SaveAt(dense=True) in the edge case when t0 == t1.

Performance improvements

  • The default adjoint method RecursiveCheckpointAdjoint now uses a dramatically improved implementation for reverse-mode autodifferentiate while loops. This should be asymptotically faster, and generally produce both runtime and compiletime speed-ups.
    • The previous implementation is available as DirectAdjoint. This is still useful in a handful of less-common cases, such as using forward-mode autodifferentiation. (Once JAX gets bounded while loops as native operations then this will be tidied up further.)

Backward-incompatible changes

  • Removed NoAdjoint. It existed as a performance improvement when not using autodifferentiation, but RecursiveCheckpointAdjoint (the default) has now incorporated this performance improvement automatically.
  • Removed ConstantStepSize(compile_steps=...) and StepTo(compile_steps=...), as these are now unnecessarily when using the new RecursiveCheckpointAdjoint.
  • Removed the undocumented Fehlberg2 solver. (It's just not useful compared to Heun/Midpoint/Ralston.)
  • AbstractSolver.term_structure should now be e.g. (ODETerm, AbstractTerm) rather than jtu.tree_structure((ODETerm, AbstractTerm)), i.e. it now encodes the term type as well.
  • Dropped support for Python 3.7.

Fixes

  • Fixed an upstream change in JAX that was breaking UnsafeBrownianPath and VirtualBrownianTree (#225).
  • The sum of Runge--Kutta stages now happens in HIGHEST precision, which should improve numerical stability on some accelerators.

Examples

  • The documentation now has an introductory "coupled ODEs" example.
  • The documentation now has an advanced "nonlinear heat PDE" example.
  • The "symbolic regression" example has been updated to use sympy2jax.

New Contributors

Full Changelog: v0.2.2...v0.3.0

Diffrax v0.2.2

15 Nov 18:39
ea1bdc9
Compare
Choose a tag to compare

Performance improvements

  • Now make fewer vector field traces in several cases (#172, #174)

Fixes

  • Many documentation improvements.
  • Fixed several warnings about jax.{tree_map,tree_leaves,...} being moved to jax.tree_util.{tree_map,tree_leaves,...}. (Thanks @jacobusmmsmit!)
  • Fixed the step size controller choking if the error is ever NaN. (#143, #152)
  • Fixed some crashes due to JAX-internal changes (If you've ever seen it throw an error about not knowing how to rewrite closed_call_p, it's this one.)
  • Fixed an obscure edge-case NaN on the backward pass, if you were using an implicit solver with an adaptive step size controller, got a rejected step due to the implicit solve failing to converge, and happened to also be backpropagating wrt the controller_state.

Other

  • Added a new Kalman filter example (#159) (Thanks @SimiPixel!)
  • Brownian motion classes accept pytrees for shape and dtype arguments (#183) (Thanks @ciupakabra!)
  • The main change is an internal refactor: a lot of functionality has moved diffrax.misc -> equinox.internal.

New Contributors

Full Changelog: v0.2.1...v0.2.2

Diffrax v0.2.1

03 Aug 22:59
7548c49
Compare
Choose a tag to compare

Autogenerated release notes as follows:

What's Changed

New Contributors

Full Changelog: v0.2.0...v0.2.1

Diffrax v0.2.0

20 Jul 19:50
115997e
Compare
Choose a tag to compare
  • Feature: event handling. In particular it is now possible to interrupt a diffeqsolve early. See the events page in the docs and the new steady state example.
  • Compilation time improvements:
    • The compilation speed of NewtonNonlinearSolver (and thus in practice also all implicit solvers like Kvaerno3 etc.) has been improved (~factor 1.5)
    • The compilation speed of all Runge--Kutta solvers can be dramatically reduced (~factor 3) by passing e.g. Dopri5(scan_stages=True). This may increase runtime slightly. At the moment the default is scan_stages=False for all solvers, but this default might change in the future.
  • Various documentation improvements.

New Contributors

Full Changelog: v0.1.2...v0.2.0

Diffrax v0.1.2

18 May 21:14
997d0e3
Compare
Choose a tag to compare

Main change here is a minor technical one - Diffrax will no longer initialise the JAX backend as a side effect of being imported.


Autogenerated release notes as follows:

What's Changed

New Contributors

Full Changelog: v0.1.1...v0.1.2