Source-to-Source Debuggable Derivatives in Pure Python
-
Updated
Sep 29, 2022 - Python
Source-to-Source Debuggable Derivatives in Pure Python
Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals.
automatic differentiation made easier for C++
Deep learning in Rust, with shape checked tensors and neural networks
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
DiffSharp: Differentiable Functional Programming
Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization
A Tensor Library written in C++.
A JIT compiler for hybrid quantum programs in PennyLane
Drop-in autodiff for NumPy.
AutoBound automatically computes upper and lower bounds on functions.
Utilities for testing custom AD primitives.
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.
A probabilistic programming language that combines automatic differentiation, automatic marginalization, and automatic conditioning within Monte Carlo methods.
Autodifferentiation package in Rust.
A new lightweight auto-differentation library that directly builds on numpy. Used as a homework for CMU 11785/11685/11485.
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
library of C++ functions that support applications of Stan in Pharmacometrics
Add a description, image, and links to the autodiff topic page so that developers can more easily learn about it.
To associate your repository with the autodiff topic, visit your repo's landing page and select "manage topics."