automatic differentiation made easier for C++
-
Updated
May 29, 2024 - C++
automatic differentiation made easier for C++
Deep learning in Rust, with shape checked tensors and neural networks
Transparent calculations with uncertainties on the quantities involved (aka "error propagation"); calculation of derivatives.
Tensors and dynamic neural networks in pure Rust.
Drop-in autodiff for NumPy.
FastAD is a C++ implementation of automatic differentiation both forward and reverse mode.
Chemical Explosive Mode Analysis for computational/experimental combustion diagnostics using Julia SciML features
Assignments for Data Intensive Systems for Machine Learning Coursework
Fazang is a Fortran library for reverse-mode automatic differentiation, inspired by Stan/Math library.
Scala embedded universal probabilistic programming language
Algorithmic differentiation with hyper-dual numbers in C++ and Python
A minimalist neural networks library built on a tiny autograd engine
A toy deep learning framework implemented in pure Numpy from scratch. Aka homemade PyTorch lol.
Differentiate python calls from Julia
A tiny autograd library made for educational purposes.
Fork of Matt Loper's autodifferentiation framework for Python
Forward mode automatic differentiation for Fortran
Yet another tensor automatic differentiation framework
Testing capabilities of Trilinos-Sacado in combination with e. g. tensors
Add a description, image, and links to the autodifferentiation topic page so that developers can more easily learn about it.
To associate your repository with the autodifferentiation topic, visit your repo's landing page and select "manage topics."