Skip to content

Tiny (header-only) Automatic Differentiation library for C++

License

Notifications You must be signed in to change notification settings

sergehog/tiny_autodf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tiny_autodf : Tiny, header-only Automatic Differentiation (AD) library (C++)

CMake Test Status

Attempt to create simple to use AD mechanism, which could be easily integrated into existing code, formulas and algorithms, essentially making them differentiable. It also contains basic Gradient Descent implementation, essentially turning library into non-linear problem solver.

Originally it was developed as part of another Tiny PGA project, however it was extracted as being useful in other projects too.

Usage Example (drop-in replacement)

Assume you have an equation (or function), depending on one or more variables, for instance something like this:

float function(float x, float y)
{
    return 0.5F * x * x + 2.0F * x * y + 4.F * sin(y);
}
...
float result = function(0.1F, 0.2F);

You may turn it into a differentiable version by redefining numerical type like that:

using Float = tiny_autodf::AutoDf<float>;

template<typename type>
type function(type x, type y)
{
    return 0.5F * x * x + 2.0F * x * y + 4.F * sin(y);
}

Float x = 0.1F;
Float y = 0.2F;
Float result = function<Float>(x, y);

Now your result is "differentiable" over the x and 'y'. In order to obtain gradient in current point (0.1, 0.2) you'd need to call eval() member function:

auto eval = result.eval();
std::cout << "dx = " << eval.derivatives[x.ID()] << ", dy = " << eval.derivatives[y.ID()] << std::endl;

About

Tiny (header-only) Automatic Differentiation library for C++

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published