Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can DifferentiationInterface be useful for Turing? #2187

Open
gdalle opened this issue Apr 2, 2024 · 4 comments
Open

Can DifferentiationInterface be useful for Turing? #2187

gdalle opened this issue Apr 2, 2024 · 4 comments

Comments

@gdalle
Copy link

gdalle commented Apr 2, 2024

Hi there!
@adrhill and I recently started https://github.com/gdalle/DifferentiationInterface.jl to provide a common interface for automatic differentiation in Julia. We're currently chatting with Lux.jl, Flux.jl and Optimization.jl to see how they can benefit from it, and so my mind went to Turing.jl as another AD power user :)
DifferentiationInterface.jl only guarantees support for functions of the type f(x) = y or f!(y, x) with standard numbers or arrays in and out. Within these restrictions, we are compatible with 13 different AD backends, including the cool kids like Enzyme.jl and even the hipsters like Tapir.jl. Do you think it could come in handy?
Ping @yebai @willtebbutt

@yebai
Copy link
Member

yebai commented Apr 2, 2024

Turing's current interface to autodiff backends is based on LogDensityProblemsAD -- it is a fairly lightweight package for sharing glue code to various autodiff backends. You can take a look there to see how things might work.

@torfjelde
Copy link
Member

Looks great!

Is there a summary of how this is different from AbstractDifferentiation.jl somewhere?:)

@gdalle
Copy link
Author

gdalle commented Apr 2, 2024

I updated the summary in this issue: JuliaDiff/AbstractDifferentiation.jl#131

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants