New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: mix arguments of different-sized arrays (& floats)? #982
Comments
Hi, I considered a similar feature every now and again, but there didn't seem to be a good enough use case for putting in the time. evermore sounds like an interesting library, I considered writing something like that myself (as I am also a fan of JAX). If you have differentiable likelihoods in JAX, I don't see why you would need iminuit. You can use the optax minimizers and you can compute uncertainties as well, at least the analog to the HESSE algorithm in MINUIT. You need to compute the hessian at the minimum with JAX and invert it. If your original function has a negative log-likelihood, then this produces the covariance matrix of the parameters. |
Hi @HDembinski, thank you very much for your reply :) Indeed it is possible to just use first order minimizer and to compute the hessian at the minimum with JAX and invert it. However, I am currently in the process of comparing evermore's features with similar tools. These tools always/only use Minuit for minimization, so a fair comparison of e.g. a likelihood profile between These two points are my main motivation to use Best, Peter PS: But I agree fully with you... I personally had a pretty robust and fast experience with |
Regarding BarlowBeeston, I recommend to have a look at our new method if you are not already aware of it. It is implemented as the default in the class |
Dear iminuit developers,
Thank you very much for this great package!
I am the author of evermore - a pure JAX based package to build binned likelihoods in HEP. Here, one can construct arbitrary PyTrees of nuisances parameters and use them in their loss function. It is highly efficient to be able to group parameters into arrays to modify bin content in a vectorized fashion (especially for barlow-beeston[-lite]). Users have some parameters that are just single values (e.g. a single cross section uncertainty), and some that are represented as arrays (e.g. barlow-beeston statistical uncertainties).
Thus, I'd like to ask if it is possible to add the feature to use mixtures of different sized arrays (and floats), e.g.:
This is in particular handy when working with JAX loss functions where the parameters (
x
,c
) are often in practise a nested PyTree ofjax.Arrays
of arbitrary size:In this example
params
is just a simple dictionary, but this would also work with arbitrary (nested) PyTree structures ifiminuit
could support arrays of arbitrary size for the loss function kwargs.Best, Peter
PS: JAX optimisers, i.e.
optax
, can minimise directly w.r.t these PyTree structures. The minimiser returns the original PyTree structure, but its leafs contain the fitted parameter values. Here, one does not even need the step of thewrapped_fun
to convert any PyTree to a list of arguments.The text was updated successfully, but these errors were encountered: