Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

default to float64 and LBFGS from differientiable? #85

Open
BalzaniEdoardo opened this issue Jan 23, 2024 · 1 comment
Open

default to float64 and LBFGS from differientiable? #85

BalzaniEdoardo opened this issue Jan 23, 2024 · 1 comment

Comments

@BalzaniEdoardo
Copy link
Collaborator

Should we default to float64 precision and "LBFGS", as sklearn?
Rationale:

  • "jaxopt.GradientDescent" on real data seems to stop early on real data compared to "LBFGS", underperforming
  • "LBFGS" may require float64

How do we do that? at import +Warning? we are waiting for response from US-RSE.

@billbrod
Copy link
Member

I posted about this on the US-RSE slack, and they said that running code on import / in __init__.py is a very bad idea: "If you are a library you can not guess what else your users are doing and what other libraries they are importing. Consider the case where some other library the user imports makes the opposite decision about setting this option, then you can end up in a case where the behavior the users sees is dependent on the import order of the libaries!"

The recommendation was to have a function like our_opinionated_config() at the top level that does this (and any other necessary "tuning of other libraries"), and we discuss it in docs, use it in all tutorials. User then has to opt in, but they don't need to understand all the details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants