You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you want to return multiple metrics, you cannot save multiple scores using the provided API. This is because we have to pass multiple scorers, not a function that generates multiple scores.
Enabling return_train_score will call the scorer callback too many times and it is not easy to distinguish between the training and testing scoring.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Hi @tianhuil! Adding a callback API is a fairly large undertaking, and indeed already in progress (#22000)!
I'll leave this issue open for now, since afaik it is a new/unique use-case and is helpful to keep in mind, but bear in mind that this feature is probably not going to be released for some time and still requires much work.
Describe the workflow you want to enable
I would like to save off the results of all runs in GridSearchCV to MLFlow. MLFlow
See https://mlflow.org/docs/latest/tutorials-and-examples/tutorial.html for more details:
I would like to use
GridSearchCV
to do the above because it comes with many other features (e.g.HalvingGridSearchCV
, multi-threading, etc ...)Describe your proposed solution
A callback parameter to
GridSearchCV
. PerhapsDescribe alternatives you've considered, if relevant
To hack the scorer for this purpose: https://danielhnyk.cz/adding-callback-to-a-sklearn-gridsearch/
This is suboptimal because:
return_train_score
will call the scorer callback too many times and it is not easy to distinguish between the training and testing scoring.Additional context
No response
The text was updated successfully, but these errors were encountered: