Replies: 5 comments 8 replies
-
Some discussion already took place in the past: https://scikit-learn.org/stable/faq.html#will-you-add-gpu-support
Also to better frame the discussion (if any): GPU support won't make |
Beta Was this translation helpful? Give feedback.
-
I think better use of warm starting is the next big opportunity here (see
#8230).
But you may be overly confident that this is "a waste".
|
Beta Was this translation helpful? Give feedback.
-
Catboost has GPU support I notice. Does this imply it might help for scikit learn's gradient boosted trees? |
Beta Was this translation helpful? Give feedback.
-
If you want to try an alternative GPU implementation of some popular scikit-learn algorithms, you can give a try to RAPIDS AI's cuML. Those estimators should be reasonably compatible with the scikit-learn API and in most of the case you should be able to tune them with |
Beta Was this translation helpful? Give feedback.
-
gridsearch is simply not usable with big models. Training time is 10 seconds on gpu and ten minutes on the cpu. If you have to choose between 9 parameters, it takes an hour. It's faster to compare models by hand. With XGBRegressor, what tool do you use to search for parameters using the GPU? Optuna? thanks |
Beta Was this translation helpful? Give feedback.
-
Just discuss, not negate or attack.
Beta Was this translation helpful? Give feedback.
All reactions