Skip to content
This repository has been archived by the owner on Nov 14, 2023. It is now read-only.

[feature request] sklearn cross_val_score #179

Open
r0f1 opened this issue Feb 1, 2021 · 1 comment
Open

[feature request] sklearn cross_val_score #179

r0f1 opened this issue Feb 1, 2021 · 1 comment

Comments

@r0f1
Copy link

r0f1 commented Feb 1, 2021

Hi,
Often times I want to compare different classifiers on the same dataset and I find myself writing code that looks like this:

list_of_feature_list = [...] # could looks like this: [[0,1,2], [2,4,5], [0,1,2,3,4,5]]

for feature_list in list_of_feature_list:
    X = Xbig[features_list] # features
    y = ...                 # target

    model = LogisticRegression()
    scores = cross_val_score(model, X, y)
    dummy = DummyClassifier(strategy="most_frequent")
    dummys = np.mean(cross_val_score(dummy, X, y))

    print(scores)
    print(dummys)

I was wondering if ray could be used to speed up the process. Or more specifically: Can I use ray to do cross validation and use it instead of cross_val_score()? If not, I think that would be a useful feature to add.

Thank you.

@Yard1
Copy link
Member

Yard1 commented Feb 1, 2021

I think you could just use grid search with one possible combination of parameters to essentially get the equivalent of sklearn's CV on Ray

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants