-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use scikit-optimize for tuning search space for model selection #95
Comments
This is how one would invoke model = BayesSearchCV(QuicGraphLasso(init_method='corrcoef', score_metric=metric),
cv=num_folds,
n_iter=16,
refit=True
)
model.add_spaces('space_1', {
'lam': Real(1e-02,1e+1, prior='log-uniform')
}
)
model.fit(X) Here is an example of using Results from running
|
Based on your example here, this feels more like something that should go in
|
@jasonlaska The example I made was just a brute force example to show usage of
skopt's Optimizer lets you go about adaptively picking a grid of points to evaluate. For example, this would not be possible to achieve to accomplish for EBIC by invoking The moment we have more than one path to optimize over multiple penalties or multiple parameters, this starts to become a lot more handy as well. |
Create a drop-in replacement for
GridSearchCV
using scikit-optimize.The main benefit here is that
scikit-optimize
can ensure that not all points in the search space or grid are evaluated as is done by GridSearchCV. Currently QuicGraphLassoCV is modeled after GridSearchCV while taking advantage ofpath
mode of the solver. We would instead need to model this estimator class after BayesSearchCVSimilarly, other model selection criteria that involve searching along a regularization path (as opposed pure model ensembling) could also have analogues such as
BayesSearchEBIC
and so forth.The text was updated successfully, but these errors were encountered: