Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bayesian optimization with Random Forest #113

Open
mikolajwojciuk opened this issue Mar 22, 2021 · 1 comment
Open

Bayesian optimization with Random Forest #113

mikolajwojciuk opened this issue Mar 22, 2021 · 1 comment

Comments

@mikolajwojciuk
Copy link

Hi there,

I am having a problem with implementing Bayesian optimization with Random Forest model, no matter how I do set up Sherpa study, i constantly get an error saying:

"InvalidConfigError: local_penalization evaluator can only be used with GP models"

My Sherpa config:

algorithm = sherpa.algorithms.GPyOpt(model_type='RF',acquisition_type='MPI',verbosity=True,max_num_trials=8)
study = sherpa.Study(parameters=parameters,
algorithm=algorithm,
lower_is_better=True,
disable_dashboard = True)

P.S. Overall great library!

@LarsHH
Copy link
Collaborator

LarsHH commented Apr 9, 2021

Hi @mikolajwojciuk !

Apologies for the slow reply. It looks like the issue is due to the evaluator_type in GPyOpt, i.e. how GPyOpt chooses to evaluate concurrent trials. I didn't realize RF didn't work with local_penalization which I had hardcoded as the option for the evaluator type. Unless you have otherwise resolved the issue, could you try setting max_concurrent=1. That is, for your code:

algorithm = sherpa.algorithms.GPyOpt(model_type='RF',acquisition_type='MPI',verbosity=True,max_num_trials=8, max_concurrent=1)

In that case GPyOpt should ignore the setting. If that doesn't work, you could try going to this line

evaluator_type='local_penalization',
in your code and setting it to evaluator_type='random'.

Thanks for raising the issue.

Best,
Lars

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants