Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add parameter typing (only float and integer) #131

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

jmehault
Copy link

@jmehault jmehault commented Nov 29, 2018

Add optional parameter "ptypes" in the BayesianOptimization class in order to indicate le type of each parameter to optimize into a dictionary. For now, every parameters should be typed. The goal is mainly to validate the method.
If a parameter type is declared as integer, then the extremal position is chosen as follow. The outcome (float) of the minimization function is transformed into two positions (lower and upper integer). The minimum is the minimal value took by the acquisition function among the lower and upper positions.
In the case of multiple integer parameters, the current implementation do not combine every lower and upper integer values. The only combinations tested are all lowers versus all uppers.

Wait for comments !

@codecov-io
Copy link

codecov-io commented Nov 29, 2018

Codecov Report

Merging #131 into master will increase coverage by 0.59%.
The diff coverage is 81.96%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #131      +/-   ##
==========================================
+ Coverage   70.92%   71.52%   +0.59%     
==========================================
  Files           6        6              
  Lines         399      446      +47     
  Branches       44       61      +17     
==========================================
+ Hits          283      319      +36     
- Misses        112      118       +6     
- Partials        4        9       +5
Impacted Files Coverage Δ
bayes_opt/bayesian_optimization.py 77.31% <100%> (ø) ⬆️
bayes_opt/target_space.py 95.14% <80%> (-4.86%) ⬇️
bayes_opt/util.py 83.33% <82.35%> (-1.43%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 9b8bafb...7c026d7. Read the comment docs.

@yuanjie-ai
Copy link

author updated this?

@jmehault
Copy link
Author

jmehault commented Jan 10, 2019

I am using the updates for my own works. Improvement might be done, but I am waiting for first comments about the proposed solution. I don't know if it should be merged at the end.

@alonfnt
Copy link
Contributor

alonfnt commented May 18, 2020

May I suggest an alternative indication method. In order to make it so that indicating the type of one parameter should not mean doing a whole dictionary, something like this?

>>> pbounds = {'p1': (0,1), 'p2': (0, 100, int)}
>>> [item[1][:2] for item in pbounds.items()]
[(0, 1), (0, 100)]
>>> [float if len(item[1])<3 else item[1][2] for item in pbounds.items()]
[<class 'float'>, <class 'int'>]

where a 3 element to the tupple is added with the type, otherwise float is considered.
Thoughts?

@mjkvaak
Copy link

mjkvaak commented Sep 24, 2020

I very much like the added value for BO with typing of the parameters! However, this could be still improved so that the parameters would get returned casted in their corresponding type. I mean, even if I define, e.g. max_depth:int, the parameter will get fed to the functio-to-be-optimized as float. This caused me some headache, when I was trying to figure out why my classifier kept on throwing fit-errors at me.

Copy link

@ChanderJindal ChanderJindal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code looks fine.

@ChanderJindal
Copy link

Nice idea here, till now, in the 'black_box_function' I was converting that float to int, it isn't the best but happened to be a convenient solution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants