New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cost bo loop #195
Cost bo loop #195
Conversation
Codecov Report
@@ Coverage Diff @@
## master #195 +/- ##
=======================================
Coverage 89.47% 89.47%
=======================================
Files 105 105
Lines 2861 2861
Branches 292 292
=======================================
Hits 2560 2560
Misses 249 249
Partials 52 52
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just one naming thing, otherwise good to go
@@ -59,6 +59,7 @@ def create_loop_state(x_init: np.ndarray, y_init: np.ndarray, cost: np.ndarray = | |||
|
|||
:param x_init: x values for initial function evaluations. | |||
:param y_init: y values for initial function evaluations | |||
:param cost: observed costs for initial function evaluations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch!
@@ -14,13 +14,15 @@ | |||
|
|||
def create_bayesian_optimization_loop(x_init: np.ndarray, y_init: np.ndarray, parameter_space: ParameterSpace, | |||
acquisition_type: AcquisitionType, model_type: ModelType, | |||
c_init: np.ndarray = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's be more expressive
c_init: np.ndarray = None, | |
cost_init: np.ndarray = None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point
Actually one more thing. Can you add a unit test that verifies initial cost is used as expected? Can be as simple as "create bo loop, make sure cost is in the state, done" |
ok I've updated a simple unit test that check whether the all the initial data is in the loop state |
Looks good, but the build fails for some reason |
hmm wired, I dont understand why it suddenly throws an import error? |
Ok after some digging, the problem seems to be a bug in pytorch here |
Ok, few options:
2 and 3 would be best i guess. if you go down that route, can you please create an issue and reference the PR or whatever we can from PyTorch repo? |
ok let's go for 2 since I don't know after which pytorch version that bug occurred |
So the build has passed, but running times are concerning: over 20 minutes on macos. @aaronkl how long does this new test run locally for you? |
it's seems that the test for Bohamiann take much longer than necessary. I will send another PR to solve that. |
Much better times now, merging |
Currently, we can only pass initial data points and their corresponding function values to the BO optimization loop. However, in case we also want to keep track of the cost per function evaluation, we should also be able to pass the initial cost. Otherwise, this will mess up our internal loop state.
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.