Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add support for continuous parameter ranges #40

Open
SimonBlanke opened this issue Apr 23, 2023 · 4 comments
Open

add support for continuous parameter ranges #40

SimonBlanke opened this issue Apr 23, 2023 · 4 comments
Assignees
Labels
enhancement New feature or request

Comments

@SimonBlanke
Copy link
Owner

In this issue I will show the progress of adding support for continuous parameter ranges in the search-space.

For most optimization algorithms it should be easy to add support for continuous parameter ranges:

  • Hill climbing based algorithms already give continuous positions (float) in the search-space, which are then converted into discrete positions (int).
  • SMBO algorithms can sample from the continuous search-space to calculate the acquisition-function. I have already seen this implementation in other bayesian-optimization packages.
  • PSO, Spiral-Optimization and the Downhill-Simplex Optimization already work by calculating float-positions similar to hill-climbing based algorithms.

So in conclusion: Adding support for continuous search-spaces should be possible with reasonable effort.

The next problem to discuss is how this will be integrated into the current API. It is important to me, that the API design stays simple and intuitive.
Also: It would be very interesting if the search-space can have discrete parameter ranges in some dimensions and continuous ones in other dimensions.

The current search-space looks something like this:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
}

How would a continuous dimension look like? It cannot be a numpy array and it should be distinguishable enough from a discrete dimension. Maybe a tuple:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": (-1, 1),
}

I will brainstorm some ideas and write some prototype code to get a clear vision for this feature.

@SimonBlanke SimonBlanke added the enhancement New feature or request label Apr 23, 2023
@SimonBlanke SimonBlanke self-assigned this Apr 23, 2023
@logan-dunbar
Copy link

Hi @SimonBlanke, I was also looking for continuous parameter ranges, perhaps you could look at the Gymnasium Spaces code for ideas for handling discrete/continuous (its one of the standard Reinforcement Learning toolkits). The Box space has low and high attributes which specify the bounds, also, the use of a generic vector for the bounds alleviates the need to name each dimension as in your current implementation.

@SimonBlanke
Copy link
Owner Author

Hello @logan-dunbar,

sorry for this very late answer. I read you comment and looked into the link you provided, but answering you somehow fell of my radar.

Using low and high is more verbose, but it might not be necessary. In this example:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": (-1, 1),
}

The low and high values (in parameter x3) are like positional arguments of a function. So it is very similar to the numpy method in x1 and x2. It is intuitive that the first one is low and the second one is high.
I would consider using the solution you provided if there will be additional parameters for a continuous dimension in the future. This would look like this:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": {
        "low": -1,
        "high": 1,
        "additional parameter 1": ...,
        "additional parameter 2": ...,
    },
}

In this case the naming improves the readability, because putting multiple values into a tuple gets confusing at some point.


the use of a generic vector for the bounds alleviates the need to name each dimension as in your current implementation.

How would this look like? I guess you mean the way the dimensions are accessed in the objective-function.
In this case:

The names for the dimensions are genertic in this example and look like they could just be indices of a vector. But if you want to do something like hyperparameter-optimization the search-space looks very different and the name for the dimensions helps for readability.

@mxv001
Copy link

mxv001 commented Nov 9, 2023

Just wanted to chime in here. Perhaps what @logan-dunbar is asking for is multi-dimensional parameter declaration. Something like

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    # something like below where there is array declaration
    # obviously this doesn't work even numpy syntax is wrong
    "x3": np.array(4,2) 
}

For example, in the nevergrad package there is an interface for parameter arrays: see https://facebookresearch.github.io/nevergrad/parametrization.html

The explicit API they expose is

search_space = nevergrad.p.Dict(
    param_array=nevergrad.p.Array(shape=(2,4)).set_bounds(0, 2),
)

Perhaps this is a different topic than the original post. If it is, then I can create it as a new feature request. It would be a great feature! Many models have arrays of parameters, and it can be a pain to fold/unfold everything.

@SimonBlanke
Copy link
Owner Author

Hello @mxv001,

thanks for your suggestion. I looked into the nevergrad-package. The interface you have shown is somewhat related to this issue, because it also enables continuous parameter ranges. But it is also a much broader topic, because of of the multi-dimensional parameter declaration.

I would suggest, that you open another issue (feature request).

I am not sure if a "nevergrad-style" of search-space creation will find its way into gradient-free-optimizers (I like to keep the API very simple), but I think it would be valuable to discuss it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants