Skip to content

add support for continuous parameter ranges #40

Open
@SimonBlanke

Description

@SimonBlanke

In this issue I will show the progress of adding support for continuous parameter ranges in the search-space.

For most optimization algorithms it should be easy to add support for continuous parameter ranges:

  • Hill climbing based algorithms already give continuous positions (float) in the search-space, which are then converted into discrete positions (int).
  • SMBO algorithms can sample from the continuous search-space to calculate the acquisition-function. I have already seen this implementation in other bayesian-optimization packages.
  • PSO, Spiral-Optimization and the Downhill-Simplex Optimization already work by calculating float-positions similar to hill-climbing based algorithms.

So in conclusion: Adding support for continuous search-spaces should be possible with reasonable effort.

The next problem to discuss is how this will be integrated into the current API. It is important to me, that the API design stays simple and intuitive.
Also: It would be very interesting if the search-space can have discrete parameter ranges in some dimensions and continuous ones in other dimensions.

The current search-space looks something like this:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
}

How would a continuous dimension look like? It cannot be a numpy array and it should be distinguishable enough from a discrete dimension. Maybe a tuple:

search_space = {
    "x1": np.arange(-100, 101, 0.1),
    "x2": np.arange(-100, 101, 0.1),
    "x3": (-1, 1),
}

I will brainstorm some ideas and write some prototype code to get a clear vision for this feature.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions