Description
🚀 Feature Request
The function optimize_acqf_mixed
in botorch/optim/optimize.py currently runs a sequential greedy optimization for q>1. Because of this, the function cannot consider inter-parameter constraints and even encounters an IndexError
when trying to normalize the inter-parameter constraints.
Motivation
Is your feature request related to a problem? Please describe.
The motivation may be two-fold:
- More alignment of the function header from
optimize_acqf_mixed
tooptimize_acqf
. After implementing this request, both functions would accept the same types of constraints. - I am using botorch to optimize the parameters of chemical experiments. In these experiments, we usually have continuous parameters (e.g., substance concentrations) alongside discrete ones (e.g., type of catalyst). Thus,
optimize_acqf_mixed
is the optimization function to go with. Moreover, batched optimization withq>1
allows the execution of optimized experiments in parallel, saving lab resources. Then, however, inter-parameter constraints are needed when dealing with parameters such as the experiment's temperature (which will be applied to all samples in a batch). Inter-parameter constraints considered byoptimize_acqf_mixed
would enable the planning of batched chemistry experiments.
Pitch
Describe the solution you'd like
One option would be to implement a case distinction:
- If inter-parameter constraints are present: run a joint optimization
- Otherwise: run sequential optimization as it might save computational resources
In the joint optimization, one can either (a) enumerate all n_combos^n possible combinations, which will probably be expensive, or (b) directly use the provided fixed_feature_list
without enumerating all combinations as proposed in the code snipped below. If we implement option (b), will there be any loss in optimality if only inter-parameter constraints are present? I am open for discussion.
def optimize_acqf_mixed(...)
...
if _has_inter_parameter_constraints(inequality_constraints, equality_constraints, nonlinear_inequality_constraints):
ff_candidate_list, ff_acq_value_list = [], []
for fixed_features in fixed_features_list:
candidate, acq_value = optimize_acqf(
acq_function=acq_function,
bounds=bounds,
q=q,
num_restarts=num_restarts,
raw_samples=raw_samples,
options=options or {},
inequality_constraints=inequality_constraints,
equality_constraints=equality_constraints,
nonlinear_inequality_constraints=nonlinear_inequality_constraints,
fixed_features=fixed_features,
post_processing_func=post_processing_func,
batch_initial_conditions=batch_initial_conditions,
ic_generator=ic_generator,
return_best_only=True,
**ic_gen_kwargs,
)
ff_candidate_list.append(candidate)
ff_acq_value_list.append(acq_value)
ff_acq_values = torch.stack(ff_acq_value_list)
best = torch.argmax(ff_acq_values)
return ff_candidate_list[best], ff_acq_values[best]
...
Are you willing to open a pull request? (See CONTRIBUTING)
It would be my first one, but if we find a good solution, I am happy to help implement it.
Activity