Conversation
…valuations in the BO loop
…where we pass default arguments, we can just omit arguments here
…valuator from smt. when the time comes we will implement our own base class if needed
|
I am including some plots here which constitutes the tests of the added capabilities. I modified the BODriverEX.py to one without constraints besides the bound constraints. I ran 10 BO iterations using with a batch size of 2 and using the various strategies. I repeated experiments 10 times and compare against mean behavior of EI with a batch size of 1. The error bars show the variance in the repeated experiments. The horizontal axis is the number of batches, so there are twice as many function evaluations with Note: the number of function evaluations done in the BO loop will be equal to |
nychiang
left a comment
There was a problem hiding this comment.
some minor comments and questions.
| problem = LpNormProblem(nx, xlimits) | ||
| else: | ||
| problem = BraninProblem(constraints=None) | ||
| problem = BraninProblem() | ||
| problem.set_constraints(user_constraint) |
There was a problem hiding this comment.
I have two ways to add the constraints:
- in the constructor, setting
constraints=user_constraint
or - using function
set_constraints.
I should leave a comment in this example.
There was a problem hiding this comment.
Yes, I understand that there are two methods to set constraints. However, passing None as the constraints to BraninProblem is equivalent to instantiating a BraninProblem with no arguments since the default constraints argument of BraninProblem is None. Or at least it was until I recently changed the default to [ ].
| for batch in range(self.batch_size): | ||
| # Get a new sample point | ||
| x_new = self._find_best_point(x_train, y_train_virtual) | ||
|
|
||
| # Evaluate the new sample point | ||
| y_new = self.prob.evaluate(np.atleast_2d(x_new)) | ||
| # Get a virtual point | ||
| y_virtual = self._get_virtual_point(np.atleast_2d(x_new)) | ||
|
|
||
| # Update training set | ||
| x_train = np.vstack([x_train, x_new]) | ||
| y_train = np.vstack([y_train, y_new]) | ||
| # Update training set with the virtual point | ||
| x_train = np.vstack([x_train, x_new ]) | ||
| y_train_virtual = np.vstack([y_train_virtual, y_virtual]) |
There was a problem hiding this comment.
if batch_size == 1, we don't need to compute the virtual point, do we?
No matter which batch_type is used, the virtual point only works with batch_size>=2, right?
x_new = self._find_best_point(x_train, y_train)
if self.batch_size > 1:
y_train_virtual = y_train.copy()
for batch in range(self.batch_size):
# Get a virtual point
y_virtual = self._get_virtual_point(np.atleast_2d(x_new))
# Update training set with the virtual point
x_train = np.vstack([x_train, x_new ])
y_train_virtual = np.vstack([y_train_virtual, y_virtual])
x_new = self._find_best_point(x_train, y_train_virtual)
There was a problem hiding this comment.
This has now been address. The code was fine as is but it is not strictly necessary to call _get_virtual_point at the final point in the batch.
| Nai-Yuan Chiang <chiang7@llnl.gov> | ||
| """ | ||
| import numpy as np | ||
| import collections.abc |
There was a problem hiding this comment.
is collections a new dependency?
There was a problem hiding this comment.
collections is a built in module that is not pip installable. See here.
…ecessary virtual_point calls, bug fix for x_opt, removing use of erroneously labeled index batch
|
@nychiang -- I believe I have addressed all of the changes that you requested. Let me know if there is anything left for me to update. |





This PR adds capabilities to hiopbbpy for batched Bayesian optimization.