Skip to content

Releases: pytorch/botorch

Compatibility Release

13 Sep 23:37
Compare
Choose a tag to compare

Compatibility

  • Pin GPyTorch == 1.9.0 (#1397).
  • Pin linear_operator == 0.1.1 (#1397).

New Features

  • Implement SaasFullyBayesianMultiTaskGP and related utilities (#1181, #1203).

Other Changes

  • Support loading a state dict for SaasFullyBayesianSingleTaskGP (#1120).
  • Update load_state_dict for ModelList to support fully Bayesian models (#1395).
  • Add is_one_to_many attribute to input transforms (#1396).

Bug Fixes

  • Fix PairwiseGP on GPU (#1388).

Compatibility Release

07 Sep 05:04
Compare
Choose a tag to compare

Compatibility

  • Require python >= 3.8 (via #1347).
  • Support for python 3.10 (via #1379).
  • Require PyTorch >= 1.11 (via (#1363).
  • Require GPyTorch >= 1.9.0 (#1347).
    • GPyTorch 1.9.0 is a major refactor that factors out the lazy tensor functionality into a new LinearOperator library, which required a number of adjustments to BoTorch (#1363, #1377).
  • Require pyro >= 1.8.2 (#1379).

New Features

  • Add ability to generate the features appended in the AppendFeatures input transform via a generic callable (#1354).
  • Add new synthetic test functions for sensitivity analysis (#1355, #1361).

Other Changes

  • Use time.monotonic() instead of time.time() to measure duration (#1353).
  • Allow passing Y_samples directly in MARS.set_baseline_Y (#1364).

Bug Fixes

  • Patch state_dict loading for PairwiseGP (#1359).
  • Fix batch_shape handling in Normalize and InputStandardize transforms (#1360).

Maintenance release

12 Aug 21:19
Compare
Choose a tag to compare

[0.6.6] - Aug 12, 2022

Compatibility

  • Require GPyTorch >= 1.8.1 (#1347).

New Features

  • Support batched models in RandomFourierFeatures (#1336).
  • Add a skip_expand option to AppendFeatures (#1344).

Other Changes

  • Allow qProbabilityOfImprovement to use batch-shaped best_f (#1324).
  • Make optimize_acqf re-attempt failed optimization runs and handle optimization
    errors in optimize_acqf and gen_candidates_scipy better (#1325).
  • Reduce memory overhead in MARS.set_baseline_Y (#1346).

Bug Fixes

  • Fix bug where outcome_transform was ignored for ModelListGP.fantasize (#1338).
  • Fix bug causing get_polytope_samples to sample incorrectly when variables
    live in multiple dimensions (#1341).

Documentation

Robust Multi-Objective BO, Multi-Objective Multi-Fidelity BO, Scalable Constrained BO, Improvements to Ax Integration

15 Jul 17:08
Compare
Choose a tag to compare

Compatibility

  • Require PyTorch >=1.10 (#1293).
  • Require GPyTorch >=1.7 (#1293).

New Features

  • Add MOMF (Multi-Objective Multi-Fidelity) acquisition function (#1153).
  • Support PairwiseLogitLikelihood and modularize PairwiseGP (#1193).
  • Add in transformed weighting flag to Proximal Acquisition function (#1194).
  • Add FeasibilityWeightedMCMultiOutputObjective (#1202).
  • Add outcome_transform to FixedNoiseMultiTaskGP (#1255).
  • Support Scalable Constrained Bayesian Optimization (#1257).
  • Support SaasFullyBayesianSingleTaskGP in prune_inferior_points (#1260).
  • Implement MARS as a risk measure (#1303).
  • Add MARS tutorial (#1305).

Other Changes

  • Add Bilog outcome transform (#1189).
  • Make get_infeasible_cost return a cost value for each outcome (#1191).
  • Modify risk measures to accept List[float] for weights (#1197).
  • Support SaasFullyBayesianSingleTaskGP in prune_inferior_points_multi_objective (#1204).
  • BotorchContainers and BotorchDatasets: Large refactor of the original TrainingData API to allow for more diverse types of datasets (#1205, #1221).
  • Proximal biasing support for multi-output SingleTaskGP models (#1212).
  • Improve error handling in optimize_acqf_discrete with a check that choices is non-empty (#1228).
  • Handle X_pending properly in FixedFeatureAcquisition (#1233, #1234).
  • PE and PLBO support in Ax (#1240, #1241).
  • Remove model.train call from get_X_baseline for better caching (#1289).
  • Support inf values in bounds argument of optimize_acqf (#1302).

Bug Fixes

  • Update get_gp_samples to support input / outcome transforms (#1201).
  • Fix cached Cholesky sampling in qNEHVI when using Standardize outcome transform (#1215).
  • Make task_feature as required input in MultiTaskGP.construct_inputs (#1246).
  • Fix CUDA tests (#1253).
  • Fix FixedSingleSampleModel dtype/device conversion (#1254).
  • Prevent inappropriate transforms by putting input transforms into train mode before converting models (#1283).
  • Fix sample_points_around_best when using 20 dimensional inputs or prob_perturb (#1290).
  • Skip bound validation in optimize_acqf if inequality constraints are specified (#1297).
  • Properly handle RFFs when used with a ModelList with individual transforms (#1299).
  • Update PosteriorList to support deterministic-only models and fix event_shape (#1300).

Documentation

  • Add a note about observation noise in the posterior in fit_model_with_torch_optimizer notebook (#1196).
  • Fix custom botorch model in Ax tutorial to support new interface (#1213).
  • Update MOO docs (#1242).
  • Add SMOKE_TEST option to MOMF tutorial (#1243).
  • Fix ModelListGP.condition_on_observations/fantasize bug (#1250).
  • Replace space with underscore for proper doc generation (#1256).
  • Update PBO tutorial to use EUBO (#1262).

Maintenance Release

21 Apr 23:47
Compare
Choose a tag to compare

New Features

  • Implement ExpectationPosteriorTransform (#903).
  • Add PairwiseMCPosteriorVariance, a cheap active learning acquisition function (#1125).
  • Support computing quantiles in the fully Bayesian posterior, add FullyBayesianPosteriorList (#1161).
  • Add expectation risk measures (#1173).
  • Implement Multi-Fidelity GIBBON (Lower Bound MES) acquisition function (#1185).

Other Changes

  • Add an error message for one shot acquisition functions in optimize_acqf_discrete (#939).
  • Validate the shape of the bounds argument in optimize_acqf (#1142).
  • Minor tweaks to SAASBO (#1143, #1183).
  • Minor updates to tutorials (24f7fda, #1144, #1148, #1159, #1172, #1180).
  • Make it easier to specify a custom PyroModel (#1149).
  • Allow passing in a mean_module to SingleTaskGP/FixedNoiseGP (#1160).
  • Add a note about acquisitions using gradients to base class (#1168).
  • Remove deprecated box_decomposition module (#1175).

Bug Fixes

  • Bug-fixes for ProximalAcquisitionFunction (#1122).
  • Fix missing warnings on failed optimization in fit_gpytorch_scipy (#1170).
  • Ignore data related buffers in PairwiseGP.load_state_dict (#1171).
  • Make fit_gpytorch_model properly honor the debug flag (#1178).
  • Fix missing posterior_transform in gen_one_shot_kg_initial_conditions (#1187).

Bayesian Optimization with Preference Exploration, SAASBO for High-Dimensional Bayesian Optimization

28 Mar 00:30
Compare
Choose a tag to compare

New Features

  • Implement SAASBO - SaasFullyBayesianSingleTaskGP model for sample-efficient high-dimensional Bayesian optimization (#1123).
  • Add SAASBO tutorial (#1127).
  • Add LearnedObjective (#1131), AnalyticExpectedUtilityOfBestOption acquisition function (#1135), and a few auxiliary classes to support Bayesian optimization with preference exploration (BOPE).
  • Add BOPE tutorial (#1138).

Other Changes

  • Use qKG.evaluate in optimize_acqf_mixed (#1133).
  • Add construct_inputs to SAASBO (#1136).

Bug Fixes

  • Fix "Constraint Active Search" tutorial (#1124).
  • Update "Discrete Multi-Fidelity BO" tutorial (#1134).

Bug fix release

09 Mar 22:48
Compare
Choose a tag to compare

New Features

  • Use BOTORCH_MODULAR in tutorials with Ax (#1105).
  • Add optimize_acqf_discrete_local_search for discrete search spaces (#1111).

Bug Fixes

  • Fix missing posterior_transform in qNEI and get_acquisition_function (#1113).

Non-linear input constraints, new MOO problems, bug fixes, and performance improvements.

28 Feb 22:41
Compare
Choose a tag to compare

New Features

  • Add Standardize input transform (#1053).
  • Low-rank Cholesky updates for NEI (#1056).
  • Add support for non-linear input constraints (#1067).
  • New MOO problems: MW7 (#1077), disc brake (#1078), penicillin (#1079), RobustToy (#1082), GMM (#1083).

Other Changes

  • Add Dispatcher (#1009).
  • Modify qNEHVI to support deterministic models (#1026).
  • Store tensor attributes of input transforms as buffers (#1035).
  • Modify NEHVI to support MTGPs (#1037).
  • Make Normalize input transform input column-specific (#1047).
  • Improve find_interior_point (#1049).
  • Remove deprecated botorch.distributions module (#1061).
  • Avoid costly application of posterior transform in Kronecker & HOGP models (#1076).
  • Support heteroscedastic perturbations in InputPerturbations (#1088).

Performance Improvements

  • Make risk measures more memory efficient (#1034).

Bug Fixes

  • Properly handle empty fixed_features in optimization (#1029).
  • Fix missing weights in VaR risk measure (#1038).
  • Fix find_interior_point for negative variables & allow unbounded problems (#1045).
  • Filter out indefinite bounds in constraint utilities (#1048).
  • Make non-interleaved base samples use intuitive shape (#1057).
  • Pad small diagonalization with zeros for KroneckerMultitaskGP (#1071).
  • Disable learning of bounds in preprocess_transform (#1089).
  • Catch runtime errors with ill-conditioned covar (#1095).
  • Fix compare_mc_analytic_acquisition tutorial (#1099).

Approximate GP model, Multi-Output Risk Measures, Bug Fixes and Performance Improvements

09 Dec 00:16
Compare
Choose a tag to compare

Compatibility

  • Require PyTorch >=1.9 (#1011).
  • Require GPyTorch >=1.6 (#1011).

New Features

  • New ApproximateGPyTorchModel wrapper for various (variational) approximate GP models (#1012).
  • New SingleTaskVariationalGP stochastic variational Gaussian Process model (#1012).
  • Support for Multi-Output Risk Measures (#906, #965).
  • Introduce ModelList and PosteriorList (#829).
  • New Constraint Active Search tutorial (#1010).
  • Add additional multi-objective optimization test problems (#958).

Other Changes

  • Add covar_module as an optional input of MultiTaskGP models (#941).
  • Add min_range argument to Normalize transform to prevent division by zero (#931).
  • Add initialization heuristic for acquisition function optimization that samples around best points (#987).
  • Update initialization heuristic to perturb a subset of the dimensions of the best points if the dimension is > 20 (#988).
  • Modify apply_constraints utility to work with multi-output objectives (#994).
  • Short-cut t_batch_mode_transform decorator on non-tensor inputs (#991).

Performance Improvements

  • Use lazy covariance matrix in BatchedMultiOutputGPyTorchModel.posterior (#976).
  • Fast low-rank Cholesky updates for qNoisyExpectedHypervolumeImprovement (#747, #995, #996).

Bug Fixes

  • Update error handling to new PyTorch linear algebra messages (#940).
  • Avoid test failures on Ampere devices (#944).
  • Fixes to the Griewank test function (#972).
  • Handle empty base_sample_shape in Posterior.rsample (#986).
  • Handle NotPSDError and hitting maxiter in fit_gpytorch_model (#1007).
  • Use TransformedPosterior for subclasses of GPyTorchPosterior (#983).
  • Propagate best_f argument to qProbabilityOfImprovement in input constructors (f5a5f8b)

Maintenance Release + New Tutorials

02 Sep 20:44
Compare
Choose a tag to compare

Compatibility

  • Require GPyTorch >=1.5.1 (#928).

New Features

  • Add HigherOrderGP composite Bayesian Optimization tutorial notebook (#864).
  • Add Multi-Task Bayesian Optimization tutorial (#867).
  • New multi-objective test problems from (#876).
  • Add PenalizedMCObjective and L1PenaltyObjective (#913).
  • Add a ProximalAcquisitionFunction for regularizing new candidates towards previously generated ones (#919, #924).
  • Add a Power outcome transform (#925).

Bug Fixes

  • Batch mode fix for HigherOrderGP initialization (#856).
  • Improve CategoricalKernel precision (#857).
  • Fix an issue with qMultiFidelityKnowledgeGradient.evaluate (#858).
  • Fix an issue with transforms with HigherOrderGP. (#889)
  • Fix initial candidate generation when parameter constraints are on different device (#897).
  • Fix bad in-place op in _generate_unfixed_lin_constraints (#901).
  • Fix an input transform bug in fantasize call (#902).
  • Fix outcome transform bug in batched_to_model_list (#917).

Other Changes

  • Make variance optional for TransformedPosterior.mean (#855).
  • Support transforms in DeterministicModel (#869).
  • Support batch_shape in RandomFourierFeatures (#877).
  • Add a maximize flag to PosteriorMean (#881).
  • Ignore categorical dimensions when validating training inputs in MixedSingleTaskGP (#882).
  • Refactor HigherOrderGPPosterior for memory efficiency (#883).
  • Support negative weights for minimization objectives in get_chebyshev_scalarization (#884).
  • Move train_inputs transforms to model.train/eval calls (#894).