Releases: pytorch/botorch
Releases · pytorch/botorch
Maintenance Release, SCoreBO
Compatibility
- Reqire Python >= 3.10 (#2293).
New Features
- SCoreBO and Bayesian Active Learning acquisition functions (#2163).
Bug Fixes
- Fix non-None constraint noise levels in some constrained test problems (#2241).
- Fix inverse cost-weighted utility behaviour for non-positive acquisition values (#2297).
Other Changes
- Don't allow unused keyword arguments in
Model.construct_inputs
(#2186). - Re-map task values in MTGP if they are not contiguous integers starting from zero (#2230).
- Unify
ModelList
andModelListGP
subset_output
behavior (#2231). - Ensure
mean
andinterior_point
ofLinearEllipticalSliceSampler
have correct shapes (#2245). - Speed up task covariance of
LCEMGP
(#2260). - Improvements to
batch_cross_validation
, support for model init kwargs (#2269). - Support custom
all_tasks
for MTGPs (#2271). - Error out if scipy optimizer does not support bounds / constraints (#2282).
- Support diagonal covariance root with fixed indices for
LinearEllipticalSliceSampler
(#2283). - Make
qNIPV
a subclass ofAcquisitionFunction
rather thanAnalyticAcquisitionFunction
(#2286). - Increase code-sharing of
LCEMGP
& defineconstruct_inputs
(#2291).
Deprecations
- Remove deprecated args from base
MCSampler
(#2228). - Remove deprecated
botorch/generation/gen/minimize
(#2229). - Remove
fit_gpytorch_model
(#2250). - Remove
requires_grad_ctx
(#2252). - Remove
base_samples
argument ofGPyTorchPosterior.rsample
(#2254). - Remove deprecated
mvn
argument toGPyTorchPosterior
(#2255). - Remove deprecated
Posterior.event_shape
(#2320). - Remove
**kwargs
& deprecatedindices
argument ofRound
transform (#2321). - Remove
Standardize.load_state_dict
(#2322). - Remove
FixedNoiseMultiTaskGP
(#2323).
Maintenance Release, Updated Community Contributions
New Features
- Introduce updated guidelines and a new directory for community contributions (#2167).
- Add
qEUBO
preferential acquisition function (#2192). - Add Multi Information Source Augmented GP (#2152).
Bug Fixes
- Fix
condition_on_observations
in fully Bayesian models (#2151). - Fix for bug that occurs when splitting single-element bins, use default BoTorch kernel for BAxUS. (#2165).
- Fix a bug when non-linear constraints are used with
q > 1
(#2168). - Remove unsupported
X_pending
fromqMultiFidelityLowerBoundMaxValueEntropy
constructor (#2193). - Don't allow
data_fidelities=[]
inSingleTaskMultiFidelityGP
(#2195). - Fix
EHVI
,qEHVI
, andqLogEHVI
input constructors (#2196). - Fix input constructor for
qMultiFidelityMaxValueEntropy
(#2198). - Add ability to not deduplicate points in
_is_non_dominated_loop
(#2203).
Other Changes
- Minor improvements to
MVaR
risk measure (#2150). - Add support for multitask models to
ModelListGP
(#2154). - Support unspecified noise in
ContextualDataset
(#2155). - Update
HVKG
sampler to reflect the number of model outputs (#2160). - Release restriction in
OneHotToNumeric
that the categoricals are the trailing dimensions (#2166). - Standardize broadcasting logic of
q(Log)EI
'sbest_f
andcompute_best_feasible_objective
(#2171). - Use regular inheritance instead of dispatcher to special-case
PairwiseGP
logic (#2176). - Support
PBO
inEUBO
's input constructor (#2178). - Add
posterior_transform
toqMaxValueEntropySearch
's input constructor (#2181). - Do not normalize or standardize dimension if all values are equal (#2185).
- Reap deprecated support for objective with 1 arg in
GenericMCObjective
(#2199). - Consistent signature for
get_objective_weights_transform
(#2200). - Update context order handling in
ContextualDataset
(#2205). - Update contextual models for use in MBM (#2206).
- Remove
(Identity)AnalyticMultiOutputObjective
(#2208). - Reap deprecated support for
soft_eval_constraint
(#2223). Please usebotorch.utils.sigmoid
instead.
Compatibility
- Pin
mpmath <= 1.3.0
to avoid CI breakages due to removed modules in the latest alpha release (#2222).
Hypervolume Knowledge Gradient (HVKG)
New features
Hypervolume Knowledge Gradient (HVKG):
- Add
qHypervolumeKnowledgeGradient
, which seeks to maximize the difference in hypervolume of the hypervolume-maximizing set of a fixed size after conditioning the unknown observation(s) that would be received if X were evaluated (#1950, #1982, #2101). - Add tutorial on decoupled Multi-Objective Bayesian Optimization (MOBO) with HVKG (#2094).
Other new features:
- Add
MultiOutputFixedCostModel
, which is useful for decoupled scenarios where the objectives have different costs (#2093). - Enable
q > 1
in acquisition function optimization when nonlinear constraints are present (#1793). - Support different noise levels for different outputs in test functions (#2136).
Bug fixes
- Fix fantasization with a
FixedNoiseGaussianLikelihood
whennoise
is known andX
is empty (#2090). - Make
LearnedObjective
compatible with constraints in acquisition functions regardless ofsample_shape
(#2111). - Make input constructors for
qExpectedImprovement
,qLogExpectedImprovement
, andqProbabilityOfImprovement
compatible withLearnedObjective
regardless ofsample_shape
(#2115). - Fix handling of constraints in
qSimpleRegret
(#2141).
Other changes
- Increase default sample size for
LearnedObjective
(#2095). - Allow passing in
X
with or without fidelity dimensions inproject_to_target_fidelity
(#2102). - Use full-rank task covariance matrix by default in SAAS MTGP (#2104).
- Rename
FullyBayesianPosterior
toGaussianMixturePosterior
; add_is_ensemble
and_is_fully_bayesian
attributes toModel
(#2108). - Various improvements to tutorials including speedups, improved explanations, and compatibility with newer versions of libraries.
Bugfix release
Compatibility
- Re-establish compatibility with PyTorch 1.13.1 (#2083).
Multi-Objective "Log" acquisition functions
Highlights
- Additional "Log" acquisition functions for multi-objective optimization with better numerical behavior, which often leads to significantly improved BO performance over their non-"Log" counterparts:
FixedNoiseGP
andFixedNoiseMultiFidelityGP
have been deprecated, their functionalities merged intoSingleTaskGP
andSingleTaskMultiFidelityGP
, respectively (#2052, #2053).- Removed deprecated legacy model fitting functions:
numpy_converter
,fit_gpytorch_scipy
,fit_gpytorch_torch
,_get_extra_mll_args
(#1995, #2050).
New Features
- Support multiple data fidelity dimensions in
SingleTaskMultiFidelityGP
and (deprecated)FixedNoiseMultiFidelityGP
models (#1956). - Add
logsumexp
andfatmax
to handle infinities and control asymptotic behavior in "Log" acquisition functions (#1999). - Add outcome and feature names to datasets, implement
MultiTaskDataset
(#2015, #2019). - Add constrained Hartmann and constrained Gramacy synthetic test problems (#2022, #2026, #2027).
- Support observed noise in
MixedSingleTaskGP
(#2054). - Add
PosteriorStandardDeviation
acquisition function (#2060).
Bug fixes
- Fix input constructors for
qMaxValueEntropy
andqMultiFidelityKnowledgeGradient
(#1989). - Fix precision issue that arises from inconsistent data types in
LearnedObjective
(#2006). - Fix fantasization with
FixedNoiseGP
and outcome transforms and useFantasizeMixin
(#2011). - Fix
LearnedObjective
base sample shape (#2021). - Apply constraints in
prune_inferior_points
(#2069). - Support non-batch evaluation of
PenalizedMCObjective
(#2073). - Fix
Dataset
equality checks (#2077).
Other changes
- Don't allow unused
**kwargs
in input_constructors except for a defined set of exceptions (#1872, #1985). - Merge inferred and fixed noise LCE-M models (#1993).
- Fix import structure in
botorch.acquisition.utils
(#1986). - Remove deprecated functionality:
weights
argument ofRiskMeasureMCObjective
andsqueeze_last_dim
(#1994). - Make
X
,Y
,Yvar
into properties in datasets (#2004). - Make synthetic constrained test functions subclass from
SyntheticTestFunction
(#2029). - Add
construct_inputs
to contextual GP modelsLCEAGP
andSACGP
(#2057).
Bug fix release
This release fixes bugs that affected Ax's modular BotorchModel
and silently ignored outcome constraints due to naming mismatches.
Bug fixes
- Hot fix (#1973) for a few issues:
- A naming mismatch between Ax's modular
BotorchModel
and the BoTorch's acquisition input constructors, leading to outcome constraints in Ax not being used with single-objective acquisition functions in Ax's modularBotorchModel
. The naming has been updated in Ax and consistent naming is now used in input constructors for single and multi-objective acquisition functions in BoTorch. - A naming mismatch in the acquisition input constructor
constraints
inqNoisyLogExpectedImprovement
, which kept constraints from being used. - A bug in
compute_best_feasible_objective
that could lead to-inf
incumbent values.
- A naming mismatch between Ax's modular
- Fix setting seed in
get_polytope_samples
(#1968)
Other changes
Dependency fix release
This is a very minor release; the only change from v0.9.0 is that the linear_operator
dependency was bumped to 0.5.1 (#1963). This was needed since a bug in linear_operator
0.5.0 caused failures with some BoTorch models.
LogEI acquisition functions, L0 regularization & homotopy optimization, PiBO, orthogonal additive kernel, nonlinear constraints
Compatibility
- Require Python >= 3.9.0 (#1924).
- Require PyTorch >= 1.13.1 (#1960).
- Require linear_operator == 0.5.0 (#1961).
- Require GPyTorch == 1.11 (#1961).
Highlights
- Introduce
OrthogonalAdditiveKernel
(#1869). - Speed up LCE-A kernel by over an order of magnitude (#1910).
- Introduce
optimize_acqf_homotopy
, for optimizing acquisition functions with homotopy (#1915). - Introduce
PriorGuidedAcquisitionFunction
(PiBO) (#1920). - Introduce
qLogExpectedImprovement
, which provides more accurate numerics thanqExpectedImprovement
and can lead to significant optimization improvements (#1936). - Similarly, introduce
qLogNoisyExpectedImprovement
, which is analogous toqNoisyExpectedImprovement
(#1937).
New Features
- Add constrained synthetic test functions
PressureVesselDesign
,WeldedBeam
,SpeedReducer
, andTensionCompressionString
(#1832). - Support decoupled fantasization (#1853) and decoupled evaluations in cost-aware utilities (#1949).
- Add
PairwiseBayesianActiveLearningByDisagreement
, an active learning acquisition function for PBO and BOPE (#1855). - Support custom mean and likelihood in
MultiTaskGP
(#1909). - Enable candidate generation (via
optimize_acqf
) with bothnon_linear_constraints
andfixed_features
(#1912). - Introduce
L0PenaltyApproxObjective
to support L0 regularization (#1916). - Enable batching in
PriorGuidedAcquisitionFunction
(#1925).
Other changes
- Deprecate
FixedNoiseMultiTaskGP
; allowtrain_Yvar
optionally inMultiTaskGP
(#1818). - Implement
load_state_dict
for SAAS multi-task GP (#1825). - Improvements to
LinearEllipticalSliceSampler
(#1859, #1878, #1879, #1883). - Allow passing in task features as part of X in MTGP.posterior (#1868).
- Improve numerical stability of log densities in pairwise GPs (#1919).
- Python 3.11 compliance (#1927).
- Enable using constraints with
SampleReducingMCAcquisitionFunction
s when usinginput_constructor
s andget_acquisition_function
(#1932). - Enable use of
qLogExpectedImprovement
andqLogNoisyExpectedImprovement
with Ax (#1941).
Bug Fixes
- Enable pathwise sampling modules to be converted to GPU (#1821).
- Allow
Standardize
modules to be loaded once trained (#1874). - Fix memory leak in Inducing Point Allocators (#1890).
- Correct einsum computation in
LCEAKernel
(#1918). - Properly whiten bounds in MVNXPB (#1933).
- Make
FixedFeatureAcquisitionFunction
convert floats to double-precision tensors rather than single-precision (#1944). - Fix memory leak in
FullyBayesianPosterior
(#1951). - Make
AnalyticExpectedUtilityOfBestOption
input constructor work correctionly with multi-task GPs (#1955).
Maintenance Release
Maintenance Release
Compatibility
- Require GPyTorch == 1.10 and linear_operator == 0.4.0 (#1803).
New Features
- Polytope sampling for linear constraints along the q-dimension (#1757).
- Single-objective joint entropy search with additional conditioning, various improvements to entropy-based acquisition functions (#1738).
Other changes
- Various updates to improve numerical stability of
PairwiseGP
(#1754, #1755). - Change batch range for
FullyBayesianPosterior
(1176a38, #1773). - Make
gen_batch_initial_conditions
more flexible (#1779). - Deprecate
objective
in favor ofposterior_transform
forMultiObjectiveAnalyticAcquisitionFunction
(#1781). - Use
prune_baseline=True
as default forqNoisyExpectedImprovement
(#1796). - Add
batch_shape
property toSingleTaskVariationalGP
(#1799). - Change minimum inferred noise level for
SaasFullyBayesianSingleTaskGP
(#1800).