Releases: secondmind-labs/trieste
Release 3.0.0
Breaking changes
This release includes a minor breaking change:
The Record
, FrozenRecord
and OptimizationResult
classes are now also generic in the model type. Any type-annotated code using these may need to be updated to include the model type annotations as appropriate. Runtime behaviour is unaffected.
New features
Customisable initial point selection for optimization (#808)
Customisable handling of model updates in AskTellOptimizer (including an AskTellOptimizerNoTraining
class for use with non-trainable models) (#815)
DeepEnsemble support for pass-through keras compile arguments (#816)
Improvements/fixes
Support for tensorflow 2.15 (#819)
Filter out local datasets when calling base rule (#805)
Speed up get_unique_points_mask
(used by BatchTrustRegionBox
) (#813)
Documentation
README and CONTRIBUTING updates (#804, #802, #810)
Mixed search space notebook (#818)
New Contributors: @nfergu (#816)
Full Changelog: v2.0.0...v3.0.0
Release 2.0.0
Breaking changes
This release includes a few minor breaking changes:
- ProbabilisticModel is now a pure interface (#797)
ProbabilisticModel.log
is now an abstract method and needs to be explicitly specified in any concrete model class implementation (though it can be empty)ProbabilisticModel.get_module_with_variables
is now a utility function intrieste.models.utils
- TREGO (TrustRegion) and TURBO were reimplemented using the new batch-trust-region classes (#778, #791)
- a TURBO rule must now be initialised as
BatchTrustRegionBox(TURBOBox(search_space))
instead ofTURBO(search_space)
- a TREGO rule must now be initialised as
BatchTrustRegionBox(TREGOBox(search_space))
instead ofTrustRegion()
- the internal
State
structures exposed by these rules are now also different:BatchTrustRegion.State
instead ofTurbo.State
orTrustRegion.State
, with additional values such aseps
accessible via the subspaces inrule._subspaces
instead.
New features
Multi trust region acquisition rules (#773, #777, #778, #783)
Local models and datasets (#788, #791)
Expose model optimization result in optimize
method and get_last_optimization_result
function (#774, #797)
Improvements/fixes
Stop trajectory sampling ignoring active-dims in the kernel (#790)
Stop randomize_hyperparameters
generating repeating values (#785)
Handle unconstrained priors in randomize_hyperparameters
(#796)
Support optimization of multiple points in batch spaces (#787)
Allow Boxes with zero width (#780)
Deepcopy search spaces (#776)
Use int64 when calculating data size in split_acquisition_function
(#795)
Start using check_shape
for shape checking (#770)
Cleanup tutorials (#769, #771)
Build changes
Parallelise integration test run (#775)
Integration test fragility (#786, #798)
Full Changelog: v1.2.0...v2.0.0
Release 1.2.0
New features
SeparateIndependent kernel support (#711, #719, #736, #745, #747, #753)
TuRBO acquisition rule (#739, #762)
Saving models as tf.saved_model (#750)
Improvements
Fix deepcopy of posterior cache and VGP parameters (#741, #752)
Fix learning rate reset error with gpflux and keras models (#740)
Support Tensorflow 2.11 and 2.12 (#746)
Make IndependentReparametrizationSampler support XLA (#718, #748)
Handle broadcasting in DeepEnsemble models (#727)
Support calling qmc_normal_samples with tensors (#723)
Make default tensorboard metric names unique (#726)
Make Reducer AFs generic in model type and add Map reducer (#759)
Add continue_optimization method to BayesianOptimizer (#755)
Additional test problems (#744)
Build changes
Add default PR template (#756)
Support installation from pypi on MacOS (#730)
Get notebooks to build with Python 3.10 (#728, #732, #734, #738, #742)
Parallelise test runs (#743, #749)
Reduce notebook file sizes (#721)
Full Changelog: v1.1.2...v1.2.0
Release 1.1.2
Bug fixes
Support disabling skip in QMC sampling (#715)
Documentation
REMBO notebook (#710)
Full Changelog: v1.1.1...v1.1.2
Release 1.1.1
Bug fixes
Fix Batch QMC sampling (#713)
Full Changelog: v1.1.0...v1.1.1
Release 1.1.0
New features
MUMBO: MUlti-task Max-value Bayesian Optimization (#699)
Support QMC sampling in reparameterization samplers (#708)
Improvements
Fix SVGP update equations (#709)
Support tf compilation in MultifidelityAutoregressive and DeepEnsemble (#691, #698)
Build changes
New README.md (#690, #705, #706)
Track code coverage (#702)
Unpin mypy and black versions (#701)
Test against Python 3.10 (#688, #696)
Work around docs issue with setuptools+gym (#695)
New contributors
@eltociear made their first contribution in #700
Full Changelog: v1.0.0...v1.1.0
Release 1.0.0
Note: this release marks the first 1.x release but is compatible with 0.13.3. Future releases will (try to) conform to semantic versioning. Since trieste is a research-led toolbox, this may result in reasonably frequent major version increments.
New features
NARGP multifidelity model (#665)
BO-specific inducing point allocators (#683)
Improvements
Support for explicit constraints in ExpectedImprovement (#664)
Support broadcasting in search space contains method (#677)
Improved logging for GPflow models (#680)
Faster sampler for deep ensembles (#682)
Build changes
.gitignore additions (#679)
Upgrade to GPflow 2.7.0 (#684)
Workaround for slowtest OOM crash (#685)
Full Changelog: v0.13.3...v1.0.0
Release 0.13.3
This release reintroduces ProbabilityOfImprovement (#638) and Pareto diverse sample method (#643), which were temporarily removed in 0.13.2.
Note that the minimum supported TensorFlow version has been raised to 2.5.
New features
Batch Expected Improvement (#641, #653)
Portfolio method for Batch BO (#651, #659, #663)
Multifidelity modeling (#621, #654)
Explicit constraints (#656, #660)
Improvements
Allow scipy optimizer to be changed (#655)
Allow arbitrary dataset/model tags (#668) — note that this may break type checking for existing code
Slight improvements to synthetic objective functgions (#671)
Build changes
Support latest gpflow and gpflux (#649)
Support more recent tox versions (#669)
Full Changelog: v0.13.1...v0.13.3
Release 0.13.2
This release fixes the 0.13.1 release by temporarily removing the two new features (#638 and #643) as they were preventing trieste from being used with the latest release of GPflow. A future release will reintroduce them.
The fix to handle constant priors in randomize_hyperparameters (#646) remains in the release.
Full Changelog: v0.13.0...v0.13.2
Release 0.13.1
New features
ProbabilityOfImprovement acquisition function (#638)
Pareto diverse sample method (#643)
Improvements
Handle constant priors in randomize_hyperparameters (#646)
Full Changelog: v0.13.0...v0.13.1