Getting Started
This section shows you how to get your feet wet with BoTorch.
Before jumping the gun, we recommend you start with the high-level Overview to learn about the basic concepts in BoTorch.
Installing BoTorch
Installation Requirements:
BoTorch is easily installed via pip
(recommended). It is also possible to
use the (unofficial) Anaconda
package from the -c conda-forge
channel.
- pip
- Conda
pip install botorch
conda install botorch -c gpytorch -c conda-forge
For more installation options and detailed instructions, please see the Project Readme on GitHub.
Basic Components
Here's a quick run down of the main components of a Bayesian Optimization loop.
-
Fit a Gaussian Process model to data
import torch
from botorch.models import SingleTaskGP
from botorch.models.transforms import Normalize, Standardize
from botorch.fit import fit_gpytorch_mll
from gpytorch.mlls import ExactMarginalLogLikelihood
train_X = torch.rand(10, 2, dtype=torch.double) * 2
# explicit output dimension -- Y is 10 x 1
train_Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True)
train_Y += 0.1 * torch.rand_like(train_Y)
gp = SingleTaskGP(
train_X=train_X,
train_Y=train_Y,
input_transform=Normalize(d=2),
outcome_transform=Standardize(m=1),
)
mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
fit_gpytorch_mll(mll) -
Construct an acquisition function
from botorch.acquisition import LogExpectedImprovement
logNEI = LogExpectedImprovement(model=gp, best_f=train_Y.max()) -
Optimize the acquisition function
from botorch.optim import optimize_acqf
bounds = torch.stack([torch.zeros(2), torch.ones(2)]).to(torch.double)
candidate, acq_value = optimize_acqf(
logNEI, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
)
Tutorials
Our Jupyter notebook tutorials help you get off the ground with BoTorch. View and download them here.
API Reference
For an in-depth reference of the various BoTorch internals, see our API Reference.
Contributing
You'd like to contribute to BoTorch? Great! Please see here for how to help out.