botorch.fit¶
Utilities for model fitting.
- botorch.fit.fit_gpytorch_model(mll, optimizer=<function fit_gpytorch_scipy>, **kwargs)[source]¶
Fit hyperparameters of a GPyTorch model.
On optimizer failures, a new initial condition is sampled from the hyperparameter priors and optimization is retried. The maximum number of retries can be passed in as a max_retries kwarg (default is 5).
Optimizer functions are in botorch.optim.fit.
- Parameters:
mll (MarginalLogLikelihood) – MarginalLogLikelihood to be maximized.
optimizer (Callable) – The optimizer function.
kwargs (Any) – Arguments passed along to the optimizer function, including max_retries and sequential (controls the fitting of ModelListGP and BatchedMultiOutputGPyTorchModel models) or approx_mll (whether to use gpytorch’s approximate MLL computation).
- Returns:
MarginalLogLikelihood with optimized parameters.
- Return type:
MarginalLogLikelihood
Example
>>> gp = SingleTaskGP(train_X, train_Y) >>> mll = ExactMarginalLogLikelihood(gp.likelihood, gp) >>> fit_gpytorch_model(mll)
- botorch.fit.fit_fully_bayesian_model_nuts(model, max_tree_depth=6, warmup_steps=512, num_samples=256, thinning=16, disable_progbar=False)[source]¶
Fit a fully Bayesian model using the No-U-Turn-Sampler (NUTS)
- Parameters:
model (Union[SaasFullyBayesianSingleTaskGP, SaasFullyBayesianMultiTaskGP]) – SaasFullyBayesianSingleTaskGP to be fitted.
max_tree_depth (int) – Maximum tree depth for NUTS
warmup_steps (int) – The number of burn-in steps for NUTS.
num_samples (int) – The number of MCMC samples. Note that with thinning, num_samples / thinning samples are retained.
thinning (int) – The amount of thinning. Every nth sample is retained.
disable_progbar (bool) – A boolean indicating whether to print the progress bar and diagnostics during MCMC.
- Return type:
None
Example
>>> gp = SaasFullyBayesianSingleTaskGP(train_X, train_Y) >>> fit_fully_bayesian_model_nuts(gp)