botorch.posteriors¶
Posterior APIs¶
Abstract Posterior API¶
Abstract base module for all botorch posteriors.
- class botorch.posteriors.posterior.Posterior[source]¶
Bases:
abc.ABC
Abstract base class for botorch posteriors.
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
This function may be overwritten by subclasses in case base_sample_shape and event_shape do not agree (e.g. if the posterior is a Multivariate Gaussian that is not full rank).
- abstract property device: torch.device¶
The torch device of the posterior.
- abstract property dtype: torch.dtype¶
The torch dtype of the posterior.
- abstract property event_shape: torch.Size¶
The event shape (i.e. the shape of a single sample).
- property mean: torch.Tensor¶
The mean of the posterior as a (b) x n x m-dim Tensor.
- property variance: torch.Tensor¶
The variance of the posterior as a (b) x n x m-dim Tensor.
- abstract rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
- Returns
A sample_shape x event-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
- sample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (without gradients).
This is a simple wrapper calling rsample using with torch.no_grad().
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler object. This is used for deterministic optimization.
- Returns
A sample_shape x event_shape-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
- class botorch.posteriors.posterior.PosteriorList(*posteriors)[source]¶
Bases:
botorch.posteriors.posterior.Posterior
A Posterior represented by a list of independent Posteriors.
A Posterior represented by a list of independent Posteriors.
- Parameters
*posteriors – A variable number of single-outcome posteriors.
posteriors (Posterior) –
- Return type
None
Example
>>> p_1 = model_1.posterior(test_X) >>> p_2 = model_2.posterior(test_X) >>> p_12 = PosteriorList(p_1, p_2)
Note: This is typically produced automatically in ModelList; it should generally not be necessary for the end user to invoke it manually.
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
- property device: torch.device¶
The torch device of the posterior.
- property dtype: torch.dtype¶
The torch dtype of the posterior.
- property event_shape: torch.Size¶
The event shape (i.e. the shape of a single sample).
- property mean: torch.Tensor¶
The mean of the posterior as a (b) x n x m-dim Tensor.
- property variance: torch.Tensor¶
The variance of the posterior as a (b) x n x m-dim Tensor.
- rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
- Returns
A sample_shape x event-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
Posteriors¶
GPyTorch Posterior¶
Posterior Module to be used with GPyTorch models.
- class botorch.posteriors.gpytorch.GPyTorchPosterior(mvn)[source]¶
Bases:
botorch.posteriors.posterior.Posterior
A posterior based on GPyTorch’s multi-variate Normal distributions.
A posterior based on GPyTorch’s multi-variate Normal distributions.
- Parameters
mvn (MultivariateNormal) – A GPyTorch MultivariateNormal (single-output case) or MultitaskMultivariateNormal (multi-output case).
- Return type
None
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
- property device: torch.device¶
The torch device of the posterior.
- property dtype: torch.dtype¶
The torch dtype of the posterior.
- property event_shape: torch.Size¶
The event shape (i.e. the shape of a single sample) of the posterior.
- rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
- Returns
A sample_shape x event_shape-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
- property mean: torch.Tensor¶
The posterior mean.
- property variance: torch.Tensor¶
The posterior variance.
- botorch.posteriors.gpytorch.scalarize_posterior(posterior, weights, offset=0.0)[source]¶
Affine transformation of a multi-output posterior.
- Parameters
posterior (botorch.posteriors.gpytorch.GPyTorchPosterior) – The posterior over m outcomes to be scalarized. Supports t-batching.
weights (torch.Tensor) – A tensor of weights of size m.
offset (float) – The offset of the affine transformation.
- Returns
- The transformed (single-output) posterior. If the input posterior has
mean mu and covariance matrix Sigma, this posterior has mean weights^T * mu and variance weights^T Sigma w.
- Return type
Example
Example for a model with two outcomes:
>>> X = torch.rand(1, 2) >>> posterior = model.posterior(X) >>> weights = torch.tensor([0.5, 0.25]) >>> new_posterior = scalarize_posterior(posterior, weights=weights)
Determinstic Posterior¶
Deterministic (degenerate) posteriors. Used in conjunction with deterministic models.
- class botorch.posteriors.deterministic.DeterministicPosterior(values)[source]¶
Bases:
botorch.posteriors.posterior.Posterior
Deterministic posterior.
- Parameters
values (Tensor) –
- Return type
None
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
This function may be overwritten by subclasses in case base_sample_shape and event_shape do not agree (e.g. if the posterior is a Multivariate Gaussian that is not full rank).
- property device: torch.device¶
The torch device of the posterior.
- property dtype: torch.dtype¶
The torch dtype of the posterior.
- property event_shape: torch.Size¶
The event shape (i.e. the shape of a single sample).
- property mean: torch.Tensor¶
The mean of the posterior as a (b) x n x m-dim Tensor.
- property variance: torch.Tensor¶
The variance of the posterior as a (b) x n x m-dim Tensor.
As this is a deterministic posterior, this is a tensor of zeros.
- rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
For the deterministic posterior, this just returns the values expanded to the requested shape.
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. Ignored in construction of the samples (used only for shape validation).
- Returns
A sample_shape x event-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
Higher Order GP Posterior¶
- class botorch.posteriors.higher_order.HigherOrderGPPosterior(mvn, joint_covariance_matrix, train_train_covar, test_train_covar, train_targets, output_shape, num_outputs)[source]¶
Bases:
botorch.posteriors.gpytorch.GPyTorchPosterior
Posterior class for a Higher order Gaussian process model [Zhe2019hogp]. Extends the standard GPyTorch posterior class by overwriting the rsample method. The posterior variance is handled internally by the HigherOrderGP model. HOGP is a tensorized GP model so the posterior covariance grows to be extremely large, but is highly structured, which means that we can exploit Kronecker identities to sample from the posterior using Matheron’s rule as described in [Doucet2010sampl].
In general, this posterior should ONLY be used for HOGP models that have highly structured covariances. It should also only be used internally when called from the HigherOrderGP.posterior(…) method. At this time, the posterior does not support gradients with respect to the training data.
A Posterior for HigherOrderGP models.
- Parameters
mvn (gpytorch.distributions.multivariate_normal.MultivariateNormal) – Posterior multivariate normal distribution
joint_covariance_matrix (gpytorch.lazy.lazy_tensor.LazyTensor) – Joint test train covariance matrix over the entire tensor
train_train_covar (gpytorch.lazy.lazy_tensor.LazyTensor) – covariance matrix of train points in the data space
test_train_covar (gpytorch.lazy.lazy_tensor.LazyTensor) – covariance matrix of test x train points in the data space
train_targets (torch.Tensor) – training responses vectorized
output_shape (torch.Size) – shape output training responses
num_outputs (int) – batch shaping of model
- Return type
None
- property base_sample_shape¶
The shape of a base sample used for constructing posterior samples.
- property event_shape¶
The event shape (i.e. the shape of a single sample) of the posterior.
- rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
As the posterior covariance is difficult to draw from in this model, we implement Matheron’s rule as described in [Doucet2010sampl]-. This may not work entirely correctly for deterministic base samples unless base samples are provided that are of shape n + 2 * n_train because the sampling method draws 2 * n_train samples as well as the standard n. samples.
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
- Returns
A sample_shape x event_shape-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
Multitask GP Posterior¶
- class botorch.posteriors.multitask.MultitaskGPPosterior(mvn, joint_covariance_matrix, test_train_covar, train_diff, test_mean, train_train_covar, train_noise, test_noise=None)[source]¶
Bases:
botorch.posteriors.gpytorch.GPyTorchPosterior
Posterior class for a Kronecker Multi-task GP model using with ICM kernel. Extends the standard GPyTorch posterior class by overwriting the rsample method. In general, this posterior should ONLY be used for MTGP models that have structured covariances. It should also only be used internally when called from the KroneckerMultiTaskGP.posterior(…) method.
- Parameters
mvn (gpytorch.distributions.multivariate_normal.MultivariateNormal) – Posterior multivariate normal distribution
joint_covariance_matrix (gpytorch.lazy.lazy_tensor.LazyTensor) – Joint test train covariance matrix over the entire tensor
train_train_covar (gpytorch.lazy.lazy_tensor.LazyTensor) – covariance matrix of train points in the data space
test_obs_covar – covariance matrix of test x train points in the data space
train_diff (torch.Tensor) – difference between train mean and train responses
train_noise (Union[gpytorch.lazy.lazy_tensor.LazyTensor, torch.Tensor]) – training noise covariance
test_noise (Optional[Union[gpytorch.lazy.lazy_tensor.LazyTensor, torch.Tensor]]) – Only used if posterior should contain observation noise. testing noise covariance
test_train_covar (gpytorch.lazy.lazy_tensor.LazyTensor) –
test_mean (torch.Tensor) –
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
- property device: torch.device¶
The torch device of the posterior.
- property dtype: torch.dtype¶
The torch dtype of the posterior.
- rsample(sample_shape=None, base_samples=None, train_diff=None)[source]¶
Sample from the posterior (with gradients).
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
train_diff (Optional[torch.Tensor]) –
- Returns
A sample_shape x event_shape-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
Transformed Posterior¶
- class botorch.posteriors.transformed.TransformedPosterior(posterior, sample_transform, mean_transform=None, variance_transform=None)[source]¶
Bases:
botorch.posteriors.posterior.Posterior
An generic transformation of a posterior (implicitly represented)
An implicitly represented transformed posterior
- Parameters
posterior (Posterior) – The posterior object to be transformed.
sample_transform (Callable[[Tensor], Tensor]) – A callable applying a sample-level transform to a sample_shape x batch_shape x q x m-dim tensor of samples from the original posterior, returning a tensor of samples of the same shape.
mean_transform (Optional[Callable[[Tensor, Tensor], Tensor]]) – A callable transforming a 2-tuple of mean and variance (both of shape batch_shape x m x o) of the original posterior to the mean of the transformed posterior.
variance_transform (Optional[Callable[[Tensor, Tensor], Tensor]]) – A callable transforming a 2-tuple of mean and variance (both of shape batch_shape x m x o) of the original posterior to a variance of the transformed posterior.
- Return type
None
- property base_sample_shape: torch.Size¶
The shape of a base sample used for constructing posterior samples.
- property device: torch.device¶
The torch device of the posterior.
- property dtype: torch.dtype¶
The torch dtype of the posterior.
- property event_shape: torch.Size¶
The event shape (i.e. the shape of a single sample).
- property mean: torch.Tensor¶
The mean of the posterior as a batch_shape x n x m-dim Tensor.
- property variance: torch.Tensor¶
The variance of the posterior as a batch_shape x n x m-dim Tensor.
- rsample(sample_shape=None, base_samples=None)[source]¶
Sample from the posterior (with gradients).
- Parameters
sample_shape (Optional[torch.Size]) – A torch.Size object specifying the sample shape. To draw n samples, set to torch.Size([n]). To draw b batches of n samples each, set to torch.Size([b, n]).
base_samples (Optional[torch.Tensor]) – An (optional) Tensor of N(0, I) base samples of appropriate dimension, typically obtained from a Sampler. This is used for deterministic optimization.
- Returns
A sample_shape x event-dim Tensor of samples from the posterior.
- Return type
torch.Tensor
Fully Bayesian Posterior¶
- class botorch.posteriors.fully_bayesian.FullyBayesianPosterior(mvn, marginalize_over_mcmc_samples=False)[source]¶
Bases:
botorch.posteriors.gpytorch.GPyTorchPosterior
A posterior for a fully Bayesian model.
A posterior for a fully Bayesian model.
The MCMC batch dimension is -3.
- Parameters
mvn (gpytorch.distributions.multivariate_normal.MultivariateNormal) – A GPyTorch MultivariateNormal (single-output case)
marginalize_over_mcmc_samples (bool) – If true, use the law of total variance to marginalize over the hyperparameter samples. This should always be false when computing acquisition functions.
- Return type
None
- property mean: torch.Tensor¶
The posterior mean.
- property variance: torch.Tensor¶
The posterior variance.