botorch.sampling¶
Monte-Carlo Sampler API¶
The base class for sampler modules to be used with MC-evaluated acquisition functions.
- class botorch.sampling.base.MCSampler(sample_shape, seed=None)[source]¶
Bases:
Module
,ABC
Abstract base class for Samplers.
Subclasses must implement the forward method.
Example
This method is usually not called directly, but via the sampler’s __call__ method: >>> posterior = model.posterior(test_X) >>> samples = sampler(posterior)
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
Index Sampler¶
Sampler to be used with EnsemblePosteriors to enable deterministic optimization of acquisition functions with ensemble models.
- class botorch.sampling.index_sampler.IndexSampler(sample_shape, seed=None)[source]¶
Bases:
MCSampler
A sampler that calls posterior.rsample_from_base_samples to generate the samples via index base samples.
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
- forward(posterior)[source]¶
Draws MC samples from the posterior.
- Parameters:
posterior (EnsemblePosterior) – The ensemble posterior to sample from.
- Returns:
The samples drawn from the posterior.
- Return type:
Tensor
Get Sampler Helper¶
- botorch.sampling.get_sampler.get_sampler(posterior, sample_shape, *, seed=None)[source]¶
Get the sampler for the given posterior.
The sampler can be used as sampler(posterior) to produce samples suitable for use in acquisition function optimization via SAA.
- Parameters:
posterior (TorchPosterior) – A Posterior to get the sampler for.
sample_shape (Size) – The sample shape of the samples produced by the given sampler. The full shape of the resulting samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – Seed used to initialize sampler.
- Returns:
The MCSampler object for the given posterior.
- Return type:
List Sampler¶
A SamplerList for sampling from a PosteriorList.
- class botorch.sampling.list_sampler.ListSampler(*samplers)[source]¶
Bases:
MCSampler
A list of samplers for sampling from a PosteriorList.
- Parameters:
samplers (MCSampler) – A variable number of samplers. This should include a sampler for each posterior.
- property sample_shape: Size¶
The sample shape of the underlying samplers.
- forward(posterior)[source]¶
Samples from the posteriors and concatenates the samples.
- Parameters:
posterior (PosteriorList) – A PosteriorList to sample from.
- Returns:
The samples drawn from the posterior.
- Return type:
Tensor
Gaussian Monte-Carlo Samplers¶
Sampler modules producing N(0,1) samples, to be used with MC-evaluated acquisition functions and Gaussian posteriors.
- class botorch.sampling.normal.NormalMCSampler(sample_shape, seed=None)[source]¶
Bases:
MCSampler
,ABC
Base class for samplers producing (possibly QMC) N(0,1) samples.
Subclasses must implement the _construct_base_samples method.
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
- class botorch.sampling.normal.IIDNormalSampler(sample_shape, seed=None)[source]¶
Bases:
NormalMCSampler
Sampler for MC base samples using iid N(0,1) samples.
Example
>>> sampler = IIDNormalSampler(1000, seed=1234) >>> posterior = model.posterior(test_X) >>> samples = sampler(posterior)
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
- class botorch.sampling.normal.SobolQMCNormalSampler(sample_shape, seed=None)[source]¶
Bases:
NormalMCSampler
Sampler for quasi-MC N(0,1) base samples using Sobol sequences.
Example
>>> sampler = SobolQMCNormalSampler(torch.Size([1024]), seed=1234) >>> posterior = model.posterior(test_X) >>> samples = sampler(posterior)
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
Pairwise Monte-Carlo Samplers¶
- class botorch.sampling.pairwise_samplers.PairwiseMCSampler(max_num_comparisons=None, seed=None)[source]¶
Bases:
MCSampler
Abstract class for Pairwise MC Sampler.
This sampler will sample pairwise comparisons. It is to be used together with PairwiseGP and BoTorch acquisition functions (e.g., qKnowledgeGradient)
- Parameters:
max_num_comparisons (int) – Max number of comparisons drawn within samples. If None, use all possible pairwise comparisons
seed (int) – The seed for np.random.seed. If omitted, use a random seed. May be overwritten by sibling classes or subclasses.
- class botorch.sampling.pairwise_samplers.PairwiseIIDNormalSampler(sample_shape, seed=None, max_num_comparisons=None, **kwargs)[source]¶
Bases:
PairwiseMCSampler
,IIDNormalSampler
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate.
seed (int | None) – The seed for the RNG. If omitted, use a random seed.
max_num_comparisons (int) – Max number of comparisons drawn within samples. If None, use all possible pairwise comparisons.
kwargs (Any) – Catch-all for deprecated arguments.
- class botorch.sampling.pairwise_samplers.PairwiseSobolQMCNormalSampler(sample_shape, seed=None, max_num_comparisons=None, **kwargs)[source]¶
Bases:
PairwiseMCSampler
,SobolQMCNormalSampler
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate.
seed (int | None) – The seed for the RNG. If omitted, use a random seed.
max_num_comparisons (int) – Max number of comparisons drawn within samples. If None, use all possible pairwise comparisons.
kwargs (Any) – Catch-all for deprecated arguments.
QMC Base Functionality¶
Quasi Monte-Carlo sampling from Normal distributions.
References:
- class botorch.sampling.qmc.NormalQMCEngine(d, seed=None, inv_transform=False)[source]¶
Bases:
object
Engine for qMC sampling from a Multivariate Normal N(0, I_d).
By default, this implementation uses Box-Muller transformed Sobol samples following pg. 123 in [Pages2018numprob]. To use the inverse transform instead, set inv_transform=True.
Example
>>> engine = NormalQMCEngine(3) >>> samples = engine.draw(16)
Engine for drawing qMC samples from a multivariate normal N(0, I_d).
- Parameters:
d (int) – The dimension of the samples.
seed (int | None) – The seed with which to seed the random number generator of the underlying SobolEngine.
inv_transform (bool) – If True, use inverse transform instead of Box-Muller.
- draw(n=1, out=None, dtype=None)[source]¶
Draw n qMC samples from the standard Normal.
- Parameters:
n (int) – The number of samples to draw. As a best practice, use powers of 2.
out (Tensor | None) – An option output tensor. If provided, draws are put into this tensor, and the function returns None.
dtype (dtype | None) – The desired torch data type (ignored if out is provided). If None, uses torch.get_default_dtype().
- Returns:
A n x d tensor of samples if out=None and None otherwise.
- Return type:
Tensor | None
- class botorch.sampling.qmc.MultivariateNormalQMCEngine(mean, cov, seed=None, inv_transform=False)[source]¶
Bases:
object
Engine for qMC sampling from a multivariate Normal N(mu, Sigma).
By default, this implementation uses Box-Muller transformed Sobol samples following pg. 123 in [Pages2018numprob]. To use the inverse transform instead, set inv_transform=True.
Example
>>> mean = torch.tensor([1.0, 2.0]) >>> cov = torch.tensor([[1.0, 0.25], [0.25, 2.0]]) >>> engine = MultivariateNormalQMCEngine(mean, cov) >>> samples = engine.draw(16)
Engine for qMC sampling from a multivariate Normal N(mu, Sigma).
- Parameters:
mean (Tensor) – The mean vector.
cov (Tensor) – The covariance matrix.
seed (int | None) – The seed with which to seed the random number generator of the underlying SobolEngine.
inv_transform (bool) – If True, use inverse transform instead of Box-Muller.
- draw(n=1, out=None)[source]¶
Draw n qMC samples from the multivariate Normal.
- Parameters:
n (int) – The number of samples to draw. As a best practice, use powers of 2.
out (Tensor | None) – An option output tensor. If provided, draws are put into this tensor, and the function returns None.
- Returns:
A n x d tensor of samples if out=None and None otherwise.
- Return type:
Tensor | None
Stochastic Samplers¶
Samplers to enable use cases that are not base sample driven, such as stochastic optimization of acquisition functions.
- class botorch.sampling.stochastic_samplers.ForkedRNGSampler(sample_shape, seed=None)[source]¶
Bases:
MCSampler
A sampler using torch.fork_rng to enable replicable sampling from a posterior that does not support base samples.
NOTE: This approach is not a one-to-one replacement for base sample driven sampling. The main missing piece in this approach is that its outputs are not replicable across the batch dimensions. As a result, when an acquisition function is batch evaluated with repeated candidates, each candidate will produce a different acquisition value, which is not compatible with Sample Average Approximation.
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
- class botorch.sampling.stochastic_samplers.StochasticSampler(sample_shape, seed=None)[source]¶
Bases:
MCSampler
A sampler that simply calls posterior.rsample to generate the samples. This should only be used for stochastic optimization of the acquisition functions, e.g., via gen_candidates_torch. This should not be used with optimize_acqf, which uses deterministic optimizers under the hood.
NOTE: This ignores the seed option.
Abstract base class for samplers.
- Parameters:
sample_shape (torch.Size) – The sample_shape of the samples to generate. The full shape of the samples is given by posterior._extended_shape(sample_shape).
seed (int | None) – An optional seed to use for sampling.
Pathwise Sampling¶
Feature Maps¶
- class botorch.sampling.pathwise.features.maps.FeatureMap(*args, **kwargs)[source]¶
Bases:
TransformedModuleMixin
,Module
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- num_outputs: int¶
- batch_shape: Size¶
- input_transform: TInputTransform | None¶
- output_transform: TOutputTransform | None¶
- class botorch.sampling.pathwise.features.maps.KernelEvaluationMap(kernel, points, input_transform=None, output_transform=None)[source]¶
Bases:
FeatureMap
A feature map defined by centering a kernel at a set of points.
Initializes a KernelEvaluationMap instance:
feature_map(x) = output_transform(kernel(input_transform(x), points)).
- Parameters:
kernel (Kernel) – The kernel \(k\) used to define the feature map.
points (Tensor) – A tensor passed as the kernel’s second argument.
input_transform (TInputTransform | None) – An optional input transform for the module.
output_transform (TOutputTransform | None) – An optional output transform for the module.
- forward(x)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
x (Tensor)
- Return type:
Tensor | LinearOperator
- property num_outputs: int¶
- property batch_shape: Size¶
- class botorch.sampling.pathwise.features.maps.KernelFeatureMap(kernel, weight, bias=None, input_transform=None, output_transform=None)[source]¶
Bases:
FeatureMap
Representation of a kernel \(k: \mathcal{X}^2 \to \mathbb{R}\) as an n-dimensional feature map \(\phi: \mathcal{X} \to \mathbb{R}^n\) satisfying: \(k(x, x') ≈ \phi(x)^\top \phi(x')\).
Initializes a KernelFeatureMap instance:
feature_map(x) = output_transform(input_transform(x)^{T} weight + bias).
- Parameters:
kernel (Kernel) – The kernel \(k\) used to define the feature map.
weight (Tensor) – A tensor of weights used to linearly combine the module’s inputs.
bias (Tensor | None) – A tensor of biases to be added to the linearly combined inputs.
input_transform (TInputTransform | None) – An optional input transform for the module.
output_transform (TOutputTransform | None) – An optional output transform for the module.
- forward(x)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
x (Tensor)
- Return type:
Tensor
- property num_outputs: int¶
- property batch_shape: Size¶
Feature Map Generators¶
A. Rahimi and B. Recht. Random features for large-scale kernel machines. Advances in Neural Information Processing Systems 20 (2007).
D. J. Sutherland and J. Schneider. On the error of random Fourier features. arXiv preprint arXiv:1506.02785 (2015).
- botorch.sampling.pathwise.features.generators.gen_kernel_features(kernel, num_inputs, num_outputs, **kwargs)[source]¶
Generates a feature map \(\phi: \mathcal{X} \to \mathbb{R}^{n}\) such that \(k(x, x') ≈ \phi(x)^{T} \phi(x')\). For stationary kernels \(k\), defaults to the method of random Fourier features. For more details, see [rahimi2007random] and [sutherland2015error].
- Parameters:
kernel (Kernel) – The kernel \(k\) to be represented via a finite-dim basis.
num_inputs (int) – The number of input features.
num_outputs (int) – The number of kernel features.
kwargs (Any)
- Return type:
Sample Paths¶
- class botorch.sampling.pathwise.paths.SamplePath(*args, **kwargs)[source]¶
Bases:
ABC
,TransformedModuleMixin
,Module
Abstract base class for Botorch sample paths.
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- class botorch.sampling.pathwise.paths.PathDict(paths=None, join=None, input_transform=None, output_transform=None)[source]¶
Bases:
SamplePath
A dictionary of SamplePaths.
Initializes a PathDict instance.
- Parameters:
paths (Mapping[str, SamplePath] | None) – An optional mapping of strings to sample paths.
join (Callable[[list[Tensor]], Tensor] | None) – An optional callable used to combine each path’s outputs.
input_transform (InputTransform | Callable[[Tensor], Tensor] | None) – An optional input transform for the module.
output_transform (OutcomeTransform | Callable[[Tensor], Tensor] | None) – An optional output transform for the module.
- forward(x, **kwargs)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
x (Tensor)
kwargs (Any)
- Return type:
Tensor | dict[str, Tensor]
- items()[source]¶
- Return type:
Iterable[tuple[str, SamplePath]]
- values()[source]¶
- Return type:
Iterable[SamplePath]
- class botorch.sampling.pathwise.paths.PathList(paths=None, join=None, input_transform=None, output_transform=None)[source]¶
Bases:
SamplePath
A list of SamplePaths.
Initializes a PathList instance.
- Parameters:
paths (Iterable[SamplePath] | None) – An optional iterable of sample paths.
join (Callable[[list[Tensor]], Tensor] | None) – An optional callable used to combine each path’s outputs.
input_transform (InputTransform | Callable[[Tensor], Tensor] | None) – An optional input transform for the module.
output_transform (OutcomeTransform | Callable[[Tensor], Tensor] | None) – An optional output transform for the module.
- forward(x, **kwargs)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
x (Tensor)
kwargs (Any)
- Return type:
Tensor | list[Tensor]
- class botorch.sampling.pathwise.paths.GeneralizedLinearPath(feature_map, weight, bias_module=None, input_transform=None, output_transform=None)[source]¶
Bases:
SamplePath
A sample path in the form of a generalized linear model.
Initializes a GeneralizedLinearPath instance.
path(x) = output_transform(bias_module(z) + feature_map(z)^T weight), where z = input_transform(x).
- Parameters:
feature_map (FeatureMap) – A map used to featurize the module’s inputs.
weight (Parameter | Tensor) – A tensor of weights used to combine input features.
bias_module (Module | None) – An optional module used to define additive offsets.
input_transform (InputTransform | Callable[[Tensor], Tensor] | None) – An optional input transform for the module.
output_transform (OutcomeTransform | Callable[[Tensor], Tensor] | None) – An optional output transform for the module.
- forward(x, **kwargs)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
x (Tensor)
- Return type:
Tensor
Pathwise Prior Samplers¶
- botorch.sampling.pathwise.prior_samplers.draw_kernel_feature_paths(model, sample_shape, **kwargs)[source]¶
Draws functions from a Bayesian-linear-model-based approximation to a GP prior.
When evaluted, sample paths produced by this method return Tensors with dimensions sample_dims x batch_dims x [joint_dim], where joint_dim denotes the penultimate dimension of the input tensor. For multioutput models, outputs are returned as the final batch dimension.
- Parameters:
model (GP) – The prior over functions.
sample_shape (Size) – The shape of the sample paths to be drawn.
kwargs (Any)
- Return type:
Pathwise Posterior Samplers¶
- class botorch.sampling.pathwise.posterior_samplers.MatheronPath(prior_paths, update_paths, input_transform=None, output_transform=None)[source]¶
Bases:
PathDict
Represents function draws from a GP posterior via Matheron’s rule:
"Prior path" v (f | y)(·) = f(·) + Cov(f(·), y) Cov(y, y)^{-1} (y - f(X) - ε), \_______________________________________/ v "Update path"
where = denotes equality in distribution, \(f \sim GP(0, k)\), \(y \sim N(f(X), \Sigma)\), and \(\epsilon \sim N(0, \Sigma)\). For more information, see [wilson2020sampling] and [wilson2021pathwise].
Initializes a MatheronPath instance.
- Parameters:
prior_paths (SamplePath) – Sample paths used to represent the prior.
update_paths (SamplePath) – Sample paths used to represent the data.
input_transform (InputTransform | Callable[[Tensor], Tensor] | None) – An optional input transform for the module.
output_transform (OutcomeTransform | Callable[[Tensor], Tensor] | None) – An optional output transform for the module.
- botorch.sampling.pathwise.posterior_samplers.get_matheron_path_model(model, sample_shape=None)[source]¶
Generates a deterministic model using a single Matheron path drawn from the model’s posterior.
The deterministic model evalutes the output of draw_matheron_paths, and reshapes it to mimic the output behavior of the model’s posterior.
- Parameters:
model (GP) – The model whose posterior is to be sampled.
sample_shape (Size | None) – The shape of the sample paths to be drawn, if an ensemble of sample paths is desired. If this is specified, the resulting deterministic model will behave as if the sample_shape is prepended to the batch_shape of the model. The inputs used to evaluate the model must be adjusted to match.
- Returns:
A deterministic model that evaluates the Matheron path.
- Return type:
- botorch.sampling.pathwise.posterior_samplers.draw_matheron_paths(model, sample_shape, prior_sampler=<function draw_kernel_feature_paths>, update_strategy=<function gaussian_update>)[source]¶
Generates function draws from (an approximate) Gaussian process posterior.
When evaluted, sample paths produced by this method return Tensors with dimensions sample_dims x batch_dims x [joint_dim], where joint_dim denotes the penultimate dimension of the input tensor. For multioutput models, outputs are returned as the final batch dimension.
- Parameters:
model (GP) – Gaussian process whose posterior is to be sampled.
sample_shape (Size) – Sizes of sample dimensions.
prior_sample – A callable that takes a model and a sample shape and returns a set of sample paths representing the prior.
update_strategy (Callable[[GP, Tensor], SamplePath]) – A callable that takes a model and a tensor of prior process values and returns a set of sample paths representing the data.
prior_sampler (Callable[[GP, Size], SamplePath])
- Return type:
Pathwise Update Strategies¶
- botorch.sampling.pathwise.update_strategies.gaussian_update(model, sample_values, likelihood=<class 'botorch.utils.types.DEFAULT'>, **kwargs)[source]¶
Computes a Gaussian pathwise update in exact arithmetic:
(f | y)(·) = f(·) + Cov(f(·), y) Cov(y, y)^{-1} (y - f(X) - ε), \_______________________________________/ V "Gaussian pathwise update"
where = denotes equality in distribution, \(f \sim GP(0, k)\), \(y \sim N(f(X), \Sigma)\), and \(\epsilon \sim N(0, \Sigma)\). For more information, see [wilson2020sampling] and [wilson2021pathwise].
- Parameters:
model (GP) – A Gaussian process prior together with a likelihood.
sample_values (Tensor) – Assumed values for \(f(X)\).
likelihood (Likelihood | None) – An optional likelihood used to help define the desired update. Defaults to model.likelihood if it exists else None.
kwargs (Any)
- Return type:
Utilities¶
- class botorch.sampling.pathwise.utils.TransformedModuleMixin[source]¶
Bases:
object
Mixin that wraps a module’s __call__ method with optional transforms.
- input_transform: InputTransform | Callable[[Tensor], Tensor] | None¶
- output_transform: OutcomeTransform | Callable[[Tensor], Tensor] | None¶
- class botorch.sampling.pathwise.utils.TensorTransform(*args, **kwargs)[source]¶
Bases:
ABC
,Module
Abstract base class for transforms that map tensor to tensor.
Initialize internal Module state, shared by both nn.Module and ScriptModule.
- abstract forward(values, **kwargs)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
kwargs (Any)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.ChainedTransform(*transforms)[source]¶
Bases:
TensorTransform
A composition of TensorTransforms.
Initializes a ChainedTransform instance.
- Parameters:
transforms (TensorTransform) – A set of transforms to be applied from right to left.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.SineCosineTransform(scale=None)[source]¶
Bases:
TensorTransform
A transform that returns concatenated sine and cosine features.
Initializes a SineCosineTransform instance.
- Parameters:
scale (Tensor | None) – An optional tensor used to rescale the module’s outputs.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.InverseLengthscaleTransform(kernel)[source]¶
Bases:
TensorTransform
A transform that divides its inputs by a kernels lengthscales.
Initializes an InverseLengthscaleTransform instance.
- Parameters:
kernel (Kernel) – The kernel whose lengthscales are to be used.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.OutputscaleTransform(kernel)[source]¶
Bases:
TensorTransform
A transform that multiplies its inputs by the square root of a kernel’s outputscale.
Initializes an OutputscaleTransform instance.
- Parameters:
kernel (ScaleKernel) – A ScaleKernel whose outputscale is to be used.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.FeatureSelector(indices, dim=-1)[source]¶
Bases:
TensorTransform
A transform that returns a subset of its input’s features. along a given tensor dimension.
Initializes a FeatureSelector instance.
- Parameters:
indices (Iterable[int]) – A LongTensor of feature indices.
dim (int | LongTensor) – The dimensional along which to index features.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- class botorch.sampling.pathwise.utils.OutcomeUntransformer(transform, num_outputs)[source]¶
Bases:
TensorTransform
Module acting as a bridge for OutcomeTransform.untransform.
Initializes an OutcomeUntransformer instance.
- Parameters:
transform (OutcomeTransform) – The wrapped OutcomeTransform instance.
num_outputs (int | LongTensor) – The number of outcome features that the OutcomeTransform transforms.
- forward(values)[source]¶
Define the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.- Parameters:
values (Tensor)
- Return type:
Tensor
- botorch.sampling.pathwise.utils.get_input_transform(model)[source]¶
Returns a model’s input_transform or None.
- Parameters:
model (GPyTorchModel)
- Return type:
InputTransform | None
- botorch.sampling.pathwise.utils.get_output_transform(model)[source]¶
Returns a wrapped version of a model’s outcome_transform or None.
- Parameters:
model (GPyTorchModel)
- Return type:
OutcomeUntransformer | None