Skip to main content
Version: v0.14.0

BoTorch Tutorials

The tutorials here will help you understand and use BoTorch in your own work. They assume that you are familiar with both Bayesian optimization (BO) and PyTorch.

  • If you are new to BO, we recommend you start with the Ax docs and the following tutorial paper.
  • If you are new to PyTorch, the easiest way to get started is with the What is PyTorch? tutorial.

Using BoTorch with Ax

For practitioners who are interested in running experiments to optimize various objectives using Bayesian optimization, we recommend using Ax rather than BoTorch. Ax provides a user-friendly interface for experiment configuration and orchestration, while choosing an appropriate Bayesian optimization algorithm to optimize the given objective, following BoTorch best practices.

For researchers who are interested in running experiments with their custom BoTorch models and acquisition functions, Ax's Modular BoTorch Interface offers a convenient way to leverage custom BoTorch objects while utilizing Ax experiment configuration and orchestration. Check out Modular BoTorch tutorial to learn how to use custom BoTorch objects in Ax! See this documentation for additional information.

Full Optimization Loops

In some situations (e.g. when working in a non-standard setting, or if you want to understand and control various details of the BO loop), then you may also consider working purely in BoTorch. The tutorials in this section illustrate this approach.

Bite-Sized Tutorials

Rather than guiding you through full end-to-end BO loops, the tutorials in this section focus on specific tasks that you will encounter in customizing your BO algorithms. For instance, you may want to write a custom acquisition function and then use a custom zero-th order optimizer to optimize it.

Advanced Usage

Tutorials in this section showcase more advanced ways of using BoTorch. For instance, this tutorial shows how to perform BO if your objective function is an image, by optimizing in the latent space of a variational auto-encoder (VAE).