2
2
<img width =" 300 " src =" ./botorch_logo_lockup.svg " alt =" BoTorch Logo " />
3
3
</a >
4
4
5
- [ ![ Build Status] (
6
- https://travis-ci.com/pytorch/botorch.svg?token=esFvpzSw7sLSsfe1PAr1&branch=master
7
- )] ( https://travis-ci.com/pytorch/botorch )
5
+ [ ![ CircleCI] ( https://circleci.com/gh/pytorch/botorch.svg?style=shield&circle-token=19c388387063692b6d33eecc243c8990a43ae655 )] ( https://circleci.com/gh/pytorch/botorch )
6
+ [ ![ Conda] ( https://img.shields.io/conda/v/pytorch/botorch.svg )] ( https://anaconda.org/pytorch/botorch )
7
+ [ ![ PyPI] ( https://img.shields.io/pypi/v/botorch.svg )] ( https://pypi.org/project/botorch )
8
+ [ ![ License] ( https://img.shields.io/badge/license-MIT-green.svg )] ( LICENSE.md )
8
9
9
10
10
11
BoTorch is a library for Bayesian Optimization built on PyTorch.
11
12
12
- * BoTorch is currently in alpha and under active development - be warned * !
13
+ * BoTorch is currently in beta and under active development! *
13
14
14
15
15
- ### Why BoTorch
16
+ #### Why BoTorch ?
16
17
BoTorch
17
18
* Provides a modular and easily extensible interface for composing Bayesian
18
19
optimization primitives, including probabilistic models, acquisition functions,
@@ -30,30 +31,43 @@ BoTorch
30
31
Processes (GPs) deep kernel learning, deep GPs, and approximate inference.
31
32
32
33
33
- ### Target Audience
34
+ #### Target Audience
34
35
35
36
The primary audience for hands-on use of BoTorch are researchers and
36
37
sophisticated practitioners in Bayesian Optimization and AI.
37
-
38
38
We recommend using BoTorch as a low-level API for implementing new algorithms
39
- for [ Ax] ( https://github.com/facebook/Ax ) . Ax has been designed to be
40
- an easy-to-use platform for end-users, which at the same time is flexible enough
41
- for Bayesian Optimization researchers to plug into for handling of feature
42
- transformations, (meta-)data management, storage, etc.
43
-
39
+ for [ Ax] ( https://ax.dev ) . Ax has been designed to be an easy-to-use platform
40
+ for end-users, which at the same time is flexible enough for Bayesian
41
+ Optimization researchers to plug into for handling of feature transformations,
42
+ (meta-)data management, storage, etc.
44
43
We recommend that end-users who are not actively doing research on Bayesian
45
44
Optimization simply use Ax.
46
45
47
46
48
47
## Installation
49
48
50
- #### Installation Requirements
51
-
49
+ ** Installation Requirements**
52
50
- Python >= 3.6
53
- - PyTorch nightly ( ** TODO: ** peg to PyTorch 1.1 once released)
54
- - gpytorch >= 0.3.1 ( ** TODO: ** peg to GPyTorch 0.3.2 once released)
51
+ - PyTorch >= 1.1
52
+ - gpytorch >= 0.3.2
55
53
- scipy
56
54
55
+
56
+ #### Installing BoTorch
57
+
58
+
59
+ ##### Installing the latest release
60
+
61
+ The latest release of BoTorch is easily installed either via
62
+ [ Anaconda] ( https://www.anaconda.com/distribution/#download-section ) (recommended):
63
+ ``` bash
64
+ conda install botorch -c pytorch
65
+ ```
66
+ or via ` pip ` :
67
+ ``` bash
68
+ pip install botorch
69
+ ```
70
+
57
71
** Important note for MacOS users:**
58
72
* You will want to make sure your PyTorch build is linked against MKL (the
59
73
non-optimized version of BoTorch can be up to an order of magnitude slower in
@@ -64,52 +78,75 @@ Optimization simply use Ax.
64
78
consult the PyTorch installation instructions above.
65
79
66
80
67
- #### Installing BoTorch
81
+ ##### Installing from latest master
68
82
69
- The latest release of BoTorch is easily installed using either pip or conda:
70
- ``` bash
71
- pip install botorch
72
- ```
73
-
74
- ** TODO: Conda install**
75
-
76
-
77
- If you'd like to try our bleeding edge features (and don't mind running into an
78
- occasional bug here or there), you can install the latest master from GitHub
79
- (this will also require installing the current GPyTorch master)::
83
+ If you'd like to try our bleeding edge features (and don't mind potentially
84
+ running into the occasional bug here or there), you can install the latest
85
+ master directly from GitHub (this will also require installing the current GPyTorch master):
80
86
``` bash
81
87
pip install git+https://github.com/cornellius-gp/gpytorch.git
82
88
pip install git+https://github.com/pytorch/botorch.git
83
89
```
84
90
91
+ ** Manual / Dev install**
85
92
86
- #### Installing BoTorch from the private repo ** TODO: REMOVE**
87
-
88
- BoTorch is easily installed using pip:
89
- ``` bash
90
- pip install git+ssh://
[email protected] /pytorch/botorch.git
91
- ```
92
-
93
- * Note:* You must use ** ssh** here since the repo is private - for this to work,
94
- make sure your ssh public key is registered with GitHub, and is usable by ssh.
95
-
96
- Alternatively, you can do a manual install. To do a basic install, run:
93
+ Alternatively, you can do a manual install. For a basic install, run:
97
94
``` bash
95
+ git clone https://github.com/pytorch/botorch.git
98
96
cd botorch
99
97
pip install -e .
100
98
```
101
99
102
- To customize the installation, you can also run the following instead:
100
+ To customize the installation, you can also run the following variants of the
101
+ above:
103
102
* ` pip install -e .[dev] ` : Also installs all tools necessary for development
104
- (testing, linting, docs building).
105
- * ` pip install -e .[tutorial] ` : Also installs jupyter for running the tutorial
106
- notebooks.
103
+ (testing, linting, docs building; see [ Contributing] ( #contributing ) below).
104
+ * ` pip install -e .[tutorials] ` : Also installs all packages necessary for running the tutorial notebooks.
105
+
106
+
107
+ ## Getting Started
108
+
109
+ Here's a quick run down of the main components of a Bayesian optimization loop.
110
+ For more details see our [ Documentation] ( https://botorch.org/docs ) and the
111
+ [ Tutorials] ( https://botorch.org/tutorials ) .
112
+
113
+ 1 . Fit a Gaussian Process model to data
114
+ ``` python
115
+ import torch
116
+ from botorch.models import SingleTaskGP
117
+ from botorch.fit import fit_gpytorch_model
118
+ from gpytorch.mlls import ExactMarginalLogLikelihood
119
+
120
+ train_X = torch.rand(10 , 2 )
121
+ Y = 1 - torch.norm(train_X - 0.5 , dim = - 1 ) + 0.1 * torch.rand(10 )
122
+ train_Y = (Y - Y.mean()) / Y.std()
123
+
124
+ gp = SingleTaskGP(train_X, train_Y)
125
+ mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
126
+ fit_gpytorch_model(mll)
127
+ ```
128
+
129
+ 2 . Construct an acquisition function
130
+ ``` python
131
+ from botorch.acquisition import UpperConfidenceBound
132
+
133
+ UCB = UpperConfidenceBound(gp, beta = 0.1 )
134
+ ```
135
+
136
+ 3 . Optimize the acquisition function
137
+ ``` python
138
+ from botorch.optim import joint_optimize
107
139
140
+ bounds = torch.stack([torch.zeros(2 ), torch.ones(2 )])
141
+ candidate = joint_optimize(
142
+ UCB , bounds = bounds, q = 1 , num_restarts = 5 , raw_samples = 20 ,
143
+ )
144
+ ```
108
145
109
146
110
147
## Contributing
111
148
See the [ CONTRIBUTING] ( CONTRIBUTING.md ) file for how to help out.
112
149
113
150
114
151
## License
115
- BoTorch is MIT licensed, as found in the LICENSE file.
152
+ BoTorch is MIT licensed, as found in the [ LICENSE] ( LICENSE.md ) file.
0 commit comments