@@ -42,7 +42,7 @@ package indices, you can still use `pip install`, but some
42
42
[ additional options] ( https://pytorch.org/get-started/locally/ ) are needed:
43
43
44
44
``` shell
45
- pip install torch --extra- index-url https://download.pytorch.org/whl/cu113
45
+ pip install torch --index-url https://download.pytorch.org/whl/cu118
46
46
```
47
47
48
48
[ ^ 1 ] :
@@ -51,14 +51,11 @@ pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
51
51
52
52
While this is certainly an improvement, it still has a few downsides:
53
53
54
- 1 . You need to know what computation backend, e.g. CUDA 11.3 (` cu113 ` ), is supported on
54
+ 1 . You need to know what computation backend, e.g. CUDA 11.8 (` cu118 ` ), is supported on
55
55
your local machine. This can be quite challenging for new users and at least tedious
56
56
for more experienced ones.
57
57
2 . Besides the stable binaries, PyTorch also offers nightly and test ones. To install
58
- them, you need a different ` --extra-index-url ` for each.
59
- 3 . For the nightly and test channel you also need to supply the ` --pre ` option. Failing
60
- to do so, will pull the stable binary from PyPI even if the rest of the installation
61
- command is correct.
58
+ them, you need a different ` --index-url ` for each.
62
59
63
60
If any of these points don't sound appealing to you, and you just want to have the same
64
61
user experience as ` pip install ` for PyTorch distributions, ` light-the-torch ` was made
@@ -96,11 +93,11 @@ In fact, `ltt` is `pip` with a few added options:
96
93
the computation backend you want to use:
97
94
98
95
``` shell
99
- ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio
96
+ ltt install --pytorch-computation-backend=cu121 torch torchvision torchaudio
100
97
```
101
98
102
99
Borrowing from the mutex packages that PyTorch provides for ` conda ` installations,
103
- ` --cpuonly ` is available as shorthand for ` --pytorch-computation-backend=cu102 ` .
100
+ ` --cpuonly ` is available as shorthand for ` --pytorch-computation-backend=cpu ` .
104
101
105
102
In addition, the computation backend to be installed can also be set through the
106
103
` LTT_PYTORCH_COMPUTATION_BACKEND ` environment variable. It will only be honored in
@@ -113,8 +110,8 @@ In fact, `ltt` is `pip` with a few added options:
113
110
ltt install --pytorch-channel=nightly torch torchvision torchaudio
114
111
```
115
112
116
- If ` --pytorch-channel ` is not passed, using ` pip ` 's builtin ` --pre ` option will
117
- install PyTorch test binaries .
113
+ If ` --pytorch-channel ` is not passed, using ` pip ` 's builtin ` --pre ` option implies
114
+ ` --pytorch-channel= test` .
118
115
119
116
Of course, you are not limited to install only PyTorch distributions. Everything shown
120
117
above also works if you install packages that depend on PyTorch:
@@ -133,8 +130,8 @@ specific tasks.
133
130
134
131
- While searching for a download link for a PyTorch distribution, ` light-the-torch `
135
132
replaces the default search index with an official PyTorch download link. This is
136
- equivalent to calling ` pip install ` with the ` --extra- index-url ` option only for
137
- PyTorch distributions.
133
+ equivalent to calling ` pip install ` with the ` --index-url ` option only for PyTorch
134
+ distributions.
138
135
- While evaluating possible PyTorch installation candidates, ` light-the-torch ` culls
139
136
binaries incompatible with the hardware.
140
137
@@ -144,16 +141,18 @@ A project as large as PyTorch is attractive for malicious actors given the large
144
141
base. For example in December 2022, PyTorch was hit by a
145
142
[ supply chain attack] ( https://pytorch.org/blog/compromised-nightly-dependency/ ) that
146
143
potentially extracted user information. The PyTorch team mitigated the attack as soon as
147
- it was detected by temporarily hosting all third party dependencies for the nightly
148
- Linux releases on their own indices. With that,
144
+ it was detected by temporarily hosting all third party dependencies on their own
145
+ indices. With that,
149
146
` pip install torch --extra-index-url https://download.pytorch.org/whl/cpu ` wouldn't pull
150
- anything from PyPI and thus avoiding malicious packages placed there.
147
+ anything from PyPI and thus avoiding malicious packages placed there. Ultimately, this
148
+ became the permanent solution and the official installation instructions now use
149
+ ` --index-url ` and thus preventing installing anything not hosted on their indices.
151
150
152
- However, due to ` light-the-torch ` 's index patching, this mitigation would have been
151
+ However, due to ` light-the-torch ` 's index patching, this mitigation was initially
153
152
completely circumvented since only PyTorch distributions would have been installed from
154
153
the PyTorch indices. Since version ` 0.7.0 ` , ` light-the-torch ` will only pull third-party
155
- dependencies for nightly Linux PyTorch releases from PyPI in case they are specifically
156
- requested and pinned. For example ` ltt install --pytorch-channel=nightly torch ` and
154
+ dependencies from PyPI in case they are specifically requested and pinned. For example
155
+ ` ltt install --pytorch-channel=nightly torch ` and
157
156
` ltt install --pytorch-channel=nightly torch sympy ` will install everything from the
158
157
PyTorch indices. However, if you pin a third party dependency, e.g.
159
158
` ltt install --pytorch-channel=nightly torch sympy==1.11.1 ` , it will be pulled from PyPI
@@ -162,7 +161,8 @@ regardless of whether the version matches the one on the PyTorch index.
162
161
In summary, ` light-the-torch ` is usually as safe as the regular PyTorch installation
163
162
instructions. However, attacks on the supply chain can lead to situations where
164
163
` light-the-torch ` circumvents mitigations done by the PyTorch team. Unfortunately,
165
- ` light-the-torch ` is not officially supported and thus also not tested by them.
164
+ ` light-the-torch ` is not officially supported by PyTorch and thus also not tested by
165
+ them.
166
166
167
167
## How do I contribute?
168
168
0 commit comments