Skip to content

Commit 19e77b7

Browse files
authored
update README (#144)
1 parent ff84146 commit 19e77b7

File tree

2 files changed

+21
-21
lines changed

2 files changed

+21
-21
lines changed

README.md

+19-19
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ package indices, you can still use `pip install`, but some
4242
[additional options](https://pytorch.org/get-started/locally/) are needed:
4343

4444
```shell
45-
pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
45+
pip install torch --index-url https://download.pytorch.org/whl/cu118
4646
```
4747

4848
[^1]:
@@ -51,14 +51,11 @@ pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
5151

5252
While this is certainly an improvement, it still has a few downsides:
5353

54-
1. You need to know what computation backend, e.g. CUDA 11.3 (`cu113`), is supported on
54+
1. You need to know what computation backend, e.g. CUDA 11.8 (`cu118`), is supported on
5555
your local machine. This can be quite challenging for new users and at least tedious
5656
for more experienced ones.
5757
2. Besides the stable binaries, PyTorch also offers nightly and test ones. To install
58-
them, you need a different `--extra-index-url` for each.
59-
3. For the nightly and test channel you also need to supply the `--pre` option. Failing
60-
to do so, will pull the stable binary from PyPI even if the rest of the installation
61-
command is correct.
58+
them, you need a different `--index-url` for each.
6259

6360
If any of these points don't sound appealing to you, and you just want to have the same
6461
user experience as `pip install` for PyTorch distributions, `light-the-torch` was made
@@ -96,11 +93,11 @@ In fact, `ltt` is `pip` with a few added options:
9693
the computation backend you want to use:
9794

9895
```shell
99-
ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio
96+
ltt install --pytorch-computation-backend=cu121 torch torchvision torchaudio
10097
```
10198

10299
Borrowing from the mutex packages that PyTorch provides for `conda` installations,
103-
`--cpuonly` is available as shorthand for `--pytorch-computation-backend=cu102`.
100+
`--cpuonly` is available as shorthand for `--pytorch-computation-backend=cpu`.
104101

105102
In addition, the computation backend to be installed can also be set through the
106103
`LTT_PYTORCH_COMPUTATION_BACKEND` environment variable. It will only be honored in
@@ -113,8 +110,8 @@ In fact, `ltt` is `pip` with a few added options:
113110
ltt install --pytorch-channel=nightly torch torchvision torchaudio
114111
```
115112

116-
If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option will
117-
install PyTorch test binaries.
113+
If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option implies
114+
`--pytorch-channel=test`.
118115

119116
Of course, you are not limited to install only PyTorch distributions. Everything shown
120117
above also works if you install packages that depend on PyTorch:
@@ -133,8 +130,8 @@ specific tasks.
133130

134131
- While searching for a download link for a PyTorch distribution, `light-the-torch`
135132
replaces the default search index with an official PyTorch download link. This is
136-
equivalent to calling `pip install` with the `--extra-index-url` option only for
137-
PyTorch distributions.
133+
equivalent to calling `pip install` with the `--index-url` option only for PyTorch
134+
distributions.
138135
- While evaluating possible PyTorch installation candidates, `light-the-torch` culls
139136
binaries incompatible with the hardware.
140137

@@ -144,16 +141,18 @@ A project as large as PyTorch is attractive for malicious actors given the large
144141
base. For example in December 2022, PyTorch was hit by a
145142
[supply chain attack](https://pytorch.org/blog/compromised-nightly-dependency/) that
146143
potentially extracted user information. The PyTorch team mitigated the attack as soon as
147-
it was detected by temporarily hosting all third party dependencies for the nightly
148-
Linux releases on their own indices. With that,
144+
it was detected by temporarily hosting all third party dependencies on their own
145+
indices. With that,
149146
`pip install torch --extra-index-url https://download.pytorch.org/whl/cpu` wouldn't pull
150-
anything from PyPI and thus avoiding malicious packages placed there.
147+
anything from PyPI and thus avoiding malicious packages placed there. Ultimately, this
148+
became the permanent solution and the official installation instructions now use
149+
`--index-url` and thus preventing installing anything not hosted on their indices.
151150

152-
However, due to `light-the-torch`'s index patching, this mitigation would have been
151+
However, due to `light-the-torch`'s index patching, this mitigation was initially
153152
completely circumvented since only PyTorch distributions would have been installed from
154153
the PyTorch indices. Since version `0.7.0`, `light-the-torch` will only pull third-party
155-
dependencies for nightly Linux PyTorch releases from PyPI in case they are specifically
156-
requested and pinned. For example `ltt install --pytorch-channel=nightly torch` and
154+
dependencies from PyPI in case they are specifically requested and pinned. For example
155+
`ltt install --pytorch-channel=nightly torch` and
157156
`ltt install --pytorch-channel=nightly torch sympy` will install everything from the
158157
PyTorch indices. However, if you pin a third party dependency, e.g.
159158
`ltt install --pytorch-channel=nightly torch sympy==1.11.1`, it will be pulled from PyPI
@@ -162,7 +161,8 @@ regardless of whether the version matches the one on the PyTorch index.
162161
In summary, `light-the-torch` is usually as safe as the regular PyTorch installation
163162
instructions. However, attacks on the supply chain can lead to situations where
164163
`light-the-torch` circumvents mitigations done by the PyTorch team. Unfortunately,
165-
`light-the-torch` is not officially supported and thus also not tested by them.
164+
`light-the-torch` is not officially supported by PyTorch and thus also not tested by
165+
them.
166166

167167
## How do I contribute?
168168

light_the_torch/_patch.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -123,9 +123,9 @@ def computation_backend_parser_options():
123123
"--pytorch-computation-backend",
124124
help=(
125125
"Computation backend for compiled PyTorch distributions, "
126-
"e.g. 'cu102', 'cu115', or 'cpu'. "
126+
"e.g. 'cu118', 'cu121', or 'cpu'. "
127127
"Multiple computation backends can be passed as a comma-separated "
128-
"list, e.g 'cu102,cu113,cu116'. "
128+
"list, e.g 'cu118,cu121'. "
129129
"If not specified, the computation backend is detected from the "
130130
"available hardware, preferring CUDA over CPU."
131131
),

0 commit comments

Comments
 (0)