You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+15-12
Original file line number
Diff line number
Diff line change
@@ -5,9 +5,10 @@
5
5
[](https://codecov.io/gh/pmeier/light-the-torch)
6
6
7
7
`light-the-torch` is a small utility that wraps `pip` to ease the installation process
8
-
for PyTorch distributions and third-party packages that depend on them. It auto-detects
9
-
compatible CUDA versions from the local setup and installs the correct PyTorch binaries
10
-
without user interference.
8
+
for PyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on as well
9
+
as third-party packages that depend on them. It auto-detects compatible CUDA versions
10
+
from the local setup and installs the correct PyTorch binaries without user
11
+
interference.
11
12
12
13
-[Why do I need it?](#why-do-i-need-it)
13
14
-[How do I install it?](#how-do-i-install-it)
@@ -17,8 +18,8 @@ without user interference.
17
18
18
19
## Why do I need it?
19
20
20
-
PyTorch distributions are fully `pip install`'able, but PyPI, the default `pip` search
21
-
index, has some limitations:
21
+
PyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on are fully
22
+
`pip install`'able, but PyPI, the default `pip` search index, has some limitations:
22
23
23
24
1. PyPI regularly only allows binaries up to a size of
24
25
[approximately 60 MB](https://github.com/pypa/packaging-problems/issues/86). One can
@@ -34,17 +35,19 @@ index, has some limitations:
34
35
hand your NVIDIA driver version simply doesn't support the CUDA version the binary
35
36
was compiled with, you can't use any of the GPU features.
36
37
37
-
To overcome this, PyTorch also hosts _most_ binaries
38
-
[on their own package indices](https://download.pytorch.org/whl). Some distributions are
39
-
not compiled against a specific computation backend and thus hosting them on PyPI is
40
-
sufficient since they work in every environment. To access PyTorch's package indices,
41
-
you can still use `pip install`, but some
38
+
To overcome this, PyTorch also hosts _most_[^1] binaries
39
+
[on their own package indices](https://download.pytorch.org/whl). To access PyTorch's
40
+
package indices, you can still use `pip install`, but some
42
41
[additional options](https://pytorch.org/get-started/locally/) are needed:
0 commit comments