Skip to content

Commit 8c33979

Browse files
authored
Update 2.3 readme (#6957)
1 parent 7674030 commit 8c33979

File tree

1 file changed

+3
-155
lines changed

1 file changed

+3
-155
lines changed

README.md

Lines changed: 3 additions & 155 deletions
Original file line numberDiff line numberDiff line change
@@ -14,14 +14,7 @@ TPUs](https://cloud.google.com/tpu/). You can try it right now, for free, on a
1414
single Cloud TPU VM with
1515
[Kaggle](https://www.kaggle.com/discussions/product-feedback/369338)!
1616

17-
Take a look at one of our [Kaggle
18-
notebooks](https://github.com/pytorch/xla/tree/master/contrib/kaggle) to get
19-
started:
20-
21-
* [Stable Diffusion with PyTorch/XLA
22-
2.0](https://github.com/pytorch/xla/blob/master/contrib/kaggle/pytorch-xla-2-0-on-kaggle.ipynb)
23-
* [Distributed PyTorch/XLA
24-
Basics](https://github.com/pytorch/xla/blob/master/contrib/kaggle/distributed-pytorch-xla-basics-with-pjrt.ipynb)
17+
Please find tutorials on our [GitHub page](https://github.com/pytorch/xla) for the latest release.
2518

2619
## Installation
2720

@@ -148,153 +141,8 @@ Our comprehensive user guides are available at:
148141

149142
## Available docker images and wheels
150143

151-
### Python packages
152-
153-
PyTorch/XLA releases starting with version r2.1 will be available on PyPI. You
154-
can now install the main build with `pip install torch_xla`. To also install the
155-
Cloud TPU plugin, install the optional `tpu` dependencies:
156-
157-
```
158-
pip install torch_xla[tpu] -f https://storage.googleapis.com/libtpu-releases/index.html
159-
```
160-
161-
GPU, XRT (legacy runtime), and nightly builds are available in our public GCS
162-
bucket.
163-
164-
| Version | Cloud TPU/GPU VMs Wheel |
165-
| --- | ----------- |
166-
| 2.2 (Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp38-cp38-manylinux_2_28_x86_64.whl` |
167-
| 2.2 (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl` |
168-
| 2.2 (CUDA 12.1 + Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.2.0-cp38-cp38-manylinux_2_28_x86_64.whl` |
169-
| 2.2 (CUDA 12.1 + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-2.2.0-cp310-cp310-manylinux_2_28_x86_64.whl` |
170-
| nightly (Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` |
171-
| nightly (Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-nightly-cp310-cp310-linux_x86_64.whl` |
172-
| nightly (CUDA 12.1 + Python 3.8) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.1/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` |
173-
174-
<details>
175-
176-
<summary>older versions</summary>
177-
178-
| Version | Cloud TPU VMs Wheel |
179-
|---------|-------------------|
180-
| 2.1 (XRT + Python 3.10) | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch_xla-2.1.0%2Bxrt-cp310-cp310-manylinux_2_28_x86_64.whl` |
181-
| 2.1 (Python 3.8) | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-2.1-cp38-cp38-linux_x86_64.whl` |
182-
| 2.0 (Python 3.8) | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` |
183-
| 1.13 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.13-cp38-cp38-linux_x86_64.whl` |
184-
| 1.12 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.12-cp38-cp38-linux_x86_64.whl` |
185-
| 1.11 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.11-cp38-cp38-linux_x86_64.whl` |
186-
| 1.10 | `https://storage.googleapis.com/tpu-pytorch/wheels/tpuvm/torch_xla-1.10-cp38-cp38-linux_x86_64.whl` |
187-
188-
<br/>
189-
190-
Note: For TPU Pod customers using XRT (our legacy runtime), we have custom
191-
wheels for `torch` and `torch_xla` at
192-
`https://storage.googleapis.com/tpu-pytorch/wheels/xrt`.
193-
194-
| Package | Cloud TPU VMs Wheel (XRT on Pod, Legacy Only) |
195-
| --- | ----------- |
196-
| torch_xla | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch_xla-2.1.0%2Bxrt-cp310-cp310-manylinux_2_28_x86_64.whl` |
197-
| torch | `https://storage.googleapis.com/pytorch-xla-releases/wheels/xrt/tpuvm/torch-2.1.0%2Bxrt-cp310-cp310-linux_x86_64.whl` |
198-
199-
<br/>
200-
201-
| Version | GPU Wheel + Python 3.8 |
202-
| --- | ----------- |
203-
| 2.1+ CUDA 11.8 | `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/11.8/torch_xla-2.1.0-cp38-cp38-manylinux_2_28_x86_64.whl` |
204-
| 2.0 + CUDA 11.8 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/118/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` |
205-
| 2.0 + CUDA 11.7 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/117/torch_xla-2.0-cp38-cp38-linux_x86_64.whl` |
206-
| 1.13 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-1.13-cp38-cp38-linux_x86_64.whl` |
207-
| nightly + CUDA 12.0 >= 2023/06/27| `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/12.0/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` |
208-
| nightly + CUDA 11.8 <= 2023/04/25| `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/118/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` |
209-
| nightly + CUDA 11.8 >= 2023/04/25| `https://storage.googleapis.com/pytorch-xla-releases/wheels/cuda/11.8/torch_xla-nightly-cp38-cp38-linux_x86_64.whl` |
210-
211-
<br/>
212-
213-
| Version | GPU Wheel + Python 3.7 |
214-
| --- | ----------- |
215-
| 1.13 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-1.13-cp37-cp37m-linux_x86_64.whl` |
216-
| 1.12 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-1.12-cp37-cp37m-linux_x86_64.whl` |
217-
| 1.11 | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-1.11-cp37-cp37m-linux_x86_64.whl` |
218-
| nightly | `https://storage.googleapis.com/tpu-pytorch/wheels/cuda/112/torch_xla-nightly-cp37-cp37-linux_x86_64.whl` |
219-
220-
<br/>
221-
222-
| Version | Colab TPU Wheel |
223-
| --- | ----------- |
224-
| 2.0 | `https://storage.googleapis.com/tpu-pytorch/wheels/colab/torch_xla-2.0-cp310-cp310-linux_x86_64.whl` |
225-
226-
You can also add `+yyyymmdd` after `torch_xla-nightly` to get the nightly wheel
227-
of a specified date. To get the companion pytorch and torchvision nightly wheel,
228-
replace the `torch_xla` with `torch` or `torchvision` on above wheel links.
229-
230-
#### Installing libtpu (before PyTorch/XLA 2.0)
231-
232-
For PyTorch/XLA release r2.0 and older and when developing PyTorch/XLA, install
233-
the `libtpu` pip package with the following command:
234-
235-
```
236-
pip3 install torch_xla[tpuvm]
237-
```
238-
239-
This is only required on Cloud TPU VMs.
240-
241-
</details>
242-
243-
### Docker
244-
245-
| Version | Cloud TPU VMs Docker |
246-
| --- | ----------- |
247-
| 2.2 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.2.0_3.10_tpuvm` |
248-
| 2.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.1.0_3.10_tpuvm` |
249-
| 2.0 | `gcr.io/tpu-pytorch/xla:r2.0_3.8_tpuvm` |
250-
| 1.13 | `gcr.io/tpu-pytorch/xla:r1.13_3.8_tpuvm` |
251-
| nightly python | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.10_tpuvm` |
252-
253-
<br/>
254-
255-
| Version | GPU CUDA 12.1 Docker |
256-
| --- | ----------- |
257-
| 2.2 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.2.0_3.10_cuda_12.1` |
258-
| 2.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.1.0_3.10_cuda_12.1` |
259-
| nightly | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_12.1` |
260-
| nightly at date | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_12.1_YYYYMMDD` |
261-
262-
<br/>
263-
264-
| Version | GPU CUDA 11.8 + Docker |
265-
| --- | ----------- |
266-
| 2.1 | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:r2.1.0_3.10_cuda_11.8` |
267-
| 2.0 | `gcr.io/tpu-pytorch/xla:r2.0_3.8_cuda_11.8` |
268-
| nightly | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_11.8` |
269-
| nightly at date | `us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla:nightly_3.8_cuda_11.8_YYYYMMDD` |
270-
271-
<br/>
272-
273-
<details>
274-
275-
<summary>older versions</summary>
276-
277-
| Version | GPU CUDA 11.7 + Docker |
278-
| --- | ----------- |
279-
| 2.0 | `gcr.io/tpu-pytorch/xla:r2.0_3.8_cuda_11.7` |
280-
281-
<br/>
282-
283-
| Version | GPU CUDA 11.2 + Docker |
284-
| --- | ----------- |
285-
| 1.13 | `gcr.io/tpu-pytorch/xla:r1.13_3.8_cuda_11.2` |
286-
287-
<br/>
288-
289-
| Version | GPU CUDA 11.2 + Docker |
290-
| --- | ----------- |
291-
| 1.13 | `gcr.io/tpu-pytorch/xla:r1.13_3.7_cuda_11.2` |
292-
| 1.12 | `gcr.io/tpu-pytorch/xla:r1.12_3.7_cuda_11.2` |
293-
294-
</details>
295-
296-
To run on [compute instances with
297-
GPUs](https://cloud.google.com/compute/docs/gpus/create-vm-with-gpus).
144+
For all builds and all versions of `torch-xla`, see our main [GitHub
145+
README](https://github.com/pytorch/xla).
298146

299147
## Troubleshooting
300148

0 commit comments

Comments
 (0)