Skip to content

Commit f9b03cc

Browse files
authored
[Doc] Remove rocm build from source instructions (#26733)
ROCM ep was removed since onnxruntime 1.23. Update documents: * Remove build from source instructions.
1 parent d1da8a0 commit f9b03cc

File tree

6 files changed

+4
-84
lines changed

6 files changed

+4
-84
lines changed

docs/build/eps.md

Lines changed: 1 addition & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -652,10 +652,8 @@ See more information on the MIGraphX Execution Provider [here](../execution-prov
652652
### Prerequisites
653653
{: .no_toc }
654654
655-
* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.3.1/)
656-
* The MIGraphX execution provider for ONNX Runtime is built and tested with ROCm6.3.1
655+
* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/index.html)
657656
* Install [MIGraphX](https://github.com/ROCmSoftwarePlatform/AMDMIGraphX)
658-
* The path to MIGraphX installation must be provided via the `--migraphx_home parameter`.
659657
660658
### Build Instructions
661659
{: .no_toc }
@@ -676,35 +674,6 @@ Then the python wheels(*.whl) could be found at ```./build/Linux/Release/dist```
676674
677675
---
678676
679-
## AMD ROCm
680-
681-
See more information on the ROCm Execution Provider [here](../execution-providers/ROCm-ExecutionProvider.md).
682-
683-
### Prerequisites
684-
{: .no_toc }
685-
686-
* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.3.1/)
687-
* The ROCm execution provider for ONNX Runtime is built and tested with ROCm6.3.1
688-
689-
### Build Instructions
690-
{: .no_toc }
691-
692-
#### Linux
693-
694-
```bash
695-
./build.sh --config <Release|Debug|RelWithDebInfo> --parallel --use_rocm --rocm_home <path to ROCm home>
696-
```
697-
698-
Dockerfile instructions are available [here](https://github.com/microsoft/onnxruntime/tree/main/dockerfiles#rocm).
699-
700-
#### Build Phython Wheel
701-
702-
`./build.sh --config Release --build_wheel --parallel --use_rocm --rocm_home /opt/rocm`
703-
704-
Then the python wheels(*.whl) could be found at ```./build/Linux/Release/dist```.
705-
706-
---
707-
708677
## NNAPI
709678
710679
Usage of NNAPI on Android platforms is via the NNAPI Execution Provider (EP).

docs/build/training.md

Lines changed: 0 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -128,36 +128,6 @@ The default NVIDIA GPU build requires CUDA runtime libraries installed on the sy
128128

129129
That's it! Once the build is complete, you should be able to use the ONNX Runtime libraries and executables in your projects. Note that these steps are general and may need to be adjusted based on your specific environment and requirements. For more information, you can ask for help on the [ONNX Runtime GitHub community](https://github.com/microsoft/onnxruntime/discussions/new?category=q-a).
130130
131-
## GPU / ROCm
132-
### Prerequisites
133-
{: .no_toc }
134-
135-
The default AMD GPU build requires ROCm software toolkit installed on the system:
136-
137-
* [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.0.0/) 6.0.0
138-
139-
### Build instructions
140-
{: .no_toc }
141-
142-
1. Checkout this code repo with
143-
144-
```bash
145-
git clone https://github.com/microsoft/onnxruntime
146-
cd onnxruntime
147-
```
148-
149-
2. Create the ONNX Runtime Python wheel
150-
151-
```bash
152-
./build.sh --config Release --enable_training --build_wheel --parallel --skip_tests --use_rocm --rocm_home /opt/rocm
153-
```
154-
155-
3. Install the .whl file in `./build/Linux/RelWithDebInfo/dist` for ONNX Runtime Training.
156-
157-
```bash
158-
python -m pip install build/Linux/RelWithDebInfo/dist/*.whl
159-
```
160-
161131
## DNNL and MKLML
162132
163133
### Build Instructions

docs/execution-providers/ROCm-ExecutionProvider.md

Lines changed: 2 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,9 @@ redirect_from: /docs/reference/execution-providers/ROCm-ExecutionProvider
1111

1212
The ROCm Execution Provider enables hardware accelerated computation on AMD ROCm-enabled GPUs.
1313

14-
** NOTE ** As of ROCm 7.1 There will be no more ROCm Execution Provider support provider by Microsoft
14+
** NOTE ** ROCm Execution Provider has been removed since 1.23 release. Please Migrate your applications to use the [MIGraphX Execution Provider](https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html#migraphx-execution-provider)
1515

16-
Please Migrate your applications to use the [MIGraphX Execution Provider](https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html#migraphx-execution-provider)
17-
18-
ROCm 7.0 is the last offiicaly AMD supported distribution of this provider and all builds going forward (ROCm 7.1+) Will have ROCm EP removed.
16+
ROCm 7.0 is the last offiicaly AMD supported distribution of this provider and all builds going forward (ROCm 7.1+) will have ROCm EP removed.
1917

2018
Please refer to this [Pull Request](https://github.com/microsoft/onnxruntime/pull/25181) for background.
2119

@@ -31,12 +29,6 @@ Please refer to this [Pull Request](https://github.com/microsoft/onnxruntime/pul
3129

3230
For Nightly PyTorch builds please see [Pytorch home](https://pytorch.org/) and select ROCm as the Compute Platform.
3331

34-
Pre-built binaries of ONNX Runtime with ROCm EP are published for most language bindings. Please reference [Install ORT](../install).
35-
36-
## Build from source
37-
38-
For build instructions, please see the [BUILD page](../build/eps.md#amd-rocm). Prebuild .whl files are provided below in the requirements section and are hosted on [repo.radeon.com](https://repo.radeon.com/rocm/manylinux/). Ubuntu based docker development environments are provided in the Docker Support section. New wheels and dockers are published each ROCm release.
39-
4032
## Requirements
4133

4234
Below is the matrix of supported ROCm versions corresponding to Ubuntu builds.

docs/execution-providers/index.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,9 +28,7 @@ ONNX Runtime supports many different execution providers today. Some of the EPs
2828
|[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|[DirectML](../execution-providers/DirectML-ExecutionProvider.md)|[Android Neural Networks API](../execution-providers/NNAPI-ExecutionProvider.md)|[Huawei CANN](../execution-providers/community-maintained/CANN-ExecutionProvider.md) (*preview*)|
2929
|[Intel OpenVINO](../execution-providers/OpenVINO-ExecutionProvider.md)|[AMD MIGraphX](../execution-providers/MIGraphX-ExecutionProvider.md)|[Arm NN](../execution-providers/community-maintained/ArmNN-ExecutionProvider.md) (*preview*)|[AZURE](../execution-providers/Azure-ExecutionProvider.md) (*preview*)|
3030
|[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)|[Intel OpenVINO](../execution-providers/OpenVINO-ExecutionProvider.md)|[CoreML](../execution-providers/CoreML-ExecutionProvider.md) (*preview*)|
31-
||[AMD ROCm](../execution-providers/ROCm-ExecutionProvider.md)|[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|
32-
||[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|[Qualcomm QNN](../execution-providers/QNN-ExecutionProvider.md)|
33-
|||[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)|
31+
|[AMD ROCm](../execution-providers/ROCm-ExecutionProvider.md)(*deprecated*)|[Qualcomm QNN](../execution-providers/QNN-ExecutionProvider.md)|[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)||
3432

3533
## Add an Execution Provider
3634

docs/genai/reference/config.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -468,7 +468,6 @@ Options passed to ONNX Runtime for model execution.
468468
- NvTensorRtRtx
469469
- OpenVINO
470470
- QNN
471-
- rocm
472471
- WebGPU
473472
- VitisAI
474473

docs/install/index.md

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -119,14 +119,6 @@ pip install coloredlogs flatbuffers numpy packaging protobuf sympy
119119
pip install --pre --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/ onnxruntime-qnn
120120
```
121121

122-
### Install ONNX Runtime GPU (ROCm)
123-
124-
For ROCm, please follow instructions to install it at the [AMD ROCm install docs](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.2.0/). The ROCm execution provider for ONNX Runtime is built and tested with ROCm 6.2.0.
125-
126-
To build from source on Linux, follow the instructions [here](https://onnxruntime.ai/docs/build/eps.html#amd-rocm).
127-
128-
129-
130122
## C#/C/C++/WinML Installs
131123

132124
### Install ONNX Runtime

0 commit comments

Comments
 (0)