Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 1 addition & 32 deletions docs/build/eps.md
Original file line number Diff line number Diff line change
Expand Up @@ -652,10 +652,8 @@ See more information on the MIGraphX Execution Provider [here](../execution-prov
### Prerequisites
{: .no_toc }

* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.3.1/)
* The MIGraphX execution provider for ONNX Runtime is built and tested with ROCm6.3.1
* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/index.html)
* Install [MIGraphX](https://github.com/ROCmSoftwarePlatform/AMDMIGraphX)
* The path to MIGraphX installation must be provided via the `--migraphx_home parameter`.

### Build Instructions
{: .no_toc }
Expand All @@ -676,35 +674,6 @@ Then the python wheels(*.whl) could be found at ```./build/Linux/Release/dist```

---

## AMD ROCm

See more information on the ROCm Execution Provider [here](../execution-providers/ROCm-ExecutionProvider.md).

### Prerequisites
{: .no_toc }

* Install [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.3.1/)
* The ROCm execution provider for ONNX Runtime is built and tested with ROCm6.3.1

### Build Instructions
{: .no_toc }

#### Linux

```bash
./build.sh --config <Release|Debug|RelWithDebInfo> --parallel --use_rocm --rocm_home <path to ROCm home>
```

Dockerfile instructions are available [here](https://github.com/microsoft/onnxruntime/tree/main/dockerfiles#rocm).

#### Build Phython Wheel

`./build.sh --config Release --build_wheel --parallel --use_rocm --rocm_home /opt/rocm`

Then the python wheels(*.whl) could be found at ```./build/Linux/Release/dist```.

---

## NNAPI

Usage of NNAPI on Android platforms is via the NNAPI Execution Provider (EP).
Expand Down
30 changes: 0 additions & 30 deletions docs/build/training.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,36 +128,6 @@ The default NVIDIA GPU build requires CUDA runtime libraries installed on the sy

That's it! Once the build is complete, you should be able to use the ONNX Runtime libraries and executables in your projects. Note that these steps are general and may need to be adjusted based on your specific environment and requirements. For more information, you can ask for help on the [ONNX Runtime GitHub community](https://github.com/microsoft/onnxruntime/discussions/new?category=q-a).

## GPU / ROCm
### Prerequisites
{: .no_toc }

The default AMD GPU build requires ROCm software toolkit installed on the system:

* [ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.0.0/) 6.0.0

### Build instructions
{: .no_toc }

1. Checkout this code repo with

```bash
git clone https://github.com/microsoft/onnxruntime
cd onnxruntime
```

2. Create the ONNX Runtime Python wheel

```bash
./build.sh --config Release --enable_training --build_wheel --parallel --skip_tests --use_rocm --rocm_home /opt/rocm
```

3. Install the .whl file in `./build/Linux/RelWithDebInfo/dist` for ONNX Runtime Training.

```bash
python -m pip install build/Linux/RelWithDebInfo/dist/*.whl
```

## DNNL and MKLML

### Build Instructions
Expand Down
12 changes: 2 additions & 10 deletions docs/execution-providers/ROCm-ExecutionProvider.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,9 @@ redirect_from: /docs/reference/execution-providers/ROCm-ExecutionProvider

The ROCm Execution Provider enables hardware accelerated computation on AMD ROCm-enabled GPUs.

** NOTE ** As of ROCm 7.1 There will be no more ROCm Execution Provider support provider by Microsoft
** NOTE ** ROCm Execution Provider has been removed since 1.23 release. Please Migrate your applications to use the [MIGraphX Execution Provider](https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html#migraphx-execution-provider)

Please Migrate your applications to use the [MIGraphX Execution Provider](https://onnxruntime.ai/docs/execution-providers/MIGraphX-ExecutionProvider.html#migraphx-execution-provider)

ROCm 7.0 is the last offiicaly AMD supported distribution of this provider and all builds going forward (ROCm 7.1+) Will have ROCm EP removed.
ROCm 7.0 is the last offiicaly AMD supported distribution of this provider and all builds going forward (ROCm 7.1+) will have ROCm EP removed.

Please refer to this [Pull Request](https://github.com/microsoft/onnxruntime/pull/25181) for background.

Expand All @@ -31,12 +29,6 @@ Please refer to this [Pull Request](https://github.com/microsoft/onnxruntime/pul

For Nightly PyTorch builds please see [Pytorch home](https://pytorch.org/) and select ROCm as the Compute Platform.

Pre-built binaries of ONNX Runtime with ROCm EP are published for most language bindings. Please reference [Install ORT](../install).

## Build from source

For build instructions, please see the [BUILD page](../build/eps.md#amd-rocm). Prebuild .whl files are provided below in the requirements section and are hosted on [repo.radeon.com](https://repo.radeon.com/rocm/manylinux/). Ubuntu based docker development environments are provided in the Docker Support section. New wheels and dockers are published each ROCm release.

## Requirements

Below is the matrix of supported ROCm versions corresponding to Ubuntu builds.
Expand Down
4 changes: 1 addition & 3 deletions docs/execution-providers/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,7 @@ ONNX Runtime supports many different execution providers today. Some of the EPs
|[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|[DirectML](../execution-providers/DirectML-ExecutionProvider.md)|[Android Neural Networks API](../execution-providers/NNAPI-ExecutionProvider.md)|[Huawei CANN](../execution-providers/community-maintained/CANN-ExecutionProvider.md) (*preview*)|
|[Intel OpenVINO](../execution-providers/OpenVINO-ExecutionProvider.md)|[AMD MIGraphX](../execution-providers/MIGraphX-ExecutionProvider.md)|[Arm NN](../execution-providers/community-maintained/ArmNN-ExecutionProvider.md) (*preview*)|[AZURE](../execution-providers/Azure-ExecutionProvider.md) (*preview*)|
|[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)|[Intel OpenVINO](../execution-providers/OpenVINO-ExecutionProvider.md)|[CoreML](../execution-providers/CoreML-ExecutionProvider.md) (*preview*)|
||[AMD ROCm](../execution-providers/ROCm-ExecutionProvider.md)|[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|
||[TVM](../execution-providers/community-maintained/TVM-ExecutionProvider.md) (*preview*)|[Qualcomm QNN](../execution-providers/QNN-ExecutionProvider.md)|
|||[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)|
|[AMD ROCm](../execution-providers/ROCm-ExecutionProvider.md)(*deprecated*)|[Qualcomm QNN](../execution-providers/QNN-ExecutionProvider.md)|[XNNPACK](../execution-providers/Xnnpack-ExecutionProvider.md)||

## Add an Execution Provider

Expand Down
1 change: 0 additions & 1 deletion docs/genai/reference/config.md
Original file line number Diff line number Diff line change
Expand Up @@ -468,7 +468,6 @@ Options passed to ONNX Runtime for model execution.
- NvTensorRtRtx
- OpenVINO
- QNN
- rocm
- WebGPU
- VitisAI

Expand Down
8 changes: 0 additions & 8 deletions docs/install/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,14 +119,6 @@ pip install coloredlogs flatbuffers numpy packaging protobuf sympy
pip install --pre --index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/ORT-Nightly/pypi/simple/ onnxruntime-qnn
```

### Install ONNX Runtime GPU (ROCm)

For ROCm, please follow instructions to install it at the [AMD ROCm install docs](https://rocm.docs.amd.com/projects/install-on-linux/en/docs-6.2.0/). The ROCm execution provider for ONNX Runtime is built and tested with ROCm 6.2.0.

To build from source on Linux, follow the instructions [here](https://onnxruntime.ai/docs/build/eps.html#amd-rocm).



## C#/C/C++/WinML Installs

### Install ONNX Runtime
Expand Down
Loading