Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/source/amdgpu/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ Hosted wheels are available for ROCm, please check out the [installation instruc

### Text Generation Inference library

Hugging Face's [Text Generation Inference](https://huggingface.co/docs/text-generation-inference/index) library (TGI) is designed for low latency LLMs serving, and natively supports AMD Instinct MI210, MI250 and MI3O0 GPUs. Please refer to the [Quick Tour section](https://huggingface.co/docs/text-generation-inference/quicktour) for more details.
Hugging Face's [Text Generation Inference](https://huggingface.co/docs/text-generation-inference/index) library (TGI) is designed for low latency LLMs serving, and natively supports AMD Instinct MI210, MI250 and MI300 GPUs. Please refer to the [Quick Tour section](https://huggingface.co/docs/text-generation-inference/quicktour) for more details.

Using TGI on ROCm with AMD Instinct MI210 or MI250 or MI300 GPUs is as simple as using the docker image [`ghcr.io/huggingface/text-generation-inference:latest-rocm`](https://huggingface.co/docs/text-generation-inference/quicktour).

Expand Down