Skip to content

Commit 9f35757

Browse files
committed
Setting env var and checking install in a single test
1 parent 1a75482 commit 9f35757

1 file changed

Lines changed: 17 additions & 3 deletions

File tree

  • playbooks/supplemental/vllm-inference

playbooks/supplemental/vllm-inference/README.md

Lines changed: 17 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,21 +109,35 @@ python -m pip install \
109109

110110
Set the environment variables required by the ROCm pip packages before starting vLLM:
111111

112-
<!-- @test:id=set-env-var timeout=600 hidden=True setup=activate-venv -->
113112
```bash
114113
export PYTHONPATH=.venv/lib/python3.12/site-packages/_rocm_sdk_core/share/amd_smi
115114
export FLASH_ATTENTION_TRITON_AMD_ENABLE=TRUE
116115
```
117-
<!-- @test:end -->
118116

119117
Check the installation:
120118

121-
<!-- @test:id=check-install timeout=600 hidden=True setup=activate-venv -->
122119
```bash
123120
echo "=== vLLM ===" && python -c "import vllm; print('vLLM version:', vllm.__version__)"
124121
echo "=== PyTorch ===" && python -c "import torch; print('PyTorch:', torch.__version__); print('HIP available:', torch.cuda.is_available()); print('HIP built:', torch.backends.hip.is_built() if hasattr(torch.backends, 'hip') else 'N/A')"
125122
echo "=== flash-attn ===" && python -c "import flash_attn; print('flash-attn:', flash_attn.__version__)"
126123
```
124+
125+
<!-- @test:id=python-env-check-linux timeout=30 hidden=True setup=activate-venv -->
126+
```bash
127+
set -euo pipefail
128+
python3 --version
129+
which python3
130+
```
131+
<!-- @test:end -->
132+
133+
<!-- @test:id=set-env-var-and-check-install timeout=600 hidden=True setup=activate-venv -->
134+
```bash
135+
export PYTHONPATH=.venv/lib/python3.12/site-packages/_rocm_sdk_core/share/amd_smi
136+
export FLASH_ATTENTION_TRITON_AMD_ENABLE=TRUE
137+
echo "=== vLLM ===" && python -c "import vllm; print('vLLM version:', vllm.__version__)"
138+
echo "=== PyTorch ===" && python -c "import torch; print('PyTorch:', torch.__version__); print('HIP available:', torch.cuda.is_available()); print('HIP built:', torch.backends.hip.is_built() if hasattr(torch.backends, 'hip') else 'N/A')"
139+
echo "=== flash-attn ===" && python -c "import flash_attn; print('flash-attn:', flash_attn.__version__)"
140+
```
127141
<!-- @test:end -->
128142

129143
## Quick Start

0 commit comments

Comments
 (0)