diff --git a/README.md b/README.md
index 8a349f66..3da64f60 100644
--- a/README.md
+++ b/README.md
@@ -19,7 +19,7 @@
| Module | Role | Key Features | Dependencies / Integration |
|--------------|------|--------------|-----------------------------|
-| [**Primus-LM**](https://github.com/AMD-AGI/Primus) | End-to-end training framework | - Supports multiple training backends (Megatron, TorchTitan, etc.)
- Provides high-performance, scalable distributed training
- Deeply integrates with Turbo and Safe | - Can invoke Primus-Turbo kernels and modules
- Runs on top of Primus-Safe for stable scheduling |
+| [**Primus-LM**](https://github.com/AMD-AGI/Primus) | End-to-end training framework | - Supports multiple training backends (Megatron, TorchTitan, etc.)
- Provides high-performance, scalable distributed training
- Deeply integrates with Primus-Turbo and Primus-SaFE | - Can invoke Primus-Turbo kernels and modules
- Runs on top of Primus-SaFE for stable scheduling |
| [**Primus-Turbo**](https://github.com/AMD-AGI/Primus-Turbo) | High-performance operators & modules | - Provides common LLM training operators (FlashAttention, GEMM, Collectives, GroupedGemm, etc.)
- Modular design, directly pluggable into Primus-LM
- Optimized for different architectures and precisions | - Built on [**AITER**](https://github.com/ROCm/aiter), [**CK**](https://github.com/ROCm/composable_kernel), [**hipBLASLt**](https://github.com/ROCm/hipBLASLt), [**Triton**](https://github.com/ROCm/triton) and other operator libraries
- Can be enabled via configuration inside Primus-LM |
| [**Primus-SaFE**](https://github.com/AMD-AGI/Primus-SaFE) | Stability & platform layer | - Cluster sanity check and benchmarking
- Kubernets scheduling with topology awareness
- Fault tolerance
- Stability enhancements | - Building a training platform based on the K8s and Slurm ecosystem |