One stop shop for running AI/ML on AWS
AWS Doc ยท Available Images ยท Tutorials
- [2025/12/19] Released v0.13.0 vLLM DLCs
- EC2/EKS/ECS:
public.ecr.aws/deep-learning-containers/vllm:0.13-gpu-py312-cu130-ubuntu22.04-ec2 - SageMaker:
public.ecr.aws/deep-learning-containers/vllm:0.13.0-gpu-py312
- EC2/EKS/ECS:
- [2025/11/17] Released first SGLang DLCs
- SageMaker:
public.ecr.aws/deep-learning-containers/sglang:0.5.5-gpu-py312
- SageMaker:
- [2026/02/10] Extended support for PyTorch 2.6 Inference containers until June 30, 2026
- PyTorch 2.6 Inference images will continue to receive security patches and updates through end of June 2026
- For complete framework support timelines, see our Support Policy
- ๐ Master Distributed Training on Amazon EKS - Set up and validate a distributed training environment on Amazon EKS for scalable ML model training across multiple nodes.
- ๐ Level Up with Amazon SageMaker AI & MLflow - Integrate AWS DLCs with Amazon SageMaker AI's managed MLflow service for streamlined experiment tracking and model management.
- ๐ Deploy LLMs Like a Pro on Amazon EKS - Deploy and serve Large Language Models efficiently on Amazon EKS using vLLM Deep Learning Containers.
- ๐ฏ Web Automation with Meta Llama 3.2 Vision - Fine-tune and deploy Meta's Llama 3.2 Vision model for AI-powered web automation.
- โก Supercharge Your DL Environment - Integrate AWS DLCs with Amazon Q Developer and Model Context Protocol (MCP).
- ๐ LLM Deployment on Amazon EKS Workshop - Deploy and optimize LLMs on Amazon EKS using vLLM Deep Learning Containers. For more information, see Sample Code
AWS Deep Learning Containers (DLCs) are a suite of Docker images that streamline the deployment of AI/ML workloads on Amazon SageMaker AI, Amazon EKS, and Amazon EC2.
- Pre-optimized Environments - Production-ready containers with optimized deep learning frameworks
- Latest AI/ML Tools - Quick access to cutting-edge frameworks like vLLM, SGLang, and PyTorch
- Multi-Platform Support - Run seamlessly on Amazon SageMaker AI, Amazon EKS, or Amazon EC2
- Enterprise-Ready - Built with security, performance, and scalability in mind
- Rapid Deployment - Get started in minutes with pre-configured environments
- Framework Flexibility - Support for popular frameworks like PyTorch, TensorFlow, and more
- Performance Optimized - Containers tuned for AWS infrastructure
- Regular Updates - Quick access to latest framework releases and security patches
- AWS Integration - Seamless compatibility with AWS AI/ML services
- Data Scientists building and training models
- ML Engineers deploying production workloads
- DevOps teams managing ML infrastructure
- Researchers exploring cutting-edge AI capabilities
Our containers undergo rigorous security scanning and are regularly updated to address vulnerabilities, ensuring your ML workloads run on a secure foundation.
For more information on our security policy, see Security.
- Getting Started - Get up and running in minutes
- Tutorials - Step-by-step guides
- Available Images - Browse all container images
- Support Policy - Framework versions and timelines
- Security - Security policy
- GitHub Issues - Report bugs or request features
This project is licensed under the Apache-2.0 License.