This image contains the official Open Data Hub Llama Stack distribution, with all the packages and configuration needed to run a Llama Stack server in a containerized environment.
The image is currently shipping with upstream Llama Stack version 0.2.22
You can see an overview of the APIs and Providers the image ships with in the table below.
| API | Provider |
|---|---|
| agents | inline::meta-reference |
| datasetio | inline::localfs |
| datasetio | remote::huggingface |
| eval | remote::trustyai_lmeval |
| files | inline::localfs |
| inference | inline::sentence-transformers |
| inference | remote::azure |
| inference | remote::bedrock |
| inference | remote::openai |
| inference | remote::vertexai |
| inference | remote::vllm |
| inference | remote::watsonx |
| safety | remote::trustyai_fms |
| scoring | inline::basic |
| scoring | inline::braintrust |
| scoring | inline::llm-as-judge |
| telemetry | inline::meta-reference |
| tool_runtime | inline::rag-runtime |
| tool_runtime | remote::brave-search |
| tool_runtime | remote::model-context-protocol |
| tool_runtime | remote::tavily-search |
| vector_io | inline::milvus |
| vector_io | remote::milvus |