Skip to content

Latest commit

 

History

History
38 lines (33 loc) · 2.52 KB

File metadata and controls

38 lines (33 loc) · 2.52 KB

Open Data Hub Llama Stack Distribution Image

This image contains the official Open Data Hub Llama Stack distribution, with all the packages and configuration needed to run a Llama Stack server in a containerized environment.

The image is currently shipping with the Open Data Hub version of Llama Stack version 57b22a0

You can see an overview of the APIs and Providers the image ships with in the table below.

API Provider External? Enabled by default? How to enable
agents inline::meta-reference No N/A
batches inline::reference No N/A
datasetio inline::localfs No N/A
datasetio remote::huggingface No N/A
eval inline::trustyai_ragas Yes (version 0.5.1) Set the EMBEDDING_MODEL environment variable
eval remote::trustyai_lmeval Yes (version 0.4.1) N/A
eval remote::trustyai_ragas Yes (version 0.5.1) Set the KUBEFLOW_LLAMA_STACK_URL environment variable
files inline::localfs No N/A
inference inline::sentence-transformers No N/A
inference remote::azure No Set the AZURE_API_KEY environment variable
inference remote::bedrock No Set the AWS_ACCESS_KEY_ID environment variable
inference remote::openai No Set the OPENAI_API_KEY environment variable
inference remote::vertexai No Set the VERTEX_AI_PROJECT environment variable
inference remote::vllm No Set the VLLM_URL environment variable
inference remote::watsonx No Set the WATSONX_API_KEY environment variable
safety remote::trustyai_fms Yes (version 0.3.0) N/A
scoring inline::basic No N/A
scoring inline::braintrust No N/A
scoring inline::llm-as-judge No N/A
tool_runtime inline::rag-runtime No N/A
tool_runtime remote::brave-search No N/A
tool_runtime remote::model-context-protocol No N/A
tool_runtime remote::tavily-search No N/A
vector_io inline::faiss No Set the ENABLE_FAISS environment variable
vector_io inline::milvus No N/A
vector_io remote::milvus No Set the MILVUS_ENDPOINT environment variable