Skip to content

feat: add s3 provider to files API #1950

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 15 additions & 1 deletion .github/workflows/integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ jobs:
matrix:
# Listing tests manually since some of them currently fail
# TODO: generate matrix list from tests/integration when fixed
test-type: [agents, inference, datasets, inspect, scoring, post_training, providers]
test-type: [agents, inference, datasets, inspect, scoring, post_training, providers, files]
client-type: [library, http]
fail-fast: false # we want to run all tests regardless of failure

Expand Down Expand Up @@ -52,6 +52,20 @@ jobs:
uv pip install -e .
llama stack build --template ollama --image-type venv

- name: Setup minio when testing files
if: matrix.test-type == 'files'
run: |
mkdir -p ~/minio/data
docker run \
-d \
-p 9000:9000 \
-p 9001:9001 \
--name minio \
-v ~/minio/data:/data \
-e "MINIO_ROOT_USER=ROOTNAME" \
-e "MINIO_ROOT_PASSWORD=CHANGEME123" \
quay.io/minio/minio server /data --console-address ":9001"

- name: Start Llama Stack server in background
if: matrix.client-type == 'http'
env:
Expand Down
93 changes: 36 additions & 57 deletions docs/_static/llama-stack-spec.html
Original file line number Diff line number Diff line change
Expand Up @@ -568,11 +568,11 @@
"get": {
"responses": {
"200": {
"description": "OK",
"description": "PaginatedResponse with the list of buckets",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/ListBucketResponse"
"$ref": "#/components/schemas/PaginatedResponse"
}
}
}
Expand All @@ -596,11 +596,21 @@
"description": "List all buckets.",
"parameters": [
{
"name": "bucket",
"name": "page",
"in": "query",
"required": true,
"description": "The page number (1-based). If None, starts from first page.",
"required": false,
"schema": {
"type": "string"
"type": "integer"
}
},
{
"name": "size",
"in": "query",
"description": "Number of items per page. If None or -1, returns all items.",
"required": false,
"schema": {
"type": "integer"
}
}
]
Expand Down Expand Up @@ -1850,7 +1860,7 @@
"parameters": []
}
},
"/v1/files/session:{upload_id}": {
"/v1/files/session/{upload_id}": {
"get": {
"responses": {
"200": {
Expand Down Expand Up @@ -2631,11 +2641,11 @@
"get": {
"responses": {
"200": {
"description": "OK",
"description": "PaginatedResponse with the list of files",
"content": {
"application/json": {
"schema": {
"$ref": "#/components/schemas/ListFileResponse"
"$ref": "#/components/schemas/PaginatedResponse"
}
}
}
Expand Down Expand Up @@ -2666,6 +2676,24 @@
"schema": {
"type": "string"
}
},
{
"name": "page",
"in": "query",
"description": "The page number (1-based). If None, starts from first page.",
"required": false,
"schema": {
"type": "integer"
}
},
{
"name": "size",
"in": "query",
"description": "Number of items per page. If None or -1, returns all items.",
"required": false,
"schema": {
"type": "integer"
}
}
]
}
Expand Down Expand Up @@ -9085,37 +9113,6 @@
],
"title": "Job"
},
"BucketResponse": {
"type": "object",
"properties": {
"name": {
"type": "string"
}
},
"additionalProperties": false,
"required": [
"name"
],
"title": "BucketResponse"
},
"ListBucketResponse": {
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"$ref": "#/components/schemas/BucketResponse"
},
"description": "List of FileResponse entries"
}
},
"additionalProperties": false,
"required": [
"data"
],
"title": "ListBucketResponse",
"description": "Response representing a list of file entries."
},
"ListBenchmarksResponse": {
"type": "object",
"properties": {
Expand Down Expand Up @@ -9148,24 +9145,6 @@
],
"title": "ListDatasetsResponse"
},
"ListFileResponse": {
"type": "object",
"properties": {
"data": {
"type": "array",
"items": {
"$ref": "#/components/schemas/FileResponse"
},
"description": "List of FileResponse entries"
}
},
"additionalProperties": false,
"required": [
"data"
],
"title": "ListFileResponse",
"description": "Response representing a list of file entries."
},
"ListModelsResponse": {
"type": "object",
"properties": {
Expand Down
77 changes: 32 additions & 45 deletions docs/_static/llama-stack-spec.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -379,11 +379,12 @@ paths:
get:
responses:
'200':
description: OK
description: >-
PaginatedResponse with the list of buckets
content:
application/json:
schema:
$ref: '#/components/schemas/ListBucketResponse'
$ref: '#/components/schemas/PaginatedResponse'
'400':
$ref: '#/components/responses/BadRequest400'
'429':
Expand All @@ -398,11 +399,20 @@ paths:
- Files
description: List all buckets.
parameters:
- name: bucket
- name: page
in: query
required: true
description: >-
The page number (1-based). If None, starts from first page.
required: false
schema:
type: string
type: integer
- name: size
in: query
description: >-
Number of items per page. If None or -1, returns all items.
required: false
schema:
type: integer
post:
responses:
'200':
Expand Down Expand Up @@ -1261,7 +1271,7 @@ paths:
- PostTraining (Coming Soon)
description: ''
parameters: []
/v1/files/session:{upload_id}:
/v1/files/session/{upload_id}:
get:
responses:
'200':
Expand Down Expand Up @@ -1816,11 +1826,11 @@ paths:
get:
responses:
'200':
description: OK
description: PaginatedResponse with the list of files
content:
application/json:
schema:
$ref: '#/components/schemas/ListFileResponse'
$ref: '#/components/schemas/PaginatedResponse'
'400':
$ref: '#/components/responses/BadRequest400'
'429':
Expand All @@ -1841,6 +1851,20 @@ paths:
required: true
schema:
type: string
- name: page
in: query
description: >-
The page number (1-based). If None, starts from first page.
required: false
schema:
type: integer
- name: size
in: query
description: >-
Number of items per page. If None or -1, returns all items.
required: false
schema:
type: integer
/v1/models:
get:
responses:
Expand Down Expand Up @@ -6277,29 +6301,6 @@ components:
- job_id
- status
title: Job
BucketResponse:
type: object
properties:
name:
type: string
additionalProperties: false
required:
- name
title: BucketResponse
ListBucketResponse:
type: object
properties:
data:
type: array
items:
$ref: '#/components/schemas/BucketResponse'
description: List of FileResponse entries
additionalProperties: false
required:
- data
title: ListBucketResponse
description: >-
Response representing a list of file entries.
ListBenchmarksResponse:
type: object
properties:
Expand All @@ -6322,20 +6323,6 @@ components:
required:
- data
title: ListDatasetsResponse
ListFileResponse:
type: object
properties:
data:
type: array
items:
$ref: '#/components/schemas/FileResponse'
description: List of FileResponse entries
additionalProperties: false
required:
- data
title: ListFileResponse
description: >-
Response representing a list of file entries.
ListModelsResponse:
type: object
properties:
Expand Down
7 changes: 7 additions & 0 deletions docs/source/distributions/self_hosted_distro/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ The `llamastack/distribution-ollama` distribution consists of the following prov
| agents | `inline::meta-reference` |
| datasetio | `remote::huggingface`, `inline::localfs` |
| eval | `inline::meta-reference` |
| files | `remote::s3` |
| inference | `remote::ollama` |
| safety | `inline::llama-guard` |
| scoring | `inline::basic`, `inline::llm-as-judge`, `inline::braintrust` |
Expand All @@ -36,6 +37,12 @@ The following environment variables can be configured:
- `OLLAMA_URL`: URL of the Ollama server (default: `http://127.0.0.1:11434`)
- `INFERENCE_MODEL`: Inference model loaded into the Ollama server (default: `meta-llama/Llama-3.2-3B-Instruct`)
- `SAFETY_MODEL`: Safety model loaded into the Ollama server (default: `meta-llama/Llama-Guard-3-1B`)
- `AWS_ACCESS_KEY_ID`: AWS access key ID for S3 access (default: ``)
- `AWS_SECRET_ACCESS_KEY`: AWS secret access key for S3 access (default: ``)
- `AWS_REGION_NAME`: AWS region name for S3 access (default: ``)
- `AWS_ENDPOINT_URL`: AWS endpoint URL for S3 access (for custom endpoints) (default: ``)
- `AWS_BUCKET_NAME`: AWS bucket name for S3 access (default: ``)
- `AWS_VERIFY_TLS`: Whether to verify TLS for S3 connections (default: `true`)


## Setting up Ollama server
Expand Down
Loading
Loading