Skip to content

Comments

dmr-llm-model#52

Merged
mathieu-benoit merged 1 commit intomainfrom
llm-model-dmr
Jan 3, 2026
Merged

dmr-llm-model#52
mathieu-benoit merged 1 commit intomainfrom
llm-model-dmr

Conversation

@mathieu-benoit
Copy link
Contributor

@mathieu-benoit mathieu-benoit commented Jan 3, 2026

Fixing #28, now that we have score-spec/score-compose#392 included in the score-compose version 0.30.0: https://github.com/score-spec/score-compose/releases/tag/0.30.0. We can now add the official LLM Model support with Docker Compose with the Docker Model Runner (DMR).
Given this score.yaml file:

apiVersion: score.dev/v1b1
metadata:
  name: open-webui
containers:
  open-webui:
    image: .
    variables:
      OPENAI_API_BASE_URL: "${resources.smollm2.url}"
      WEBUI_NAME: "Hello, DMR with Score Compose!"
    volumes:
      /app/backend/data:
        source: ${resources.data}
resources:
  data:
    type: volume
  gemma3:
    type: llm-model
    params:
      model: ai/gemma3:270M-UD-IQ2_XXS
  smollm2:
    type: llm-model
    params:
      model: ai/smollm2:135M-Q2_K
service:
  ports:
    tcp:
      port: 8080
      targetPort: 8080

By running these commands:

score-compose init \
  --provisioners ./llm-model/score-compose/10-dmr-llm-model.provisioners.yaml

score-compose generate score.yaml \
  --image ghcr.io/open-webui/open-webui:main-slim \
  --publish 8080:open-webui:8080 \
  --output compose.yaml

This will generate this compose.yaml file:

name: community-provisioners
services:
    open-webui-open-webui:
        annotations:
            compose.score.dev/workload-name: open-webui
        environment:
            OPENAI_API_BASE_URL: http://172.17.0.1:12434/engines/v1/
            WEBUI_NAME: Hello, DMR with Score Compose!
        hostname: open-webui
        image: ghcr.io/open-webui/open-webui:main-slim
        models:
            open-webui.gemma3: {}
            open-webui.smollm2: {}
        ports:
            - target: 8080
              published: "8080"
        volumes:
            - type: volume
              source: open-webui-data-7BKJ1L
              target: /app/backend/data
volumes:
    open-webui-data-7BKJ1L:
        name: open-webui-data-7BKJ1L
        driver: local
        labels:
            dev.score.compose.res.uid: volume.default#open-webui.data
models:
    open-webui.gemma3:
        model: ai/gemma3:270M-UD-IQ2_XXS
        context_size: 2048
    open-webui.smollm2:
        model: ai/smollm2:135M-Q2_K
        context_size: 2048

And finally running this docker compose up -d will generate:

docker model list:

MODEL NAME              PARAMETERS  QUANTIZATION  ARCHITECTURE  MODEL ID      CREATED       CONTEXT  SIZE       
gemma3:270M-UD-IQ2_XXS  268.10 M    IQ4_NL        gemma3        bae1ee104a16  3 months ago           165.54 MiB  
smollm2:135M-Q2_K       134.52 M    Q2_K          llama         eba11bf8f361  8 months ago           82.41 MiB

docker images:

IMAGE                                     ID             DISK USAGE   CONTENT SIZE   EXTRA
docker/model-runner:latest                5c7b30452d7f        442MB             0B    U   
ghcr.io/open-webui/open-webui:main-slim   bc3fcbf38b91       4.08GB             0B    U

And then the app is available on localhost:8080, ready to use the 2 local LLM models:
image

Signed-off-by: Mathieu Benoit <mathieu-benoit@hotmail.fr>
@mathieu-benoit mathieu-benoit linked an issue Jan 3, 2026 that may be closed by this pull request
@mathieu-benoit mathieu-benoit self-assigned this Jan 3, 2026
@mathieu-benoit mathieu-benoit merged commit c594152 into main Jan 3, 2026
2 checks passed
@mathieu-benoit mathieu-benoit deleted the llm-model-dmr branch January 3, 2026 16:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

score-compose - example with Docker Model Runner

1 participant