Skip to content

Comments

Enhance description and fix URL in provisioners.yaml#55

Merged
mathieu-benoit merged 2 commits intomainfrom
mathieu-benoit-patch-2
Jan 15, 2026
Merged

Enhance description and fix URL in provisioners.yaml#55
mathieu-benoit merged 2 commits intomainfrom
mathieu-benoit-patch-2

Conversation

@mathieu-benoit
Copy link
Contributor

@mathieu-benoit mathieu-benoit commented Jan 15, 2026

  • Default llm-model returning the url output for Ollama-compatible clients
  • Adding new llm-model for OpenAI SDK/clients

Resources:

Blog post associated to this updated to use the default one with OLLAMA_BASE_URL: https://medium.com/google-cloud/score-docker-compose-to-deploy-your-local-llm-models-10aff89686ce.

Updated the description to specify output for Ollama-compatible clients and corrected the URL format.

Signed-off-by: Mathieu Benoit <mathieu-benoit@hotmail.fr>
This file defines a provisioner for generating an LLM model using the Docker Model Runner, including supported parameters and expected outputs.

Signed-off-by: Mathieu Benoit <mathieu-benoit@hotmail.fr>
@mathieu-benoit mathieu-benoit merged commit 5589e93 into main Jan 15, 2026
2 checks passed
@mathieu-benoit mathieu-benoit deleted the mathieu-benoit-patch-2 branch January 15, 2026 19:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant