feat: embedding provider now defaults to vLLM #533
Annotations
1 warning
|
build-test-push (linux/amd64)
No file matched to [/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*requirements*.txt,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*requirements*.in,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*constraints*.txt,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*constraints*.in,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/pyproject.toml,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/uv.lock,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*.py.lock]. The cache will never get invalidated. Make sure you have checked out the target repository and configured the cache-dependency-glob input correctly.
|
Artifacts
Produced during runtime
| Name | Size | Digest | |
|---|---|---|---|
|
ci-logs-63a8451b9fffe79643860348e38f06896c234beb
Expired
|
13.6 KB |
sha256:cc73421ca1f36be065b7d338e49a8a31a46055548b7c809ec3a43d2d946b816c
|
|
|
opendatahub-io~llama-stack-distribution~ZM8X4W.dockerbuild
Expired
|
46.9 KB |
sha256:b86c600c0ce8e39d9a444df8835603277410cb18b3c2f6fe37c7db591d131669
|
|