feat: bump llama-stack to 0.2.23 #175
Annotations
1 error and 1 warning
|
build-test-push (linux/amd64)
buildx failed with: ERROR: failed to build: failed to solve: process "/bin/sh -c pip install aiosqlite asyncpg autoevals boto3 chardet 'datasets>=4.0.0' fastapi fire google-cloud-aiplatform httpx ibm_watsonx_ai litellm matplotlib 'mcp>=1.8.1' nltk numpy opentelemetry-exporter-otlp-proto-http opentelemetry-sdk pandas pillow psycopg2-binary 'pymilvus[milvus-lite]>=2.4.10' 'pymilvus[milvus-lite][milvus-lite]>=2.4.10' pymongo pypdf redis requests scikit-learn scipy sentencepiece sqlalchemy[asyncio] tqdm transformers uvicorn" did not complete successfully: exit code: 1
|
|
build-test-push (linux/amd64)
No file matched to [/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*requirements*.txt,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*requirements*.in,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*constraints*.txt,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/*constraints*.in,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/pyproject.toml,/home/runner/work/llama-stack-distribution/llama-stack-distribution/**/uv.lock]. The cache will never get invalidated. Make sure you have checked out the target repository and configured the cache-dependency-glob input correctly.
|
Artifacts
Produced during runtime
| Name | Size | Digest | |
|---|---|---|---|
|
ci-logs-13b51d3b66526b01385e7345cf44b2cb44f84363
Expired
|
1.02 KB |
sha256:408ad8fb8675a6d55d89153b8b8995c4c8b0d12d4742c42ef47e29039b99be0f
|
|
|
opendatahub-io~llama-stack-distribution~MTEWT1.dockerbuild
Expired
|
16.9 KB |
sha256:357adbd2a55f6ecba0a2996c76f79a9f9afabd61d51b727749f6333bb77b1831
|
|