Skip to content

Conversation

@zhihangdeng
Copy link
Collaborator

This pull request adds support for building and testing the document summarization microservice using the openEuler OS in addition to the existing setup. It introduces a new Dockerfile.openEuler for openEuler-based builds and updates the test scripts to build and validate both the default and openEuler Docker images. The test scripts are also refactored for improved maintainability.

The most important changes are:

openEuler Docker support:

  • Added a new Dockerfile.openEuler in comps/llms/src/doc-summarization to build the document summarization microservice using the openEuler Python 3.11 image, with specific dependencies and environment setup for air-gapped and model caching support.

Test script enhancements:

  • Updated both test_llms_doc-summarization_tgi.sh and test_llms_doc-summarization_vllm.sh to:
    • Allow building Docker images with a specified Dockerfile, supporting both the default and openEuler versions. [1] [2]
    • Add steps to build and test with the new openEuler Dockerfile, including air-gapped scenarios if DATA_PATH is set. [1] [2]

Test script refactoring:

  • Renamed the stop_docker function to stop_service in both test scripts for clearer intent, and updated all usages accordingly. [1] [2]

@zhihangdeng zhihangdeng force-pushed the main branch 3 times, most recently from 529a381 to 5c1875e Compare November 6, 2025 02:47
@zhihangdeng zhihangdeng requested a review from lianhao November 11, 2025 01:55
@joshuayao joshuayao self-requested a review November 18, 2025 01:55
Signed-off-by: zhihang <[email protected]>
@ZePan110
Copy link
Collaborator

ZePan110 commented Nov 20, 2025

(llms_doc-summarization_vllm Timeout failure. @zhihangdeng Could you take a look at it? I tested the main branch and it works fine. https://github.com/opea-project/GenAIComps/actions/runs/19491488676/job/55784405366

Signed-off-by: zhihang <[email protected]>
@zhihangdeng
Copy link
Collaborator Author

(llms_doc-summarization_vllm Timeout failure. @zhihangdeng Could you take a look at it? I tested the main branch and it works fine. https://github.com/opea-project/GenAIComps/actions/runs/19491488676/job/55784405366

I tried to figure out the reason, and the cause of the timeout does not appear to be related to this PR.
The logs show that the workflow hangs and eventually times out after the following part of the test.

2025-11-19T08:03:55.6111789Z + docker build --no-cache -t opea/llm-docsum:comps --build-arg https_proxy= --build-arg http_proxy= -f comps/llms/src/doc-summarization/Dockerfile .
...
2025-11-19T08:08:40.2116224Z ++ curl -s -o /dev/null -w '%{http_code}' -X POST -d '{"messages":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.", "max_tokens":32, "language":"en", "summary_type": "truncate", "chunk_size": 2000}' -H 'Content-Type: application/json' http://192.168.122.213:10507/v1/docsum

logs_50254316498.zip

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

3 participants