-
Notifications
You must be signed in to change notification settings - Fork 217
Closed as duplicate of#1947
Description
(llms_doc-summarization_vllm Timeout failure. @zhihangdeng Could you take a look at it? I tested the main branch and it works fine. https://github.com/opea-project/GenAIComps/actions/runs/19491488676/job/55784405366
I tried to figure out the reason, and the cause of the timeout does not appear to be related to this PR.
The logs show that the workflow hangs and eventually times out after the following part of the test.
2025-11-19T08:03:55.6111789Z + docker build --no-cache -t opea/llm-docsum:comps --build-arg https_proxy= --build-arg http_proxy= -f comps/llms/src/doc-summarization/Dockerfile .
...
2025-11-19T08:08:40.2116224Z ++ curl -s -o /dev/null -w '%{http_code}' -X POST -d '{"messages":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.", "max_tokens":32, "language":"en", "summary_type": "truncate", "chunk_size": 2000}' -H 'Content-Type: application/json' http://192.168.122.213:10507/v1/docsum
Originally posted by @zhihangdeng in #1939 (comment)
Metadata
Metadata
Assignees
Labels
No labels