Skip to content

[bug] llm-docsum test fail on vllm backend #1948

@zhihangdeng

Description

@zhihangdeng

(llms_doc-summarization_vllm Timeout failure. @zhihangdeng Could you take a look at it? I tested the main branch and it works fine. https://github.com/opea-project/GenAIComps/actions/runs/19491488676/job/55784405366

I tried to figure out the reason, and the cause of the timeout does not appear to be related to this PR.
The logs show that the workflow hangs and eventually times out after the following part of the test.

2025-11-19T08:03:55.6111789Z + docker build --no-cache -t opea/llm-docsum:comps --build-arg https_proxy= --build-arg http_proxy= -f comps/llms/src/doc-summarization/Dockerfile .
...
2025-11-19T08:08:40.2116224Z ++ curl -s -o /dev/null -w '%{http_code}' -X POST -d '{"messages":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.", "max_tokens":32, "language":"en", "summary_type": "truncate", "chunk_size": 2000}' -H 'Content-Type: application/json' http://192.168.122.213:10507/v1/docsum

logs_50254316498.zip

Originally posted by @zhihangdeng in #1939 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions