Skip to content

[Bug] Qwen3-8B- Qwen3-8B vllm 部署调用 -李娇娇-vllm serve参数bug #459

@nlnlnl123

Description

@nlnlnl123

出bug的具体模型

Qwen3-8B

出bug的具体模型教程

Qwen3-8B vllm 部署调用

教程负责人

李娇娇

Bug描述

在启用vllm serve 设置--enable-resoning参数报错

Image

复现步骤

1.执行命令:VLLM_USE_MODELSCOPE=true vllm serve /root/autodl-tmp/Qwen/Qwen3-8B --served-model-name Qwen3-8B --max_model_len 8192 --enable-reasoning --reasoning-parser deepseek_r1
2.报错:“vllm:error:unrecognized arguments:--enable-reason”

期望行为

Image

环境信息

Image

其他信息

BUG消除办法:删除--enable--reasoning参数即可

Image

确认事项 / Verification

  • 此问题未在过往Issue中被报告过 / This issue hasn't been reported before

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions