Skip to content

Conversation

@ChangyiYang
Copy link
Contributor

Motivation

Reated issue: #16132

Diffusers set max token length as 1024
https://github.com/huggingface/diffusers/blob/1cdb8723b85f1b427031e390e0bd0bebfe92454e/src/diffusers/pipelines/qwenimage/pipeline_qwenimage.py#L175
increse it to align with diffusers.

Thanks to Yuhao to point this out!

Modifications

set max token length for Qwen-Image from 256 to 1024

Accuracy Tests

Max Length 256 will make model to fail in long prompt, now model can correctly follow long prompt word.

Benchmarking and Profiling

Checklist

Review Process

  1. Ping Merge Oncalls to start the PR flow. See the PR Merge Process.
  2. Get approvals from CODEOWNERS and other reviewers.
  3. Trigger CI tests with comments (/tag-run-ci-label, /rerun-failed-ci, /tag-and-rerun-ci) or contact authorized users to do so.
  4. After green CI and required approvals, ask Merge Oncalls to merge.

@gemini-code-assist
Copy link
Contributor

Warning

You have reached your daily quota limit. Please wait up to 24 hours and I will start processing your requests again!

@mickqian
Copy link
Collaborator

mickqian commented Jan 1, 2026

/tag-and-rerun-ci

@github-actions github-actions bot added the run-ci label Jan 1, 2026
Copy link
Collaborator

@mickqian mickqian left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

brilliant!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

diffusion SGLang Diffusion run-ci

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants