Skip to content

Fix smoothquant DDP test and refactor the test#2557

Merged
HDCharles merged 3 commits intomainfrom
91_fix_ddp_smooth
Apr 2, 2026
Merged

Fix smoothquant DDP test and refactor the test#2557
HDCharles merged 3 commits intomainfrom
91_fix_ddp_smooth

Conversation

@HDCharles
Copy link
Copy Markdown
Collaborator

original test had an issue where it would call tok.apply_chat_template without having a chat template being defined.

This was hard to catch because the whole script was in a string.

I turned everything into functions and pulled the DDP part into a separate file to make it all more readable.

Test Plan:
pytest /home/HDCharles/repos/llm-compressor/tests/llmcompressor/modifiers/transform/smoothquant/test_smoothquant_distributed.py

@HDCharles HDCharles requested a review from dsikka April 2, 2026 02:27
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 2, 2026

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@HDCharles HDCharles added the ready When a PR is ready for review label Apr 2, 2026
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a distributed runner for SmoothQuant testing and updates the corresponding integration tests to use this runner. The review identified a potential issue where the number of calibration samples passed to the oneshot function in the distributed runner should be based on the local dataset shard size rather than the total sample count. Additionally, improvements were suggested for the command-line argument validation to use standard error output and cleaner string formatting.

Copy link
Copy Markdown
Collaborator

@brian-dellabetta brian-dellabetta left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Copy link
Copy Markdown
Collaborator

@kylesayrs kylesayrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will run as part of the per-commit tests. Typically we want the oneshot tests to run as part of the transformers tests

original test had an issue where it would call tok.apply_chat_template
without having a chat template being defined.

This was hard to catch because the whole script was in a string.

I turned everything into functions and pulled the DDP part into a
separate file to make it all more readable.

Signed-off-by: Author Name <authoremail@example.com>
Summary

Signed-off-by: HDCharles <charlesdavidhernandez@gmail.com>
@HDCharles HDCharles force-pushed the 91_fix_ddp_smooth branch from 93b3219 to fb91c72 Compare April 2, 2026 19:16
@HDCharles HDCharles merged commit 9c33c00 into main Apr 2, 2026
13 checks passed
@HDCharles HDCharles deleted the 91_fix_ddp_smooth branch April 2, 2026 20:32
2imi9 pushed a commit to 2imi9/llm-compressor that referenced this pull request Apr 3, 2026
original test had an issue where it would call tok.apply_chat_template
without having a chat template being defined.

This was hard to catch because the whole script was in a string.

I turned everything into functions and pulled the DDP part into a
separate file to make it all more readable.

Test Plan:
pytest
/home/HDCharles/repos/llm-compressor/tests/llmcompressor/modifiers/transform/smoothquant/test_smoothquant_distributed.py

---------

Signed-off-by: Author Name <authoremail@example.com>
Signed-off-by: HDCharles <charlesdavidhernandez@gmail.com>
Signed-off-by: Ziming <frankziming26@outlook.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready When a PR is ready for review

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants