Skip to content

Conversation

@jangel97
Copy link

@jangel97 jangel97 commented Dec 11, 2025

While running llmcompressor, I saw the following warning:

"`torch_dtype` is deprecated! Use `dtype` instead!"

This PR replaces the deprecated torch_dtype parameter with dtype in transformers.from_pretrained() calls.

Since support for the dtype argument was introduced in transformers v4.56.1, this change also bumps the minimum supported transformers version accordingly.

Transformers ≥4.56.1 requires Python ≥3.9; however, llmcompressor already requires Python ≥3.10, so this change does not reduce Python compatibility for existing users.

The goal of this PR is to eliminate the deprecation warning and align with current transformers APIs.

Related discussion: vllm-project/vllm#26293

@github-actions
Copy link

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

Note: This is required to complete the testing suite, please only add the label once the PR is code complete and local testing has been performed.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @jangel97, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request modernizes the project's interaction with the Hugging Face transformers library by adopting the non-deprecated dtype parameter for model loading. This update resolves a deprecation warning and ensures compatibility with newer transformers versions, requiring a corresponding bump in the minimum supported transformers dependency. The changes are straightforward, involving direct parameter name replacements and a version constraint update.

Highlights

  • API Modernization: Replaced the deprecated torch_dtype parameter with the current dtype parameter in all transformers.from_pretrained() calls across the codebase. This change aligns the project with the latest transformers API standards.
  • Dependency Update: Updated the minimum required transformers library version from 4.54.0 to 4.56.1 in setup.py. This version bump is necessary as support for the dtype argument was introduced in transformers v4.56.1.
  • Warning Elimination: The primary goal of these changes is to eliminate the "torch_dtype is deprecated! Use dtype instead!" warning observed during llmcompressor execution, leading to a cleaner console output.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request effectively addresses the deprecation warning related to torch_dtype by replacing it with dtype in transformers.from_pretrained() calls across the codebase. The setup.py file has been appropriately updated to reflect the new minimum transformers version (4.56.1), which is necessary for the dtype argument to be supported. The changes are clear, concise, and directly resolve the stated objective of aligning with current transformers APIs and eliminating the deprecation warning. No further issues were identified in the reviewed changes.

@jangel97 jangel97 changed the title Replace deprecated torch_dtype parameter with dtype in transformers Replace deprecated torch_dtype with dtype in transformers.from_pretrained Dec 11, 2025
@jangel97 jangel97 changed the title Replace deprecated torch_dtype with dtype in transformers.from_pretrained Replace deprecated torch_dtype with dtype in transformers.from_pretrained method invocations Dec 11, 2025
@jangel97 jangel97 changed the title Replace deprecated torch_dtype with dtype in transformers.from_pretrained method invocations Fix deprecated torch_dtype usage in transformers loading Dec 11, 2025
@jangel97 jangel97 marked this pull request as ready for review December 11, 2025 00:48
Copy link
Collaborator

@kylesayrs kylesayrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the contribution!

@dsikka @dhuangnm Thoughts on updating the LLM Compressor lower bound?

@dsikka
Copy link
Collaborator

dsikka commented Dec 11, 2025

Looks like a reasonable change.

@dsikka dsikka requested a review from dhuangnm December 11, 2025 17:32
@dsikka dsikka added the ready When a PR is ready for review label Dec 11, 2025
@dhuangnm
Copy link
Collaborator

Thanks for implementing the fix @jangel97 !

We also have tests and examples etc using the torch_dtype, e.g.:

./tests/e2e/e2e_utils.py
./examples/quantizing_moe/qwen_example.py
./src/llmcompressor/entrypoints/utils.py
...

Can you please do a find and replace for all the files that's using torch_dtype?

Replace deprecated torch_dtype parameter with dtype in transformers
from_pretrained calls. Bump minimum transformers version to 4.56.1
where dtype parameter was introduced.

Signed-off-by: Jose Angel Morena <[email protected]>
@jangel97 jangel97 force-pushed the deprecation/torch-dtype-to-dtype branch from a089bfb to 6cb1ec0 Compare December 14, 2025 16:56
@jangel97
Copy link
Author

@dhuangnm good point, I just did!

Copy link
Collaborator

@dhuangnm dhuangnm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, LGTM!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready When a PR is ready for review

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants