Skip to content

Aanuf/data free awq #3315

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 30 commits into
base: develop
Choose a base branch
from

Conversation

andreyanufr
Copy link
Collaborator

@andreyanufr andreyanufr commented Feb 26, 2025

Changes

Data free AWQ: smooth down_proj input channels and merge extra scale to up_proj output channels.

New method is enabled when AWQ is enabled but the dataset is not available.

Reason for changes

@github-actions github-actions bot added the NNCF PTQ Pull requests that updates NNCF PTQ label Feb 26, 2025
@github-actions github-actions bot added the NNCF OpenVINO Pull requests that updates NNCF OpenVINO label Feb 27, 2025
@github-actions github-actions bot added the API Public API-impacting changes label Mar 4, 2025
@andreyanufr andreyanufr requested a review from ljaljushkin March 13, 2025 08:16
@github-actions github-actions bot removed the API Public API-impacting changes label Apr 9, 2025
@andreyanufr andreyanufr marked this pull request as ready for review April 10, 2025 08:47
@andreyanufr andreyanufr requested a review from a team as a code owner April 10, 2025 08:47
@andreyanufr andreyanufr requested a review from ljaljushkin April 10, 2025 08:47
Copy link
Contributor

@ljaljushkin ljaljushkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test coverage is poor for the new algorithm, that is going to be a default option.

Definitely, a functionality testing for data-free method is needed, that is capable to find an expected alpha (like in template test) and accuracy is improved (conformance test).

As far as I understood, this PR introduces a new behavior for weight compression with dataset and awq option. It would be great to check typical user scenarios and check whether the expected path is launched (e.g. via mocker.spy on specific function).

Copy link
Contributor

@alexsu52 alexsu52 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andreyanufr, please fill the PR description

@github-actions github-actions bot added the API Public API-impacting changes label Apr 16, 2025
@@ -542,7 +543,7 @@ def apply(
nodes_to_compress = self.get_nodes_to_compress(graph)

statistics = None
if self._data_aware_mixed_precision or self._data_aware_compression:
if (self._data_aware_mixed_precision or self._data_aware_compression) and dataset:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please redefine self._data_aware_compression as following:

self._data_aware_compression = (self._awq and self._advanced_parameters.awq_params.is_data_aware) or self._scale_estimation or self._lora_correction or self._gptq

Then we can rollback this if statement to the original form.

Suggested change
if (self._data_aware_mixed_precision or self._data_aware_compression) and dataset:
if self._data_aware_mixed_precision or self._data_aware_compression:

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have to change my original suggestion here 🙂 After the recent changes (is_data_aware -> prefer_data_aware) I think it actually makes more sense to define

self._data_aware_compression = self._scale_estimation or self._lora_correction or self._gptq

Because otherwise it can happen that self._data_aware_compression is True, but data-aware won't actually be applied. This is in case self._awq is True and dataset is not provided.

And then we can do:

Suggested change
if (self._data_aware_mixed_precision or self._data_aware_compression) and dataset:
data_aware_awq = dataset and self._awq and self._advanced_parameters.awq_params.prefer_data_aware
if self._data_aware_mixed_precision or self._data_aware_compression or data_aware_awq:

@github-actions github-actions bot added the NNCF PT Pull requests that updates NNCF PyTorch label Apr 17, 2025
Comment on lines +279 to +288
:param prefer_data_aware: Determines whether to use activations to calculate scales if activations are presented.
:type prefer_data_aware: bool
"""

subset_size: int = 32
percent_to_apply: float = 0.002
alpha_min: float = 0.0
alpha_max: float = 1.0
steps: int = 100
prefer_data_aware: bool = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
:param prefer_data_aware: Determines whether to use activations to calculate scales if activations are presented.
:type prefer_data_aware: bool
"""
subset_size: int = 32
percent_to_apply: float = 0.002
alpha_min: float = 0.0
alpha_max: float = 1.0
steps: int = 100
prefer_data_aware: bool = True
:param use_data_aware_scaling: Whether to use activation data for scale calculation when available.
:type use_data_aware_scaling: bool
"""
subset_size: int = 32
percent_to_apply: float = 0.002
alpha_min: float = 0.0
alpha_max: float = 1.0
steps: int = 100
use_data_aware_scaling: bool = True

What do you think about this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
API Public API-impacting changes NNCF OpenVINO Pull requests that updates NNCF OpenVINO NNCF PT Pull requests that updates NNCF PyTorch NNCF PTQ Pull requests that updates NNCF PTQ
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants