Skip to content

🐞 Bug: cannot export model with INT8_PTQ and INT8_ACQ #3196

@maxxgx

Description

@maxxgx

Describe the bug

See title

Dataset

Other (please specify in the text field below)

Model

N/A

Steps to reproduce the behavior

Using Geti inspect:

  1. train model
  2. export with INT8_PTQ or INT8_ACQ

OS information

Tested on macos

Expected behavior

Export completes

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

feature/geti-inspect

Configuration YAML

?

Logs

  File "/Users/xiangxi2/forks/anomalib/application/backend/src/services/model_service.py", line 209, in _run_export
    engine.export(
    │      └ <function Engine.export at 0x3375cd120><anomalib.engine.engine.Engine object at 0x33b363b10>

  File "/Users/xiangxi2/forks/anomalib/src/anomalib/engine/engine.py", line 876, in export
    exported_model_path = model.to_openvino(
                          │     └ <function ExportMixin.to_openvino at 0x3326ca7a0>
                          └ Padim(
                              (pre_processor): PreProcessor(
                                (transform): Compose(
                                      Resize(size=[256, 256], interpolation=Interpolati...

  File "/Users/xiangxi2/forks/anomalib/src/anomalib/models/components/base/export_mixin.py", line 267, in to_openvino
    model = self._compress_ov_model(model, compression_type, datamodule, metric, task, max_drop)
            │    │                  │      │                 │           │       │     └ 0.01
            │    │                  │      │                 │           │       └ None
            │    │                  │      │                 │           └ None
            │    │                  │      │                 └ None
            │    │                  │      └ <CompressionType.INT8_PTQ: 'int8_ptq'>
            │    │                  └ <Model: 'main_graph'
            │    │                    inputs[
            │    │                    <ConstOutput: names[input] shape[?,3,?,?] type: f32>
            │    │                    ]
            │    │                    outputs[
            │    │                    <ConstOutput: names[pred_score] ...
            │    └ <function ExportMixin._compress_ov_model at 0x3326ca8e0>
            └ Padim(
                (pre_processor): PreProcessor(
                  (transform): Compose(
                        Resize(size=[256, 256], interpolation=Interpolati...

  File "/Users/xiangxi2/forks/anomalib/src/anomalib/models/components/base/export_mixin.py", line 315, in _compress_ov_model
    model = self._post_training_quantization_ov(model, datamodule)
            │    │                              │      └ None
            │    │                              └ <Model: 'main_graph'
            │    │                                inputs[
            │    │                                <ConstOutput: names[input] shape[?,3,?,?] type: f32>
            │    │                                ]
            │    │                                outputs[
            │    │                                <ConstOutput: names[pred_score] ...
            │    └ <staticmethod(<function ExportMixin._post_training_quantization_ov at 0x3326ca980>)>
            └ Padim(
                (pre_processor): PreProcessor(
                  (transform): Compose(
                        Resize(size=[256, 256], interpolation=Interpolati...

  File "/Users/xiangxi2/forks/anomalib/src/anomalib/models/components/base/export_mixin.py", line 346, in _post_training_quantization_ov
    raise ValueError(msg)
                     └ 'Datamodule must be provided for OpenVINO INT8_PTQ compression'

ValueError: Datamodule must be provided for OpenVINO INT8_PTQ compression

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions