Skip to content

How to use stable-video-diffusion-img2vid-xt-int8? No module named 'onediff_quant' #1186

@soumik-kanad

Description

@soumik-kanad

I wanted to check out the img2vid model speedups claimed on https://www.reddit.com/r/StableDiffusion/comments/1adu2hn/accelerating_stable_video_diffusion_3x_faster/.

I am using the checkpoint from https://huggingface.co/siliconflow/stable-video-diffusion-img2vid-xt-int8/tree/main.

I am trying to use the benchmark benchmarks/image_to_video.py script. I was able to run this with --model stabilityai/stable-video-diffusion-img2vid-xt.

But as soon as I use siliconflow/stable-video-diffusion-img2vid-xt-int8, I started getting RuntimeError: InferDataType Failed. Expected kFloat16, but got kInt8 error, which I realised was because the [calibrate_info.txt](https://huggingface.co/siliconflow/stable-video-diffusion-img2vid-xt-int8/blob/main/calibrate_info.txt) was not getting downloaded on its own. So, I manually downloaded it and placed it in the model directory and provided the path to that to the script.

But now I'm getting the following error:

Traceback (most recent call last):
  File "/fs/cfar-projects/iarpa_wriva_as/workspaces/soumik/diffusion_refinement_2025/onediff/onediff/benchmarks/image_to_video.py", line 321, in <module>
    main()
  File "/fs/cfar-projects/iarpa_wriva_as/workspaces/soumik/diffusion_refinement_2025/onediff/onediff/benchmarks/image_to_video.py", line 178, in main
    pipe = load_pipe(
  File "/fs/cfar-projects/iarpa_wriva_as/workspaces/soumik/diffusion_refinement_2025/onediff/onediff/benchmarks/image_to_video.py", line 123, in load_pipe
    from onediff.quantization import QuantPipeline
  File "/fs/cfar-projects/iarpa_wriva_as/workspaces/soumik/diffusion_refinement_2025/onediff/onediff/src/onediff/quantization/__init__.py", line 1, in <module>
    from .quantize_pipeline import QuantPipeline
  File "/fs/cfar-projects/iarpa_wriva_as/workspaces/soumik/diffusion_refinement_2025/onediff/onediff/src/onediff/quantization/quantize_pipeline.py", line 5, in <module>
    from onediff_quant import quantize_pipeline, save_quantized
ModuleNotFoundError: No module named 'onediff_quant'
(onediff) bash-4.4$ python3 -c "import onediff_quant" && echo "enable quant model"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'onediff_quant'

I couldn't find a solution for this.
I think I saw on some link that onediff_quant was enterprise? Is there no way to test the int8 model on community? Or am I missing something or doing something wrong?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions