Skip to content

Unable to inference using Segmind Tiny SD model #603

Open
@Rehaman1429

Description

@Rehaman1429

Hi there,

I am trying to run Segmind's Distilled Diffusion model (segmind/tiny-sd) from Hugging Face on my machine with the following specifications:

  • Processor: 12th Gen Intel(R) Core(TM) i5-1235U 1.30 GHz
  • Installed RAM: 16.0 GB (15.7 GB usable)
  • System Type: 64-bit operating system, x64-based processor

I successfully converted the model to a single checkpoint file (.safetensors, .ckpt) as shown below:

Image

Then, I used the convert function in stable-diffusion.cpp to convert the .safetensors file to the GGUF format. While the conversion completed without any errors, the resulting GGUF file is significantly smaller than the original model, as shown here:

Image

Image

When I attempt to run inference with the converted GGUF file, I encounter the following error:

Image

It appears that the issue may lie in the conversion process to GGUF, possibly due to the fact that the model in question is a distilled version (Tiny-SD). I am wondering if anyone has worked with distilled models in this context and found a fix for this issue.

Any insights or suggestions would be greatly appreciated.

Thank you for your time and assistance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions