Skip to content

Enable Llama 3.2 11B Vision Instruct in OpenVINO #1571

@abhijitsinha17

Description

@abhijitsinha17

I request to add support for the "Llama 3.2 11B Vision Instruct" model so that we can quantize the model using optimum-cli. Currently, I get an error while trying to quantize and download the Llama 3.2 11B Vision Instruct model.

"ValueError: Trying to export a mllama model, that is a custom or unsupported architecture, but no custom export configuration was passed as custom_export_configs"

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions