Skip to content

LOCAL Model for Surya #24

@RobinM30

Description

@RobinM30

Hi,

First of all, thank you for developing and maintaining this project. It’s been incredibly useful, and I really appreciate your efforts!

I have a feature request: Would it be possible to add support for loading the Surya model locally to allow offline inference? Currently, the system accesses Hugging Face to load the model, which can be an issue for environments without internet access.

Having an option to download and configure the model locally would make the tool more versatile and applicable in secure or offline environments.

Looking forward to your thoughts! Please let me know if there’s any way I can assist with this.

Thank you,

Robin

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions