Skip to content

Vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors #374

@ganesh-rao

Description

@ganesh-rao

When I attempt to index a table with a column containing vector embeddings from OpenAI's text-embedding-3-large model, I get the following error:

error: vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors
I'd like to utilise the 3072 dimensions if at all possible and would rather avoid reducing it to 1536. Is there a way I can make it happen with lantern?
I see examples using the aforementioned model, as well as Cohere's model which produces vectors with 4096 dimensions. Is it possible to make it work without quantization?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions