Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exposing config_kwargs in sentence_transformers component #8425

Open
bglearning opened this issue Sep 30, 2024 · 0 comments · May be fixed by #8432 or #8433
Open

Exposing config_kwargs in sentence_transformers component #8425

bglearning opened this issue Sep 30, 2024 · 0 comments · May be fixed by #8432 or #8433
Assignees
Labels
2.x Related to Haystack v2.0 P2 Medium priority, add to the next sprint if no P1 available

Comments

@bglearning
Copy link
Contributor

The main SentenceTransformer class has config_kwargs.

config_kwargs (Dict[str, Any], optional) – Additional model configuration parameters to be passed to the Hugging Face Transformers config. See the AutoConfig.from_pretrained documentation for more details.

This is not available through the SentenceTransformer Embedders in haystack (though we have model_kwargs and tokenizer_kwargs).

It could be good to also expose config_kwargs.

For instance, turning off use_memory_efficient_attention to run dunzhang/stella_en_400M_v5 seems to be not possible when loading through SentenceTransformersDocumentEmbedder.

@bglearning bglearning added the 2.x Related to Haystack v2.0 label Sep 30, 2024
@julian-risch julian-risch added the P2 Medium priority, add to the next sprint if no P1 available label Oct 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.x Related to Haystack v2.0 P2 Medium priority, add to the next sprint if no P1 available
Projects
None yet
3 participants