Replies: 7 comments 2 replies
-
It's much cleaner indeed ;) |
Beta Was this translation helpful? Give feedback.
-
I like it. Much cleaner and more flexible. |
Beta Was this translation helpful? Give feedback.
-
I agree, it's cleaner Also, slight typo (form vs from):
|
Beta Was this translation helpful? Give feedback.
-
What about Also allowing And using outlines.model for all language models so you can also do |
Beta Was this translation helpful? Give feedback.
-
I like this in particular from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "gpt2"
model = outlines.from_transformers(
AutoModelForCausalLM.from_pretrained(model_name),
AutoTokenizer.from_pretrained(model_name)
) because it's flexible enough to work with models that follow the transformers API even if they don't necessarily come from that library. |
Beta Was this translation helpful? Give feedback.
-
i don't have a lot of skin in the game here, but my 2 cents instead of replace source with |
Beta Was this translation helpful? Give feedback.
-
Love it. Wouldn't change a thing. 🚀 Plus, I believe that will make it easier to integrate future models, such as litellm types of packages. You can even imagine models that have a bunch of configs to setup a workflow behind it, something like Perplexity AI. |
Beta Was this translation helpful? Give feedback.
-
Model integrations currently work via factory functions implemented in
outlines.models
:There are several issues with this interface:
outlines.models.Transformers
and pass model and tokenizers. The interface fortransformers_vision
is obviously ugly.I propose the following interface instead:
Beta Was this translation helpful? Give feedback.
All reactions