-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
To avoid each user to redownload the model weights everytime, which wastes time and also storage, Julian has made this library:
from dlab_utils.paths import model_path, available_models
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained(model_path("meta-llama/Meta-Llama-3-8B-Instruct"))
tokenizer = AutoTokenizer.from_pretrained(model_path("meta-llama/Meta-Llama-3-8B-Instruct"))
print(available_models())
This is really good and simple. Would be even better with autodownloading
What I can do in addition when I get sudo is to periodically delete cache directory so that people need to go for this if they don't want to redownload everytime (edited)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels