-
Notifications
You must be signed in to change notification settings - Fork 2.7k
Open
Description
Reproducible example
Fresh environment:
mkdir test
cd test
uv init
uv add "sentence-transformers[onnx]"
mkdir model_cachePopulate the cache:
from sentence_transformers import SentenceTransformer
t = SentenceTransformer(
"sentence-transformers/all-MiniLM-L6-v2",
cache_folder="./model_cache/",
backend="onnx",
device="cpu",
local_files_only=False,
)Now use the cache (local_files_only=True):
from sentence_transformers import SentenceTransformer
t = SentenceTransformer(
"sentence-transformers/all-MiniLM-L6-v2",
cache_folder="./model_cache/",
backend="onnx",
device="cpu",
local_files_only=True,
)Results in the traceback pasted below.
Note that things work without the cache_folder and without using onnx.
Traceback (most recent call last):
File "/Users/lcorcodilos/Projects/temp/test.py", line 3, in <module>
t = SentenceTransformer(
^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/SentenceTransformer.py", line 327, in __init__
modules, self.module_kwargs = self._load_sbert_model(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/SentenceTransformer.py", line 2305, in _load_sbert_model
module = module_class.load(
^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/models/Transformer.py", line 365, in load
return cls(model_name_or_path=model_name_or_path, **init_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/models/Transformer.py", line 88, in __init__
self._load_model(model_name_or_path, config, cache_dir, backend, is_peft_model, **model_args)
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/models/Transformer.py", line 200, in _load_model
self.auto_model = load_onnx_model(
^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/sentence_transformers/backend/load.py", line 73, in load_onnx_model
model = model_cls.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/optimum/onnxruntime/modeling.py", line 575, in from_pretrained
return super().from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/optimum/modeling_base.py", line 356, in from_pretrained
library_name = TasksManager.infer_library_from_model(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/optimum/exporters/tasks.py", line 994, in infer_library_from_model
library_name = cls._infer_library_from_model_name_or_path(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/lcorcodilos/Projects/temp/.venv/lib/python3.12/site-packages/optimum/exporters/tasks.py", line 956, in _infer_library_from_model_name_or_path
raise ValueError(
ValueError: The library name could not be automatically inferred. If using the command-line, please provide the argument --library {transformers,diffusers,timm,sentence_transformers}. Example: `--library diffusers`.
cjermain
Metadata
Metadata
Assignees
Labels
No labels