-
Notifications
You must be signed in to change notification settings - Fork 165
Description
Hi,
I thought this is completely self hosted and thought I could use it with Ollama. But it relies on either openai or jina which are paid subscriptions. I couldn't even get it to install as I don't have OpenAI key. Is there a branch that I can use just with Ollama without relying on any paid subscriptions?
Here is my config:
llm_base_url: http://192.168.1.98:11434/v1
llm_api_key: ollama
best_llm_model: qwen2.5:7b
Here are the errors:
memobase-server-api | memobase_server | ERROR - 2025-10-09 16:48:47,406 - Error in get_embedding: Error code: 404 - {'error': {'message': 'model "text-embedding-3-small" not found, try pulling it first', 'type': 'api_error', 'param': None, 'code': None}} Traceback (most recent call last):
memobase-server-api | File "/app/memobase_server/llms/embeddings/init.py", line 48, in get_embedding
memobase-server-api | results = await FACTORIES[CONFIG.embedding_provider](model, texts, phase)
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/memobase_server/llms/embeddings/openai_embedding.py", line 11, in openai_embedding
memobase-server-api | response = await openai_async_client.embeddings.create(
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/.venv/lib/python3.12/site-packages/openai/resources/embeddings.py", line 251, in create
memobase-server-api | return await self._post(
memobase-server-api | ^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1791, in post
memobase-server-api | return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1591, in request
memobase-server-api | raise self._make_status_error_from_response(err.response) from None
memobase-server-api | openai.NotFoundError: Error code: 404 - {'error': {'message': 'model "text-embedding-3-small" not found, try pulling it first', 'type': 'api_error', 'param': None, 'code': None}}
memobase-server-api |
memobase-server-api | Traceback (most recent call last):
memobase-server-api | File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 694, in lifespan
memobase-server-api | async with self.lifespan_context(app) as maybe_state:
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/usr/local/lib/python3.12/contextlib.py", line 210, in aenter
memobase-server-api | return await anext(self.gen)
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 134, in merged_lifespan
memobase-server-api | async with original_context(app) as maybe_original_state:
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/usr/local/lib/python3.12/contextlib.py", line 210, in aenter
memobase-server-api | return await anext(self.gen)
memobase-server-api | ^^^^^^^^^^^^^^^^^^^^^
memobase-server-api | File "/app/api.py", line 24, in lifespan
memobase-server-api | await check_embedding_sanity()
memobase-server-api | File "/app/memobase_server/llms/embeddings/init.py", line 27, in check_embedding_sanity
memobase-server-api | raise ValueError(
memobase-server-api | ValueError: Embedding API check failed! Make sure the embedding API key is valid.
memobase-server-api |
memobase-server-api | Application startup failed. Exiting.