Skip to content

通过 HTTP 调用召回与嵌入遇到了异常,不知道该如何解决 #203

@jkwlstv

Description

@jkwlstv

通过 agentscope-java 中 ReMeClient.java 发起的 HTTP 调用。异常堆栈如下:

2026-04-08 10:33:27 | ERROR | base_embedding_model.py:126 | embedding model name=bge-m3 encounter error with e=('Error code: 400 - {\'error\': {\'message\': \'Model "bge-m3" does not support matryoshka representation, changing output dimensions will lead to poor results.\', \'type\': \'BadRequestError\', \'param\': \'\', \'code\': 400}}',)
�[33m�[1mTraceback (most recent call last):�[0m

  File "/opt/reme/.venv/bin/reme", line 6, in <module>
    sys.exit(main())
    │   │    └ <function main at 0x73b6151dc9a0>
    │   └ <built-in function exit>
    └ <module 'sys' (built-in)>

  File "�[32m/opt/reme/reme_ai/�[0m�[32m�[1mmain.py�[0m", line �[33m231�[0m, in �[35mmain�[0m
    �[1mapp�[0m�[35m�[1m.�[0m�[1mrun_service�[0m�[1m(�[0m�[1m)�[0m
    �[36m│   └ �[0m�[36m�[1m<function Application.run_service at 0x73b6151dd580>�[0m
    �[36m└ �[0m�[36m�[1m<reme_ai.main.ReMeApp object at 0x73b62a862b70>�[0m

  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/application.py", line 384, in run_service
    service.run()
    │       └ <function HttpService.run at 0x73b6162abe20>
    └ <flowllm.core.service.http_service.HttpService object at 0x73b6281913d0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/service/http_service.py", line 123, in run
    uvicorn.run(
    │       └ <function run at 0x73b6284a5b20>
    └ <module 'uvicorn' from '/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/__init__.py'>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/main.py", line 606, in run
    server.run()
    │      └ <function Server.run at 0x73b6284a5ee0>
    └ <uvicorn.server.Server object at 0x73b614d549b0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/server.py", line 75, in run
    return asyncio_run(self.serve(sockets=sockets), loop_factory=self.config.get_loop_factory())
           │           │    │             │                      │    │      └ <function Config.get_loop_factory at 0x73b6284a58a0>
           │           │    │             │                      │    └ <uvicorn.config.Config object at 0x73b614d8fcb0>
           │           │    │             │                      └ <uvicorn.server.Server object at 0x73b614d549b0>
           │           │    │             └ None
           │           │    └ <function Server.serve at 0x73b6284a5f80>
           │           └ <uvicorn.server.Server object at 0x73b614d549b0>
           └ <function run at 0x73b629f21ee0>
  File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           │      │   └ <coroutine object Server.serve at 0x73b614bf82e0>
           │      └ <function Runner.run at 0x73b629f6f880>
           └ <asyncio.runners.Runner object at 0x73b614d8cd70>
  File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           │    │     │                  └ <Task pending name='Task-4' coro=<Server.serve() running at /opt/reme/.venv/lib/python3.12/site-packages/uvicorn/server.py:79...
           │    │     └ <cyfunction Loop.run_until_complete at 0x73b614d81300>
           │    └ <uvloop.Loop running=True closed=False debug=False>
           └ <asyncio.runners.Runner object at 0x73b614d8cd70>
> File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/base_embedding_model.py", line 123, in async_get_embeddings
    return await self._async_get_embeddings(input_text)
                 │    │                     └ ['Xiao Bai, 29, apples, Hami melons']
                 │    └ <function OpenAICompatibleEmbeddingModel._async_get_embeddings at 0x73b626d7c400>
                 └ <flowllm.core.embedding_model.openai_compatible_embedding_model.OpenAICompatibleEmbeddingModel object at 0x73b6153c5d90>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/openai_compatible_embedding_model.py", line 124, in _async_get_embeddings
    completion = await self._async_client.embeddings.create(
                       │    │             │          └ <function AsyncEmbeddings.create at 0x73b611c89440>
                       │    │             └ <openai.resources.embeddings.AsyncEmbeddings object at 0x73b614c7a900>
                       │    └ <openai.AsyncOpenAI object at 0x73b61517d070>
                       └ <flowllm.core.embedding_model.openai_compatible_embedding_model.OpenAICompatibleEmbeddingModel object at 0x73b6153c5d90>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/resources/embeddings.py", line 259, in create
    return await self._post(
                 │    └ <bound method AsyncAPIClient.post of <openai.AsyncOpenAI object at 0x73b61517d070>>
                 └ <openai.resources.embeddings.AsyncEmbeddings object at 0x73b614c7a900>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1884, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
                 │    │       │        │            │                  └ None
                 │    │       │        │            └ False
                 │    │       │        └ FinalRequestOptions(method='post', url='/embeddings', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN, timeout=NOT_GIVEN,...
                 │    │       └ <class 'openai.types.create_embedding_response.CreateEmbeddingResponse'>
                 │    └ <function AsyncAPIClient.request at 0x73b626e13e20>
                 └ <openai.AsyncOpenAI object at 0x73b61517d070>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1669, in request
    raise self._make_status_error_from_response(err.response) from None
          │    └ <function BaseClient._make_status_error_from_response at 0x73b626e10fe0>
          └ <openai.AsyncOpenAI object at 0x73b61517d070>

�[31m�[1mopenai.BadRequestError�[0m:�[1m Error code: 400 - {'error': {'message': 'Model "bge-m3" does not support matryoshka representation, changing output dimensions will lead to poor results.', 'type': 'BadRequestError', 'param': '', 'code': 400}}�[0m
2026-04-08 10:33:27 | ERROR | base_async_op.py:190 | op=update_vector_store_op async execute failed, error=('Error code: 400 - {\'error\': {\'message\': \'Model "bge-m3" does not support matryoshka representation, changing output dimensions will lead to poor results.\', \'type\': \'BadRequestError\', \'param\': \'\', \'code\': 400}}',)
�[33m�[1mTraceback (most recent call last):�[0m

  File "/opt/reme/.venv/bin/reme", line 6, in <module>
    sys.exit(main())
    │   │    └ <function main at 0x73b6151dc9a0>
    │   └ <built-in function exit>
    └ <module 'sys' (built-in)>

  File "�[32m/opt/reme/reme_ai/�[0m�[32m�[1mmain.py�[0m", line �[33m231�[0m, in �[35mmain�[0m
    �[1mapp�[0m�[35m�[1m.�[0m�[1mrun_service�[0m�[1m(�[0m�[1m)�[0m
    �[36m│   └ �[0m�[36m�[1m<function Application.run_service at 0x73b6151dd580>�[0m
    �[36m└ �[0m�[36m�[1m<reme_ai.main.ReMeApp object at 0x73b62a862b70>�[0m

  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/application.py", line 384, in run_service
    service.run()
    │       └ <function HttpService.run at 0x73b6162abe20>
    └ <flowllm.core.service.http_service.HttpService object at 0x73b6281913d0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/service/http_service.py", line 123, in run
    uvicorn.run(
    │       └ <function run at 0x73b6284a5b20>
    └ <module 'uvicorn' from '/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/__init__.py'>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/main.py", line 606, in run
    server.run()
    │      └ <function Server.run at 0x73b6284a5ee0>
    └ <uvicorn.server.Server object at 0x73b614d549b0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/server.py", line 75, in run
    return asyncio_run(self.serve(sockets=sockets), loop_factory=self.config.get_loop_factory())
           │           │    │             │                      │    │      └ <function Config.get_loop_factory at 0x73b6284a58a0>
           │           │    │             │                      │    └ <uvicorn.config.Config object at 0x73b614d8fcb0>
           │           │    │             │                      └ <uvicorn.server.Server object at 0x73b614d549b0>
           │           │    │             └ None
           │           │    └ <function Server.serve at 0x73b6284a5f80>
           │           └ <uvicorn.server.Server object at 0x73b614d549b0>
           └ <function run at 0x73b629f21ee0>
  File "/usr/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           │      │   └ <coroutine object Server.serve at 0x73b614bf82e0>
           │      └ <function Runner.run at 0x73b629f6f880>
           └ <asyncio.runners.Runner object at 0x73b614d8cd70>
  File "/usr/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           │    │     │                  └ <Task pending name='Task-4' coro=<Server.serve() running at /opt/reme/.venv/lib/python3.12/site-packages/uvicorn/server.py:79...
           │    │     └ <cyfunction Loop.run_until_complete at 0x73b614d81300>
           │    └ <uvloop.Loop running=True closed=False debug=False>
           └ <asyncio.runners.Runner object at 0x73b614d8cd70>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 416, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
                   └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x73b614d8e8a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
                 │    │   │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614ce5e...
                 │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
                 │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
                 │    └ <fastapi.applications.FastAPI object at 0x73b61516aed0>
                 └ <uvicorn.middleware.proxy_headers.ProxyHeadersMiddleware object at 0x73b614d8e8a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1163, in __call__
    await super().__call__(scope, receive, send)
                           │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614ce5e...
                           │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
                           └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/applications.py", line 90, in __call__
    await self.middleware_stack(scope, receive, send)
          │    │                │      │        └ <bound method RequestResponseCycle.send of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614ce5e...
          │    │                │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │                └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x73b614ce56a0>
          └ <fastapi.applications.FastAPI object at 0x73b61516aed0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
          │    │   │      │        └ <function ServerErrorMiddleware.__call__.<locals>._send at 0x73b6148ef740>
          │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <starlette.middleware.cors.CORSMiddleware object at 0x73b614ce5640>
          └ <starlette.middleware.errors.ServerErrorMiddleware object at 0x73b614ce56a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 88, in __call__
    await self.app(scope, receive, send)
          │    │   │      │        └ <function ServerErrorMiddleware.__call__.<locals>._send at 0x73b6148ef740>
          │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x73b614e1e270>
          └ <starlette.middleware.cors.CORSMiddleware object at 0x73b614ce5640>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 63, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
          │                            │    │    │     │      │        └ <function ServerErrorMiddleware.__call__.<locals>._send at 0x73b6148ef740>
          │                            │    │    │     │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │                            │    │    │     └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │                            │    │    └ <starlette.requests.Request object at 0x73b614ce6120>
          │                            │    └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x73b614e1d0a0>
          │                            └ <starlette.middleware.exceptions.ExceptionMiddleware object at 0x73b614e1e270>
          └ <function wrap_app_handling_exceptions at 0x73b628876c00>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
          │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x73b614e1d0a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
          │    │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <fastapi.routing.APIRouter object at 0x73b61517e090>
          └ <fastapi.middleware.asyncexitstack.AsyncExitStackMiddleware object at 0x73b614e1d0a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/routing.py", line 660, in __call__
    await self.middleware_stack(scope, receive, send)
          │    │                │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │    │                │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │                └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <bound method Router.app of <fastapi.routing.APIRouter object at 0x73b61517e090>>
          └ <fastapi.routing.APIRouter object at 0x73b61517e090>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/routing.py", line 680, in app
    await route.handle(scope, receive, send)
          │     │      │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │     │      │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │     │      └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │     └ <function Route.handle at 0x73b6288991c0>
          └ APIRoute(path='/summary_personal_memory', name='execute_endpoint', methods=['POST'])
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
          │    │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │    │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │    │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │    └ <function request_response.<locals>.app at 0x73b614d40ae0>
          └ APIRoute(path='/summary_personal_memory', name='execute_endpoint', methods=['POST'])
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 134, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
          │                            │    │        │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148ef880>
          │                            │    │        │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │                            │    │        └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          │                            │    └ <starlette.requests.Request object at 0x73b614ce5c70>
          │                            └ <function request_response.<locals>.app.<locals>.app at 0x73b6148ef920>
          └ <function wrap_app_handling_exceptions at 0x73b628876c00>
  File "/opt/reme/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
          │   │      │        └ <function wrap_app_handling_exceptions.<locals>.wrapped_app.<locals>.sender at 0x73b6148efa60>
          │   │      └ <bound method RequestResponseCycle.receive of <uvicorn.protocols.http.httptools_impl.RequestResponseCycle object at 0x73b614c...
          │   └ {'type': 'http', 'asgi': {'version': '3.0', 'spec_version': '2.3'}, 'http_version': '1.1', 'server': ('172.18.0.2', 28202), '...
          └ <function request_response.<locals>.app.<locals>.app at 0x73b6148ef920>
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 120, in app
    response = await f(request)
                     │ └ <starlette.requests.Request object at 0x73b614ce5c70>
                     └ <function get_request_handler.<locals>.app at 0x73b614d17d80>
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 674, in app
    raw_response = await run_endpoint_function(
                         └ <function run_endpoint_function at 0x73b6162a8360>
  File "/opt/reme/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 328, in run_endpoint_function
    return await dependant.call(**values)
                 │         │      └ {'request': SummaryPersonalMemoryModel(query='', messages=[], workspace_id='example_user_id', metadata={}, trajectories=[{'me...
                 │         └ <function HttpService.integrate_tool_flow.<locals>.execute_endpoint at 0x73b614e02e80>
                 └ Dependant(path_params=[], query_params=[], header_params=[], cookie_params=[], body_params=[ModelField(field_info=Body(Pydant...
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/service/http_service.py", line 61, in execute_endpoint
    return await flow.async_call(**request.model_dump(exclude_none=True))
                 │    │            │       └ <function BaseModel.model_dump at 0x73b629562480>
                 │    │            └ SummaryPersonalMemoryModel(query='', messages=[], workspace_id='example_user_id', metadata={}, trajectories=[{'messages': [{'...
                 │    └ <function BaseFlow.async_call at 0x73b6164e3e20>
                 └ <flowllm.core.flow.expression_tool_flow.ExpressionToolFlow object at 0x73b614e041d0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/flow/base_flow.py", line 255, in async_call
    result = await self._async_call(context=context)
                   │    │                   └ FlowContext({'query': '', 'messages': [PersonalMemory(workspace_id='example_user_id', memory_id='56dc81c4a3a945fb9aaf8cc671bf...
                   │    └ <function BaseFlow._async_call at 0x73b6164e3d80>
                   └ <flowllm.core.flow.expression_tool_flow.ExpressionToolFlow object at 0x73b614e041d0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/flow/base_flow.py", line 225, in _async_call
    await flow_op.async_call(context=context)
          │       │                  └ FlowContext({'query': '', 'messages': [PersonalMemory(workspace_id='example_user_id', memory_id='56dc81c4a3a945fb9aaf8cc671bf...
          │       └ <function BaseAsyncOp.async_call at 0x73b6164e22a0>
          └ <flowllm.core.op.sequential_op.SequentialOp object at 0x73b611bb3200>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/op/base_async_op.py", line 185, in async_call
    result = await self.async_execute()
                   │    └ <function SequentialOp.async_execute at 0x73b6164e37e0>
                   └ <flowllm.core.op.sequential_op.SequentialOp object at 0x73b611bb3200>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/op/sequential_op.py", line 51, in async_execute
    result = await op.async_call(context=self.context)
                   │  │                  │    └ FlowContext({'query': '', 'messages': [PersonalMemory(workspace_id='example_user_id', memory_id='56dc81c4a3a945fb9aaf8cc671bf...
                   │  │                  └ <flowllm.core.op.sequential_op.SequentialOp object at 0x73b611bb3200>
                   │  └ <function BaseAsyncOp.async_call at 0x73b6164e22a0>
                   └ <reme_ai.vector_store.update_vector_store_op.UpdateVectorStoreOp object at 0x73b614ce7170>
> File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/op/base_async_op.py", line 185, in async_call
    result = await self.async_execute()
                   │    └ <function UpdateVectorStoreOp.async_execute at 0x73b61504c9a0>
                   └ <reme_ai.vector_store.update_vector_store_op.UpdateVectorStoreOp object at 0x73b614ce7170>

  File "�[32m/opt/reme/reme_ai/vector_store/�[0m�[32m�[1mupdate_vector_store_op.py�[0m", line �[33m54�[0m, in �[35masync_execute�[0m
    �[35m�[1mawait�[0m �[1mself�[0m�[35m�[1m.�[0m�[1mvector_store�[0m�[35m�[1m.�[0m�[1masync_insert�[0m�[1m(�[0m�[1mnodes�[0m�[35m�[1m=�[0m�[1minsert_nodes�[0m�[1m,�[0m �[1mworkspace_id�[0m�[35m�[1m=�[0m�[1mworkspace_id�[0m�[1m)�[0m
    �[36m      │    │                               │                          └ �[0m�[36m�[1m'example_user_id'�[0m
    �[36m      │    │                               └ �[0m�[36m�[1m[VectorNode(unique_id='49dc2f8253d34deaa2e57c185cc62ac2', workspace_id='example_user_id', content='Xiao Bai, 29, apples, Hami...�[0m
    �[36m      │    └ �[0m�[36m�[1m<property object at 0x73b6164dc900>�[0m
    �[36m      └ �[0m�[36m�[1m<reme_ai.vector_store.update_vector_store_op.UpdateVectorStoreOp object at 0x73b614ce7170>�[0m

  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/vector_store/local_vector_store.py", line 546, in async_insert
    await super().async_insert(nodes, workspace_id, **kwargs)
                               │      │               └ {}
                               │      └ 'example_user_id'
                               └ [VectorNode(unique_id='49dc2f8253d34deaa2e57c185cc62ac2', workspace_id='example_user_id', content='Xiao Bai, 29, apples, Hami...
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/vector_store/memory_vector_store.py", line 755, in async_insert
    nodes = await self.async_get_node_embeddings(nodes)
                  │    │                         └ [VectorNode(unique_id='49dc2f8253d34deaa2e57c185cc62ac2', workspace_id='example_user_id', content='Xiao Bai, 29, apples, Hami...
                  │    └ <function BaseVectorStore.async_get_node_embeddings at 0x73b616494360>
                  └ <flowllm.core.vector_store.local_vector_store.LocalVectorStore object at 0x73b616cca2a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/vector_store/base_vector_store.py", line 91, in async_get_node_embeddings
    return await self.embedding_model.async_get_node_embeddings(nodes)
                 │    │               │                         └ [VectorNode(unique_id='49dc2f8253d34deaa2e57c185cc62ac2', workspace_id='example_user_id', content='Xiao Bai, 29, apples, Hami...
                 │    │               └ <function BaseEmbeddingModel.async_get_node_embeddings at 0x73b627423f60>
                 │    └ <flowllm.core.embedding_model.openai_compatible_embedding_model.OpenAICompatibleEmbeddingModel object at 0x73b6153c5d90>
                 └ <flowllm.core.vector_store.local_vector_store.LocalVectorStore object at 0x73b616cca2a0>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/base_embedding_model.py", line 207, in async_get_node_embeddings
    batch_results = await asyncio.gather(*batch_tasks)
                          │       │       └ [<coroutine object BaseEmbeddingModel.async_get_embeddings at 0x73b614c8ab40>]
                          │       └ <function gather at 0x73b629f5dee0>
                          └ <module 'asyncio' from '/usr/lib/python3.12/asyncio/__init__.py'>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/base_embedding_model.py", line 129, in async_get_embeddings
    raise e
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/base_embedding_model.py", line 123, in async_get_embeddings
    return await self._async_get_embeddings(input_text)
                 │    │                     └ ['Xiao Bai, 29, apples, Hami melons']
                 │    └ <function OpenAICompatibleEmbeddingModel._async_get_embeddings at 0x73b626d7c400>
                 └ <flowllm.core.embedding_model.openai_compatible_embedding_model.OpenAICompatibleEmbeddingModel object at 0x73b6153c5d90>
  File "/opt/reme/.venv/lib/python3.12/site-packages/flowllm/core/embedding_model/openai_compatible_embedding_model.py", line 124, in _async_get_embeddings
    completion = await self._async_client.embeddings.create(
                       │    │             │          └ <function AsyncEmbeddings.create at 0x73b611c89440>
                       │    │             └ <openai.resources.embeddings.AsyncEmbeddings object at 0x73b614c7a900>
                       │    └ <openai.AsyncOpenAI object at 0x73b61517d070>
                       └ <flowllm.core.embedding_model.openai_compatible_embedding_model.OpenAICompatibleEmbeddingModel object at 0x73b6153c5d90>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/resources/embeddings.py", line 259, in create
    return await self._post(
                 │    └ <bound method AsyncAPIClient.post of <openai.AsyncOpenAI object at 0x73b61517d070>>
                 └ <openai.resources.embeddings.AsyncEmbeddings object at 0x73b614c7a900>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1884, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
                 │    │       │        │            │                  └ None
                 │    │       │        │            └ False
                 │    │       │        └ FinalRequestOptions(method='post', url='/embeddings', params={}, headers=NOT_GIVEN, max_retries=NOT_GIVEN, timeout=NOT_GIVEN,...
                 │    │       └ <class 'openai.types.create_embedding_response.CreateEmbeddingResponse'>
                 │    └ <function AsyncAPIClient.request at 0x73b626e13e20>
                 └ <openai.AsyncOpenAI object at 0x73b61517d070>
  File "/opt/reme/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1669, in request
    raise self._make_status_error_from_response(err.response) from None
          │    └ <function BaseClient._make_status_error_from_response at 0x73b626e10fe0>
          └ <openai.AsyncOpenAI object at 0x73b61517d070>

�[31m�[1mopenai.BadRequestError�[0m:�[1m Error code: 400 - {'error': {'message': 'Model "bge-m3" does not support matryoshka representation, changing output dimensions will lead to poor results.', 'type': 'BadRequestError', 'param': '', 'code': 400}}�[0m

Dockerfile 内容如下:

FROM ubuntu:24.04

RUN sed -i 's|http://archive.ubuntu.com/ubuntu|http://mirrors.ustc.edu.cn/ubuntu|g' /etc/apt/sources.list.d/ubuntu.sources && \
    sed -i 's|http://security.ubuntu.com/ubuntu|http://mirrors.ustc.edu.cn/ubuntu|g' /etc/apt/sources.list.d/ubuntu.sources

ENV DEBIAN_FRONTEND=noninteractive \
    TZ=Asia/Shanghai

RUN apt-get update && apt-get install -y \
    locales \
    tzdata \
    ca-certificates \
    python3 \
    python3-venv \
    python3-pip \
    && rm -rf /var/lib/apt/lists/*

RUN locale-gen zh_CN.UTF-8 en_US.UTF-8

ENV LANG=zh_CN.UTF-8 \
    LANGUAGE=zh_CN:zh \
    LC_ALL=zh_CN.UTF-8

RUN ln -sf /usr/share/zoneinfo/Asia/Shanghai /etc/localtime && \
    dpkg-reconfigure -f noninteractive tzdata

WORKDIR /opt

COPY ReMe-0.3.1.8 ./reme

RUN cd reme \
    && python3 -m venv .venv \
    && .venv/bin/pip install --no-cache-dir --upgrade pip \
    && .venv/bin/pip install --no-cache-dir -e ".[light]"

ENV PATH="/opt/reme/.venv/bin:$PATH"

WORKDIR /opt/reme

docker-compose.yaml 内容如下:

services:
  reme:
    image: reme:0.3.1.8
    container_name: reme
    restart: on-failure:3
    volumes:
      - ./local_vector_store:/opt/reme/local_vector_store
      - ./.env:/opt/reme/.env
    ports:
      - "28202:28202"
    command: [
      "reme",
      "http.port=28202",
      "llm.default.model_name=qwen2.5",
      "embedding_model.default.model_name=bge-m3",
      "vector_store.default.backend=local"
    ]

.env 的内容如下,隐私信息已修改,确定 API 地址与 API Key 以及模型名称都是正常可用的:

LLM_API_KEY=sk-***
LLM_BASE_URL=http://IP:PORT/v1
EMBEDDING_API_KEY=sk-***
EMBEDDING_BASE_URL=http://IP:PORT/v1
FLOW_EMBEDDING_API_KEY=sk-***
FLOW_EMBEDDING_BASE_URL=http://IP:PORT/v1
FLOW_LLM_API_KEY=sk-***
FLOW_LLM_BASE_URL=http://IP:PORT/v1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions