Skip to content

[BUG] 简洁阐述问题 / langchain部署中使用api来调用llm模型和embedding模型出现bug,初始化知识库失败 #5431

@Jacob021219

Description

@Jacob021219

问题描述 / Problem Description
langchain部署中使用api来调用llm模型和embedding模型出现bug,初始化知识库失败

复现问题的步骤 / Steps to Reproduce

  1. 我修改了model_settings.yaml,我使用阿里dashscope的api服务调用qwen-plus和text-embedding-v3分别作为LLM和embedding
  2. 执行chatchat kb -r初始化向量库
  3. 出现问题
    3.1openai.BadRequestError: Error code: 400 - {'error': {'message': '<400> InternalError.Algo.InvalidParameter: Value error, contents is neither str nor list of str.: input.contents', 'type': 'InvalidParameter', 'param': None, 'code': 'InvalidParameter'}, 'id': '928d36f0-6097-9f7f-964d-f521ae99bfa0', 'request_id': '928d36f0-6097-9f7f-964d-f521ae99bfa0'}
    3.2ValueError: Tag "<400>" does not correspond to any known color directive, make sure you did not misspelled it (or prepend '' to escape it)
    3.3ValueError: Tag "<400>" does not correspond to any known color directive, make sure you did not misspelled it (or prepend '' to escape it)

完整报错 / whole problem
(chatchat) PS D:\llm_project\Langchain-Chatchat-master> chatchat kb -r
C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\pydantic_settings\main.py:426: UserWarning: Config key json_file is set in model_config but will be ignored because no JsonConfigSettingsSource source is configured. To use this config key,
add a JsonConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\pydantic_settings\main.py:426: UserWarning: Config key json_file is set in model_config but will be ignored because no JsonConfigSettingsSource source is configured. To use this config key,
add a JsonConfigSettingsSource source to the settings sources via the settings_customise_sources hook.
self._settings_warn_unused_config_keys(sources, self.model_config)
C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following:
from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser
warnings.warn(
recreating all vector stores
2025-12-26 15:25:55.437 | INFO | chatchat.server.knowledge_base.kb_cache.faiss_cache:load_vector_store:109 - loading vector store in 'samples/vector_store/text-embedding-v4' from disk.
Process Process-1:
Traceback (most recent call last):
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store
vector_store = self.new_vector_store(
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store
vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents
return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts
embeddings = embedding.embed_documents(texts)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_openai\embeddings\base.py", line 480, in embed_documents
return self._get_len_safe_embeddings(texts, engine=engine)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\langchain_openai\embeddings\base.py", line 323, in _get_len_safe_embeddings
response = self.client.create(
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\openai\resources\embeddings.py", line 132, in create
return self._post(
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\openai_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\openai_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': '<400> InternalError.Algo.InvalidParameter: Value error, contents is neither str nor list of str.: input.contents', 'type': 'InvalidParameter', 'param': None, 'code': 'InvalidParameter'}, 'id': '928d36f0-6097-9f7f-964d-f521ae99bfa0', 'request_id': '928d36f0-6097-9f7f-964d-f521ae99bfa0'}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\init_database.py", line 40, in worker
folder2db(
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\migrate.py", line 156, in folder2db
kb.create_kb()
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_service\base.py", line 98, in create_kb
self.do_create_kb()
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb
self.load_vector_store()
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store
return kb_faiss_pool.load_vector_store(
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 140, in load_vector_store
logger.exception(e)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_logger.py", line 2099, in exception
__self._log("ERROR", False, options, __message, args, kwargs)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_logger.py", line 2051, in _log
colored_message = Colorizer.prepare_simple_message(str(message))
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_colorizer.py", line 378, in prepare_simple_message
parser.feed(string)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_colorizer.py", line 260, in feed
raise ValueError(
ValueError: Tag "<400>" does not correspond to any known color directive, make sure you did not misspelled it (or prepend '' to escape it)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\Jacob.conda\envs\chatchat\lib\multiprocessing\process.py", line 314, in _bootstrap
self.run()
File "C:\Users\Jacob.conda\envs\chatchat\lib\multiprocessing\process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "D:\llm_project\Langchain-Chatchat-master\libs\chatchat-server\chatchat\init_database.py", line 61, in worker
logger.exception(e)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_logger.py", line 2099, in exception
__self._log("ERROR", False, options, __message, args, kwargs)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_logger.py", line 2051, in _log
colored_message = Colorizer.prepare_simple_message(str(message))
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_colorizer.py", line 378, in prepare_simple_message
parser.feed(string)
File "C:\Users\Jacob.conda\envs\chatchat\lib\site-packages\loguru_colorizer.py", line 260, in feed
raise ValueError(
ValueError: Tag "<400>" does not correspond to any known color directive, make sure you did not misspelled it (or prepend '' to escape it)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions