Open
Description
Describe the bug
OpenAI-like API type is not supported and error is given
To Reproduce
Steps to reproduce the behavior:
- I followed your setup steps from https://github.com/test-zeus-ai/testzeus-hercules
- agents_llm_config.json configuration:
{
"mistral": {
"planner_agent": {
"model_name": "mistralai/Mistral-Small-3.1-24B-Instruct-2503",
"model_api_key": "XXX",
"model_base_url": "<local_vllm>",
"model_api_type": "openai",
"llm_config_params": {
"cache_seed": null,
"temperature": 0.0,
"seed": 12345
}
},
"nav_agent": {
"model_name": "mistralai/Mistral-Small-3.1-24B-Instruct-2503",
"model_api_key": "XXX",
"model_base_url": "<local_vllm>",
"model_api_type": "openai",
"llm_config_params": {
"cache_seed": null,
"temperature": 0.0,
"seed": 12345
}
}
}
}
- .env coniguration:
AGENTS_LLM_CONFIG_FILE=agents_llm_config.json
AGENTS_LLM_CONFIG_FILE_REF_KEY=mistral
TOKENIZERS_PARALLELISM=false
MODE=prod
HEADLESS=false
RECORD_VIDEO=false
TAKE_SCREENSHOTS=true
BROWSER_TYPE=chromium
CAPTURE_NETWORK=true
HF_HOME=./.cache
- Used unchanged example from opt/input/test.feature.
- I used these commands
A.
make run
This error is given:
openai.BadRequestError: Error code: 400 - {'detail': '400: Open WebUI: Server Connection Error'}
Full log: make_run.log
or
B.
testzeus-hercules --input-file opt/input/test.feature --output-path opt/output --test-data-path opt/test_data --agents-llm-config-file ./agents_llm_config.json --agents-llm-config-file-ref-key mistral
This error is given:
pydantic_core._pydantic_core.ValidationError: 1 validation error for _LLMConfig
config_list.0.openai.model
Field required [type=missing, input_value={'api_type': 'openai'}, input_type=dict]
Full log: testzeus.log
Expected behavior
I'm using Mistral model served using vllm https://github.com/vllm-project/vllm which is OpenAI-like
Using "openai" as "model_api_type" doesn't seem to work, but I expected to.
Maybe a different api_type should be used for OpenAI-like models
Desktop
- OS: MacOS
- Version Sonoma 14.6