The LLM API is the OpenAI-compatible API that the bot uses to generate responses from an LLM.
You choose which LLM provider API to use:
- the
Qwen CodeAPI on your VM - the
OpenRouterAPI
The API key for your LLM provider API.
-
For the
Qwen CodeAPI:the value of
QWEN_CODE_API_KEYfromqwen-code-api/.env.secret. -
For the
OpenRouterAPI:your
OpenRouterAPI key.
The LLM API key (without < and >).
The base URL of the OpenAI-compatible API endpoint.
-
For the
Qwen CodeAPI on your VM:<lms-api-url>/utils/qwen-code-api/v1.See
<lms-api-url>. -
For the
OpenRouterAPI:https://openrouter.ai/api/v1.
The LLM API base URL (without < and >).
The name of the LLM model to use via the LLM provider API.
-
For the
Qwen CodeAPI:coder-model. -
For the
OpenRouterAPI:meta-llama/llama-3.3-70b-instruct:free.
The LLM API model (without < and >).