Skip to content

Conversation

@morning3tar
Copy link

This pull request adds support for using OpenRouter.ai models as LLM backends in AgentLaboratory.
It allows users to select any OpenRouter-supported model (including free models) via the YAML config and routes requests to the OpenRouter API.
OpenRouter API logic is separated into its module (openrouter_inference.py) for maintainability.
For unknown or OpenRouter models, the code defaults to the cl100k_base tokenizer to avoid errors.

How to Use

  • Add your OpenRouter API key to your YAML config
  • Specify the backend model in your YAML config: llm-backend: "openrouter/deepseek/deepseek-chat-v3-0324:free"

Run AgentLaboratory as usual.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant