feat(openrouter): add new models [bot]#1012
Conversation
|
/test-models |
Gateway test results
Failures (2)
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=False,
)
import json as _json
_content = response.choices[0].message.content
print(_content)
if not _content:
raise Exception("VALIDATION FAILED: json-output - response content is empty")
_json.loads(_content)
print("VALIDATION: json-output SUCCESS")
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=True,
)
import json as _json
_accumulated = ""
for chunk in response:
if chunk.choices and len(chunk.choices) > 0:
delta = chunk.choices[0].delta
if delta.content is not None:
_accumulated += delta.content
print(delta.content, end="", flush=True)
if not _accumulated:
raise Exception("VALIDATION FAILED: json-output stream - no content received")
_json.loads(_accumulated)
print("\nVALIDATION: json-output stream SUCCESS") |
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
❌ Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
Reviewed by Cursor Bugbot for commit a7945cf. Configure here.
|
/test-models |
Gateway test results
Failures (2)
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=False,
)
import json as _json
_content = response.choices[0].message.content
print(_content)
if not _content:
raise Exception("VALIDATION FAILED: json-output - response content is empty")
_json.loads(_content)
print("VALIDATION: json-output SUCCESS")
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=True,
)
import json as _json
_accumulated = ""
for chunk in response:
if chunk.choices and len(chunk.choices) > 0:
delta = chunk.choices[0].delta
if delta.content is not None:
_accumulated += delta.content
print(delta.content, end="", flush=True)
if not _accumulated:
raise Exception("VALIDATION FAILED: json-output stream - no content received")
_json.loads(_accumulated)
print("\nVALIDATION: json-output stream SUCCESS") |
|
/test-models |
Gateway test results
Failures (2)
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=False,
)
import json as _json
_content = response.choices[0].message.content
print(_content)
if not _content:
raise Exception("VALIDATION FAILED: json-output - response content is empty")
_json.loads(_content)
print("VALIDATION: json-output SUCCESS")
Error: Code snippetfrom openai import OpenAI
client = OpenAI(api_key="***", base_url="https://internal.devtest.truefoundry.tech/api/llm")
response = client.chat.completions.create(
model="test-v2-openrouter/anthropic-claude-opus-4.7-fast",
messages=[
{"role": "user", "content": "List 3 colors with their hex codes in JSON."},
],
response_format={"type": "json_object"},
stream=True,
)
import json as _json
_accumulated = ""
for chunk in response:
if chunk.choices and len(chunk.choices) > 0:
delta = chunk.choices[0].delta
if delta.content is not None:
_accumulated += delta.content
print(delta.content, end="", flush=True)
if not _accumulated:
raise Exception("VALIDATION FAILED: json-output stream - no content received")
_json.loads(_accumulated)
print("\nVALIDATION: json-output stream SUCCESS") |

Auto-generated by model-addition-agent for provider
openrouter.Note
Low Risk
Low risk: adds new OpenRouter model definition YAMLs without modifying runtime logic; main risk is incorrect metadata (costs/limits/features) impacting pricing or capability selection.
Overview
Adds two new OpenRouter model definition YAMLs:
anthropic/claude-opus-4.7-fastandperceptron/perceptron-mk1.Each config declares pricing, context/output limits, modalities, and supported features (e.g., prompt caching/tooling for Claude; structured output for Perceptron), and marks both models as active
chatserverless offerings withthinking: true.Reviewed by Cursor Bugbot for commit c3f4d06. Bugbot is set up for automated code reviews on this repo. Configure here.