-
Notifications
You must be signed in to change notification settings - Fork 1
feat: add venice support #142
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughAdds Venice AI provider and runtime selection: new env vars and dependencies, a Venice provider config in models.ts, and a getModel() function used by the chat route to choose between Anthropic and Venice at runtime. Changes
Sequence DiagramsequenceDiagram
actor Client
participant ChatRoute as Chat Route
participant getModel as getModel()
participant Env as Environment
participant Provider as Provider<br/>(Anthropic or Venice)
Client->>ChatRoute: POST /chat
ChatRoute->>getModel: request model instance
getModel->>Env: read AI_PROVIDER
alt AI_PROVIDER == "venice"
getModel->>Provider: configure Venice (openai-compatible)
else
getModel->>Provider: configure Anthropic
end
getModel-->>ChatRoute: return LanguageModelV2
ChatRoute->>Provider: stream completion/request
Provider-->>ChatRoute: stream tokens
ChatRoute-->>Client: streamed response
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
.env.example (1)
2-3: Clarify provider config and align with.envlinting (optional)The new
VENICE_API_KEYandAI_PROVIDERentries make the provider setup clear. To satisfydotenv-linterand avoid any parsing surprises, consider:
- Quoting the
AI_PROVIDERvalue so the inline comment is unambiguous, e.g.:AI_PROVIDER="anthropic" # 'anthropic' or 'venice'- (Optional) Moving
AI_PROVIDERaboveANTHROPIC_API_KEYif you want to follow the linter’s key-ordering rule.Functionally this is fine as-is; these tweaks are mainly for tooling cleanliness.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
.env.example(1 hunks)apps/agentic-server/package.json(1 hunks)apps/agentic-server/src/models.ts(1 hunks)apps/agentic-server/src/routes/chat.ts(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
apps/agentic-server/src/routes/chat.ts (1)
apps/agentic-server/src/models.ts (1)
getModel(37-44)
🪛 dotenv-linter (4.0.0)
.env.example
[warning] 3-3: [UnorderedKey] The AI_PROVIDER key should go before the ANTHROPIC_API_KEY key
(UnorderedKey)
[warning] 3-3: [ValueWithoutQuotes] This value needs to be surrounded in quotes
(ValueWithoutQuotes)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: main
🔇 Additional comments (2)
apps/agentic-server/src/routes/chat.ts (1)
17-17: Type compatibility confirmed—getModel()correctly implements provider switchingThe implementation properly centralizes model selection.
getModel()explicitly returnsLanguageModelV2(line 37 of models.ts), which is the correct type forstreamTextin[email protected]. Both Anthropic (viacreateAnthropic) and Venice (viacreateOpenAICompatible) providers correctly instantiate models that satisfy theLanguageModelV2interface.The env-based provider switching via
AI_PROVIDERworks as intended, defaulting to anthropic. No per-provider configuration differences require adjustment—both providers are configured with their respective API keys and endpoints.apps/agentic-server/src/models.ts (1)
2-44: Venice integration andgetModel()design are correctly configuredThe setup is clean and well-structured: one central
modelsmap,AIProviderderived from it, andgetModel()handling env selection with a safe fallback to Anthropic plus useful logging.All Venice configuration details are accurate:
- Base URL
https://api.venice.ai/api/v1is correct for the OpenAI-compatible endpoint.- Authorization via
Bearer <VENICE_API_KEY>in the header is the expected auth mechanism.- Model ID
zai-org-glm-4.6is valid, available, and supports ~128k token context.The fallback logic
const config = models[provider] || models.anthropicprovides good runtime safety ifAI_PROVIDERis mis-set. API key handling (lazy loading on first use) matches your existing Anthropic approach and is acceptable for this use case.If you want slightly clearer observability, consider logging when an unknown
AI_PROVIDERvalue falls back to Anthropic, but this is optional:export function getModel(): LanguageModelV2 { const raw = process.env.AI_PROVIDER || 'anthropic' const provider = (raw in models ? raw : 'anthropic') as AIProvider const config = models[provider] if (raw !== provider) { console.warn(`[AI] Unknown AI_PROVIDER="${raw}", falling back to "${provider}".`) } console.log(`[AI] Using provider: ${provider}, model: ${config.model}`) return config.provider(config.model) }
7db24f0 to
a8910d4
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (1)
apps/agentic-server/package.json (1)
24-25: Dependency compatibility issue already flagged.This concern was already raised in a previous review:
@ai-sdk/openai-compatible@^1.0.28is incompatible withai@^5.0.76and will cause runtime errors. Please refer to the existing review comment for resolution options.
🧹 Nitpick comments (1)
.env.example (1)
2-3: LGTM! Clear documentation for Venice integration.The new environment variables are well-documented with sensible defaults. The inline comment clearly indicates the valid options for
AI_PROVIDER.Optional: The static analysis tool suggests alphabetical ordering and adding quotes around the value. While not required, you could apply this diff for consistency:
ANTHROPIC_API_KEY=sk-ant- +AI_PROVIDER="anthropic" # 'anthropic' or 'venice' VENICE_API_KEY= -AI_PROVIDER=anthropic # 'anthropic' or 'venice'
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
.env.example(1 hunks)apps/agentic-server/package.json(1 hunks)apps/agentic-server/src/models.ts(1 hunks)apps/agentic-server/src/routes/chat.ts(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- apps/agentic-server/src/models.ts
🧰 Additional context used
🪛 dotenv-linter (4.0.0)
.env.example
[warning] 3-3: [UnorderedKey] The AI_PROVIDER key should go before the ANTHROPIC_API_KEY key
(UnorderedKey)
[warning] 3-3: [ValueWithoutQuotes] This value needs to be surrounded in quotes
(ValueWithoutQuotes)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: main
🔇 Additional comments (1)
apps/agentic-server/src/routes/chat.ts (1)
17-17: Error handling ingetModel()is adequate.The
getModel()function correctly handles the key error scenarios:
- Missing/invalid
AI_PROVIDER: Defaults to'anthropic'and falls back to the anthropic configuration for unknown providers (lines 38-39).- Return type: Properly annotated as
LanguageModelV2(line 37) and correctly invoked asconfig.provider(config.model)(line 43), ensuring compatibility withstreamText.- Missing API keys: While keys (
ANTHROPIC_API_KEY,VENICE_API_KEY) are passed to providers at creation time, validation is appropriately deferred to runtime when API calls occur. This is standard practice with provider SDKs.The refactoring is sound and ready to merge.
Summary by CodeRabbit
New Features
Chores
✏️ Tip: You can customize this high-level summary in your review settings.