Commit 5b14e3e
feat: complete vLLM RunPod integration with modern chat interface
- Add comprehensive vLLM service with dual API support (Native + OpenAI compatible)
- Create modern chat interface similar to Qwen/DeepSeek with light/dark mode
- Implement useInference hook for model management and cost optimization
- Add ThemeProvider integration with next-themes
- Update marketplace with real inference capabilities and testing
- Add TypeScript interfaces for vLLM requests/responses
- Configure RunPod environment variables for serverless deployment
- Fix Next.js config compatibility issues
- Integrate organization-specific model configurations
🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>1 parent a0c66ba commit 5b14e3e
16 files changed
Lines changed: 2700 additions & 24 deletions
File tree
- public
- src
- app
- chat
- marketplace
- components
- chat
- terminal
- hooks
- providers
- services/runpod
- types
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1 | | - | |
2 | | - | |
| 1 | + | |
3 | 2 | | |
4 | | - | |
| 3 | + | |
| 4 | + | |
5 | 5 | | |
6 | 6 | | |
7 | 7 | | |
| |||
30 | 30 | | |
31 | 31 | | |
32 | 32 | | |
33 | | - | |
| 33 | + | |
34 | 34 | | |
35 | 35 | | |
36 | 36 | | |
| |||
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
34 | 34 | | |
35 | 35 | | |
36 | 36 | | |
| 37 | + | |
37 | 38 | | |
38 | 39 | | |
39 | 40 | | |
| |||
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
0 commit comments