Skip to content

[Provider] Add DeepSeek support #1846

@Konstantinov-Innokentii

Description

@Konstantinov-Innokentii

Provider Name: DeepSeek

Provider Website:
https://deepseek.com/

API Documentation:

https://api-docs.deepseek.com/

Implementation Checklist

Adding a new provider involves integrating it into both LLM Proxy and Chat. For detailed guidance, see our Adding LLM Providers documentation.

Requirements

When submitting a PR to add this provider, please ensure:

1. API Key Instructions

Include clear instructions on how to obtain an API key for testing. This helps reviewers verify the integration works correctly.

2. Streaming Support

  • Non-streaming responses work correctly
  • Streaming responses work correctly (if supported by the provider)

If the provider doesn't support streaming, document this limitation.

3. Feature Completeness

LLM Proxy:

  • Tool invocation and persistence
  • Token/cost limits
  • Model optimization
  • Tool results compression
  • Dual LLM verification
  • Metrics and observability

Chat:

  • Chat conversations works
  • Model listing and selection
  • Streaming responses
  • Error handling

4. Demo Video

Please include a demo video showing all the LLM Proxy and Chat features mentioned above working correctly with both:

  • Non-streaming responses
  • Streaming responses (it's ok to use Archestra Chat UI for this)

5. Documentation

Update the Supported LLM Providers page to include the new provider.

6. Testing

Make sure you add e2e tests for the new provider. See E2E Tests documentation for guidance.

Acceptance Criteria: We expect all LLM Proxy and Chat functionality to be supported by the provider and demonstrated in the demo video. This is required for PR approval.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions