Skip to content

Commit d72e7a7

Browse files
Merge pull request #114 from jamesrochabrun/jroch-reasoning-object
DeepSeek reasoning content support.
2 parents c581d02 + 328ad63 commit d72e7a7

File tree

3 files changed

+67
-5
lines changed

3 files changed

+67
-5
lines changed

README.md

+61-5
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,6 @@ let service = OpenAIServiceFactory.service(apiKey: apiKey, organizationID: ogani
134134

135135
That's all you need to begin accessing the full range of OpenAI endpoints.
136136

137-
138137
### How to get the status code of network errors
139138

140139
You may want to build UI around the type of error that the API returns.
@@ -3289,19 +3288,76 @@ For more inofrmation about the `OpenRouter` api visit its [documentation](https:
32893288

32903289
The [DeepSeek](https://api-docs.deepseek.com/) API uses an API format compatible with OpenAI. By modifying the configuration, you can use SwiftOpenAI to access the DeepSeek API.
32913290

3291+
Creating the service
3292+
32923293
```swift
3293-
// Creating the service
32943294

32953295
let apiKey = "your_api_key"
32963296
let service = OpenAIServiceFactory.service(
32973297
apiKey: apiKey,
32983298
overrideBaseURL: "https://api.deepseek.com")
3299+
```
32993300

3300-
// Making a request
3301+
Non-Streaming Example
33013302

3303+
```swift
33023304
let prompt = "What is the Manhattan project?"
3303-
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("deepseek-reasoner"))
3304-
let stream = service.startStreamedChat(parameters: parameters)
3305+
let parameters = ChatCompletionParameters(
3306+
messages: [.init(role: .user, content: .text(prompt))],
3307+
model: .custom("deepseek-reasoner")
3308+
)
3309+
3310+
do {
3311+
let result = try await service.chat(parameters: parameters)
3312+
3313+
// Access the response content
3314+
if let content = result.choices.first?.message.content {
3315+
print("Response: \(content)")
3316+
}
3317+
3318+
// Access reasoning content if available
3319+
if let reasoning = result.choices.first?.message.reasoningContent {
3320+
print("Reasoning: \(reasoning)")
3321+
}
3322+
} catch {
3323+
print("Error: \(error)")
3324+
}
3325+
```
3326+
3327+
Streaming Example
3328+
3329+
```swift
3330+
let prompt = "What is the Manhattan project?"
3331+
let parameters = ChatCompletionParameters(
3332+
messages: [.init(role: .user, content: .text(prompt))],
3333+
model: .custom("deepseek-reasoner")
3334+
)
3335+
3336+
// Start the stream
3337+
do {
3338+
let stream = try await service.startStreamedChat(parameters: parameters)
3339+
for try await result in stream {
3340+
let content = result.choices.first?.delta.content ?? ""
3341+
self.message += content
3342+
3343+
// Optional: Handle reasoning content if available
3344+
if let reasoning = result.choices.first?.delta.reasoningContent {
3345+
self.reasoningMessage += reasoning
3346+
}
3347+
}
3348+
} catch APIError.responseUnsuccessful(let description, let statusCode) {
3349+
self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
3350+
} catch {
3351+
self.errorMessage = error.localizedDescription
3352+
}
3353+
```
3354+
3355+
Notes
3356+
3357+
- The DeepSeek API is compatible with OpenAI's format but uses different model names
3358+
- Use .custom("deepseek-reasoner") to specify the DeepSeek model
3359+
- The `reasoningContent` field is optional and specific to DeepSeek's API
3360+
- Error handling follows the same pattern as standard OpenAI requests.
33053361
```
33063362

33073363
For more inofrmation about the `DeepSeek` api visit its [documentation](https://api-docs.deepseek.com).

Sources/OpenAI/Public/ResponseModels/Chat/ChatCompletionChunkObject.swift

+3
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,8 @@ public struct ChatCompletionChunkObject: Decodable {
4545

4646
/// The contents of the chunk message.
4747
public let content: String?
48+
/// The reasoning content generated by the model, if available.
49+
public let reasoningContent: String?
4850
/// The tool calls generated by the model, such as function calls.
4951
public let toolCalls: [ToolCall]?
5052
/// The name and arguments of a function that should be called, as generated by the model.
@@ -57,6 +59,7 @@ public struct ChatCompletionChunkObject: Decodable {
5759

5860
enum CodingKeys: String, CodingKey {
5961
case content
62+
case reasoningContent = "reasoning_content"
6063
case toolCalls = "tool_calls"
6164
case functionCall = "function_call"
6265
case role

Sources/OpenAI/Public/ResponseModels/Chat/ChatCompletionObject.swift

+3
Original file line numberDiff line numberDiff line change
@@ -50,6 +50,8 @@ public struct ChatCompletionObject: Decodable {
5050
public let functionCall: FunctionCall?
5151
/// The role of the author of this message.
5252
public let role: String
53+
/// The reasoning content generated by the model, if available.
54+
public let reasoningContent: String?
5355
/// Provided by the Vision API.
5456
public let finishDetails: FinishDetails?
5557
/// The refusal message generated by the model.
@@ -86,6 +88,7 @@ public struct ChatCompletionObject: Decodable {
8688
case functionCall = "function_call"
8789
case role
8890
case finishDetails = "finish_details"
91+
case reasoningContent = "reasoning_content"
8992
case refusal
9093
case audio
9194
}

0 commit comments

Comments
 (0)