Skip to content

No thinking/reason content provided from DeepSeek-R1 (Ollama) #1460

@Cheezy096

Description

@Cheezy096

I've been finding myself unable to get any sort of response regarding thinking from DeepSeek-R1 and I'm wondering if it's either something I am doing wrong with my client?

How I am setting up:

llm, err = ollama.New(
	ollama.WithServerURL(config.Maid.Ollama.URL),
	ollama.WithModel(config.Maid.Ollama.Model), // "deepseek-r1"
)

complete, err := llm.GenerateContent(llmContextMan.Ctx, msgContent,
	llms.WithTemperature(temperature), // 0.7
	llms.WithReturnThinking(config.Maid.Ollama.ThinkingInResponse), // false (though ReturnThinking as true yields nothing)
	llms.WithInterleaveThinking(config.Maid.Ollama.InterleaveThinking), // false
	llms.WithThinkingMode(llms.ThinkingMode(config.Maid.Ollama.ThinkingMode)), // medium
	llms.WithStreamThinking(config.Maid.Ollama.StreamThinking), // true
	llms.WithStreamingReasoningFunc(func(ctx context.Context, reasoningChunk, chunk []byte) error {
		log.Debug(fmt.Sprintf("[AI-ReasonStream]: %s", string(chunk)))
		return nil
	}),
)

I've outputted both .Choices-[AI-Stuff] and .Choices[0].GenerationInfo-[AI-Stuff2] using the prompt "what is 2+2?":
Image

ThinkingTokens are at 0 and the ThinkingContent is empty so honestly I'm at a bit of a loss as to what is going on.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions