Skip to content

feat: mcp support, openai update, refactor #486

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 21 commits into
base: main
Choose a base branch
from
Open

Conversation

caarlos0
Copy link
Member

@caarlos0 caarlos0 commented Apr 22, 2025

This ended up being a way bigger change than I thought it would, and I think we should probably refactor further.

So far, this is what I've done:

Since we support a myriad of providers, for now I think we can add MCP to only anthropic and openai to get a feel on how it'll hold up.

That said, some stuff we probably need to do at some point:

  • the current stream API is based on an outdated openai go library, we should probably replace all that. I think a better way would be to use channels as well
  • also, we force everything to be of openai's types, which seems like a waste of time most of the time
  • our streaming api doesn't differentiate between roles - it assumes all responses are from the assistant role, which is not true - it can be the result of a tool call as well
  • the bubbletea lifecycle is too tied together with everything else IMO

On that note, I think we could have an API like so:

// T is the type of the message for each implementation
type Streamer[T any] interface {
  Start(context.Context) <-chan Chunk
  Close() ([]T, error)
}

Callers could then:

select {
case chunk := <-chunks:
  appendToOutput(chunk.Content)
case <-ctx.Done():
  messages, err := stream.Close()

Caching should probably store the messages array as JSON as well.

IMHO this would work better, but will break too many thing as well.

That said, I still did refactor quite a bit in there, like using the official openai library and more.

It looks like this:
CleanShot 2025-04-25 at 16 40 03@2x

closes #460
closes #397

@caarlos0 caarlos0 self-assigned this Apr 22, 2025
@caarlos0 caarlos0 requested review from Copilot, meowgorithm, aymanbagabas and andreynering and removed request for Copilot April 25, 2025 19:40
@caarlos0 caarlos0 added the enhancement New feature or request label Apr 25, 2025
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR aims to support multiple MCP providers while refactoring and standardizing client interactions for OpenAI, Anthropic, Cohere, Ollama, and Google streams. The key changes include:

  • Replacing the legacy openai package with the new "github.com/openai/openai-go" package and updating client configuration types.
  • Refactoring stream handling methods and message conversion logic across multiple modules.
  • Adding support and CLI flags for MCP functionality (listing MCP servers and tools).

Reviewed Changes

Copilot reviewed 17 out of 18 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
stream.go Updated client configuration and message role remapping for OpenAI streams.
openai.go Introduced new OpenAI client config and stream logic with minor error formatting issues.
ollama.go Adjusted stream response types and error handling to match new API types.
mods.go Updated config usage, error handling, and stream command logic.
mcp.go Added MCP support functions and tool-calling logic.
main.go Introduced new CLI flags for MCP features and updated flag registration.
google.go, format.go, config.go, cohere.go, cache.go, anthropic.go Updated type conversions, error handling, and message encoding/decoding.
Files not reviewed (1)
  • go.mod: Language not supported

@caarlos0 caarlos0 changed the title wip: mcp support and refactoring feat: mcp support, openai update, refactor Apr 25, 2025
@caarlos0 caarlos0 requested a review from Copilot April 25, 2025 19:52
@caarlos0 caarlos0 marked this pull request as ready for review April 25, 2025 19:52
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds MCP provider support while migrating from the legacy go-openai library to the official openai-go library. Key changes include updating API client initialization, refactoring message and stream handling across providers (OpenAI, Anthropic, Cohere, Ollama), and incorporating MCP tools integration.

Reviewed Changes

Copilot reviewed 17 out of 18 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
stream.go Refactored OpenAI stream client and message role conversions for updated API types.
openai.go Replaced legacy go-openai client with the official openai-go client and updated initialization logic.
ollama.go Adjusted stream response types and numerical type conversions for Ollama API.
mods.go Modified client configuration usage and float handling, and updated role constants for clarity.
messages.go Updated message role comparisons to use string literals consistent with the new API.
mcp.go Introduced MCP support functions for tool listing and tool calls across enabled MCP servers.
google.go Updated stream response conversion and error handling, though some error messages reference incorrect function names.
format.go Added utility functions for converting between internal modsMessage and openai.ChatCompletionMessage.
config.go Converted numerical configuration fields to int64/float64 and extended configuration with MCP settings.
cohere.go Updated Cohere client error conversion and stream handling using new OpenAI types.
cache.go and cache_test.go Updated caching logic to use modsMessage for storing and reloading conversation history.
anthropic.go Refactored stream handling and tool call logic for Anthropic API support.
main.go Updated flag definitions and application logic to handle new types and MCP-related flags.
Files not reviewed (1)
  • go.mod: Language not supported
Comments suppressed due to low confidence (1)

google.go:195

  • Error messages in google.go still reference 'ollamaStreamReader.processLines' instead of 'googleStreamReader.processLines'. Consider updating the error message to reflect the correct function name for clarity.
return *new(openai.ChatCompletionChunk), fmt.Errorf("ollamaStreamReader.processLines: %w", unmarshalErr)

AuthToken: accessToken.Token,
BaseURL: api.BaseURL,
}
// TODO: port this
Copy link
Preview

Copilot AI Apr 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The TODO comment regarding ghCopilotHTTPClient is ambiguous; consider clarifying or removing it if no longer needed to improve code clarity.

Suggested change
// TODO: port this
// TODO: Integrate ghCopilotHTTPClient with OpenAIClientConfig.
// The following lines are commented out because the integration with
// ghCopilotHTTPClient is not yet complete. To finalize this, ensure that
// ghCopilotHTTPClient is properly configured and compatible with the
// OpenAIClientConfig structure.

Copilot uses AI. Check for mistakes.

Co-authored-by: Andrey Nering <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: Add MCP Support
3 participants