-
Notifications
You must be signed in to change notification settings - Fork 146
feat: mcp support, openai update, refactor #486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: Dex <[email protected]>
Signed-off-by: Carlos Alexandro Becker <[email protected]>
Signed-off-by: Carlos Alexandro Becker <[email protected]>
- migrating to new lib - adding mcp support todo: - fix azure - test copilot
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR aims to support multiple MCP providers while refactoring and standardizing client interactions for OpenAI, Anthropic, Cohere, Ollama, and Google streams. The key changes include:
- Replacing the legacy openai package with the new "github.com/openai/openai-go" package and updating client configuration types.
- Refactoring stream handling methods and message conversion logic across multiple modules.
- Adding support and CLI flags for MCP functionality (listing MCP servers and tools).
Reviewed Changes
Copilot reviewed 17 out of 18 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
stream.go | Updated client configuration and message role remapping for OpenAI streams. |
openai.go | Introduced new OpenAI client config and stream logic with minor error formatting issues. |
ollama.go | Adjusted stream response types and error handling to match new API types. |
mods.go | Updated config usage, error handling, and stream command logic. |
mcp.go | Added MCP support functions and tool-calling logic. |
main.go | Introduced new CLI flags for MCP features and updated flag registration. |
google.go, format.go, config.go, cohere.go, cache.go, anthropic.go | Updated type conversions, error handling, and message encoding/decoding. |
Files not reviewed (1)
- go.mod: Language not supported
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds MCP provider support while migrating from the legacy go-openai library to the official openai-go library. Key changes include updating API client initialization, refactoring message and stream handling across providers (OpenAI, Anthropic, Cohere, Ollama), and incorporating MCP tools integration.
Reviewed Changes
Copilot reviewed 17 out of 18 changed files in this pull request and generated 1 comment.
Show a summary per file
File | Description |
---|---|
stream.go | Refactored OpenAI stream client and message role conversions for updated API types. |
openai.go | Replaced legacy go-openai client with the official openai-go client and updated initialization logic. |
ollama.go | Adjusted stream response types and numerical type conversions for Ollama API. |
mods.go | Modified client configuration usage and float handling, and updated role constants for clarity. |
messages.go | Updated message role comparisons to use string literals consistent with the new API. |
mcp.go | Introduced MCP support functions for tool listing and tool calls across enabled MCP servers. |
google.go | Updated stream response conversion and error handling, though some error messages reference incorrect function names. |
format.go | Added utility functions for converting between internal modsMessage and openai.ChatCompletionMessage. |
config.go | Converted numerical configuration fields to int64/float64 and extended configuration with MCP settings. |
cohere.go | Updated Cohere client error conversion and stream handling using new OpenAI types. |
cache.go and cache_test.go | Updated caching logic to use modsMessage for storing and reloading conversation history. |
anthropic.go | Refactored stream handling and tool call logic for Anthropic API support. |
main.go | Updated flag definitions and application logic to handle new types and MCP-related flags. |
Files not reviewed (1)
- go.mod: Language not supported
Comments suppressed due to low confidence (1)
google.go:195
- Error messages in google.go still reference 'ollamaStreamReader.processLines' instead of 'googleStreamReader.processLines'. Consider updating the error message to reflect the correct function name for clarity.
return *new(openai.ChatCompletionChunk), fmt.Errorf("ollamaStreamReader.processLines: %w", unmarshalErr)
AuthToken: accessToken.Token, | ||
BaseURL: api.BaseURL, | ||
} | ||
// TODO: port this |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The TODO comment regarding ghCopilotHTTPClient is ambiguous; consider clarifying or removing it if no longer needed to improve code clarity.
// TODO: port this | |
// TODO: Integrate ghCopilotHTTPClient with OpenAIClientConfig. | |
// The following lines are commented out because the integration with | |
// ghCopilotHTTPClient is not yet complete. To finalize this, ensure that | |
// ghCopilotHTTPClient is properly configured and compatible with the | |
// OpenAIClientConfig structure. |
Copilot uses AI. Check for mistakes.
Co-authored-by: Andrey Nering <[email protected]>
This ended up being a way bigger change than I thought it would, and I think we should probably refactor further.
So far, this is what I've done:
Since we support a myriad of providers, for now I think we can add MCP to only anthropic and openai to get a feel on how it'll hold up.
That said, some stuff we probably need to do at some point:
On that note, I think we could have an API like so:
Callers could then:
Caching should probably store the
messages
array as JSON as well.IMHO this would work better, but will break too many thing as well.
That said, I still did refactor quite a bit in there, like using the official openai library and more.
It looks like this:

closes #460
closes #397