-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Open
Labels
.NETIssue or Pull requests regarding .NET codeIssue or Pull requests regarding .NET codebugSomething isn't workingSomething isn't working
Description
Describe the bug
After Kernel.AddOpenAIChatClient, invokePrompt produces an exception. This fixed by providing a HttpClient as a parameter to AddOpenAIChatClient.
To Reproduce
private const string ApiKey = "secret";
private const string Endpoint =
"https://mistral-small-3-2-24b-instruct-2506.endpoints.kepler.ai.cloud.ovh.net/api/openai_compat/v1";
private const string Model = "Mistral-Small-3.2-24B-Instruct-2506";
[Fact]
public async Task ProducesException()
{
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient(
modelId: Model,
apiKey: ApiKey,
endpoint: new Uri(Endpoint)
)
.Build();
KernelArguments arguments = new() { { "topic", "sea" } };
Console.WriteLine(await kernel.InvokePromptAsync("What color is the {{$topic}}?", arguments));
}
[Fact]
public async Task Works()
{
var kernel = Kernel.CreateBuilder()
.AddOpenAIChatClient(
httpClient: new HttpClient
{
BaseAddress = new Uri(Endpoint)
},
modelId: Model,
apiKey: ApiKey
)
.Build();
KernelArguments arguments = new() { { "topic", "sea" } };
Console.WriteLine((await kernel.InvokePromptAsync("What color is the {{$topic}}?", arguments)).ToString());
}
// manually creating the client without Kernel also works
[Fact]
public async Task CreateChatClient()
{
var client = new OpenAIClient(new ApiKeyCredential(ApiKey), new OpenAIClientOptions
{
Endpoint = new Uri(Endpoint),
});
var chatClient = client.GetChatClient(Model);
var completeChat = chatClient.CompleteChat([new UserChatMessage("hello")]);
Console.WriteLine(completeChat.Value.Content.First().Text);
}
Full stacktrace
System.ClientModel.ClientResultException
The SSL connection could not be established, see inner exception.
at System.ClientModel.Primitives.HttpClientPipelineTransport.ProcessSyncOrAsync(PipelineMessage message, Boolean async)
at System.ClientModel.Primitives.HttpClientPipelineTransport.ProcessCoreAsync(PipelineMessage message)
at System.ClientModel.Primitives.PipelineTransport.ProcessSyncOrAsync(PipelineMessage message, Boolean async)
at System.ClientModel.Primitives.PipelineTransport.ProcessAsync(PipelineMessage message)
at System.ClientModel.Primitives.PipelineTransport.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.PipelinePolicy.ProcessNextAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.MessageLoggingPolicy.ProcessSyncOrAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex, Boolean async)
at System.ClientModel.Primitives.MessageLoggingPolicy.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.PipelinePolicy.ProcessNextAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.ApiKeyAuthenticationPolicy.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.PipelinePolicy.ProcessNextAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.ClientRetryPolicy.ProcessSyncOrAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex, Boolean async)
at System.ClientModel.Primitives.ClientRetryPolicy.ProcessSyncOrAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex, Boolean async)
at System.ClientModel.Primitives.ClientRetryPolicy.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at GenericActionPipelinePolicy.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at OpenAI.GenericActionPipelinePolicy.ProcessAsync(PipelineMessage message, IReadOnlyList`1 pipeline, Int32 currentIndex)
at System.ClientModel.Primitives.ClientPipeline.SendAsync(PipelineMessage message)
at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)
at OpenAI.Chat.ChatClient.CompleteChatAsync(BinaryContent content, RequestOptions options)
at OpenAI.Chat.ChatClient.CompleteChatAsync(IEnumerable`1 messages, ChatCompletionOptions options, CancellationToken cancellationToken)
at Microsoft.Extensions.AI.OpenAIChatClient.GetResponseAsync(IEnumerable`1 messages, ChatOptions options, CancellationToken cancellationToken)
at Microsoft.Extensions.AI.OpenTelemetryChatClient.GetResponseAsync(IEnumerable`1 messages, ChatOptions options, CancellationToken cancellationToken)
at Microsoft.Extensions.AI.FunctionInvokingChatClient.GetResponseAsync(IEnumerable`1 messages, ChatOptions options, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.GetChatClientResultAsync(IChatClient chatClient, Kernel kernel, PromptRenderingResult promptRenderingResult, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.KernelFunctionFromPrompt.InvokeCoreAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.KernelFunction.<>c__DisplayClass32_0.<<InvokeAsync>b__0>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.SemanticKernel.Kernel.InvokeFilterOrFunctionAsync(NonNullCollection`1 functionFilters, Func`2 functionCallback, FunctionInvocationContext context, Int32 index)
at Microsoft.SemanticKernel.Kernel.OnFunctionInvocationAsync(KernelFunction function, KernelArguments arguments, FunctionResult functionResult, Boolean isStreaming, Func`2 functionCallback, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.KernelFunction.InvokeAsync(Kernel kernel, KernelArguments arguments, CancellationToken cancellationToken)
at SemanticKernelRag.Tests.TestiTest.ProducesException()
Expected behavior
Should work without providing a HttpClient. Seems like the HttpClient that is created inside the configure is buggy.
Platform
- Language: C#
- Versions
<PackageReference Include="Microsoft.SemanticKernel.Agents.OpenAI" Version="1.65.0-preview" /> <PackageReference Include="Microsoft.SemanticKernel.Connectors.OpenAI" Version="1.65.0" />
Metadata
Metadata
Assignees
Labels
.NETIssue or Pull requests regarding .NET codeIssue or Pull requests regarding .NET codebugSomething isn't workingSomething isn't working
Type
Projects
Status
Bug