Unable to get message content from chat functionality #303
Unanswered
achenbarrett
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi All,
I'm trying to use a custom endpoint with the extension where my custom endpoint will take the chat messages from the extension, forward them to an LLM service provider, and then send the response from the LLM provider back to the extension. To test for now, my custom endpoint is a simple Flask app running on localhost. The Flask app routes the
/chat/completionspost request, authenticates using the api key in the request header, sends the body of the request to an Azure OpenAI endpoint as the messages payload, and then returns the Azure OpenAI endpoint ChatCompletions object as a json response.I receive the request payload from the extension and am able to successfully parse the body of the request to send to Azure OpenAI. The Azure OpenAI service successfully sends me back a
ChatCompletionsobject and that is successfully returned to the extension as JSON, however the messages content does not appear in the chat window. I can see in the extension output window that the correct number of prompt tokens, completion tokens, total tokens, and session tokens are getting parsed out from the response I am returning to the extension. As far as I can tell, the extension is simply using the TypeScript SDK completions API which would expect back a JSON response to be able to build theChatCompletionsobject and theChatCompletionsobject is then used to get the mssage content and token counts which are rendered in the extension's chat window. Am I missing something here? Is there a specific format I need to be sending responses back to the extension to have them properly parsed and displayed?Here's the flask app that I'm using as the custom endpoint for reference. I stripped out the authentication stuff in the flask app since it's not relevant to the problem I'm facing here.
And here is an example of the OpenAI ChatCompletions response that I am sending back to the extension as JSON:
Beta Was this translation helpful? Give feedback.
All reactions