Skip to content

Change OpenAI default mode to streaming #7

@HavenDV

Description

@HavenDV

Anti — 01/22/2024 1:05 PM
@HavenDV Is it possible to add TokenGenerated event into OpenAI provider? Maybe we should make those events to be part of the common interface?
HavenDV — 01/23/2024 2:48 AM
It's possible, but it will only work in streaming mode - https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream
And I'm not sure if we should use this as the default
Anti — 01/23/2024 11:45 AM
streaming mode is not slower than regular one. and it looks better when you see the output of LLM right away instead of waiting for response for 10 seconds
if you want you can, actually, check if there is any subscribers to the event and pick the mode based on that
oh, and also PromptSent event. This also helps quite a lot with debug
HavenDV — 01/23/2024 10:50 PM
Yes, I think you are right and we need to do this as default behavior

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

Status

Todo

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions