Skip to content

Feature Request: Streaming Support for Model Invocations #675

Open
@bannawandoor27

Description

@bannawandoor27

Description:

I kindly request the implementation of streaming support for model invocations in the Modus SDK. This feature is crucial for real-time applications that require incremental responses.

Expected Changes:

  • Introduce a stream: true option in the ChatModelInput configuration.
  • Provide a mechanism to handle tokens as they are received.

Benefits:

  • Reduces latency for real-time applications.
  • Enhances user experience by providing immediate feedback.

Thank you for considering this request.

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions