Skip to content

feat: add batch support for LLM.generate #442

Open
@micpst

Description

@micpst

Feature description

Add support for list of prompts as an input for LLM.generate method, also add adjust the interface for the generate_streaming and generate with_metadata. I'm talking about something like this:

responses = await LLiteLLM.generate([ImageDescriberPrompt(...), ...])

It should also work for strings and other types:

responses = await LLiteLLM.generate(["Tell me the meaning of life", ...])

Motivation

Currently there is no way to send multiple prompts in a single request to the LLM. There are many use cases that can take advantage of this feature, like LLM as judge evaluation pipeline or LLM reranker that need to process the same prompts with different input data.

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureNew feature or request

    Type

    No type

    Projects

    Status

    Ready

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions