Skip to content

Bulk processing of inputs in an LLM - bulk categorization #3394

Open
@adampolak-vertex

Description

@adampolak-vertex

Currently running prompt flow the "input" is defined as a single "item".

Currently for the classification example it can only classify 1 input at a time.

There needs to be a feature to be able to put in many products at once so that a single prompt can output many categorizations.

To be able to import many inputs at once, and have them all output and linked to original input to make sure that accuracy can be traced.

This way the "cost" of the prompt tokens to explain what must be done can be "amortized" across many inputs.

The same way in an eval you can "bulk" process inputs. The same must be done with a general flow.

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions