Skip to content

RunInference: investigate adding optional batching flag #21863

Open
@yeandy

Description

@yeandy

What would you like to happen?

Look into adding a flag that users can specify to turn off BatchElements.

Issue Priority

Priority: 2

Issue Component

Component: sdk-py-core

Subtask of: #22117

Activity

changed the title [Feature Request]: RunInference: investigate adding optional batching flag RunInference: investigate adding optional batching flag on Jun 14, 2022
TheNeuralBit

TheNeuralBit commented on Aug 12, 2022

@TheNeuralBit
Member

Note that the ability to elide batching/unbatching is the primary value provided by https://s.apache.org/batched-dofns right now.

damccorm

damccorm commented on Sep 2, 2022

@damccorm
Contributor

@yeandy do we actually need this or is setting the max batch size to 1 enough? Is the idea here just to offer a convenience flag?

7 remaining items

Loading
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

      Participants

      @TheNeuralBit@tvalentyn@yeandy@damccorm

      Issue actions

        RunInference: investigate adding optional batching flag · Issue #21863 · apache/beam