Description
Maybe there should be a way to 'batch' the execution of actions. This would allow modules to break down complex actions into multiple actions, while still executing them in the same message to the device.
This would be performed by companion sending a new executeActionBatch
message, which would be similar to executeAction
, but would take an array of actions instead.
For simplicity, we should define that a batch should be assumed to be a series of actions on one button with the same start time of execution. (ie, in a concurrent group, with no waits between them.)
If different buttons/triggers are executing actions at the same time, each source will be a different batch.
In the future we could explore handling sequential groups, but that is more complex and wouldnt benefit from the same execution justification.
I would propose for the api, some new method for modules to implement on the module class:
// TBatchContext will be a new generic argument to the class
beginActionBatch?: () => TBatchContext
// If this is not defined and beginActionBatch is, that should throw an error
executeActionBatch?: (batch: TBatchContext) => Promise<void>
The action definition would then be updated to:
callback: (action: CompanionActionEvent, context: CompanionActionContext, batch: TBatchContext | undefined) => Promise<void> | void
or should this be a new property on CompanionActionContext
?
Use case
In bmd-atem, there would be a few benefits to this;
-
The
meTransitionSelection
is a bitmask. So toggling one value in it means sending the whole value to the atem.
This means that if the user toggles multiple using separate actions at the same time each value sent replaces the previous.
Today we are ensuring predictable behaviour by our own debounce on the sending, resulting in queuing and delaying the sending of values. This could result in timing/order issues that the user doesnt expect -
Some actions are written with 20+ properties because that translates to a single command in the protocol. This ensures we don't send many large commands immediately following each other, and also ensures that all the changes to one feature apply as a single operation, rather than staggered.
With batching, this could be done over multiple actions, with the batcher combining the operations back together -
In the protocol it is possible to send multiple commands in one packet; doing this ensures that they will be executed in the same frame.
Batching would allow us to utilise this for the batch.
This was actually a problem I saw at work in other software talking to an atem, where we would flood the atem with messages resulting in a lot of retransmits and network usage spikes because of similar 1 command per packet behaviour.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status