The current retryable consumer implementation allows consuming records from Kafka with retry and error-handling capabilities.
However, there is currently no built-in component that enables the production of records as a result of consuming other records — a common use case when implementing pipeline or transformation patterns.
To address this, I propose adding a RetryableConsumerProducer class that extends AbstractRetryableConsumer, allowing users to:
Consume records from one or more topics,
Process them via a user-defined RecordProcessorList functional interface,
Produce resulting records to one or more output topics,
While still benefiting from the retry/error-handling and DLQ mechanisms already implemented.