Subject of the issue
This is an enhancement. Support for multiple topic ingestion is very limited to the use-case that all topics messages have the same dataset id, operation, dataflow id etc.
Messages should be filtered by topic and streamed into AEPPublisher.producer.post updates based upon topic specific batches.
Each topic batch could then have topic specific headers appended. Using something like this in the config:
aep.connection.endpoint.topic-x.headers=...
aep.connection.endpoint.topic-y.headers=...
aep.connection.endpoint.topic-z.headers=...
aep.connection.endpoint.headers=...
the current headers attribute could provide default/common headers with the specific topic headers over writing/adding new ones
I could create a pull request for this if the committers are interested / supportive. The issues I'm concern with are:
- how does this change playout in terms of connector performance (each set of sink records provided from kafka will now produce 0 to many http requests to the aep end point.
- can we share the same auth token (I think we can...)
- how does this impact on the configuration of the kafka micro batching parameters (maybe it doesn't)
Your environment
All
Steps to reproduce
set property topics=a,b,c
where a,b,c are topics with messages which have different aep dataset id's or require different operations or flow id's
Observe stitching, topic update logic and values will be broken within aep
Expected behaviour
NA
Actual behaviour
NA