Description
When upgrading from 2.29.0 to 2.36.0 our Kafka Read transform broke.
While debugging, I saw that information of the key and value coders is lost after this statement in KafkaIO:
return output.apply(readTransform).setCoder(KafkaRecordCoder.of(keyCoder, valueCoder));
The issue seems to be that output.apply
already checks for the presence of key and value coders and ends up trying to infer them with the help of LocalDeserializerProvider, so the process never arrives at {}setCoder{
}.
In our case this inference fails since we implement the Deserializer
interface in a super class of the instance passed as the deserializer (I assume it would also fail afterwards since we don't register our coders with the registry).
This was not yet broken on 2.29.0, so all versions after that could be affected.
Imported from Jira BEAM-13924. Original Jira may contain additional context.
Reported by: jgrabber.