Skip to content

Conversation

iundarigun
Copy link

Description

In the documentation doesn't mention a way to map between postgres type arrays and flink datatype.

Details

This PR add a way to process arrays. For primitive int and long is tranforming in a primitive array to create the GenericArrayData, and for String, the elements are being wrapped in StringData.

@lvyanquan
Copy link
Contributor

Hi @Mrart, maybe you can help to review this.

@Mrart
Copy link
Contributor

Mrart commented Aug 17, 2025

Hi @Mrart, maybe you can help to review this.

I will review after #4086.

@Mrart
Copy link
Contributor

Mrart commented Aug 20, 2025

@iundarigun Can you update the code and add tests case for postgres pipeline to cover this?

Copy link
Contributor

@Mrart Mrart left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ut tests need to be added to cover this requirement.

return null;
}
if (Schema.Type.ARRAY.equals(schema.type()) && dbzObj instanceof List && schema.valueSchema() != null) {
switch (schema.valueSchema().type()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I didn't realize that pgsql array can only support these schema.types. There are more such as Map boleans and so on.

});
}

private static Optional<DeserializationRuntimeConverter> createArrayConverter() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a more general way, there is no need to implement only in PostgreSQLDeserializationConverterFactory. Java
Inside, in DebeziumEventDeserializationSchema achieve more appropriate

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants