S3 Batching with VRL Template syntax Key_prefix #23013
Unanswered
satellite-no
asked this question in
Q&A
Replies: 1 comment 2 replies
-
When batching is configured like this: batch:
max_bytes: 30973400 # 30MB
timeout_secs: 60 # 60s Vector should flush a batch to S3 when either:
However, a separate batch is maintained for each unique S3 object key, and in your case, the key is dynamically derived from:
If |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Quick question on the expected behavior of batching to s3. I have the below s3 config for my sink which is accepting a lot of different data sources. Based on this config would my assumption of multiple 30MB files being written to s3 OR 1 file a every 60 seconds be correct? We are seeing hundreds of thousands of small 10kb files being written in reality. The running theory we are testing is that the VRL Syntax is causing the files to be closed and written to s3 before they hit either condition.
For an example we have a stream of two data sources with different
.unmapped.dl_partition_class_name
so sense the stream is alternation the variable its causing the files to be written premature.s3:
Beta Was this translation helpful? Give feedback.
All reactions