Skip to content

fix(benchmarks): Bump max_decoding_message_size to 32MiB to fix batch processor benchmarks#2730

Merged
albertlockett merged 1 commit intoopen-telemetry:mainfrom
JakeDern:batch-processor-continuous
Apr 22, 2026
Merged

fix(benchmarks): Bump max_decoding_message_size to 32MiB to fix batch processor benchmarks#2730
albertlockett merged 1 commit intoopen-telemetry:mainfrom
JakeDern:batch-processor-continuous

Conversation

@JakeDern
Copy link
Copy Markdown
Contributor

Change Summary

Batch processor benchmarks had 100% signal drop rate due to being over the decompression limit on the backend engine. Bumping the limit fixes the issue for both continuous and nightly.

What issue does this PR close?

How are these changes tested?

Ran all scenarios locally and observed the dropped rate being 0 (or less):

image

Are there any user-facing changes?

No

@JakeDern JakeDern changed the title fix(benchmarks): Bump max_decoding_message_size to 32MiB to unblock batch processor tests fix(benchmarks): Bump max_decoding_message_size to 32MiB to fix batch processor tests Apr 22, 2026
@JakeDern JakeDern marked this pull request as ready for review April 22, 2026 15:23
@JakeDern JakeDern requested a review from a team as a code owner April 22, 2026 15:23
@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 22, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 87.87%. Comparing base (dd34485) to head (9c8995c).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2730      +/-   ##
==========================================
- Coverage   87.87%   87.87%   -0.01%     
==========================================
  Files         638      638              
  Lines      244608   244608              
==========================================
- Hits       214950   214944       -6     
- Misses      29134    29140       +6     
  Partials      524      524              
Components Coverage Δ
otap-dataflow 89.90% <ø> (-0.01%) ⬇️
query_abstraction 80.61% <ø> (ø)
query_engine 90.74% <ø> (ø)
otel-arrow-go 51.92% <ø> (ø)
quiver 92.27% <ø> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@JakeDern JakeDern changed the title fix(benchmarks): Bump max_decoding_message_size to 32MiB to fix batch processor tests fix(benchmarks): Bump max_decoding_message_size to 32MiB to fix batch processor benchmarks Apr 22, 2026
@albertlockett albertlockett enabled auto-merge April 22, 2026 15:43
@albertlockett albertlockett added this pull request to the merge queue Apr 22, 2026
Merged via the queue into open-telemetry:main with commit 3a6d2a3 Apr 22, 2026
83 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

OTLP->BATCH-OTLP scenario is showing 100% dropped for all three signaly types in nightly and continuous benches

3 participants