Skip to content

fix: Move batch processor to 100klrps scenario and add otap-batch-otap#2395

Merged
jmacd merged 2 commits intoopen-telemetry:mainfrom
JakeDern:continuous-batch-benches
Mar 23, 2026
Merged

fix: Move batch processor to 100klrps scenario and add otap-batch-otap#2395
jmacd merged 2 commits intoopen-telemetry:mainfrom
JakeDern:continuous-batch-benches

Conversation

@JakeDern
Copy link
Copy Markdown
Contributor

@JakeDern JakeDern commented Mar 20, 2026

Change Summary

This PR moves the continuous batch processor benchmarks to the 1ooklrps scenario and adds an otap-batch-otap configuration.

I think the batch processor benchmarks were mistakenly added to the "passthrough" scenario which states it's for scenarios with no processor in the middle. The dashboard also does not seem to be set up properly for these and we want to add otap-batch-otap as mentioned here: #2246 (comment)

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 20, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 88.00%. Comparing base (a88939a) to head (821556f).
⚠️ Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2395      +/-   ##
==========================================
- Coverage   88.01%   88.00%   -0.02%     
==========================================
  Files         582      582              
  Lines      204179   204179              
==========================================
- Hits       179713   179688      -25     
- Misses      23940    23965      +25     
  Partials      526      526              
Components Coverage Δ
otap-dataflow 90.05% <ø> (-0.02%) ⬇️
query_abstraction 80.61% <ø> (ø)
query_engine 90.61% <ø> (ø)
syslog_cef_receivers ∅ <ø> (∅)
otel-arrow-go 52.44% <ø> (ø)
quiver 91.91% <ø> (ø)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@JakeDern JakeDern marked this pull request as ready for review March 20, 2026 23:11
@JakeDern JakeDern requested a review from a team as a code owner March 20, 2026 23:11
@JakeDern
Copy link
Copy Markdown
Contributor Author

CC: @cijothomas

Copy link
Copy Markdown
Member

@cijothomas cijothomas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.
(Regarding using batch processor in passthrough - I will come back on it, if needed later. They can be truly passthrough now without any processor in the middle.)

@albertlockett albertlockett enabled auto-merge March 23, 2026 18:35
@albertlockett albertlockett added this pull request to the merge queue Mar 23, 2026
@github-merge-queue github-merge-queue Bot removed this pull request from the merge queue due to failed status checks Mar 23, 2026
@jmacd jmacd added this pull request to the merge queue Mar 23, 2026
Merged via the queue into open-telemetry:main with commit e8e67bd Mar 23, 2026
68 of 69 checks passed
@JakeDern JakeDern deleted the continuous-batch-benches branch April 8, 2026 15:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

Batch processor continuous benchmarks

4 participants