Skip to content

Commit 524b78d

Browse files
Make pipeline perf test a required CI check (#2779)
## Summary Make the pipeline performance test a required CI check so that PRs which break the perf test are caught before merge. > **Dependency**: #2780 must be merged first (it fixes the currently broken perf test). #2774 is an example of the kind of breakage this prevents — a route rename broke the perf test but the PR still merged because the perf test was not a required check. ### Changes - **rust-ci.yml**: Add `pipeline_perf_test` job (runs on `ubuntu-latest`) and include it in `rust-required-status-check` aggregator - **pipeline-perf-on-label.yaml**: Simplify to only run on dedicated Oracle bare-metal hardware when `pipelineperf` label is present — the basic validation path is removed since `rust-ci.yml` now covers it ### Motivation The pipeline perf test has been broken by merged PRs several times because it was not a required check. This change ensures that if a PR breaks the perf test (e.g. build failures, config issues, test infrastructure breakage), it is caught before merge. --------- Co-authored-by: albertlockett <a.lockett@f5.com>
1 parent 7757946 commit 524b78d

2 files changed

Lines changed: 43 additions & 65 deletions

File tree

.github/workflows/pipeline-perf-on-label.yaml

Lines changed: 6 additions & 65 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
# This action runs the pipeline perf continuous benchmarking suite on every PR.
2-
# - With 'pipelineperf' label: runs on dedicated Oracle bare-metal hardware for accurate benchmarks
3-
# - Without label: runs on ubuntu-latest for basic validation
4-
# In either case, the results does not update the charts.
5-
name: Pipeline Perf Pre-Merge
1+
# This action runs the pipeline perf benchmarking suite on dedicated Oracle
2+
# bare-metal hardware when the 'pipelineperf' label is added to a PR.
3+
# Basic perf validation on ubuntu-latest is handled by Rust-CI (rust-ci.yml).
4+
# The results from this workflow do not update the charts.
5+
name: Pipeline Perf Dedicated
66

77
on:
88
pull_request:
@@ -18,29 +18,9 @@ concurrency:
1818
cancel-in-progress: true
1919

2020
jobs:
21-
# Check for the pipelineperf label to determine which runner to use
22-
label-check:
23-
name: Check for pipelineperf label
24-
runs-on: ubuntu-latest
25-
outputs:
26-
has_label: ${{ steps.check_label.outputs.has_label }}
27-
steps:
28-
- name: Check if PR has 'pipelineperf' label
29-
id: check_label
30-
run: |
31-
labels=$(echo '${{ toJson(github.event.pull_request.labels) }}' | jq -r '.[].name')
32-
if echo "$labels" | grep -q "pipelineperf"; then
33-
echo "Label pipelineperf found - will use dedicated hardware"
34-
echo "has_label=true" >> $GITHUB_OUTPUT
35-
else
36-
echo "Label 'pipelineperf' not found - will use ubuntu-latest"
37-
echo "has_label=false" >> $GITHUB_OUTPUT
38-
fi
39-
4021
# Run on dedicated Oracle hardware when 'pipelineperf' label is present
4122
pipeline-perf-test-dedicated:
42-
needs: label-check
43-
if: needs.label-check.outputs.has_label == 'true'
23+
if: contains(github.event.pull_request.labels.*.name, 'pipelineperf')
4424
runs-on: oracle-bare-metal-64cpu-1024gb-x86-64-ubuntu-24
4525
steps:
4626
- name: Harden the runner (Audit all outbound calls)
@@ -122,42 +102,3 @@ jobs:
122102
echo ""
123103
echo "=== Docker disk usage ==="
124104
docker system df -v 2>/dev/null || true
125-
126-
# Run on ubuntu-latest for basic validation when no label is present
127-
pipeline-perf-test-basic:
128-
needs: label-check
129-
if: needs.label-check.outputs.has_label == 'false'
130-
runs-on: ubuntu-latest
131-
steps:
132-
- name: Harden the runner (Audit all outbound calls)
133-
uses: step-security/harden-runner@fe104658747b27e96e4f7e80cd0a94068e53901d # v2.16.1
134-
with:
135-
egress-policy: audit
136-
137-
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
138-
139-
- name: Set up Python
140-
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
141-
with:
142-
python-version: "3.14"
143-
144-
- name: Set up Docker Buildx
145-
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
146-
147-
- name: Build dataflow_engine
148-
run: |
149-
git submodule init
150-
git submodule update
151-
cd rust/otap-dataflow
152-
docker buildx build --load --build-context otel-arrow=../../ -f Dockerfile -t df_engine .
153-
cd ../..
154-
155-
- name: Install dependencies
156-
run: |
157-
python -m pip install --user --require-hashes -r tools/pipeline_perf_test/orchestrator/requirements.lock.txt
158-
python -m pip install --user --require-hashes -r tools/pipeline_perf_test/load_generator/requirements.lock.txt
159-
160-
- name: Run pipeline performance test suite
161-
run: |
162-
cd tools/pipeline_perf_test
163-
python orchestrator/run_orchestrator.py --config test_suites/integration/continuous/100klrps-docker.yaml

.github/workflows/rust-ci.yml

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -737,6 +737,38 @@ jobs:
737737
reporter: java-junit
738738
fail-on-error: false
739739

740+
# Pipeline performance test - validates that Rust changes don't regress performance.
741+
pipeline_perf_test:
742+
runs-on: ubuntu-latest
743+
steps:
744+
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
745+
with:
746+
submodules: true
747+
- name: Set up Python
748+
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
749+
with:
750+
python-version: "3.14"
751+
- name: Set up Docker Buildx
752+
uses: docker/setup-buildx-action@4d04d5d9486b7bd6fa91e7baf45bbb4f8b9deedd # v4.0.0
753+
- name: Free disk space
754+
run: |
755+
sudo rm -rf /usr/lib/jvm /usr/share/dotnet /usr/share/swift /usr/local/.ghcup
756+
sudo rm -rf /usr/local/julia* /usr/local/lib/android /usr/local/share/chromium
757+
sudo rm -rf /opt/microsoft /opt/google /opt/az /usr/local/share/powershell
758+
- name: Build dataflow_engine
759+
run: |
760+
cd rust/otap-dataflow
761+
docker buildx build --load --build-context otel-arrow=../../ -f Dockerfile -t df_engine .
762+
cd ../..
763+
- name: Install dependencies
764+
run: |
765+
python -m pip install --user --require-hashes -r tools/pipeline_perf_test/orchestrator/requirements.lock.txt
766+
python -m pip install --user --require-hashes -r tools/pipeline_perf_test/load_generator/requirements.lock.txt
767+
- name: Run pipeline performance test suite
768+
run: |
769+
cd tools/pipeline_perf_test
770+
python orchestrator/run_orchestrator.py --config test_suites/integration/continuous/100klrps-docker.yaml
771+
740772
# Aggregated status check - depends only on the required matrix combinations.
741773
# Add/remove jobs from the needs list to change what is required via PR,
742774
# rather than updating GitHub branch protection settings directly.
@@ -753,6 +785,7 @@ jobs:
753785
- compile_proto
754786
- pest-fmt
755787
- no_default_features_check
788+
- pipeline_perf_test
756789
steps:
757790
- name: Check if all required jobs succeeded
758791
run: |
@@ -788,4 +821,8 @@ jobs:
788821
echo "no_default_features_check failed or was cancelled"
789822
exit 1
790823
fi
824+
if [[ "${{ needs.pipeline_perf_test.result }}" != "success" ]]; then
825+
echo "pipeline_perf_test failed or was cancelled"
826+
exit 1
827+
fi
791828
echo "All required checks passed!"

0 commit comments

Comments
 (0)