Skip to content

feat: add missing Spark import/export support for metrics #233

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
May 6, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ansible-lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ jobs:
- name: Install tox, tox-lsr
run: |
set -euxo pipefail
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.5.1"
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.6.0"

- name: Convert role to collection format
id: collection
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ansible-managed-var-comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ jobs:
- name: Install tox, tox-lsr
run: |
set -euxo pipefail
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.5.1"
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.6.0"

- name: Run ansible-plugin-scan
run: |
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ansible-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ jobs:
- name: Install tox, tox-lsr
run: |
set -euxo pipefail
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.5.1"
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.6.0"

- name: Convert role to collection format
run: |
Expand Down
27 changes: 21 additions & 6 deletions .github/workflows/qemu-kvm-integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ jobs:
python3 -m pip install --upgrade pip
sudo apt update
sudo apt install -y --no-install-recommends git ansible-core genisoimage qemu-system-x86
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.5.1"
pip3 install "git+https://github.com/linux-system-roles/tox-lsr@3.6.0"

- name: Configure tox-lsr
if: steps.check_platform.outputs.supported
Expand All @@ -86,7 +86,7 @@ jobs:
if: steps.check_platform.outputs.supported
run: >-
tox -e ${{ matrix.scenario.env }} -- --image-name ${{ matrix.scenario.image }} --make-batch
--log-level=debug --skip-tags tests::infiniband --
--log-level=debug --skip-tags tests::infiniband,tests::nvme,tests::scsi --

- name: Test result summary
if: steps.check_platform.outputs.supported && always()
Expand All @@ -109,14 +109,29 @@ jobs:
echo "$f"
done < batch.report

- name: Show test logs on failure
- name: Upload test logs on failure
if: failure()
uses: actions/upload-artifact@v4
with:
name: "logs-${{ matrix.scenario.image }}-${{ matrix.scenario.env }}"
path: |
tests/*.log
artifacts/default_provisioners.log
artifacts/*.qcow2.*.log
batch.txt
batch.report
retention-days: 30

- name: Show test log failures
if: steps.check_platform.outputs.supported && failure()
run: |
set -euo pipefail
for f in tests/*.log; do
echo "::group::$(basename $f)"
cat "$f"
echo "::endgroup::"
if FAIL=$(grep -B100 -A30 "fatal:" "$f"); then
echo "::group::$(basename $f)"
echo "$FAIL"
echo "::endgroup::"
fi
done

- name: Set commit status as success with a description that platform is skipped
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/tft.yml
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ jobs:
targetUrl: ""

- name: Run test in testing farm
uses: sclorg/testing-farm-as-github-action@v3
uses: sclorg/testing-farm-as-github-action@v4
if: contains(needs.prepare_vars.outputs.supported_platforms, matrix.platform)
with:
git_ref: main
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-CentOS-10.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-CentOS-9.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-Fedora.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-RedHat-10.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-RedHat-8.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-runtime-RedHat-9.txt
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@ bpftrace
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-mssql
pcp-system-tools
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-CentOS-10.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-CentOS-9.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-Fedora.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-RedHat-10.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-RedHat-8.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
1 change: 1 addition & 0 deletions .ostree/packages-testing-RedHat-9.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ cyrus-sasl-scram
grafana
grafana-pcp
pcp-export-pcp2elasticsearch
pcp-export-pcp2spark
pcp-pmda-bpftrace
pcp-pmda-elasticsearch
pcp-pmda-mssql
Expand Down
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,14 @@ Boolean flag allowing metric values to be exported into Elasticsearch.

Boolean flag allowing metrics from Elasticsearch to be made available.

### metrics_into_spark: false

Boolean flag allowing metric values to be exported into Spark.

### metrics_from_spark: false

Boolean flag allowing metrics from Spark to be made available.

### metrics_from_postfix: false

Boolean flag allowing metrics from Postfix to be made available.
Expand Down
6 changes: 6 additions & 0 deletions defaults/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,12 @@ metrics_into_elasticsearch: false
# Make metrics available from an Elasticsearch instance
metrics_from_elasticsearch: false

# Export metrics to an Apache Spark data analysis setup
metrics_into_spark: false

# Make metrics available from an Apache Spark instance
metrics_from_spark: false

# Make metrics available from a SQL Server database
metrics_from_mssql: false

Expand Down
11 changes: 11 additions & 0 deletions examples/pcp_spark.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# SPDX-License-Identifier: MIT
---
- name: Set up Spark for use with PCP
hosts: monitoring

roles:
- role: linux-system-roles.metrics
vars:
metrics_provider: pcp
metrics_from_spark: true
metrics_into_spark: true
12 changes: 12 additions & 0 deletions tasks/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,18 @@
metrics_into_elasticsearch | d(false) | bool
# yamllint enable rule:line-length

- name: Configure Spark metrics
vars:
spark_metrics_agent: "{{ metrics_from_spark | d(false) | bool }}"
spark_export_metrics: "{{ metrics_into_spark | d(false) | bool }}"
spark_metrics_provider: "{{ metrics_provider }}"
include_role:
# noqa role-name[path]
name: "{{ role_path }}/roles/spark"
when: >
metrics_from_spark | d(false) | bool or
metrics_into_spark | d(false) | bool

- name: Configure SQL Server metrics.
vars:
mssql_metrics_provider: "{{ metrics_provider }}"
Expand Down
10 changes: 10 additions & 0 deletions tests/check_from_spark.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# SPDX-License-Identifier: MIT
---
- name: Check if OpenMetrics PMDA has Spark metrics registered
command: pmprobe -I openmetrics.control.status
register: status
until: status.stdout.find("spark") != -1
retries: 10
delay: 1
changed_when: false
when: "'pcp-pmda-openmetrics' in __spark_packages_pcp"
11 changes: 11 additions & 0 deletions tests/check_into_spark.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# SPDX-License-Identifier: MIT
---
- name: Check if export to Spark is installed
command: test -x /usr/bin/pcp2spark
changed_when: false

- name: Check the ansible_managed header in pcp2spark.service
vars:
__test_config_path: >-
{{ __spark_service_path }}/pcp2spark.service
include_tasks: check_header.yml
1 change: 1 addition & 0 deletions tests/roles/spark
38 changes: 38 additions & 0 deletions tests/tests_verify_from_spark.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# SPDX-License-Identifier: MIT
---
- name: Test import from Spark
hosts: all
vars:
metrics_from_spark: true

pre_tasks:
- name: Stop test
meta: end_host
when: (ansible_distribution in ['RedHat', 'CentOS'] and
ansible_distribution_major_version | int < 7) or
ansible_distribution not in ['Fedora', 'RedHat', 'CentOS']

- name: Save state of services
import_tasks: get_services_state.yml

tasks:
- name: Run test
block:
- name: Run the metrics role to configure Spark
include_role:
name: linux-system-roles.metrics
public: true

- name: Check if import from Spark works
include_tasks: check_from_spark.yml

- name: Flush handlers
meta: flush_handlers

rescue:
- name: Handle failure case
include_tasks: handle_test_failure.yml

always:
- name: Restore state of services
import_tasks: restore_services_state.yml
38 changes: 38 additions & 0 deletions tests/tests_verify_into_spark.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# SPDX-License-Identifier: MIT
---
- name: Test import to Spark
hosts: all
vars:
metrics_into_spark: true

pre_tasks:
- name: Stop test
meta: end_host
when: (ansible_distribution in ['RedHat'] and
(ansible_facts['distribution_version'] is version('8.4', '<'))) or
ansible_distribution not in ['Fedora', 'RedHat']

- name: Save state of services
import_tasks: get_services_state.yml

tasks:
- name: Run test
block:
- name: Run the role
include_role:
name: linux-system-roles.metrics
public: true

- name: Check if import to Spark works
include_tasks: check_into_spark.yml

- name: Flush handlers
meta: flush_handlers

rescue:
- name: Handle failure case
include_tasks: handle_test_failure.yml

always:
- name: Restore state of services
import_tasks: restore_services_state.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading