Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,255 changes: 1,255 additions & 0 deletions dashboards/databricks-cluster-health/databricks-cluster-health.json

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,049 changes: 1,049 additions & 0 deletions dashboards/databricks-job-runs/databricks-job-runs.json

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,102 changes: 1,102 additions & 0 deletions dashboards/databricks-pipeline-updates/databricks-pipeline-updates.json

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
840 changes: 840 additions & 0 deletions dashboards/databricks-query-metrics/databricks-query-metrics.json

Large diffs are not rendered by default.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,578 changes: 1,106 additions & 472 deletions dashboards/databricks-spark/databricks-spark.json

Large diffs are not rendered by default.

Binary file modified dashboards/databricks-spark/databricks-spark01.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dashboards/databricks-spark/databricks-spark02.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified dashboards/databricks-spark/databricks-spark03.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed dashboards/databricks-spark/databricks-spark04.png
Binary file not shown.
7 changes: 5 additions & 2 deletions data-sources/databricks/config.yml
Copy link
Collaborator

@praveenkatha-nr praveenkatha-nr Jan 28, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We are unable to deploy this PR because description should be at most 240 character(s). @sdewitt-newrelic Please update datasource description. Please create a new PR against release branch for the fix.

Original file line number Diff line number Diff line change
@@ -1,12 +1,15 @@
id: databricks
displayName: Databricks Integration
description: |
This integration collects Spark telemetry, Workflow telemetry, and Cost and Billing information from Databricks.
The Databricks Integration collects Apache Spark application metrics,
Databricks Lakeflow job run metrics, Databricks Lakeflow Spark Declarative
Pipeline update metrics, Databricks query metrics, Databricks cluster health
metrics and logs, and Databricks consumption and cost data.
icon: logo.png
install:
primary:
link:
url: https://github.com/newrelic-experimental/newrelic-databricks-integration
url: https://github.com/newrelic/newrelic-databricks-integration?tab=readme-ov-file#getting-started
keywords:
- nrlabs
- nrlabs-data
Expand Down
51 changes: 42 additions & 9 deletions quickstarts/databricks/config.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,38 @@
id: 533cdd19-8232-42cb-b134-e7d17bfff581
slug: databricks
title: Databricks Spark Integration
title: Databricks Integration
description: |
Databricks is an orchestration platform for Apache Spark. Instantly monitor Databricks Spark clusters with our New Relic Spark integration.
This integration collects Spark telemetry, Workflow telemetry, and Cost and Billing information from Databricks.
## Why monitor Databricks

The New Relic Databricks integration can collect telemetry from Spark running on Databricks. By default, the integration will automatically connect to and collect telemetry from the Spark deployments in all clusters created via the UI or API in the specified workspace.
In the world of big data, Databricks is a mission-critical platform. But
making sure your workloads are running efficiently, cost-effectively, and
reliably can be challenging.

The Databricks Integration from New Relic delivers total visibility for your
entire Databricks estate, allowing you to troubleshoot, optimize, and connect
performance directly to cost - all from a single, unified observability
platform.

### Databricks quickstart highlights

The Databricks integration collects a comprehensive suite of telemetry data,
including the following:

- Apache Spark application metrics
- Databricks Lakeflow job run metrics
- Databricks Lakeflow Spark Declarative Pipeline update metrics
- Databricks query metrics
- Databricks cluster health metrics and logs
- Databricks consumption and cost data

With the pre-built dashboards in this quickstart, you can quickly visualize
and analyze your Databricks workloads to ensure optimal performance and cost
efficiency.
summary: |
Monitor Databricks Spark clusters with the New Relic Databricks integration
Gain full visibility into your entire Databricks estate with the comprehensive
suite of telemetry data collected by the Databricks Integration
icon: logo.png
level: Community

keywords:
- nrlabs
- nrlabs-data
Expand All @@ -23,11 +45,22 @@ keywords:
authors:
- New Relic Labs
documentation:
- name: Databricks integration docs
- name: Getting Started
description: |
Follow our Getting Started documentation to quickly instrument your
Databricks environment and start visualizing your data in New Relic.
url: https://github.com/newrelic/newrelic-databricks-integration?tab=readme-ov-file#getting-started
- name: Learn More
description: |
Collect Spark telemetry data with the New Relic Databricks integration
url: https://github.com/newrelic-experimental/newrelic-databricks-integration
Learn more about the Databricks Integration in our Usage Guide, including
configuration options, telemetry collected, and dashboards included.
url: https://github.com/newrelic/newrelic-databricks-integration?tab=readme-ov-file#usage-guide
dataSourceIds:
- databricks
dashboards:
- databricks-spark
- databricks-job-runs
- databricks-pipeline-updates
- databricks-query-metrics
- databricks-cluster-health
- databricks-consumption-cost
Loading