Skip to content

Conversation

@osullivandonal
Copy link
Contributor

@osullivandonal osullivandonal commented Jan 21, 2026

Description

This PR adds the scraperID to the error logs for scrapers (metrics, logs, and profiles). Why is this important?

Currently if an error occurs in any of the scrapers in hostmetricsreceiver/internal/scraper/... there is no info about the which scraper errored in the logs:

2026-01-21T10:42:42.830Z	error	scraperhelper@v0.144.0/obs_metrics.go:61	Error scraping metrics	{"resource": {"service.instance.id": "63071771-056d-40ca-83e3-585dc1a00d57", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "error": "forced test error for scraper identification"}

The update in this PR add's the scraper ID (In this case the memory scraper errored) to the logs which then gives us:

2026-01-21T10:23:52.759Z	error	scraperhelper/obs_metrics.go:62	Error scraping metrics	{"resource": {"service.instance.id": "7137f19d-706f-4095-8367-0d68b2732402", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "scraper": "memory", "error": "forced test error for scraper identification"}

Link to tracking issue

Fixes Issue in collector-contrib.

Testing

Metrics scrapers: Locally tested with collector-contrib by updating the mod file...
Logs scrapers: The same pattern was applied. Logs scrapers (MySQL, PostgreSQL, Oracle, SQL Server receivers) require database connections for integration testing.
Profiles scrapers (xscraperhelper): The same pattern was applied. Note: There are currently no profiles scrapers in collector-contrib to test against.

Metrics scraper local test: Locally this was updated and included in the collector-contrib via updating the mod file. An error was forced in the hostmetricsreceiver/internal/scraper/... both memory and cpu, we can now see which scraper component had an error:

2026-01-21T11:51:34.176Z	error	scraperhelper/obs_metrics.go:61	Error scraping metrics	{"resource": {"service.instance.id": "e6832705-bc54-43dd-9b1f-0c06d58b7ead", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "scraper": "memory", "error": "forced test error for scraper identification"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
	/home/dos/go/pkg/mod/go.opentelemetry.io/collector/scraper@v0.144.0/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:167
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:139
go.opentelemetry.io/collector/scraper/scraperhelper/internal/controller.(*Controller[...]).startScraping.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/internal/controller/controller.go:118
2026-01-21T11:51:34.177Z	error	scraperhelper/obs_metrics.go:61	Error scraping metrics	{"resource": {"service.instance.id": "e6832705-bc54-43dd-9b1f-0c06d58b7ead", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "scraper": "cpu", "error": "forced test error for scraper identification"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
	/home/dos/go/pkg/mod/go.opentelemetry.io/collector/scraper@v0.144.0/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:167
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:139
go.opentelemetry.io/collector/scraper/scraperhelper/internal/controller.(*Controller[...]).startScraping.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/internal/controller/controller.go:118
2026-01-21T11:51:34.199Z	info	Metrics	{"resource": {"service.instance.id": "e6832705-bc54-43dd-9b1f-0c06d58b7ead", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "metrics", "resource metrics": 1, "metrics": 1, "data points": 1}
2026-01-21T11:51:34.199Z	info	ResourceMetrics #0
Resource SchemaURL: https://opentelemetry.io/schemas/1.9.0

Closes open-telemetry/opentelemetry-collector-contrib#35814

This supliments the error logs for scrapers with the scraperID that
failed. Currently when the scraper experiences an error there is no
information in the error logs which tells the user what scraper errored.
This fix adds the scraper id, for example if the memory scraper errors
the logs now have "scraper": "memory".
Here we also provide the scraperID to the obs logs and xscraperhelper
obs profile, tests have also been added
@osullivandonal osullivandonal requested a review from a team as a code owner January 21, 2026 14:28
@codecov
Copy link

codecov bot commented Jan 21, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.84%. Comparing base (55399d4) to head (abc7c74).
⚠️ Report is 8 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main   #14461      +/-   ##
==========================================
- Coverage   91.85%   91.84%   -0.01%     
==========================================
  Files         677      677              
  Lines       42680    42683       +3     
==========================================
  Hits        39203    39203              
- Misses       2423     2425       +2     
- Partials     1054     1055       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@dmathieu
Copy link
Member

I wondered whether we could have a shared context, so we don't have to repeat the attribute every time.
But to do that, we'd have to pass a logger around, or duplicate TelemetryConfig, neither of which seem very good.

@codspeed-hq
Copy link

codspeed-hq bot commented Jan 21, 2026

Merging this PR will not alter performance

⚠️ Unknown Walltime execution environment detected

Using the Walltime instrument on standard Hosted Runners will lead to inconsistent data.

For the most accurate results, we recommend using CodSpeed Macro Runners: bare-metal machines fine-tuned for performance measurement consistency.

✅ 61 untouched benchmarks
⏩ 20 skipped benchmarks1


Comparing osullivandonal:add-scraperID-to-error-logs (abc7c74) with main (55399d4)

Open in CodSpeed

Footnotes

  1. 20 benchmarks were skipped, so the baseline results were used instead. If they were deleted from the codebase, click here and archive them to remove them from the performance reports.

@osullivandonal
Copy link
Contributor Author

I wondered whether we could have a shared context, so we don't have to repeat the attribute every time. But to do that, we'd have to pass a logger around, or duplicate TelemetryConfig, neither of which seem very good.

We could also set the logger with scraperID in GetSettings in scraper/internal/controller/controller.go

Something like:

func GetSettings(sType component.Type, rSet receiver.Settings) scraper.Settings {
	id := component.NewID(sType)
	telemetry := rSet.TelemetrySettings
	telemetry.Logger = telemetry.Logger.With(zap.String("scraper", id.String()))
	return scraper.Settings{
		ID:                id,
		TelemetrySettings: telemetry,
		BuildInfo:         rSet.BuildInfo,
	}
}

Pros:

  • Single place to set scraperID in logs
  • All the logs for the scraper will get the scraperID
  • Can remove changes from obs_metrics.go, obs_logs.go, and obs_profiles.go.

Cons:

  • All Logs will get the scraper field now, not just Error.

@jade-guiton-dd
Copy link
Contributor

I don't really see a downside to tagging non-error logs as well. Adding the tag in GetSettings makes sense to me.

This allows us to inject the scraperID to all logs in the scrapers
metrics, logs and profiles. We can make the change once and not have
todo it in multiple places. Tests have also been updated
@osullivandonal
Copy link
Contributor Author

I don't really see a downside to tagging non-error logs as well. Adding the tag in GetSettings makes sense to me.

Sounds good, I have updated GetSettings to inject the ScraperID, and removed the duplication from obs_metrics, obs_logs, obs_profiles. Retested locally and it looks good:

e.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "scraper": "cpu", "error": "forced test error for scraper identification"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
	/home/dos/go/pkg/mod/go.opentelemetry.io/collector/scraper@v0.144.0/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:167
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:139
go.opentelemetry.io/collector/scraper/scraperhelper/internal/controller.(*Controller[...]).startScraping.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/internal/controller/controller.go:115
2026-01-22T09:29:42.659Z	error	scraperhelper/obs_metrics.go:61	Error scraping metrics	{"resource": {"service.instance.id": "4cff5cdf-e7de-47ae-9286-2f0eba636655", "service.name": "otelcontribcol", "service.version": "0.144.0-dev"}, "otelcol.component.id": "hostmetrics", "otelcol.component.kind": "receiver", "otelcol.signal": "metrics", "scraper": "memory", "error": "forced test error for scraper identification"}
go.opentelemetry.io/collector/scraper/scraperhelper.wrapObsMetrics.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/obs_metrics.go:61
go.opentelemetry.io/collector/scraper.ScrapeMetricsFunc.ScrapeMetrics
	/home/dos/go/pkg/mod/go.opentelemetry.io/collector/scraper@v0.144.0/metrics.go:24
go.opentelemetry.io/collector/scraper/scraperhelper.scrapeMetrics
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:167
go.opentelemetry.io/collector/scraper/scraperhelper.NewMetricsController.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/controller.go:139
go.opentelemetry.io/collector/scraper/scraperhelper/internal/controller.(*Controller[...]).startScraping.func1
	/home/dos/Documents/opentelemetry-collector/scraper/scraperhelper/internal/controller/controller.go:115

@jade-guiton-dd jade-guiton-dd added the ready-to-merge Code review completed; ready to merge by maintainers label Jan 22, 2026
@mx-psi mx-psi added this pull request to the merge queue Jan 26, 2026
Merged via the queue into open-telemetry:main with commit 85daf49 Jan 26, 2026
62 checks passed
@osullivandonal osullivandonal deleted the add-scraperID-to-error-logs branch January 27, 2026 10:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready-to-merge Code review completed; ready to merge by maintainers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[receiver/hostmetrics] should report scraper that failed in case of error

4 participants