Description
Component(s)
receiver/journald
What happened?
Description
When using the journald receiver in opentelemetry-collector-contrib version 0.120.0, all messages have detected_status = unknown when sent to Loki and viewed in Grafana. After upgrading to version 0.121.0, all messages are marked with detected_status = error. The system logs appear correct when checked using dmesg -x. The issue occurs when reading messages from dmesg, and it's unclear how to correctly interpret dmesg statuses in OpenTelemetry.
Steps to Reproduce
- Configure OpenTelemetry Collector with the journald receiver.
- Send logs to dmesg using the command:
echo "<3> [Error] $(date)" | tee /dev/kmsg
- Verify logs in Loki/Grafana.
- Observe that in version 0.120.0, logs have detected_status = unknown, while in 0.121.0, they have detected_status = error.
Expected Result
- OpenTelemetry Collector should correctly interpret and forward the status of messages from dmesg, preserving their severity levels.
- The detected status should match the system's actual log severity levels and be mapped correctly to Grafana's expected severities.
- Example mapping from dmesg severities to Grafana:
- <0-1> (Emergency, Alert) → critical
- <2-3> (Critical, Error) → error
- <4> (Warning) → warning
- <5-6> (Notice, Info) → info
- <7> (Debug) → debug
Actual Result
In OpenTelemetry Collector 0.120.0: All messages have detected_status = unknown.
In OpenTelemetry Collector 0.121.0: All messages have detected_status = error.
Checking logs using dmesg -x on the system correctly displays severity levels.
Collector version
0.121.0
Environment information
Environment
OS: Fedora41, Ubuntu24.04, Rocky9
Compiler(if manually compiled): Binaries
OpenTelemetry Collector configuration
receivers:
journald:
directory: /var/log/journal
dmesg: true
start_at: end
exporters:
loki:
endpoint: http://loki:3100/loki/api/v1/push
service:
pipelines:
logs:
receivers: [journald]
exporters: [loki]
Log output
Mar 7 12:30:12 myhostname systemd[1]: Started otelcol-contrib.service - OpenTelemetry Collector Contrib.
Mar 7 12:30:12 myhostname dbus-daemon[48863]: [session uid=977 pid=48859] SELinux support is enabled
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.690-0400 info [email protected]/service.go:193 Setting up own telemetry...
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.690-0400 warn [email protected]/service.go:241 service::telemetry::metrics::address is being deprecated in favor of service::telemetry::metrics::readers
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.690-0400 info builders/builders.go:26 Deprecated component. Will be removed in future releases. {"otelcol.component.id": "loki", "otelcol.component.kind": "Exporter", "otelcol.signal": "logs"}
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.690-0400 info [email protected]/exporter.go:43 using the new Loki exporter {"otelcol.component.id": "loki", "otelcol.component.kind": "Exporter", "otelcol.signal": "logs"}
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.691-0400 info [email protected]/service.go:258 Starting otelcol-contrib... {"Version": "0.121.0", "NumCPU": 32}
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.691-0400 info extensions/extensions.go:40 Starting extensions...
Mar 7 12:30:12 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:12.691-0400 info adapter/receiver.go:41 Starting stanza receiver {"otelcol.component.id": "journald", "otelcol.component.kind": "Receiver", "otelcol.signal": "logs"}
Mar 7 12:30:13 myhostname otelcol-contrib[48784]: 2025-03-10T12:30:13.692-0400 info [email protected]/service.go:281 Everything is ready. Begin running and processing data.
Additional context
No response