Skip to content

Conversation

@swamisriman
Copy link
Contributor

@swamisriman swamisriman commented Dec 3, 2025

What was changed?

Updated the datatypes of fixed64, uint64 proto fields in the (Proto)JSON example to string

Modified fields

uint64 fields

ExponentialHistogramDataPoint.Buckets.bucket_counts

fixed64 fields:

ExponentialHistogramDataPoint.count
ExponentialHistogramDataPoint.zero_count
HistogramDataPoint.count
HistogramDataPoint.bucket_counts

Why?

As per specification.md - JSON Protobuf Encoding

JSON Protobuf encoded payloads use proto3 standard defined JSON Mapping for mapping between Protobuf and JSON

As per ProtoJSON format - Representation of each type, fixed64, uint64 proto fields are represented as strings in ProtoJSON

Justification

  • While the Protobuf spec(linked above) says Either numbers or strings are accepted, pmetric.JSONMarshaler.MarshalMetrics()(ref) always generates the JSON with these datatypes as strings.
    This may lead to confusion.
    To avoid this, it's better to maintain example JSON with string datatype so that while it can be used as input, it could also be compared with output without any confusion.
  • Some fixed64 fields like start_time_unix_nano and time_unix_nano are already represented as strings, making the notation inconsistent within this example. This only adds to the confusion.

Testing

The datapoints were sent to otel collector which is configured to receive the metric and log it.

Payload

{
  "resourceMetrics": [
    {
      "resource": {
        "attributes": [
          {
            "key": "service.name",
            "value": {
              "stringValue": "my.service"
            }
          }
        ]
      },
      "scopeMetrics": [
        {
          "scope": {
            "name": "my.library",
            "version": "1.0.0",
            "attributes": [
              {
                "key": "my.scope.attribute",
                "value": {
                  "stringValue": "some scope attribute"
                }
              }
            ]
          },
          "metrics": [
            {
              "name": "my.histogram",
              "unit": "1",
              "description": "I am a Histogram",
              "histogram": {
                "aggregationTemporality": 1,
                "dataPoints": [
                  {
                    "startTimeUnixNano": "1544712660300000000",
                    "timeUnixNano": "1544712660300000000",
                    "count": "2",
                    "sum": 2,
                    "bucketCounts": ["1","1"],
                    "explicitBounds": [1],
                    "min": 0,
                    "max": 2,
                    "attributes": [
                      {
                        "key": "my.histogram.attr",
                        "value": {
                          "stringValue": "some value"
                        }
                      }
                    ]
                  }
                ]
              }
            },
            {
              "name": "my.exponential.histogram",
              "unit": "1",
              "description": "I am an Exponential Histogram",
              "exponentialHistogram": {
                "aggregationTemporality": 1,
                "dataPoints": [
                  {
                    "startTimeUnixNano": "1544712660300000000",
                    "timeUnixNano": "1544712660300000000",
                    "count": "3",
                    "sum": 10,
                    "scale": 0,
                    "zeroCount": "1",
                    "positive": {
                      "offset": 1,
                      "bucketCounts": ["0","2"]
                    },
                    "min": 0,
                    "max": 5,
                    "zeroThreshold": 0,
                    "attributes": [
                      {
                        "key": "my.exponential.histogram.attr",
                        "value": {
                          "stringValue": "some value"
                        }
                      }
                    ]
                  }
                ]
              }
            }
          ]
        }
      ]
    }
  ]
}

Log

Otel collector is able to read the fields as expected and the log is as follows.

2025-12-04T02:14:08.797+0530    info    ResourceMetrics #0
Resource SchemaURL:
Resource attributes:
     -> service.name: Str(my.service)
ScopeMetrics #0
ScopeMetrics SchemaURL:
InstrumentationScope my.library 1.0.0
InstrumentationScope attributes:
     -> my.scope.attribute: Str(some scope attribute)
Metric #0
Descriptor:
     -> Name: my.histogram
     -> Description: I am a Histogram
     -> Unit: 1
     -> DataType: Histogram
     -> AggregationTemporality: Delta
HistogramDataPoints #0
Data point attributes:
     -> my.histogram.attr: Str(some value)
StartTimestamp: 2018-12-13 14:51:00.3 +0000 UTC
Timestamp: 2018-12-13 14:51:00.3 +0000 UTC
Count: 2
Sum: 2.000000
Min: 0.000000
Max: 2.000000
ExplicitBounds #0: 1.000000
Buckets #0, Count: 1
Buckets #1, Count: 1
Metric #1
Descriptor:
     -> Name: my.exponential.histogram
     -> Description: I am an Exponential Histogram
     -> Unit: 1
     -> DataType: ExponentialHistogram
     -> AggregationTemporality: Delta
ExponentialHistogramDataPoints #0
Data point attributes:
     -> my.exponential.histogram.attr: Str(some value)
StartTimestamp: 2018-12-13 14:51:00.3 +0000 UTC
Timestamp: 2018-12-13 14:51:00.3 +0000 UTC
Count: 3
Sum: 10.000000
Min: 0.000000
Max: 5.000000
Bucket [0, 0], Count: 1
Bucket (2.000000, 4.000000], Count: 0
Bucket (4.000000, 8.000000], Count: 2
        {"resource": {"service.instance.id": "7d166739-8e2e-449e-a98b-5c8be937b4ba", "service.name": "otelcol-contrib", "service.version": "0.141.0"}, "otelcol.component.id": "debug", "otelcol.component.kind": "exporter", "otelcol.signal": "metrics"}

@tigrannajaryan
Copy link
Member

Protobuf spec says:

Either numbers or strings are accepted.

Looks like unnecessary change to me unless I am misreading the spec.

@swamisriman
Copy link
Contributor Author

swamisriman commented Dec 4, 2025

Hi @tigrannajaryan
Yes
While both are accepted, pmetric.JSONMarshaler.MarshalMetrics() is generating the JSON with these datatypes as strings only.
So, it's leading to some confusion
Eg: open-telemetry/opentelemetry-collector#10457

By looking at this example JSON, one would assume that it will match with the output of the above function.

@dmathieu
Copy link
Member

dmathieu commented Dec 4, 2025

Representing counts as strings seems like an issue that needs fixing in the collector, not in this example.

@swamisriman
Copy link
Contributor Author

swamisriman commented Dec 4, 2025

@dmathieu
Please see specification.md and the Why part in this PR description

@dmathieu
Copy link
Member

dmathieu commented Dec 4, 2025

Sorry, I was wrong. There is no issue in the collector. JSON can't handle int64s, as some serializers wouldn't be able to parse them properly.
Passing int64s as strings is therefore the proper practice, as documented in the Proto to JSON specification.
https://protobuf.dev/programming-guides/json/

The current example is still valid though, since the proto specification here accepts both strings and integers. If you intent to ingest OTLP, you will therefore need to be able to receive both (the collector is not the only emitter of OTLP data).

@swamisriman
Copy link
Contributor Author

@tigrannajaryan @dmathieu
I added Justification section to the PR description explaining the need for this change
Please take a look once.

@dmathieu
Copy link
Member

dmathieu commented Dec 4, 2025

Yes. But your comment assumes pdata/the collector is the only tool to generate OTLP.
Other implementations (such as the SDKs) may do ints, not strings. So you need to assume both can happen, and the current example is correct.

@swamisriman
Copy link
Contributor Author

swamisriman commented Dec 4, 2025

Shouldn't any SDK follow the Proto spec and output string "only" ?
Because while it says Either numbers or strings are "accepted", it also says JSON value "will" be a decimal string, suggesting that output is always a string

@swamisriman
Copy link
Contributor Author

Hi @tigrannajaryan, @dmathieu
Please take a look at Justification section I added and see what you think about this change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants