-
Notifications
You must be signed in to change notification settings - Fork 80
Description
I'm working to use https://github.com/json-schema-org/json-schema-spec to validate SigMF metadata.
I'm finding that the schema-dictated maximum value for several properties to be very problematic, and I wonder if there's a good rationale for the current value:
18446744073709552000 = 2^64 + 384
I would like to suggest that the value 2**63-1 be used instead, as this makes it much easier for libraries like json-schema-spec to implement such a constraint.
This would cause a breakage for all those files out there with values > 9223372036854775807, it's true. But, given the number of elementary particles in the universe, I don't think that's going to be a problem in practice.
Like, it's one thing to enforce such a limit in python; it's something else to do so in generic C++. The implementation of json-schema-spec uses long to hold such a constraint, on a JSON element of type unsigned int (that's a nlohmann/json thing).
What is the origin of 18446744073709552000 anyway?