There are/can be scenarios where there are UCAN tokens that are almost valid (validish), and can be reasonably interpreted as such. Canonicalization of these tokens (x = serialize(deserialize(x))) is at odds with encoding valid tokens and being lenient in what we parse.
Examples of validish tokens:
Some options forward:
- Strictly reject validish tokens.
- Allow validish tokens. Retain canonicalization and encode validish tokens.
- Allow validish tokens. Encode as valid tokens, break canonicalization.
There are/can be scenarios where there are UCAN tokens that are almost valid (validish), and can be reasonably interpreted as such. Canonicalization of these tokens (
x = serialize(deserialize(x))) is at odds with encoding valid tokens and being lenient in what we parse.Examples of validish tokens:
"prf": [](which "MUST be omitted")exp, re-encoded as"exp": null(feat: Allow nullable expiry per 0.9.0 spec. Fixes #23 #95 currently will trigger this scenario if given a validish token without extra handling)Some options forward: