Description
Previous ID | SR-5353 |
Radar | None |
Original Reporter | vlas (JIRA User) |
Type | Bug |
Attachment: Download
Environment
Apple Swift version 4.0 (swiftlang-900.0.45.6 clang-900.0.26)
Apple Swift version 4.0-dev (LLVM 1a69f372ee, Clang 17f727b492, Swift a8ba0772cd)
Additional Detail from JIRA
Votes | 0 |
Component/s | Compiler, Standard Library |
Labels | Bug, Foundation |
Assignee | None |
Priority | Medium |
md5: 9c092cf213be7d5b64c0a4aaa961ffeb
Issue Description:
Attempting to conditionally cast some numbers larger than `90071992547409930` (`0x0C7FFFFFFFFFDFB2`) stored in `NSDecimalNumber` to `Int64` using `as?` fails (results in `nil`). Experimentally, all such numbers are multiples of 10, but not every multiple of 10 fails – only some of them, e.g.:
-
90071992547409930
-
90071992547409970
-
90071992547409990
-
90071992547410010
-
90071992547410030
-
90071992547410070
-
90071992547410090
-
90071992547410130
-
90071992547410150
-
etc.
To exercise the issue, simply create a decimal number and cast it to Int64: `NSDecimalNumber(value: largeNumber) as? Int64`. Note that this issue does not occur if a normal `NSNumber` is used instead. This is a regression from Swift 3 where such numbers would be cast correctly.
An example where this behaviour would cause problems is decoding JSON using `JSONSerialization` on macOS 10.13 and iOS 11 where (some?) numbers are decoded internally into `NSDecimalNumber`. In cases where a cast to Int or Int64 during decoding would previously succeed, it would now fail.
Attached is a sample macOS command line project that demonstrates the issue.