Skip to content

Wouldn't it be better to use arbitrary precision arithmetic instead of floating point arithmetic? #186

@Sl-L

Description

@Sl-L

Currently, setting precision to 16 causes operations like 7 / 5 or 13 / 5 to show the inaccuracies of floating point arithmetic.
I suggest it would be better to either set the maximum precision at 15 decimal digits (which is how google's calculator works around this) or move away from floating point arithmetic to arbitrary precision arithmetic (like how windows calculator handle this since windows 98)

Metadata

Metadata

Assignees

Labels

calculatorIssues, pull requests or discussions concerning the calculator mode.intendedQuestions concerning intended behaviour

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions