Currently, setting precision to 16 causes operations like 7 / 5 or 13 / 5 to show the inaccuracies of floating point arithmetic.
I suggest it would be better to either set the maximum precision at 15 decimal digits (which is how google's calculator works around this) or move away from floating point arithmetic to arbitrary precision arithmetic (like how windows calculator handle this since windows 98)