I'm trying to better understand why large numbers, with potentially large precisions are inconsistently handled, specifically in JavaScript and it's localization facilities (e.g. ECMA-402/Intl). I'm assuming this has to do with the use of floating point numbers, but I'd like to understand where the limits are and/or how to avoid these pitfalls.
For example, using Intl.NumberFormat:
console.log(new Intl.NumberFormat('en-US', { minimumFractionDigits: 3, maximumFractionDigits: 3 }).format(9999999999990.001)); // logs 9,999,999,999,990.000
let test1 = 9999999999990.001
console.log(test1); // logs 9999999999990.002
How would I be able to figure out where these numbers start to get inconsistent? Is there some kind of limit? Does that limit change as I increase decimal precision, e.g. :
let test2 = 9999999999990.0004;
console.log(test2) // logs 9999999999990
Yes, and yes. Floating-point numbers in JavaScript are themselves stored in 64 bits of space, which means they are limited in the precision they can represent. See this answer for more information.
Pass your "numeric literals" to a function in the form of strings, and check to see if that string, when coerced to a number and back, returns the correct literal: