I know that JavaScript has both a normal zero 0 (known as a positive zero +0) and a negative zero -0, however I have never come across a situation where I had to use -0.
There are some existing posts on stack overflow about how positive and negative zeros are similar/different, but none of them explain real life use-cases/examples of it.
Assume we're studying the function
y = 1/xand we'd like to know how it behaves when x is small. Let's takex=1, x=0.1, x=0.01and calculate the func:As you can see, it approaches towards positive infinity.
1/xis equal toInfinitybecause at some pointxgets so small that it's indistinguishable from0, and1/0 = Infinity. Note that this is the "positive" Infinity, that is, "a very big number".Now, let's start with
-1instead ofx=1:The answer is now
-Infinity, that is, the function approaches towards the negative Infinity, "a very small number". Of course, this is also correct, but how did the computer get that? We just learned that1/0 = (positive) Infinity? The secret is that the zero in the last snippet is actually negative, soxon the last iteration is-0and not just0, and1/-0gives-Infinity. Without the signed zero, the last snippet would give an incorrect result.Hope that explains it a bit.