Consider the following code
func main() {
var (
negativeTwoThirds = -2.0 / 3.0
negativeSix = -6.0
negativeThree = -3.0
)
for _, v := range []float64{negativeTwoThirds} {
four := negativeSix * v // 4 = (-6.0) * (-2/3)
print1(negativeThree + four)
print2(negativeThree, four)
}
}
func print1(c float64) {
fmt.Printf("%f (%b)\n", c, math.Float64bits(c))
}
func print2(a, b float64) {
c := a + b
fmt.Printf("%f (%b)\n", c, math.Float64bits(c))
}
I tried the code on a personal linux machine as well as on go playground (demo) and it produced both times the output I was expecting
1.000000 (11111111110000000000000000000000000000000000000000000000000000)
1.000000 (11111111110000000000000000000000000000000000000000000000000000)
I tried the code on two macos machines and I got, both times, a rounding error
1.000000 (11111111101111111111111111111111111111111111111111111111111110)
1.000000 (11111111110000000000000000000000000000000000000000000000000000)
Can you help me make sense of the discrepancies
- between
print1andprint2? - between different OS / architecture / go executables?
Note
- I can only trigger the discrepancy in a for loop
- I've used go version
1.20.x - Both macbook used for test use the Apple M1 chip (ARM instruction set) and the linux machine has a 11th Gen Intel® Core™ i7-1165G7 (x86 instruction set)
EDIT: I reported the issue at https://github.com/golang/go/issues/61061