I am studying for an exam that involves C. but I am encountering an error on the variables even when casting them. The code is copied from the powerpoint the professor gave us.
I was going through the Casting section that has this program written:
#include <stdio.h>
int main()
{
long i;
int j = 100000;
i = (long)j * j;
printf("%li", i);
}
I don't know why but that still gives me the output "1410065408" which of course means it's wrong. I even tried casting both variables:
i = (long)j * (long)j;
printf("%li", i);
I don't know how to move from now on, thanks in advance for the help
The cast only saves the situation in case
longhappens to be 64 bit, which is often not the case.In case it is 32 bit, then the maximum value of the signed
longis 231-1 = 2.147 * 109.But 100000 * 100000 = 10 * 109 so we get an integer overflow no matter the cast.
The solution is to use the portable
int64_tfromstdint.hinstead:(It's sufficient to only cast one operator of the
+, since this means that the smaller operator of lower "conversion rank" gets implicitly promoted to the type of the larger one.)