In .NET Core 6 I am getting a funny result when adding days, and it only happens in one project but not in another.
var time = 3.2986111640930176;
var firstDate = new DateTime(2023, 7, 1);
var newDate = firstDate.AddDays(time);
var ticks = newDate.Ticks;
I expect the newDate to be 4 July 2023 07:10 which it is. But when looking at the ticks I get this: 638240514000050000 when I was expecting 638240514000000000 (which it is in another project).
I use the following code to round the result which is how I found the diff:
public static DateTime RoundUp(this DateTime dt, TimeSpan d)
{
return new DateTime(((dt.Ticks + d.Ticks / 2) / d.Ticks) * d.Ticks);
}
In my other project because I get 638240514000000000 so it rounds fine to 07:10 but the wrong results gives me 07:11.
I also spun up a new console app and tried the above code and also got the wrong result (638240514000050000).
I am totally confused whats causing it and why is it different in other project.
The issue you are encountering is due to the fact that the
doubledata type is not precise enough to represent time values with perfect accuracy. When you add3.2986111640930176days to2023-07-01, the result is indeed2023-07-04 07:10:00, but the underlying binary representation of double introduces slight imprecision. This imprecision affects the Ticks property of the resulting DateTime.To avoid this issue, you should use the
decimaldata type instead ofdoublewhen working with exact time values. Thedecimaldata type offers higher precision fordecimalcalculations.Here is how you can modify your code to use
decimal:Using
decimalwill give you a more precise result and reduce the issue of imprecision withdouble. However, keep in mind thatdecimalmay have slightly reduced performance compared todouble, so it's best to use it only when you need high precision withdecimalcalculations.