I am wondering about the behavioral difference that can be seen in output for Test 2 on example below.
I'm on .NET Core 3.1 for this scenario.
using System;
public class Program
{
public static void Main()
{
var a = new Customer();
if (a.BirthDate == default)
Console.WriteLine("Test 1 Result: default");
else
Console.WriteLine("Test 1 Result: NOT default");
if (a?.BirthDate == default)
Console.WriteLine("Test 2 Result: default");
else
Console.WriteLine("Test 2 Result: NOT default");
if (a.BirthDate == default(DateTime))
Console.WriteLine("Test 3 Result: default");
else
Console.WriteLine("Test 3 Result: NOT default");
if (a?.BirthDate == default(DateTime))
Console.WriteLine("Test 4 Result: default");
else
Console.WriteLine("Test 4 Result: NOT default");
}
}
public class Customer
{
public DateTime BirthDate {get;set;}
}
Output:
Test 1 Result: default
Test 2 Result: NOT default
Test 3 Result: default
Test 4 Result: default
I was expecting all the outputs would be "default".
The type of
defaultliteral is inferred by the compiler. In the second test, the inferred type isNullable<DateTime>, so it's equivalent toThe default value of a
Nullable<T>type isnull, you can check with the following codeBecause the left operand is a real
DateTimeobject, the comparison result is false.