Why is virtual memory usage is so different between .NET Framework and .NET Core

199 Views Asked by At

I am upgrading some old project using .NET Frameowrk 3.5 to the latest .NET 7.0, and found the result of a function, which is used to output the memory usage of the current process, is very strange.

The code is like this:

    static void Main(string[] args)
    {
        var process = Process.GetCurrentProcess();
        Console.WriteLine(FormatSize(process.PeakVirtualMemorySize64));
        Console.WriteLine(FormatSize(process.PeakWorkingSet64));
        Console.Read();
    }

    public static string FormatSize(long size)
    {
        string[] units = { "bytes", "KB", "MB", "GB", "TB" };
        double s = size;
        int index = 0;

        while (s > 1024 && index++ < units.Length - 1)
        {
            s /= 1024d;
        }

        return $"{s:.00} {units[index]}";
    }

I found this code gives very different results while running in Framework and Core.

For .NET Framework 4.8, it looks like:

 4.54 GB
23.07 MB

Then, for .NET 7.0, the output is:

     2.3 TB
   21.91 MB

Why does the PeakVirtualMemorySize64 property have this big a difference?

And which one is more accurate if I want to monitor how much memory my program used?

0

There are 0 best solutions below