Why does GC not run when finalization queue is using a lot of memory?

289 Views Asked by At

I'm starting to learn about GC and finalization, and I've come across quite a simple example where the behaviour of the application is quite unexpected to me.

(Note: I'm aware that finalizers should be used only be used with unmanaged resources and using the disposable pattern, I just want to understand what is going on here.)

This is a simple console app that generates a "saw-tooth" pattern of memory. The memory rises to around 90MB and then does a GC, drops and begins to rise again, never going beyond 90MB.

    class Program
    {
        static void Main(string[] args)
        {
            for (int i = 0; i < 100000; i++)
            {
                MemoryWaster mw = new MemoryWaster(i);
                Thread.Sleep(250);
            }
        }
    }

    public class MemoryWaster
    {
        long l = 0;
        long[] array = new long[1000000];

        public MemoryWaster(long l)
        {
            this.l = l;
        }

        //~MemoryWaster()
        //{
        //    Console.WriteLine("Finalizer called.");
        //}
    }

If I remove the comment with the finalizer, the behaviour is very different - the application does one or two GCs at the start but then the memory increases in a linear way until it is using over 1GB of memory (at which point I terminate the application)

From what I have read, this is because instead of releasing the item, the GC moves the object to the finalization queue. The GC starts a thread to execute the finalizer methods, and then waits for another GC to remove the finalized objects. This can be an issue when the finalizer methods are very long-running but this isn't the case here.

If I manually trigger run GC.Collect() every few iterations, the app behaves as expected and I see the saw-tooth pattern of the memory getting released.

My question is - why does the large amount of memory being used by the application not trigger a GC automatically? In the example with finalizers included, would the GC ever run again after the first time?

1

There are 1 best solutions below

0
Christopher On

Do not rely on Finalizers. They are a safety net that you should never get to, not the first option. If the finalizers have to clean up after you, you already messed up terribly.

I have two basic rules regarding disposeables, that always worked:

  • Never split up the creation and disposing of a instance. Create, use, dispose. All in teh same piece of code, ideally using a using block.
  • If you can not do the 1st thing - like when you are wrapping around something that Implements IDisposeable - your class implements IDisposeable for the sole purpose of relaying the Dispose call.

As for the GC:

While the GC runs, all other Threads have to be paused. This is a absolute rule. As a result the GC is quite lazy. It tries to avoid running. Indeed if it only runs once during application close, that is the ideal case.