8 Ways You can Cause Memory Leaks in .NET

Any experienced .NET developer knows that even though .NET applications have a garbage collector, memory leaks occur all the time. It’s not that the garbage collector has bugs, it’s just that there are ways we can (easily) cause memory leaks in a managed language.

Memory leaks are sneakily bad creatures. It’s easy to ignore them for a very long time, while they slowly destroy the application. With memory leaks, your memory consumption grows, creating GC pressure and performance problems. Finally, the program will just crash on an out-of-memory exception.

In this article, we will go over the most common reasons for memory leaks in .NET programs. All examples are in C#, but they are relevant to other languages.

Defining Memory Leaks in .NET

In a garbage collected environment, the term memory leak is a bit counter intuitive. How can my memory leak when there’s a garbage collector (GC) that takes care to collect everything?

There are 2 related core causes for this. The first core cause is when you have objects that are still referenced but are effectually unused. Since they are referenced, the GC won’t collect them and they will remain forever, taking up memory. This can happen, for example, when you register to events but never unregister. Let’s call this a managed memory leak.

The second cause is when you somehow allocate unmanaged memory (without garbage collection) and don’t free it. This is not so hard to do. .NET itself has a lot of classes that allocate unmanaged memory. Almost anything that involves streams, graphics, the file system or network calls does that under the hood. Usually, these classes implement a Dispose method, which frees the memory. You can easily allocate unmanaged memory yourself with special .NET classes (like Marshal ) or with PInvoke.

Many share the opinion that managed memory leaks are not memory leaks at all since they are still referenced and theoretically can be de-allocated. It’s a matter of definition and my point of view is that they are indeed memory leaks. They hold memory that can’t be allocated for another instance and will eventually cause an out-of-memory exception. For this article, I will address both managed memory leaks and unmanaged memory leaks as, well, memory leaks.

Here are 8 of the most common offenders. The first 6 refer to managed memory leaks and the last 2 to unmanaged memory leaks:

1. Subscribing to Events

Events in .NET are notorious for causing memory leaks. The reason is simple: Once you subscribe to an event, that object holds a reference to your class. That is unless you subscribed with an anonymous method that didn’t capture a class member. Consider this example:

1 2 3 4 5 6 7 8 9 10 11 12 public class MyClass { public MyClass ( WiFiManager wiFiManager ) { wiFiManager . WiFiSignalChanged += OnWiFiChanged ; } private void OnWiFiChanged ( object sender , WifiEventArgs e ) { // do something } }

Assuming the wifiManager outlives MyClass , you have a memory leak on your hands. Any instance of MyClass is referenced by wifiManager and will never be allocated by the garbage collector.

Events are dangerous indeed and I wrote an entire article about it called 5 Techniques to avoid Memory Leaks by Events in C# .NET you should know.

So what can you do? There are several great pattern to prevent memory leaks from event in the mentioned article. Without going into detail, some of them are:

Unsubscribe from the event. Use weak-handler patterns. Subscribe if possible with an anonymous function and without capturing any members.

2. Capturing members in anonymous methods

While it might be obvious that an event-handler method means an object is referenced, it’s less obvious that the same applies when a class member is captured in an anonymous method.

Here’s an example:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 public class MyClass { private JobQueue _jobQueue ; private int _id ; public MyClass ( JobQueue jobQueue ) { _jobQueue = jobQueue ; } public void Foo ( ) { _jobQueue . EnqueueJob ( ( ) = > { Logger . Log ( $ "Executing job with ID {_id}" ) ; // do stuff } ) ; } }

In this code, the member _id is captured in the anonymous method and as a result the instance is referenced as well. This means that while JobQueue exists and references that job delegate, it will also reference an instance of MyClass .

The solution can be quite simple – assigning a local variable:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 public class MyClass { public MyClass ( JobQueue jobQueue ) { _jobQueue = jobQueue ; } private JobQueue _jobQueue ; private int _id ; public void Foo ( ) { var localId = _id ; _jobQueue . EnqueueJob ( ( ) = > { Logger . Log ( $ "Executing job with ID {localId}" ) ; // do stuff } ) ; } }

By assigning the value to a local variable, nothing is captured and you’ve averted a potential memory leak.

3. Static Variables

Some developers I know consider using static variables as always a bad practice. While that’s a bit extreme, there’s a certain point to it when talking about memory leaks.

Let’s consider how the garbage collector works. The basic idea is that the GC goes over all GC Root objects and marks them as not-to-collect. Then, the GC goes to all the objects they reference and marks as not-to-collect as well. And so on. Finally, the GC collects everything left (great article on garbage collection).

So what is considered as a GC Root?

Live Stack of the running threads. Static variables. Managed objects that are passed to COM objects by interop (Memory de-allocation will be done by reference count)

This means that static variables and everything they reference will never be garbage collected. Here’s an example:

1 2 3 4 5 6 7 8 public class MyClass { static List < MyClass > _instances = new List < MyClass > ( ) ; public MyClass ( ) { _instances . Add ( this ) ; } }

If, for whatever reason, you decide to write the above code, any instance of MyClass will forever stay in memory, causing a memory leak.

4. Caching functionality

Developers love caching. Why do an operation twice when you can do it once and save the result, right?

That’s true enough, but if you cache indefinitely, you will eventually run out of memory. Consider this example:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 public class ProfilePicExtractor { private Dictionary < int , byte [ ] > PictureCache { get ; set ; } = new Dictionary < int , byte [ ] > ( ) ; public byte [ ] GetProfilePicByID ( int id ) { // A lock mechanism should be added here, but let's stay on point if ( ! PictureCache . ContainsKey ( id ) ) { var picture = GetPictureFromDatabase ( id ) ; PictureCache [ id ] = picture ; } return PictureCache [ id ] ; } private byte [ ] GetPictureFromDatabase ( int id ) { // ... } }

This piece of code might save some expensive trips to the database, but the price is cluttering your memory.

You can do several things to solve this:

Delete caching that wasn’t used for some time Limit caching size Use WeakReference to hold cached objects. This relies on the garbage collector to decide when to clear the cache, but might not be such a bad idea. The GC will promote objects that are still in use to higher generations in order to keep them longer. That means that objects that are used often will stay longer in cache.

5. Incorrect WPF Bindings

WPF Bindings can actually cause memory leaks. The rule of thumb is to always bind to a DependencyObject or to a INotifyPropertyChanged object. When you fail to do so, WPF will create a strong reference to your binding source (meaning the ViewModel) from a static variable, causing a memory leak (explanation).

Here’s an example.

1 2 3 4 5 < UserControl x : Class = "WpfApp.MyControl" xmlns = "http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns : x = "http://schemas.microsoft.com/winfx/2006/xaml" > < TextBlock Text = "{Binding SomeText}" > < / TextBlock > < / UserControl >

This View Model will stay in memory forever:

1 2 3 4 5 6 7 8 9 10 11 12 public class MyViewModel { public string _someText = "memory leak" ; public string SomeText { get { return _someText ; } set { _someText = value ; } } }

Whereas this View Model won’t cause a memory leak:

1 2 3 4 5 6 7 8 9 10 11 12 13 public class MyViewModel : INotifyPropertyChanged { public string _someText = "not a memory leak" ; public string SomeText { get { return _someText ; } set { _someText = value ; PropertyChanged ? . Invoke ( this , new PropertyChangedEventArgs ( nameof ( SomeText ) ) ) ; } }

It actually doesn’t matter if you invoke PropertyChanged or not, the important thing is that the class derives from INotifyPropertyChanged . This tells the WPF infrastructure not to create a strong reference.

The memory leak occurs when the binding mode is OneWay or TwoWay. If the binding is OneTime or OneWayToSource, it’s not a problem.

Another WPF memory leak issue occurs when binding to a collection. If that collection doesn’t implement INotifyCollectionChanged , then you will have a memory leak. You can avoid the problem by using ObservableCollection which does implement that interface.

6. Threads that Never Terminate

We already talked about how the GC works and about GC roots. I mentioned that the Live Stack is considered as a GC root. The Live Stack includes all local variables and members of the call stacks in the running threads.

If for whatever reason, you were to create an infinitely-running thread that does nothing and has references to objects, that would be a memory leak. One example of how this can easily happen is with a Timer . Consider this code:

1 2 3 4 5 6 7 8 9 10 11 12 public class MyClass { public MyClass ( ) { Timer timer = new Timer ( HandleTick ) ; timer . Change ( TimeSpan . FromSeconds ( 5 ) , TimeSpan . FromSeconds ( 5 ) ) ; } private void HandleTick ( object state ) { // do something }

If you don’t actually stop the timer, it will run in a separate thread, referencing an instance of MyClass , preventing it from being collected.

7. Not de-allocating unmanaged memory

Up to now, we only talked about managed memory. That is, memory that’s managed by the garbage collector. Unmanaged memory is a whole different matter – Instead of just avoiding unnecessary references, you will need to de-allocate the memory explicitly.

Here’s a simple example:

1 2 3 4 5 6 7 8 9 10 11 12 public class SomeClass { private IntPtr _buffer ; public SomeClass ( ) { _buffer = Marshal . AllocHGlobal ( 1000 ) ; } // do stuff without freeing the buffer memory }

In the above method, we’ve used the Marshal.AllocHGlobal , which allocates a buffer of unmanaged memory (documentation). Under the hood, AllocHGlobal calls the LocalAlloc function in Kernel32.dll. Without explicitly freeing the handle with Marshal.FreeHGlobal , that buffer memory will be considered as taken in the process`s memory heap, causing a memory leak.

To deal with such issues you can add a Dispose method that frees any unmanaged resources, like so:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 public class SomeClass : IDisposable { private IntPtr _buffer ; public SomeClass ( ) { _buffer = Marshal . AllocHGlobal ( 1000 ) ; // do stuff without freeing the buffer memory } public void Dispose ( ) { Marshal . FreeHGlobal ( _buffer ) ; } }

Unmanaged memory leaks are in a way worst than managed memory leaks due to memory fragmentation issues. Managed memory can be moved around by the garbage collector, making space for other objects. Unmanaged memory, however, is forever stuck in place.

8. Adding Dispose without Calling it

In the last example, we added the Dispose method to free any unmanaged resources. That’s great, but what happens when whoever used the class didn’t call Dispose ?

One thing you can do is to use the using statement in C#:

1 2 3 4 using ( var instance = new MyClass ( ) ) { // ... }

This works on IDisposable classes and translates by the compiler to this:

1 2 3 4 5 6 7 8 9 10 MyClass instance = new MyClass ( ) ; ; try { // ... } finally { if ( instance != null ) ( ( IDisposable ) instance ) . Dispose ( ) ; }

This is very useful because even if an exception was thrown, Dispose will still be called.

Another thing you can do is utilize the Dispose Pattern. Here’s an example of how you would implement it:

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 public class MyClass : IDisposable { private IntPtr _bufferPtr ; public int BUFFER_SIZE = 1024 * 1024 ; // 1 MB private bool _disposed = false ; public MyClass ( ) { _bufferPtr = Marshal . AllocHGlobal ( BUFFER_SIZE ) ; } protected virtual void Dispose ( bool disposing ) { if ( _disposed ) return ; if ( disposing ) { // Free any other managed objects here. } // Free any unmanaged objects here. Marshal . FreeHGlobal ( _bufferPtr ) ; _disposed = true ; } public void Dispose ( ) { Dispose ( true ) ; GC . SuppressFinalize ( this ) ; } ~ MyClass ( ) { Dispose ( false ) ; } }

This pattern makes sure that even if Dispose wasn’t called, then it will eventually be called when the instance is garbage collected. If, on the other hand, Dispose was called, then the finalizer is suppressed. Suppressing the finalizer is important because finalizers are expensive and can cause performance issues.

The dispose-pattern is not bulletproof, however. If Dispose was never called and your class wasn’t garbage collected due to a managed memory leak, then the unmanaged resources will not be freed.

Summary

Knowing how memory leaks can occur is important, but only part of the whole picture. It’s also important to recognize there are memory leak problems in an existing application, find them, and fix them. You can read my article Find, Fix, and Avoid Memory Leaks in C# .NET: 8 Best Practices for more info on that.

Hope you enjoyed the post, and happy coding.

Share:

Enjoy the blog? I would love you to subscribe! Performance Optimizations in C#: 10 Best Practices (exclusive article) SUBSCRIBE