TL;DR: Pro .NET Memory Management (Part 1)



Oren F.



Pro .NET Memory Management (Apress), by Konrad Kokosa was printed in 2018 and is beast of a tome.

The first few chapters cover important fundamentals in depth, but are more informational for those curious about how things work under the hood.

Progressing further, I thought it would be good to take notes of sections that might actually change how I code in the near future.

String Interning

[ Potential performance improvement. ]

Strings are a fascinating topic for me because of the variety of special treatment they get across many languages.

String Interning is something that happens without you realizing it. The compiler takes string literals like "Hello World!" and registers them for reuse and effectively there should only be one instance of any string literal. Dynamically generated strings do not inherently gain this benefit.

But, it is possible to intern any string:

string.Intern("my custom interned string!");

The above line of code adds "my custom interned string!" to a pool of strings for reuse. But as you might imagine, the disadvantage is these strings stay in memory for the lifetime of the application and if you abuse this feature, you might fill up memory.

The benefits of string instance re-use can be non-trivial. Instead of byte-by-byte comparisons, the initial instance equal check can improve the speed of repeated evaluations.

Arrays of Doubles

[ Something to be weary of. ]

According to the book, a double is always 8-bytes long... For me breaks some assumptions I picked up about them. And if an array of them is 1,000 length or greater, it's allocated onto the Large Object Heap (LOH) even though they might only contain 8,000-bytes.

Memory Partitioning: Rule 12

Avoid Unnecessary Heap References

[ Potential optimization. ]

An example is provided here where a linked-list or tree nodes may contain references to one another. Instead of these references (like previous or next), retain their indexes and do a lookup at run-time.

Value-Tuples

[ Optimization, and code simplification. ]

These are not new to me and have opened a whole new world of convenience when dealing with parameters and return types.

The book recommends using them instead of their reference type predecessor: Tuple

If you're not familiar with the benefits of a ValueTuple, I strongly recommend further study. Their simplicity of use, including custom property names and built-in deconstruction is (IMO) a game-changer.

Array-Pools & Object-Pools

[ Potential speed and memory optimization. ]

The benefit of object pools might surprise you. Simply put, reusing objects has some serious benefits. An ArrayPool provides a .NET built-in way of optimizing around reuse of specific (minimum) length arrays.

If you are considering an ObjectPool, instead of writing your own, check out the ObjectPool library I wrote has been performance bench-marked and includes a similar implementation to what was engineered by the Roslyn team:

Open.Disposable.ObjectPools: Nuget Github

RecyclableMemoryStream

[ Serious memory optimization. ]

This one is probably worth repeated study and definitely immediate consideration.

If you are repeatedly using a MemoryStream in you code, consider using a RecyclableMemoryStream instead.

More to Come

Taking a break mid-way in Chapter 6: Memory Allocation. This chapter contains a wealth of important information and requires more in-depth study.

I'll return with more bits in Part 2.