I am reading a very large file and extracting some small portions of text from each line. However at the end of the operation, I am left with very little memory to work with. It seems that the garbage collector fails to free memory after reading in the file.

My question is: Is there any way to free this memory? Or is this a JVM bug?

I created an SSCCE to demonstrate this. It reads in a 1 mb (2 mb in Java due to 16 bit encoding) file and extracts one character from each line (~4000 lines, so should be about 8 kb). At the end of the test, the full 2 mb is still used!

The initial memory usage:

Allocated: 93847.55 kb Free: 93357.23 kb

Immediately after reading in the file (before any manual garbage collection):

Allocated: 93847.55 kb Free: 77613.45 kb (~16mb used)

This is to be expected since the program is using a lot of resources to read in the file.

However then I garbage collect, but not all the memory is freed:

Allocated: 93847.55 kb Free: 91214.78 kb (~2 mb used! That's the entire file!)

I know that manually calling the garbage collector doesn't give you any guarantees (in some cases it is lazy). However this was happening in my larger application where the file eats up almost all available memory, and causes the rest of the program to run out of memory despite the need for it. This example confirms my suspicion that the excess data read from the file is not freed.

Here is the SSCCE to generate the test: