I feel your pain. My biggest PITA in my day-to-day programming is dealing with those !@#$%^&( space leaks.

However, if it helps, with time you do learn (the hard way) about how to deal with this, and it does get better. But I'm still waiting for Andy Gill to come out with his magical space leak profiler to fix all of my problems. (I'm taking his off-hand comment to me at the last ICFP that he'd dreamed up this cool idea as a promise to implement it.)

I won't try to convince you that lazy evaluation is the best thing in the world, but there are certain good points about it. I've got some stream-processing programs that scoot lazy lists through any variety of combinators that run happily on gigabytes of data while using only 3.5 MB or so of memory (of which more than 2MB is GHC runtime). And someone smarter than I am pointed out to me last year that you would really be quite surprised, as a typical Haskell programmer, how much you depend on lazy evaluation.

But what we really need is a really good book on dealing with lazy evaluation in the real world (which is not so different from the academic world, really, except they simply don't get a paper published, and we get clients coming after us with knives) that will properly cover most of the issues relating to this and, more importantly, give us an intuitive sense of what's going to explode our heap and what isn't.

I don't think that this is a new thing; I'm sure other languages and architectures have been through this too. How did the first programmers to deal with hardware stacks and all that, after all? Not so well, I bet.