TartanLlama's answer is correct. The undefined behavior can happen at any time, even during compile time. This may seem absurd, but it's a key feature to permit compilers to do what they need to do. It's not always easy to be a compiler. You have to do exactly what the spec says, every time. However, sometimes it can be monstrously difficult to prove that a particular behavior is occurring. If you remember the halting problem, its rather trivial to develop software for which you cannot prove whether it completes or enters an infinite loop when fed a particular input.

We could make compilers be pessimistic, and constantly compile in fear that the next instruction might be one of these halting problem like issues, but that isn't reasonable. Instead we give the compiler a pass: on these "undefined behavior" topics, they are freed from any responsibility. Undefined behavior consists of all of the behaviors which are so subtly nefarious that we have trouble separating them from the really-nasty-nefarious halting problems and whatnot.

There is an example which I love to post, though I admit I lost the source to, so I have to paraphrase. It was from a particular version of MySQL. In MySQL, they had a circular buffer which was filled with user-provided data. They, of course, wanted to make sure that the data didn't overflow the buffer, so they had a check:

if (currentPtr + numberOfNewChars > endOfBufferPtr) { doOverflowLogic(); }

It looks sane enough. However, what if numberOfNewChars is really big, and overflows? Then it wraps around and becomes a pointer smaller than endOfBufferPtr , so the overflow logic would never get called. So they added a second check, before that one:

if (currentPtr + numberOfNewChars < currentPtr) { detectWrapAround(); }

It looks like you took care of the buffer overflow error, right? However, a bug was submitted stating that this buffer overflowed on a particular version of Debian! Careful investigation showed that this version of Debian was the first to use a particularly bleeding-edge version of gcc. On this version of gcc, the compiler recognized that currentPtr + numberOfNewChars can never be a smaller pointer than currentPtr because overflow for pointers is undefined behavior! That was sufficient for gcc to optimize out the entire check, and suddenly you were not protected against buffer overflows even though you wrote the code to check it!

This was spec behavior. Everything was legal (though from what I heard, gcc rolled back this change in the next version). It's not what I would consider intuitive behavior, but if you stretch your imagination a bit, it's easy to see how a slight variant of this situation could become a halting problem for the compiler. Because of this, the spec writers made it "Undefined Behavior" and stated that the compiler could do absolutely anything it pleased.