Suppose that you are creating an API in C and that you have a return value that is just right for an enum ; for example, it communicates either 'all is okay' or some range of errors and exceptional conditions. Here's how not to write this API:

typedef enum { ERROR_1, ERROR_2, ERROR_3, ALL_OK } error_t;

You don't want to do this, because sooner or later you're going to want to add another error condition, ERROR_4 , and the end result of putting it after ALL_OK is going to look somewhere between ugly and stupid.

The rule of thumb with enums and similar objects is that the fixed point goes at the start of the range. You are unlikely to have more than one 'all is fine' return code, so it is the fixed point and goes at the start.

The extra special way not to design this API is to do this and then just put ERROR_4 where it belongs, ie before ALL_OK . If you do this, any number of people will throttle you because you have just destroyed binary compatibility by renumbering ALL_OK 's actual value. Worse, the broken binary compatibility may be subtle, depending on where and how people use the enum , since only one value has shifted.

(Admittedly this is only an issue in C and similar compiled languages that turn enum s into actual integers behind the scenes. In other languages, this confusion can't happen; either ALL_OK is silently renumbered in all code that's using it or ALL_OK is purely a symbol with no numeric value attached to it as such.)

You would think that people wouldn't do this. Sadly, I have just seen this mistake made in software from a major vendor, assuming that it was a mistake instead of a deliberate decision to subtly punish people who counted on binary compatibility when it wasn't documented.

(PS: if you want to punish these people, it is much more productive and direct to spectacularly break your ABI so that people can't help but notice. People are kind of slow to notice subtle problems and they may not even realize what's going on for some time.)