When it comes to programming there are rules. Those rules fall into two types. The first are those that you should not break, because if you do the compiler will cry or things will explode spectacularly (possibly exaggeration) at run time. The other rules are more akin to folklore; they're the rules you don't often learn from a book or a course, but from one of those people who've been around the traps for a while when you're first cutting your teeth as a career developer.

Some Rules You Don't Break

One of the rules you'll hear is "don't divide by zero". Now that's a rule you don't want to mess with, because dividing by zero doesn't work. A division by zero error resulted in the the Cruiser USS Yorktown being left without propulsion. Rules like this are there because they tell you how it is. You don't break those rules because there's no point in it.

Don't write to an unassigned pointer: it's self explanatory. Bad things are likely to result, the best result is your code doesn't do what your expect it to (ok, you might get super lucky and write somewhere useful such as the exact memory location you wanted to, but the odds aren't on your side).

Some You Do (Carefully)

Some rules are there because conventional wisdom suggests they're a good thing, but plenty of these rules can be broken if you're in control. Any good software development course will tell you to check the length of input versus the size of the buffer assigned to receive it (language dependent of course). Buffer overflows are bad ju ju. They're more rare these days as we've moved up through layers of abstraction and have more friendly languages, but despite it's rampaging efforts to become ubiquitous, JavaScript is not the only language around and these issues do still occur in places they shouldn't because people have neglected to protect against them.

I've got C code that only accepts ASCII information into fixed length char arrays. Some of it I wrote in the last year. Make no mistake, these are horrible coding practices.

I leave this code in a 'dangerous' state because I'm aware of the run-time conditions that are imposed on the program. For starters, it runs on an Atari, and secondly I don't think anyone other than me is ever going to run it. I'm happy not to spend time coding for error conditions that I can avoid through knowledge of how the system works. So in my personal use scenario, breaking this particular rule is fine, but I'd definitely rework the code if I expected it to get used by others.

So What About goto ?

"Do not use goto " is the one rule just about every programmer has heard of, and many, many people abide by it because they've been told that to use it is bad, and if they use it they're a bad programmer. Bad. Don't be Bad. goto is bad.



I'm pretty confident Randall would use a goto when appropriate

That's often the gist of it. There may be a smattering of hyperbole here but you get the idea. The thing that bugs me is that that people tend to not know why it's bad. You know what? That's bad.

I've seen people come out with all kinds of reasons as to why it's bad... one reason I've seen online claimed that it's 'inefficient' at the hardware level, which makes little sense because a goto is what the hardware does when your various high-level control statements are compiled down and assembled.

The primary reason that it's considered bad is because it makes it far too easy for the flow of a program to become almost incomprehensible. With a nest of goto statements reading code suddenly becomes far harder, and as a result maintaining it does likewise. For instance, even though this is akin to what you'd write in assembly, I'd never say that this code:

void function(int param1, int param2) { if(param1 < param2) goto reverse; printf("%i

", param1 - param2); goto end; reverse: printf("%i

", param2 - param1); end:; }

is as clean, and as easy to parse as this code:

void function(int param1, int param2) { if(param1 < param2) printf("%i

", param2 - param1); else printf("%i

", param1 - param2); }

I don't think there's a single developer on the planet that would claim it is either, and this is just a very trivial case of 'spaghetti' code.

Goto does have it's uses though, for example it provides the most elegant (in terms of code complexity and speed) way out of a nested loop for instance:

int i = 0, j = 0; for(int i = 0; i < 10; i++) { for(int j = 0; j < 10; j++) { if(someArray[i][j] == 42) { goto done; } } } done:

And you can also use it to ensure that some code is always executed, for the sake of cleaning up or similar (yes this is similar to try/catch/finally):

someBigFunction(int p1, int p2) { if(someError) goto cleanUp; // do some work if(someOtherError) goto cleanUp; // do more work cleanUp: // do some work that should happen regardless success }

This is not to say that in the real world there may be better opportunities to restructure your code and find a way not to use these methods: often there will be, and a good developer should always be looking for ways to refactor and improve the maintainability of code as they go.

I once worked on a commercial game that included a single goto call and it was used (if I recall correctly) to allow the player to restart a match (it was a sports title) at any point. At some point somebody—and I have no idea who because this code went through at least 10 iterations of released titles—decided a goto was the best way to implement that feature. More than a few of us attempted to remove it over the years, but nobody was ever successful in getting things to work correctly without it, and frankly, it was at some level a neat way to solve that problem.

So What, Break All the Rules?

The point I'm trying to convey is this:

You should not simply follow rules without understanding them. If somebody tells you that a technique is bad and shouldn't be used, ask them why.

Understanding why certain rules exist will make you far more rounded as a developer than someone who blindly follows and repeats them, and often with programming learning the details turns out to be far more interesting than not doing so.