Andy Lester's got a great article over at the New Relic site which makes me realize how lucky I was in college.

I graduated in 1979; our computing platform was a 360/75, late upgraded to a 370/145 (I think) still running OS/360 in a VM under VM/360. This meant that for our own projects, we actually ended up doing a number of the things that Andy talks about as a matter of survival.

We did not have a version control system at all; we ended up using generation data sets and meticulous tape backups to manage our source code. A tool that just made that work would have been a miracle. (I remember well having to write and use programs to recover "deleted" partitioned data set members when one realized that it would have been a good idea to back that member up but didn't.)

We did indeed learn to write in the "weeder" class, assembly language programming. It had a professor who had a very specific idea as to what was good style and what was not - and a meaningful comment on every instruction was a key part of that style. One learned to write a good description of the algorithm one was implementing in the comments, as each "bad" or missing comment was 5% off your grade!

We didn't have full-up Perl style regular expressions in every language, but there was SNOBOL, with its first cut at a regular-expression driven language, and a few interesting wrinkles that Perl doesn't have. That was an eye-opening experience into just how powerful regular expressions were, and SNOBOL remained my go-to text and data munging language until I left mainframes.

Understanding libraries was key when one was writing primarily in assembler (it was fastest to compile and used the least resources to run, very important on a not-very-large timeshared machine); the more code you could reuse, whether IBM's or someone else's, the less time you spent waiting for a job to turn around and come back, so building libraries (both macro libraries - sequences of configurable assembler code that expanded inline - and binaries) was key to becoming productive and staying that way, simply because the more code you had that you didn't have to debug each time, the less time you wasted on waiting around for runs to finish. This also explained why a lot of us became nocturnal and adept at finding our way into locked buildings overnight - no one else was running anything on the machine then; if you could find a terminal after 9 PM, you got nearly instant turnaround on jobs.

The first implementations of SQL available for IBM mainframes weren't out until after 1979, and those were quite expensive. I was working at NASA, infamous for having absolutely no money, so SQL databases weren't something we had for quite a while, and even then they were limited to very specific projects. A miss on that one, and definitely something that held me back somewhat until my current job, where I've really had to learn to use it well.

Tools were few and far between in 1979; editing was via WYLBUR, a line-oriented editor. Still very important to learn how to use it well, when the maximum speed you were getting out of your terminal was 30 characters per second (or 11, if you were on a Selectric terminal). Not knowing how to do global changes and finds would mean either sending your file to print, waiting for the turnaround, and then hand-editing, or many, many 'list' commands, and failed jobs because you missed a change you needed to make.

Assembler was fantastic for teaching defensive programming because almost nothing could be taken for granted. If you needed bounds checking, exception handling, error checking, anything - you had to code it. As an object lesson, there were some very interesting, and exploitable, bugs in OS/360 caused by bad bounds checking in the logical equivalent of system calls, like the one that took an offset into a dispatch table to call code in privileged mode - but didn't verify that the offset was actually valid...

I very, very luckily fell in with some very talented programmers; simply because the programming environment was so primitive, we were impelled to build a hack that would let us run a job and communicate to it via a shared file - essentially we built a one-lung version of TSO to help get our work done faster.

We had to work with the existing conversational programming language implementation (CPS, a PL/1 variant adapted to be sort of like BASIC) to figure out how to make the connection between it and the batch job (we eventually ended up using a keyed-access dataset, using one record for the command line, another for a "go-ahead/break" semaphore, and another for returning output from the batch job). This required us to work with what features we had available in CPS, as we couldn't extend it, and to collaborate to add and update features.

This was not a typical experience; I was astoundingly lucky to have had the caliber of classmates that I did, and as forgiving a systems group as we had. We were pretty sure that they had figured out we had created a clandestine time-sharing system, but they were kind enough to let us run it - and we were careful to not abuse their trust.