I woke up in the middle of the night in a cold sweat. I’d had a terrible nightmare that shocked me into consciousness, and then I couldn’t get back to sleep. What horror disturbed me so much?

The 1980s.

Specifically, 1980s technology.

You see, back when I was a graduate student, we were doing research on developing neurons using fluorescent tracer dyes, and the way we’d collect data was take the camera output from the microscope and record what we were seeing on VHS tape. Do you remember VHS, kiddies? Sure you do. Well, some of you.

And then analysis involved playing back the recording a frame at a time and — don’t laugh — taping plastic transparencies over the monitor and tracing the outlines of the cells with colored sharpies. That was our life. We had little pieces of paper with approximate frame numbers of sequences we were interested in, and we’d spend a lot of time fast-forwarding and rewinding video tape to get to them, and then convert the images to notebooks full of colored transparencies.

But that wasn’t the nightmare.

Later we thought we’d get smart. We bought a VAX 11/750 — a computer that required an air-conditioned room of its own, was the size of a couple of major kitchen appliances, had das blinkenlights and the fancy whirring tape drives, and an image processing and analysis system. The software was from a company called International Imaging Systems, or I2S, and it was clunky as hell. Think photoshop, but driven by a command line interface, and with all the code written in FORTRAN. It had apparently also been written for petroleum geologists, and had all these useless (to us) functions and was lacking all the functions we did need, so I was constantly patching in bits of FORTRAN to make it do what we wanted. We later got rid of the VAX. The last I heard of I2s was a few years later, when there was some scandal about selling technology to Iran.

But that wasn’t the nightmare.

I went off to a post-doc in Utah, and my goal was to study growth cone behavior in developing grasshopper embryos. We started with nothing. We got a microscope, a silicon intensified target camera (this was all low light level work), micromanipulators, dyes, electrodes, all kinds of electronics, a Macintosh II, a video frame grabber, etc. None of these were integrated. There was no software to do what we wanted. I spent the next several years doing single cell manipulations and simultaneously trying to make all the bits and pieces work together, coding up software to control everything and collect data and store it in a form we could analyze. I came out of that project with 200,000 lines of code, all written by me on my lonesome, in a combination of Object Pascal, C, and assembly language. It was some sweet software.

One thing, though: don’t be the only one coding in a group of biologists. They never understand that on top of the cell work you were doing, you were also staying up until 2am every night writing and testing software, and they get the impression that making complicated gadgets stand up and dance when you click a button is trivially easy.

But that wasn’t the nightmare.

Here’s the nightmare. Among all the gadgets I was juggling was something called an OMDR, or Optical Memory Disk Recorder. This was the cat’s pajamas in 1989. It was precisely what someone who’d spent hours interminably winding and rewinding VHS tapes would consider the perfect technology: it stored video like VHS tape, but it was instantly addressable to any video frame you wanted. It was garbage, only we didn’t know it.

So in the 1980s I was handed an OMDR, a $10,000 box, and I was ecstatic. There wasn’t any software to control the damned thing, but that was OK, I’d just write it myself (we thought things like that in the 80s. We were insane.)

Here’s a video I found about thrilling OMDR technology.

So I wrote an OMDR controller for the Mac. It would store experiment data and video frame numbers for you, and gave you little buttons on the screen to jump to any sequence, step through it, play it forward or in reverse, all that stuff. Everyone thought it was handy and easy, and didn’t appreciate the work that went into it, but that was OK.

Here’s the thing about an OMDR. It stores the data on a big platter, like a laser disc (most of you probably don’t remember those, either). It was basically a write-once laser disc. It had 3 ports.

One was an RS-232 serial port. It’s a kind of crude, slow version of a USB connector; it was used solely for transmitting a control code to tell the OMDR what to do. (By the way, back in the day, RS-232 was far more fun than USB. It used a 25 pin connector, two of the pins were for data, one was a ground, and the others had cryptic syncing functions, like DSR and CTS and so forth. Most devices could get by with a 3 wire connection, but every once in a while you’d get something that demanded one or more of the other functions. I spent a lot of time with a soldering iron making custom cables for miscellaneous devices).

Another port was an RS-170 out connector. This was a standard for closed circuit TV; it was the same video signal used by every 7-11 with a camera on the ceiling to catch shoplifters. So you’d plug a cable from that OMDR port to your television monitor, so you could watch what was stored on the disc. Note: this is an analog signal.

The last port was an RS-170 in connector. This was how you sent data to the device: you’d display it on your computer, and pipe that analog video signal into the OMDR, and then you’d send a code through the RS-232 port to tell it to grab the current image and store it in a frame on the disc.

So it was easy. I’d do an experiment, and my computer would digitize the analog video signal from the camera, producing a glorious 640 x 480 pixel, 8-bit image in my machine. Data! Then I would convert it to an analog display signal, send that to the OMDR, which would digitize the video and store it on the disc. And then later I could tell it to show me that picture, and it would do so by converting the digital data on the disc into an RS-170 signal, and display it on the monitor.

There was no way to tell it to give me the digital data. It was video image or nothing. All of my data from 5 years of work is stored in an inaccessible form on these platters in Utah, that can only be read by a machine that is probably extremely rare right now, and even if I could read it, it would only be available in a fuzzy low-resolution video.

I know, that’s a lot of background to explain why this brief, silly dream, in which I recalled my first days as a post-doc, facing a pile of hardware that needed to be made to work, culminated in my eyes snapping open and a shiver of horror when my advisor said one simple acronym, “OMDR”. Beware transitional technology, I say…and the 1980s was a decade full of weird-ass dead end technology.