Generated text using OpenAI’s GPT-2 and OCR text from mid-1990s WIRED issues

I trained a bleeding-edge machine learning (ML) model with a dozen issues of WIRED magazine from the mid-1990s and things got weird. The new model started generating a unique blend of 90s-era techno-optimism, product advertisements for microprocessors and gaming systems that never existed, odd interviews with tech luminaries and… a profound love for all things Apple.

There’s a popular ML technique going around where people generate text using neural networks based on different source materials. The technology isn’t advanced enough to make us think that a sensible person actually wrote the generated text—a serious problem we’re going to have to deal with—but it’s often good enough to produce bizarre statements that are funny in context.

I first tried it around two years ago to generate an ICO whitepaper and there have been many other developers, data scientists, and researchers who have continued to explore. Janelle Shane’s posts on this subject are wonderful and inspired this work. In the past two years, the effectiveness of the models have significantly improved.

This post is about what happened when I wanted to train one of the newer models going around (OpenAI’s GPT-2) with mid-1990s WIRED magazine issues.

WIRED, Nostalgia and Machine Learning

I grew up reading WIRED magazine, and I’m not immune to the nostalgia for the period of the 1990s when the internet and personal computers were a new and amazing thing for many people in the United States.

Generating text from the model is surprisingly addictive—with a click of a button, it’s like reading something a WIRED journalist might write after not sleeping for three days.

There is also a strange blend of marketing and journalism is impacted by the training data—WIRED issues in the 90s, after all, had plenty of print ads. As a result you get serious-sounding statements about history mixed with product marketing speak:

Sometimes this veers off into editorialization. After a quote around how you can get the latest PowerBook “when it’s available” (which every tech company would write in announcements if they could get away with it), this paragraph ends with something that sounds like a melody:

When it comes to technology luminaries of the mid-1990s, even seemingly critical quotes of Jobs end up making him look good:

Bill Gates, in contrast, basically has to plead that Microsoft isn’t actually in trouble when seeking an ambitious target of “download hits”:

The machine learning hype is real?

The hype, man, it’s real.

This started as a quick AI experiment to see how text generation models were advancing. From start to finish (including training time), it took only about two hours before text started coming out, including some text that wouldn’t look out of place in an experimental poetry journal.

While it was fun and fascinating to see what was generated, it also doesn’t seem unrealistic that more advanced versions of the model would probably be good enough to generate entirely convincing WIRED articles that seemed like they were written in the 90s. That’s disturbing for anyone who reads digitized historical sources of anything on the internet.

The creator of the GPT-2 model, OpenAI, has committed to working with research and governmental institutions on the societal impact of widespread use of models like this, including an advanced version of GPT-2 that hasn’t been publicly released yet.

As a creative tool, however, my experiments with WIRED and neural networks were unexpectedly inspiring. Strange alternate histories, quotes, bizarre products, and koan-like statements around the future emerged.

My deep concern for the future of these language models is combined with a desire to read the first few chapters of a book featuring Larry Ellison, his wife FmmL, and Denny Dallas. Maybe they’re all Apple fans.

The plot of 2025’s bestselling tech thriller?

Thanks for reading! Might get around to turning this model into some kind of chat bot in the next few weeks. Find me on twitter: @smithclay.