Today in Tedium: Computers often seem like they’re above the supply chain. Putting aside hot devices like the latest iPhone and the different variants of the Microsoft Surface, it’s generally easy to get a computer of some kind that will allow you to do all sorts of interesting things. They’re mostly built for the long haul, too—designed to last for a few years and ultimately responsible for nearly infinite amounts of production when it comes to digital content, code, and imagery. We don’t think about the ways in which resourcing impacts the way these machines are sold, how much they cost, and how it impacts production. But they are very much affected by these factors—and in at least one case, the shortage was the digital equivalent of long lines at gas stations. Today, we talk about RAM chip droughts throughout history, the causes, and their impacts over the years. — Ernie @ Tedium

The course correction that made RAM really expensive in the summer of 1988 The Chicago Tribune compared “The Great DRAM Shortage of 1988” to the oil shortages of the 1970s. And honestly, the comparison was pretty close to the mark. It was a story filled with trade conflicts and manufacturing challenges, but was ultimately defined by a complete misunderstanding of the computer market by chip producers, which ended up underestimating the demand for certain kinds of RAM. The issue was basically this: After the computer industry slumped in 1985, prices for RAM chips fell to extreme lows. According to an analysis by retired computer science professor John C. McCallum, a 256-kilobit DRAM (dynamic random-access memory) chip being sold in BYTE magazine could be had for just $2.95 in September of 1985—a sharp decline from the $8.95 the same-capacity chip sold for in January of that year. Higher-capacity chips remained expensive: A two-megabit chip, for example, was $599 in September 1985, making the cost-per-megabyte for larger-capacity RAM $300, per McCallum’s analysis. But even at the high end, the prices kept dropping: The cost-per-megabyte fell to as low as $133 in October of 1987. What drove the severe drop in price? To put it simply, an industry slowdown, mixed with extreme overproduction, caught the industry off-guard. “About 1985, the personal computer market was tremendously overextended,” Texas Instruments spokesman Stan Victor told the Tribune. “There were a lot of manufacturers, all fighting for the same 10-percent share of the market. Then, demand for computers started slowing, and the computer manufacturers found themselves with a tremendous inventory. So they stopped buying chips. For about six months there was no demand. People started selling chips for whatever price they could get.” To offer a comparison to the modern day, you know how gas has been really cheap for the last year or so? Well, that cost decrease has been pretty rough for the oil industry, leading to massive bankruptcies and job decreases. The steep drop in chip prices, to a commodity level, had a similar effect in the U.S., with numerous RAM manufacturers closing their doors in response to the downturn. Manufacturers were looking for a target of their scorn. That target, as it turned out, was Japan. In June of 1985, the Semiconductor Industry Association filed for regulatory relief with the U.S. Trade Representative, arguing that Japan was unfairly penalizing the U.S. chip industry by favoring local computer chips over American ones. The plea had its intended effect. In 1986, the U.S. Commerce Department accused the Japanese chip industry of flooding the American market with RAM chips at a level below market rates, with the goal of pushing American companies out of the market. Not long after, the Reagan Administration put economic sanctions on the country, effectively doubling the price of electronic products manufactured in Japan and sold in the U.S. It was the first time the U.S. had retaliated against Japan on trade issues since World War II, according to the New York Times. By late 1987, the sanctions ended after the administration decided that the market was no longer being flooded with costly chips. It was a huge win for the U.S. semiconductor industry, and one that the Harvard Business Review highlighted as a successful example of how an industry can swiftly and successfully influence the U.S. government. “This series of measures is unprecedented for its swiftness, severity, and agreement with industry recommendations,” David B. Yoffie wrote in May of 1988. The impact of the flood of RAM in the U.S. computer industry—which cost chip makers as much as $5 billion, according to the Tribune—is that it made chip manufacturers very timid about making too much RAM. Manufacturers shifted away from making 256-kilobit DRAM chips, converting their factories to 1-megabit chips. It turned out to be a huge miscalculation, and fixing that error was not going to be easy.

“I think when the year ends, supply will be 10 percent to 15 percent less than what real market demand is.” — Peter T. Main, the marketing vice president of Nintendo, discussing with the Los Angeles Times how chip shortages were negatively affecting the video game market in 1988. The market was negatively affected not only by a sharp decrease in DRAM—which was affecting the industry as a whole at the time—but by a decline in SRAM (static random access memory) chips, which Nintendo used most notably for its save-state cartridges. (Japanese memory manufacturers had moved away from producing SRAM, because DRAM was more lucrative.) The company delayed the American release of Zelda II: The Adventure of Link as a direct result of the SRAM shortage, and its competitor Sega also delayed the release of some games.

(BYTE Magazine ad) Why was RAM so expensive in the late ’80s? Blame a really terrible miscalculation that was hard to fix By July of 1988, the increasing RAM prices, which had slowly crept up in the early parts of the year, surged. A megabyte of RAM, according to John C. McCallum’s price analysis, had surged in price from $199 to $505 in a single month, and the cost of a 256k DRAM chip had surged from $2.95 at the beginning of 1988 to $12.45—a price level maintained for nearly a year. Eventually, 1-megabit chips took over the market, but the price issues were causing many RAM-buyers to hold out before purchasing any more chips. The RAM prices were clearly affecting sellers—many of whom were hiding their prices in ads for the magazines of the day, such as BYTE and Computer Shopper. In a July 1988 ad, Advanced Computer Products (which hid its prices for every kind of DRAM behind a request that customers call the company) leveled with its audience. “Don’t feel like the Lone Ranger … we are also suffering from loss of memory,” the company wrote in the full-page ad, which listed prices for literally every other kind of computer part besides memory. “ACP sells more Memory Upgrade IC's than other mail order suppliers ... But! ... the present shortage is driving us up a wall! We can't get them at the right price, but we are getting them. PLEASE BEAR WITH US ... as the market price comes down our price will come down.” Why did it take so long to fix this problem? To put it simply, correcting course from a miscalculation in the amount of RAM needed is a tough problem to get past. The production process for one-megabit chips, notes the Chicago Tribune, was much more difficult than that for the 256-kilobit chips. Chip-fabrication took four months, start to finish. And complicating things even further, RAM manufacturers converted their 256k factories to 1-meg facilities, rather than producing the chips in two separate facilities. That meant when the finally problem showed itself, it was already too late to fix. “This cannibalism of 256k capacity would not have mattered, but then they found that, in practice, the converted capacity was largely standing idle, because the new one-megabit chips were so difficult to make,” Lamont Wood wrote at the time. It took until mid-1989 for the problem to work itself out. John C. McCallum’s month-by-month analysis showed that the $505-per-megabyte chip prices for three-megabit chips remained stagnant until June of 1989, at which point the chips fell by nearly $500 in a single month. Soon, higher-capacity chips came onto the market, and within two years, four-megabit chips could be had for less than $200—a massive decrease that came just in time for the internet revolution to take hold. With few hiccups, the cost-per-megabyte of DRAM has been falling ever since.