Shortly before noon on December 1, an ominous wail pierced the air throughout the Hawaiian Islands. The state’s emergency management agency had flipped the switch on its “Attack Warning Tone,” a foreboding siren that heralds an imminent nuclear attack. Not heard in decades, the tone will now ring out on the first business day of each month — one of a series of measures the state is undertaking to adjust to life in the shadow of North Korea’s ongoing ballistic missile tests and the bizarre gamesmanship between the “Rocket Man” and our own explosively tempered president. Meanwhile, on the other side of the world, Vladimir Putin sat down last month to practice the forgotten art of nuclear war, personally pushing the button to launch three ballistic missiles during a military drill, a step further than any U.S. president has ever gone in a nuclear exercise. In response, the Swedish government has begun updating shelters untouched since the fall of the Soviet Union. As governments around the globe dust off moldering contingency plans for an atomic sunset, it’s official: Nuclear Armageddon is hovering over the world once again.

It’s been decades since most Americans have thought seriously about nuclear war, although we’re regularly entertained with reality TV shows about “preppers” readying themselves for it, or a zombie invasion. What if, though, it turns out that they’re the smart ones? If, in the coming months or years, the standoff with North Korea turns hot and we confront a nuclear holocaust, and millions of people flee toward long-forgotten fallout shelters, one of the first questions we’ll face is the simplest: What do you eat when the world ends? It’s actually a question that the government has spent a lot of time — and millions of dollars — struggling with. The answer, though, may not encourage you to survive.

The threat of nuclear annihilation didn’t just hang over American society throughout the 1950s, it wholly reshaped it. In the early years of the Cold War, hundreds of thousands of volunteers manned church steeples, fire towers, and building rooftops in round-the-clock vigils to watch for approaching Soviet bombers. New York City distributed 2.5 million military-style dog tags to schoolchildren to help officials both identify bodies and reunite separated families after an attack, while schools in an Indiana county tattooed children’s blood types under their armpits — rather than on their arms, which could be blown off — to help speed transfusions in the event of injury.

The expectation, which seems astonishing now, was that much of the nation’s population would survive a catastrophic Soviet nuclear attack. Around the country, fallout shelters were constructed beneath backyards, in suburban basements, and in the bowels of sturdy municipal buildings, each helpfully researched by FBI agents and engineers and then marked with yellow-and-black signs — many of which are still visible today, faded and rusting on elementary schools and post offices. Government officials also turned to America’s abundant natural wonders: As engineers hollowed out mountains to serve as secret emergency government bunkers and created small underground cities inside places like Cheyenne Mountain in Colorado and Raven Rock Mountain in Pennsylvania, adventurous Boy Scouts were tasked with mapping abandoned mines and underground caves that could be used to protect civilians. In Tennessee, the state’s civil defense director estimated that about 800,000 of the state’s 3.5 million residents could be housed in its extensive network of underground caverns, while in Hawai‘i, civil defense officials identified 28 lava tube caves for residents to retreat to in the event of an attack.

As these efforts advanced — the Army Corps of Engineers ultimately identified 450 caves around the country that could house atomic refugees — officials still struggled with the task of preparing the population of an entire country for the prospect of living underground. Government planners led several large-scale experiments to determine the optimum conditions and supplies, including one where California prison inmates received a day off their sentences for each day they lived in an underground shelter. Other experiments were little more than publicity stunts, dreamed up by entrepreneurs who seized on public hysteria to market survival kits for basements and prefabricated shelters for backyards. Bomb Shelters, Inc., convinced Melvin Mininson and his new bride, Maria, to spend their honeymoon in Miami, 12 feet below ground in an 8-by-14-foot steel-and-concrete bunker; the pair emerged hot and dusty after 14 days, then promptly left for a real, company-paid, two-week honeymoon in Mexico. All told, during the peak of the fallout shelter craze, from the mid-1950s to the late 1960s, the government tallied that some “7,000 volunteers had participated in over 22,000 man-days of shelter living in occupancy tests ranging from family size to over 1,000 people.”

These experiments ultimately produced enduring national standards for underground shelters, such as a minimum of 10 square feet of space per person — which, while only half the space allotted inmates in crowded jail cells, was more than three times the amount of space given to prisoners at the Nazis’ Bergen-Belsen concentration camp, and six times as much space per person as inside the notorious Black Hole of Calcutta, the government explained helpfully in one report on shelter life. The tests also zeroed in on answers to fundamental questions that had plagued doomsday planners for more than a decade: What’s the minimum level of sustenance one needs to survive the apocalypse, and how do you get that to some 50 million hungry survivors?

In 1955, Eisenhower’s Federal Civil Defense Administration launched a propaganda campaign they called “Grandma’s Pantry,” calling for each household to have ready a seven-day supply of food and water for an attack. “Grandma’s Pantry, the symbol of preparedness. Unexpected company? Grandma always had plenty for everyone,” explained one radio ad. “In an emergency or during evacuation in case of enemy attack it's too late to plan. You’ll have to depend on your own resources — on Grandma’s Pantry.” Women’s magazines carried articles like “Take these steps now to save your family,” and Sears, Roebuck and Co. erected government-produced “Grandma’s Pantry” exhibits in 500 of its stores, encouraging people to stock up on Hawaiian Punch, Campbell’s soup, Tang, boxes of cornflakes, and candy bars.

Yet as the 1950s unfolded, it became clear that buying a few extra cans of food at the grocery store wasn’t going to feed the entire country sufficiently. In urban areas, high-rises, and many southern states where homes lacked basements, there would need to be larger government-run shelters. People couldn’t be expected to bring their own supplies and food; everything they would need had to be ready and waiting inside a shelter when nuclear war arrived. The Eisenhower administration embarked on the quest to develop the perfect “Doomsday food.” The requirements were stark: America’s Armageddon ration needed to be nutritious, cheap, easy to eat, shelf-stable, and reproducible at mass scale. Taste, visual appeal, quality, packaging, and all the other attributes that normally come with designing a successful, mass-produced consumer good would be discarded in favor of the simplest food the government could design.

That coldly logical approach, combined with an extensive 1958 study by the Department of Agriculture and the Department of Health, Education, and Welfare, led the government toward a single commodity as the foundation for its plan to feed a nation: The “parched wheat form known as Bulgur,” one of the simplest ingredients known to man. The main ingredient in dishes like tabbouleh, kibbeh, and pilafs, bulgur is nutty, nutritious, high-fiber, and supremely safe. “Bulgur was selected for this investigation because it is processed from a basic agricultural commodity, whole-grain wheat, which is plentiful in the U.S., low in cost, highly palatable, and reportedly very stable,” one government report explained.

That last thing stood out in particular, because it would need to hold up for years inside fallout shelters, awaiting the apocalypse. “Indeed a long shelf life may well be the single most important criterion for choosing bulgur in a stockpiling program,” the government reported. As part of its research, the USDA eventually landed on crackers as the best medium for bulgur-wheat rations in a bunker scenario; after 52 months of storage it reported merely a “discernible but inconsequential decrease” in flavor.

“This is one of the oldest and most proven forms of food known to man,” Paul Visher, deputy assistant secretary of defense for civil defense, explained to Congress as he presented a plan to mass-produce the crackers. “It has been the subsistence ration for many portions of the earth for thousands of years. Its shelf life has been established by being edible after 3,000 years in an Egyptian pyramid.” After millions of dollars and years of research, it turned out that after a nuclear apocalypse, the remnants of the American civilization would survive by chowing down on whole-wheat crackers. The government dubbed its creation the “All-Purpose Survival Cracker.”

New York Governor Nelson Rockefeller, perhaps the nation’s leading civil-defense enthusiast, bragged that a day’s worth of crackers cost just 37 cents per person — an economic solution to feeding an entire nation following nuclear war. A new problem emerged, though: There wasn’t enough capacity to turn the necessary three million bushels of bulgur wheat into the 150 million pounds of crackers that the government originally believed it needed; at the time, nearly all of the government’s surplus bulgur went through a single plant at the Fisher Flour Mill in Seattle, and it couldn’t possibly handle the volume the nation now required to secure itself against nuclear war. On December 21, 1961, the Pentagon convened the nation’s cereal companies to discuss the best way to quickly ramp up manufacturing of the chosen biscuit recipe.

“Additional suppliers must be found to quickly stock shelters,” the Pentagon said. Within five months, it had more than $4 million in contracts ready with Long Island’s Sunshine Biscuits, Ohio’s Kroger Company, and Richmond’s Southern Biscuit Company. Nabisco and the United Biscuit Company of America (now Keebler) also joined in the effort, and in the end, industry met the challenge: The bulgur “survival crackers” were manufactured in truly mass quantities — ultimately more than 20 billion crackers were produced by the end of the program in 1964 — and then sealed in airtight tins that varied in size depending on the manufacturer, often over five pounds and holding about 400 crackers. The tins were rushed across the nation to fallout shelters, caves, and mountain bunkers where Americans might ride out nuclear war.

Plans called for shelters to stock 10,000 calories of food per person, which would have worked out to a little over 700 calories per person, per day over the expected two-week stay underground. Each government-run shelter was also to be stocked with 21-inch-tall fiberboard drums, lined with plastic, that would start out as water storage — containing just 3.5 gallons of drinking water per person for the entire duration of the internment — and then, once empty, be converted into toilets. Since there was little else to do in a shelter, the government literature encouraged serving six small single-cracker “meals” each day of precisely 125 calories. The cracker diet would also include stockpiled tins of mouth-soluble “carbohydrate supplements,” i.e., suckable yellow and red hard candy. As one official explained, “Although this may seem somewhat austere, nutrition experts consider it adequate and in accord with minimal survival concept.” That’s a bureaucratic way of saying that the crackers would provide the equivalent of a Doomsday starvation ration — you’d still be hungry, you’d still lose weight, but you wouldn’t starve to death. Herman Kahn, a Cold War strategist, glibly assessed, “Well, you’re sipping a drink, munching on something tasteless, and it’s dark and crowded — a Greenwich Village nightclub.”

The government expected that survivors would be able to emerge from shelters to search for food and water after only a couple of weeks. “The contamination of food and water we think is one of the lesser problems, not one of the major problems of the post-attack environment,” Steuart Pittman, John F. Kennedy’s civil defense head, told one military audience. “If the [radioactive] particles got into the food, you could wash it out. It would be possible to harvest the crops in the field after a rain or two.” Based on research conducted around the site of dozens of nuclear tests in the Nevada desert, planners estimated that few crops would be lost entirely in the first year after an attack and that by the following year, most agriculture could return to normal. Of course, according to the government planners, the lingering gamma radiation would limit the amount of time that workers could safely till the fields, but, according to the actuary tables, enough Americans would die in the nuclear attack that even short work days would suffice to feed the living.

Officials in each region studied native food stores and agriculture supplies, then estimated the difference between a nuclear war in the springtime and one in autumn. In the southeastern United States, for instance, officials estimated that if an attack occurred in the fall, agriculture harvesting would be 95 percent complete and that survivors in states like Alabama or Georgia would then be able to forage a pound of food a day — catching fish or game, or eating roots or berries.

Nebraska, meanwhile, went one step further and tested out a fallout shelter for livestock—the Roberts Dairy Company built an underground bunker capable of holding 200 cows and ran a two-week test with 35 cows and one lucky bull, named Aristocrat, all overseen by two student cowhands. The cows appeared to barely notice they were living underground.

Next door in Kansas, officials calculated they could probably provide two million pounds of food after an attack, and that if survivors reduced consumption to an “austerity diet” of 2,000 calories, the state’s food stocks could last nearly two months. Besides the official stocks, Kansas’s wildlife could help too: Its forests, plains, and waters contained, officials believed, 11 million “man-days” of food — the amount of food needed to feed an adult for one day — in rabbit meat, 10 million man-days of wild birds, five million man-days of edible fish, and nearly 20 million man-days of meat in residential pets. After an attack, officials also planned to confiscate household vitamins for the good of the general population and ration carefully the state’s 28-day supply of coffee. Everything would be fine.

As it turned out though, Rockefeller’s enthusiasm notwithstanding, not even the federal government would stand behind its chosen cracker. In February 1962, when the navy set out to test how people would survive on fallout shelter rations, it hid 100 sailors for two weeks in a fallout shelter on the grounds of the Bethesda Naval Hospital outside Washington, D.C. and refused to offer the sailors only survival crackers for sustenance, supplementing the meals with different types of soup, peanut butter, jellies, and coffee.

Perhaps no one should have been surprised, given that the crackers’ major flaw was immediately apparent to the reporters who were offered samples: “Wafers... were about 2 inches square and ¼ inch thick and resembled small pieces of wallboard,” wrote one reporter. “They tasted like wheat and crumbled easily.” The biscuits, the New York Times wrote, “may be destined in the era of possible nuclear warfare for the notoriety that World War II gave to Spam, the crushed meat.” (They were not.) Whether the crackers were preferable to starving seemed entirely debatable. Timothy J. Cooney, who ran civil defense for New York City, had an entirely different suggestion, at any rate: “Get into a building, whether it had a shelter sign or not on it, that had an A&P in it,” he told the Times.

All-purpose survival-cracker tins were tested for freshness annually by, of all government entities, the U.S. Army Veterinary Corps, and they appeared to be hold up over the decade following their introduction. In the early 1970s, the crackers were still fresh enough that, as Cold War fears receded and the fallout shelters seemed increasingly obsolete, international aid groups turned to the stockpiled tins to aid in natural-disaster and famine relief around the world. CARE International exported millions of biscuits to countries like Niger and Chad. “They saved a lot of lives here,” Jack Soldate, CARE’s Niger director, said at the time.

The government warmly embraced the idea of converting Cold War rations to disaster relief, believing that it had discovered a great way to recycle and rotate the stores of crackers. In 1974, officials estimated there were still nearly 150,000 tons of the crackers stocked in fallout shelters out of the nearly 165,000 tons that had been originally baked. “Last week, we opened up a can and ate some,” explained the D.C. civil defense director, brushing aside concerns about their freshness. One Pentagon official, appearing to not understand the original purpose of the biscuits, reported that the rations dispatched for disaster victims weren’t meant to be a complete diet on their own. “You wouldn’t want to eat them for a couple of weeks with nothing else,” he said. “They’re good with little bit of cheese on them with a martini on the side.”

Later that year, the U.S. exhumed 20 tons of crackers — hidden in an old streetcar tunnel under Dupont Circle in Washington, D.C. that had been used since the Cuban Missile Crisis to store civil defense supplies — and shipped them to Bangladesh to feed survivors of a monsoon there. Other cracker caches were dispatched to Guatemala to aid victims of a devastating 1976 earthquake. The recipients of the disaster food reported developing what one newspaper described as “severe gastric disturbances” after ingesting the biscuits. As those reports trickled back to the U.S., officials across the country wondered just what they’d stocked away for a nuclear apocalypse. In mid-1976, E. Erie Jones, the Illinois state emergency coordinator, convened a group in his shag-carpeted office in Springfield for a taste test; it didn’t even start well. The mere smell from the newly opened tin caused coughing fits. He took a single bite, grimaced, then canceled the rest of the experiment. In reporting the taste test gone wrong, the Chicago Tribune declared that the “Survival biscuits [would be] better as weapons” than food if a war did unfold.

That fall, after hearing similar reports from around the country, the federal government recommended officials discontinue the use of the millions of stored biscuits. By 1978, the federal government’s civil preparedness office issued what amounted to a nationwide recall of any crackers that remained in shelter stockpiles. “As a result of recent laboratory and other tests, a high probability exists that all of the cereal-based rations stored have become rancid,” the circular said. “It is recommended that cereal-based rations no longer be considered as shelter supplies and should be destroyed or disposed of.” It was mostly a moot point: By the end of the 1970s, nearly all of the nation’s fallout shelters had been long forgotten. The era of civil defense efforts had passed.

In 1979, New York City abandoned its efforts to give away the remaining supplies socked away inside its 10,800 fallout shelters and began hiring contractors to transport the stockpiles to landfills for $38 a ton. Many of the unused and rancid crackers were ground up and mixed with other bakery waste to become high-energy chicken feed.

Some of the tins stayed hidden, though. During a routine inspection of the foundation of the Brooklyn Bridge in March 2006, workers stumbled upon a sealed vault that held an enormous stockpile of Cold War fallout shelter supplies. There were medical kits, paper blankets, 50 large drums of drinking water, and boxes upon boxes of “Survival Biscuits.” All told, Joseph Vaccaro, a transportation department worker, calculated they’d uncovered around 352,000 crackers. Iris Weinshall, the city’s transportation commissioner at the time, was brave enough to taste one of the forgotten All-Purpose Survival Crackers. “It tasted like cardboard, but with a nasty backbite that stayed in your mouth for hours,” she told the New York Times. “I cannot think of eating a saltine now without that taste coming up.”

Given the vast quantities of Survival Crackers produced, it’s perhaps no surprise that they continue to pop up now and again — they’re regularly for sale on eBay and, even today, people stumble over untouched fallout shelters. The odd remnants of this forgotten, scary time have given rise to their own oddly compelling subgenre of YouTube videos and survivalist experiments: the taste test. In 2015, a Boston city employee tasted a cracker he’d discovered long lost inside a municipal office basement. Another Massachusetts family decided to crack open a tin as part of a weekend wedding celebration, reporting that when they opened the tin, “The oppressively musty smell that permeated the most beautiful porch in Worcester was not a happy one.” A Canadian historian who tried one as part of his Cold War research found himself similarly unimpressed, commenting: “They basically taste like rancid oil.”

“If I was starving, I’d look around for bugs to eat instead,” Tribeca printer Dikko Faust said after sampling one as part of a New York party that included “gourmet condiments” like pate, caviar, capers, and a pepper-infused Spanish goat cheese. Others at the event had more robust stomachs. Sociology professor Emily Horowitz explained after she wolfed down several crackers, “I’m a vegan. I’m used to things tasting horrible.”

Nothing has ever replaced the Doomsday cracker. The government never again embarked on any sort of mass stockpile to ready the nation for a large-scale catastrophe. Today FEMA, in responding to natural disasters like the hurricanes in Puerto Rico, relies on Meals Ready to Eat, canned goods, and, apparently, a lot of Skittles.

In fact, broadly speaking, there’s no plan in place today to protect the civilian population in the event of war or a massive natural disaster. Instead, the government’s hopes and ambitions have shrunk to just protecting a small number of high-level officials in its own mountain bunkers scattered around the country.

Today, inside those facilities in places like Mount Weather in Virginia or Raven Rock in Pennsylvania — both hollowed-out mountains filled by small cities made up of free-standing buildings — stocks of military MREs have replaced the Cold War biscuits. The bunkers’ own cafeterias would also help feed the thousands of high-ranking officials who would be evacuated in the event of an emergency; at NORAD, the Pentagon’s air-defense headquarters inside Cheyenne Mountain, Colorado — the nuclear-hardened bunker site made famous by the 1980s movie WarGames with Matthew Broderick — there’s even a Subway franchise deep inside the mountain that feeds the bunker’s regular workforce. At least some lucky air force personnel would continue to eat $5 foot-longs after the end of the world.

Those of us left outside the bunkers will be left to fend for ourselves. In the modern era of “Doomsday Prepping,” survivalists trade recipes for their favorite style of hardtack biscuits, and, among a certain Silicon Valley set, enthusiasts have embraced the idea of Soylent “Ready to Drink Meals.” One of the major manufacturers of today’s “survival foods” says that its business is up 40 percent in the last four months, while Costco sells “disaster-preparedness” products on its website — offering “Emergency Food by the Pallet,” including 36,000 servings of food for $5,999.99. (When Eater critic Ryan Sutton sampled some apocalypse foodstuffs back in 2011, he seemed inclined to choose death over freeze-dried meat.)

Given today’s headlines, such “prepping” is no longer the sole province of wacky survivalists. It seems clear, in fact, that world tensions are forcing these anxieties to leak into the general public; as Bloomberg Businessweek recently proclaimed, “It’s boom time for the end times.” Just remember, when you’re packing for Doomsday, don’t forget the cheese or the martinis.

Garrett M. Graff is a magazine journalist and historian, and the author of Raven Rock: The Story of the U.S. Government’s Secret Plan to Save Itself—While the Rest of Us Die (Simon & Schuster, 2017) , from which this piece has been adapted and expanded.

Fact checked by Samantha Schuyler

Copy edited by Rachel P. Kreiter

Thanks to Bill Geerhart at CONELRAD.com