Many people know that work on nuclear weapons enabled the development of the first electronic computers. But it’s no less true that the humble refrigerator, in a roundabout way, enabled the development of the first atom bomb.

While reading the newspaper one morning in 1926, Albert Einstein nearly choked on his eggs. An entire family in Berlin, including several children, had suffocated a few nights before when a seal on their refrigerator broke and toxic gas flooded their apartment. Anguished, the forty-seven-year-old physicist called up a young friend of his, the inventor and scientist Leo Szilard. “There must be a better way,” Einstein pleaded.

Szilard, a stocky man of 28, had first impressed Einstein six years earlier by proving him wrong on a certain scientific point. (That didn’t happen often.) Szilard also had a knack for turning esoteric ideas into useful gadgets. In later years he became a sort of Thomas Alva Edison of high-energy physics, sketching out the first electron microscope and particle accelerator; he and Einstein had bonded in part over their love of such mechanical devices. (Although a theorist and somewhat flighty, Einstein came from a family of tinkerers—his uncle Jakob and father Hermann had invented new types of arc lamps and electricity meters—and he’d worked in the Swiss patent office for seven years.) So when Einstein called Szilard that morning, the two men agreed to collaborate and build a better, safer refrigerator.

Excerpted from CAESAR’S LAST BREATH by Sam Kean Little, Brown

This wasn’t as odd as it might sound: in the previous half century, refrigeration had become serious science. The study of thermodynamics and heat had led to the concept of absolute zero—the coldest possible temperature—and several labs around the world were racing to reach the bottom of the thermometer. Some of the best science revolved around attempts to liquefy certain gases: nitrogen, oxygen, hydrogen, methane, carbon monoxide, and nitric oxide. Throughout the 1800s this sextet—the so-called permanent gases—had resisted all efforts to liquefy them. This stubbornness had led some scientists to declare that these six gases could never be liquefied, that they somehow stood apart from the rest of matter. Other scientists said baloney—that powerful new cooling methods would eventually condense them. In particular, the latter group pinned their hopes on a clever, cyclical cooling process that involved removing heat from substances in several stages.

Stage one involved filling a chamber with a gas that was easy to liquefy. Call it A. Scientists first compressed A with a piston, then cooled down the compression chamber with an external jacket of cold water. As soon as A had chilled down, a valve opened. This dropped the pressure on A and allowed it to expand into a larger volume. The key point is that expanding into a larger volume takes energy, takes work. (It’s similar to how a litter of puppies, if locked in a broom closet, would suddenly expend a lot more energy if you opened the door and let them run free inside the house.) And in this situation, the only energy A can draw on to expand and spread is its own internal store of heat energy. But depleting its internal store of heat energy inevitably cooled A down even more, and it eventually condensed into a liquid at around –100°F.

Now came the clever part. The next stage involved a chamber of gas B, which was tougher to liquefy. Scientists once again compressed B with a piston to start. But for the cooling jacket this time, instead of cold water they ran liquid A through the jacket. This dropped gas B’s temperature to –100°F. Opening a valve then caused B to expand, which forced B to deplete its internal store of heat energy. Its temperature plunged to around –180°F, whereupon it also liquefied.