Depending on your point of view, the Manhattan Project — the collaboration between the US, UK, and Canada that created the first atomic bombs — is one of the greatest or worst scientific endeavors of our time. You could say that the Manhattan Project is what ultimately ended World War II, and thus in the long run saved millions of lives as it allowed the world to get on with rebuilding itself. Alternatively, you could just as easily blame the Manhattan Project for the decades-long misery of the Iron Curtain and Cold War. Whatever your worldview, though, I think we can all agree that the physicists, engineers, and chemists who worked on the Project were consummate geniuses and paragons of professionalism. Except they weren’t.

Following the atomic bombing of Nagasaki and Hiroshima, scientists at Los Alamos — the Project’s primary research facility — were still carrying out experiments on a subcritical mass of plutonium, with only the thin blade of a standard flathead screwdriver (held by a scientist!) preventing the plutonium from going critical and killing everyone in the room. Suffice it to say, sometimes the screwdriver slipped — and the scientist holding the screwdriver was instantly fried by killer neutrons.

The Atomic Age

As you may know, two atomic bombs were dropped on Japan. Little Boy, a very simple gun-type atomic bomb, where a lump of uranium-235 is fired into another lump of uranium-235, was dropped on Hiroshima on August 6, 1945. Fat Man, a more complex design that imploded a sphere or core of plutonium, was dropped on Nagasaki three days later, on August 9. Later that day, Emperor Hirohito began the process that would eventually result in Japan’s surrender and the end of World War II.

What you may not know is that the Allies actually produced three plutonium cores, at a cost of around $500 million each (most of the Manhattan Project’s funding was spent on producing the fissile uranium and plutonium fuel). The first core was used in the Trinity test, to see if the complex implosion technique was actually feasible. (The Trinity test, on July 16 1945, is usually considered the beginning of the Atomic Age.) The second core was detonated over Nagasaki and killed tens of thousands of people. The third core… well, the third core was going to be dropped on Japan, but the country’s surrender got in the way. Instead it was kept at Los Alamos for further investigation, whereupon it promptly killed a bunch of scientists and became known as the demon core.

The Demon Core

When an atom undergoes fission, it usually splits into two smaller atoms, along with a few leftover neutrons that are emitted as waste. These waste neutrons can then hit nearby atoms and cause them to fission. When a fission-type nuclear bomb explodes, what’s actually happening is that the uranium or plutonium fuel is going supercritical. This essentially means that there are enough fissioned (split) atoms pumping out enough neutrons to keep a chain reaction going. This requires a certain mass and volume of fissile material (i.e. the material’s critical mass). One of the Manhattan Project’s core areas of research was discovering the exact controlled conditions that can take a normal, radioactive lump of uranium or plutonium and make it supercritical — thus creating an atom bomb.

While you would think that such research into supercriticality would be carried out with the chemists and physicists safely behind half a mile of rock and lead, with long mechanical armatures manipulating the fissile material, our boys at Los Alamos were a little more… er… blasé. To discover the critical mass of the plutonium cores that would be used in the Trinity test and Fat Man bombs, Los Alamos researcher Louis Slotin devised a procedure that would later be called (by Richard Feynmann no less) “tickling the dragon’s tail.” This technique involved Slotin — often while wearing blue jeans and cowboy boots, apparently — lowering a beryllium hemisphere over the plutonium core. Beryllium is a neutron deflector — so if it’s close enough to the core it bounces enough neutrons back into the plutonium, triggering supercriticality. While Slotin lowered the beryllium hemisphere with a thumb hole in the top, all that prevented it from completely covering the core was the blade of a flathead screwdriver.

He tickled the dragon’s tail almost a dozen times before the screwdriver finally slipped — on May 21, 1946 — causing the plutonium core to go supercritical and emit a massive burst of neutron radiation. Slotin reported a flash of blue light and a wave of heat across his skin before he managed to flip the beryllium reflector onto the floor half a second later, stopping the chain reaction in a matter of seconds. It was too late, though: He received around 1000 rad of radiation and died nine days later of acute radiation syndrome.

Alvin Graves, who was watching over Slotin’s shoulder, survived the incident but developed a host of neurological and vision problems — and died 19 years later of a heart attack, possibly due to the radiation frying his heart. Because Slotin’s body absorbed most of the neutron burst (what a hero), no one else died immediately, but at least two other personnel in the room died over the next 30 years due to radiation-related complications (cancer, anemia).

Unsurprisingly, the protocol for criticality experiments was changed immediately after the incident, with personnel kept a quarter-mile away and remote-control machines taking the place of Slotin’s screwdriver.

Next page: But why’s it called the demon core?