Definitively tying one of these older explosions to a mass extinction would be next to impossible, because the radioactive atoms that serve as supernova tracers would have long been lost to time. But that’s not the case with more recent explosions. Their relics persist, both in lunar soil and on the seafloor.

“You can use sea sediments as a telescope to learn about the supernova, and conduct supernova archaeology,” says Brian Fields, an astrophysicist at the University of Illinois who was one of the first people to propose doing this. “You’re digging into the earth to look at the cosmic past.”

A trio of new studies takes supernova archaeology into new territory, tracing different sets of stellar shrapnel to their sources.

When looking for dead star remains, scientists use a radioactive isotope called iron-60, which supernovas churn out in vast quantities (Earthly sources produce only one-tenth as much). Astrophysicists first located it in ocean-floor rocks in 1999, but now a team led by Anton Wallner at the Australian National University in Canberra found it in four separate locations, and at earlier times. The global distribution suggests that not only does the iron come from recent supernovae, but it comes from many of them.

Wallner’s team found two distinct peaks in iron-60, one of them between 1.5 and 3.2 million years ago and the other 6.5 to 8.7 million years ago. They used a particle accelerator to meticulously count single atoms of iron-60—“basically finding needles in haystacks,” says Fields.

Many of these atoms landed on the moon, too, according to a separate study published this week. Samples from two Apollo missions contain an order of magnitude more iron-60 than what cosmic rays would have deposited. It presumably came from the same supernovas as the ones buried on Earth.

Meanwhile, a third set of researchers used modeling to improve the stellar excavation. Astrophysicist Dieter Breitschwerdt at the Berlin Institute of Technology found evidence for two different star cataclysms within the younger iron deposits. The researchers used statistical analysis to figure out the stars' sizes, using numbers like the local neighborhood’s star population and existing stars’ brightnesses. They found the more distant one originated with a star about 8.8 times the mass of the sun, which blew up about 1.5 million years ago. The closer ex-star was about 9.2 times the mass of the sun and exploded 2.3 million years ago.

According to Breitschwerdt’s calculations—which took several years of work—the stardust would have taken about 100,000 years to shower Earth. Around the same interval, Earth experienced a sharp decline in global temperatures, and the onset of the Pleistocene ice ages. The cause of these climate changes is still under debate, but some anthropologists argue that the shift contributed to the evolution of human ancestors. The creep of glaciers was linked to a great drying throughout Africa, which caused forested ecosystems to become arid grasslands. Some argue this could have spurred human ancestors to walk upright rather than climb. There is evidence that early Homo species diets changed, incorporating more animal meat to ensure a diverse food supply. It’s even possible that hominid brains grew larger to enable them to figure out where to find food.