The Science Explorer Logo

Greg Stewart/SLAC National Accelerator Lab

Nearby Stellar Explosions Bathed Earth in Radioactive Debris

This bombardment of debris may have even changed Earth’s climate.

| 3 min read

This bombardment of debris may have even changed Earth’s climate.

An international team of scientists have found evidence of radioactive debris at the bottom of Earth’s largest oceans. Where did it come from? A series of massive supernovae 326 light-years away and dating back between 3.2 and 1.7 million years ago, which is relatively recent in astronomical terms.

The scientists discovered iron-60, which is produced at the core of large stars and supernovae (exploding stars), in sediment and crust samples taken from the Pacific, Atlantic, and Indian Oceans.

Supernovae are huge explosions that occur when stars run out of fuel — hydrogen — and collapse in on themselves, blasting away heavy elements and radioactive isotopes across space. According to the results of the new study, which were published in the journal Nature, around three million years ago, a rapid series of explosions occurred, showering Earth in radioactive debris.

SEE ALSO: Scientists to Drill Into the Impact Crater Responsible for the Extinction of the Dinosaurs

“We were very surprised that there was debris clearly spread across 1.5 million years,” said Dr. Wallner, a nuclear physicist in the ANU Research School of Physics and Engineering, in a press release. “It suggests there were a series of supernovae, one after another.”

This bombardment of debris may have even changed Earth’s climate. “It's an interesting coincidence that they correspond with when the Earth cooled and moved from the Pliocene into the Pleistocene period,” Wallner continued. Even more, the team also found evidence of iron-60 from an older supernova eight million years ago, coinciding with global faunal changes in the late Miocene. Coincidence?

The researchers are not quite sure how nearby supernovae could change the planet’s climate or affect life since the radiation would have been too weak to cause any direct biological damage or trigger mass extinctions. However, some scientists have hypothesized that supernovae could influence our planet by increasing cloud cover or burning up our ozone layer, which could explain some of the changes on Earth around those same times.

To detect iron-60, the team needed extremely sensitive techniques to identify the atoms. This is because, “[I]ron-60 from space is a million-billion times less abundant than the iron [iron-56] that exists naturally on Earth,” said Wallner in the release. The team ended up collecting interstellar dust from 120 ocean-floor samples spanning the past eleven million years, and then separated the tiny traces of interstellar iron-60 from the other terrestrial isotopes using the Heavy-Ion Accelerator at ANU. What they found was that these tiny traces occurred all over the globe, but only between 6.5-8.7 and 3.2-1.7 million years ago.

“We don’t have any concrete evidence that any one event is tied to a supernovae,” University of Kansas astronomer Adrian Mellott, who wasn't involved in the study, told Maddie Stone of Gizmodo. “But the odds are, one or more are.”

According to the researchers, if the ageing star cluster was around 326 light-years away like they estimated, we would have been able to see them. Although the explosions would have appeared small, they would have been as bright as the moon when it happened. Wow.

You might also like: Carbon Emissions Are the Highest They’ve Been Since the Age of the Dinosaurs

Related Content