Over the course of Earth's 4.6-billion-year history, our planet has repeatedly experienced the sudden annihilation of flora and fauna. The fossil record testifies to five truly catastrophic "mass extinctions," during which tens of percent of all species were wiped out in ruthless ticks of the geological clock. The demise of the dinosaurs 65 million years ago is the most famous such upheaval, even if it wasn't the most devastating.
Climate change, solar activity, volcanism, impacts, and other calamities have been cited as possible causes for these sporadic culls. But astronomers have proposed more distant killers: supernovae. For a brief time, a supernova --the cataclysmic explosion that ends the life of a star--can outshine an entire galaxy. Imagine compressing all the light from the Andromeda Galaxy's hundreds of billions of stars into a star-size dot and placing it just a few light-years from us. When you do that it's easy to imagine that a nearby supernova might spell certain doom for our biosphere.
Hard empirical evidence suggests that supernovae have left their mark on Earth in the recent geological past. The most recent such event may have occurred only 2.8 million years ago, when early premodern humans such as Australopithecus roamed Africa.
Ticking Time Bombs
Stars are essentially time bombs. Inside a stellar core, vast pressures and temperatures allow light atomic nuclei to fuse into heavier ones. The energy released by these thermonuclear reactions generates an outward pressure that counterbalances the star's tendency to collapse under its own weight. But stars have finite lifetimes. Once the nuclear fires go out, the internal pressure turns off. Death is near.
A lightweight star's ending will be relatively sedate; a low-mass star might puff off its envelope to form a planetary nebula. But a star originally containing about eight or more solar masses suffers a catastrophic fate. Fusion ceases once it has built up an iron core; the core then collapses and rebounds in milliseconds. Shock waves and neutrinos propagate out from the core to rip apart the star's outer layers and fling them into space, producing the energetic blast we witness as a Type Ib, Type Ic, or Type II supernova.
In contrast, a Type Ia supernova occurs in a binary system, where a white dwarf gradually snatches matter from a nearby companion star. When the white dwarf nears 1.4 solar masses--the famous Chandrasekhar limit beyond which the star cannot resist the pull of its own gravity--runaway thermonuclear reactions blow it apart.
As a Type II supernova shock wave plows through the star's envelope, nuclear reactions synthesize many different elements. Most are stable, but some undergo radioactive decay. These unstable nuclei spontaneously emit subatomic particles, changing into isotopes of the same element or into different elements altogether. A supernova unleashes a shell of radiation and radioactive particles at velocities up to 10% that of light. The blast wave spreads and decelerates as it plows into rarefied interstellar gas. But if the expanding shell encounters a planet, bits of debris can wind up on the surface. Either the material is deposited directly, or it interacts with atoms in the planet's atmosphere, causing them to fracture and emit other byproducts (in a process called spallation) that also find their way to the ground.
Theoretically, a nearby supernova should leave behind a fingerprint in the form of a fine layer of radioactive debris. The key to finding this cosmic powder burn is the fact that radioactive isotopes (radioisotopes) occur on Earth only in trace amounts. The radioisotope aluminum-26, for example, which is a predicted byproduct of a Type II supernova, has a half-life of around 720,000 years. Earth is much older than this, so very few aluminum-26 atoms from the planet's formation remain in the crust--most have long since decayed. The same is true of other radioisotopes. But if a supernova went off nearby and deposited radioisotopes on Earth in the past few million years, they will not yet have decayed completely, so some will still be detectable in geological strata.
Digging Up Evidence
The first possible evidence for such a spike was reported in 1987. A group led by Grant Raisbeck (Center for Nuclear Spectroscopy and Mass Spectroscopy, France) announced the discovery of traces of the radioisotope beryllium-10 in ice-core samples. The deposits were laid down at two different epochs approximately 35,000 and 60,000 years ago. Other researchers have since confirmed this anomaly at various locations around the world. Raisbeck's group raised the possibility that the beryllium anomaly came from a supernova with a distance of 60 to 130 light-years.
Scientists calculate the distance by estimating the amount of radioactive debris in a supernova shell and how it is diluted as the remnant expands. But there are many uncertainties in the distance calculation: How much of a particular isotope is produced? How uniformly does it spread as the shell expands? How efficiently is this material captured by Earth's atmosphere? How uniformly is it deposited on the surface? Sixty to 130 light-years is probably not close enough for a supernova to cause significant damage, but it's certainly near enough to leave its radioactive calling card.
In 1996 a group led by John Ellis of the European Organization for Nuclear Research (CERN) in Switzerland predicted several other radioisotopes that supernovae should deposit. Sure enough, in 1999 a group headed by Klaus Knie (Technical University of Munich, Germany) discovered one of those isotopes buried in deep-sea Pacific Ocean crust. Using an accelerator laboratory in Munich, the team found iron-60 (which has a half-life of about 1.5 million years) at a whopping concentration of around 100 times the background level. This result is consistent with a supernova within the last 5 million years at a distance of 100 light-years.
In 2004 Knie and his collaborators announced the discovery of yet another iron-60 concentration in Pacific crust, but in a different location, showing that the enhancement is widespread. These new data are of even higher quality, and Knie's group was able to pin down the deposit to a single layer with an age of 2.8 million [+ or -] 0.4 million years.
"This spectacular new result gives hard experimental evidence supporting the idea that nearby supernovae really do occur," says astronomer Brian Fields (University of Illinois). "Not only that, but supernova debris can be recovered geologically and studied in the laboratory. The iron-60 discovery thus marks the birth of supernova archaeology."
The 2.8-million-year date corresponds to the Pliocene epoch in Earth's history. Antarctica and the North Pole were both covered in ice, and the climate was generally comparable to the modern one. The continents were about 70 kilometers (45 miles) from their present locations. Mastodons roamed North America, while early hominids--members of the biological family that includes humans and apes--thrived. Human ancestors such as Australopithecus africanus would have witnessed the supernova. With its estimated distance of between 50 and 400 light-years, it would have been a spectacular naked-eye object, easily visible during the day and casting shadows at night.
Interestingly, the iron-60 results are consistent with circumstantial astronomical evidence for a recent, nearby supernova: the Local Bubble. Astronomers have detected that the solar system and nearby stars lie within a 300-light-year-wide cavity in the interstellar medium (ISM). The density of hydrogen gas in this bubble is only one-tenth that of the surrounding ISM. Astronomers think the Local Bubble was created when one or more supernovae went off nearby and pushed aside the ISM, leaving a comparative void. The blasts must have occurred within the last few million years, or else the cavity would have closed up by now.
Astronomers have quietly discussed the possible damage caused by a nearby supernova since the late 1960s. As first pointed out by Malvin Ruderman (Columbia University), the worst damage is likely to come from the partial or substantial destruction of Earth's ozone layer. In a 1974 Science paper, Ruderman calculated the possible influences of nearby supernovae on atmospheric ozone and on our planet's biosphere. He concluded that supernovae have probably almost totally destroyed Earth's atmospheric ozone several times over the past 545 million years, with serious consequences for plant and animal life.
Ruderman may have been pessimistic in his calculations, in light of new results. But he was on the right track. Supernovae produce copious gamma rays and high-energy particles, or cosmic rays. Type Ia supernovae generate more gamma rays, whereas Type II events eject more particles. This radiation bombards Earth's atmosphere, where it shreds nitrogen and oxygen molecules to produce the oxides NO and N[O.sub.2]. These compounds participate in chain reactions that consume ozone ([O.sub.3]) and atomic oxygen (O).
The ozone layer offers our only protection from harmful solar ultraviolet light. This radiation attacks cells and leads to skin cancer and worse. Some plankton species are particularly at risk from increased ultraviolet exposure. At the bottom of the ocean food chain, plankton are vital for the survival of fish and other animals that feed on them and sustain the creatures that in turn feed on them. With the loss of plankton, the food chain collapses. Plants are at risk too, affecting the food chain on land.
In a 2003 Astrophysical Journal paper, Neil Gehrels (NASA/Goddard Space Flight Center) and his colleagues calculated the ozone depletion caused by both gamma rays and cosmic rays as a function of distance to a Type II supernova. Using the latest atmospheric chemistry codes from ozone-hole modeling, they found that such a blast within 25 light-years would destroy perhaps half of our planet's ozone layer.
But Gehrels points out that Type Ia events are potentially more dangerous. Though they are only one-fifth as common, accelerate fewer cosmic rays, and persist roughly one-sixth as long in their gamma-ray-bright phase, they are nevertheless more than 10 times as luminous in gamma rays than Type IIs. Gehrels says a Type Ia supernova at 25 light-years would destroy significantly more ozone than a Type II at the same distance. Even when considering the lower occurrence of Type Ia events, Gehrels thinks there is a slightly higher chance of ozone destruction from Type Ia supernovae. "But there are large uncertainties in the numbers," he admits.
Another team, headed by Brian Thomas (Washburn University, Kansas), has also analyzed the potential damage from a Type II blast, but this time situated at 100 light-years and only including the effects of gamma rays. The study shows that even at this larger distance, ozone depletion will be as high as 15% in some places and 7% averaged globally, resulting in a 7% increase in ultraviolet radiation for a period of several months at equatorial latitudes. While the authors do not consider such increases to be significant on a global scale, they note that the resulting ultraviolet boost would be lethal to some marine microorganisms. Moreover, "these results are a lower limit," says Thomas, "since we do not include the effects of cosmic rays."
With ozone-depletion calculations in mind, it's likely that if the iron-60 layer was indeed deposited by a supernova roughly 2.8 million years ago at a distance of 50 to 400 light-years, the damage to Earth was probably minimal unless the supernova was at the closer end of that range or was a Type Ia event. Perhaps there were localized extinctions and genetic mutations, but no mass extinctions occur in the fossil record at that time. There is, however, evidence for a cooling trend about 3 million years ago, which may have been triggered by the supernova's cosmic rays. This cooling led to a more arid climate in Africa, which probably influenced hominid evolution.
Fortunately, the chances of a destructive supernova happening soon are vanishingly small. Supernovae are rare in our galaxy. The current rate is estimated to be about 1.5 per century (S&T: May 2006, page 16), but none has been witnessed with certainty in the Milky Way since 1604. (It's possible that English astronomer John Flamsteed saw the Cassiopeia A supernova in 1680, noting a "new star" near the blast's estimated position.) This adds up to something like 70 million stellar explosions in our galaxy over our planet's lifetime. But given the vast size of the Milky Way--100,000 light-years across--very few events occur near the Sun.
The estimates vary, but according to Gehrels, the number could be as few as 1.5 supernovae per billion years within a radius of 25 light-years, the distance he calculated for 50% ozone depletion. Statistically, we might expect around five supernovae within that range since the origin of life on Earth some 3.8 billion years ago. On the other hand, the number expected within a larger circle--the 100-light-year radius considered by Thomas's team, for example--rises to around 70 or 80 since life took root.
Astronomers know of no potential supernova close enough to damage our biosphere severely in the near future. The nearest Type II progenitor is Betelgeuse--but it's safely tucked away at about 400 to 450 light-years. Of greater concern is the Type Ia progenitor system HR 8210, or IK Pegasi. The white dwarf in this binary is close to the Chandrasekhar limit and is separated from an elderly companion by only about 40% the Earth-Sun distance. Making matters worse, the companion is about to expand into a red giant and dump some of its atmosphere onto the white dwarf, pushing it over 1.4 solar masses and igniting a Type Ia supernova. IK Peg is only 150 light-years away. Given that the gamma rays from Type Ia events are more harmful to our ozone than those from Type II supernovae, that is not a very comfortable distance. Both Betelgeuse and IK Peg could go off anytime soon. But "soon" in astronomy can mean today or in a million years. We can probably rest easy--for now.
Former astronomer MARK A. GARLICK is a writer and illustrator specializing in astronomy. His latest book, The Illustrated Atlas of the Universe, was published in 2006. To explore more of his artwork, visit www.markgarlick.com and www.space-art.co.uk.
The Creative Side of Supernovae.
The term "supernova" conjures up images of devastated planets and civilizations. But we owe our very existence and technological prowess to these stellar cataclysms. For example, supernovae synthesized and dispersed many of the heavy elements (such as oxygen, calcium, phosphorus, and silicon) that built the planets and made life possible on Earth. Supernovae also created the elements heavier than iron (such as nickel, lead, and uranium) that human society exploits for machines and energy.
But perhaps most important for us, a supernova shock wave may have triggered the collapse of the gas cloud that gave rise to our solar system. The evidence for this controversial scenario comes from an excess of several isotopes in meteorites, rocks that date back 4.6 billion years to the birth of the Sun and planets. These isotopes include magnesium-26, which comes from the radioactive decay of aluminum-26, and nickel-60 and iron-56, decay products of iron-60. Given that most stars form in clusters, and that many clusters originally contain massive stars that are destined to go supernova, it's no surprise that our solar system's formation, or at least its early evolution, may have been profoundly influenced by one or more nearby supernovae (S&T: October 2004, page 24).
Senior editor ROBERT NAEYE often wonders how many civilizations in our Milky Way Galaxy have been wiped out by supernovae.
Garlick, Mark A. "The supernova menace: a nearby supernova can put on a spectacular light show, but it can also devastate Earth's atmosphere. How close is too close?" Sky & Telescope Mar. 2007: 26+. Academic OneFile. Web. 7 Nov. 2009.
Gale Document Number:A169211103
(Album / Profile) http://www.facebook.com/album.php?aid=10031&id=1661531726&l=cf90f7df9c