While on a spacewalk, if you plucked a guitar's strings, you would, of course, hear nothing—sound waves cannot travel through space's near-vacuum. Although soundless to our ears, however, the cosmic "void" is actually a cacophony of zipping particles and light waves. As atomic nuclei and radiation strike scientific instruments, they can be construed, in effect, as the "sounds" of space. We can listen in on the universe's hidden melodies.



That is because the sequence or intensity of detected particles can be converted into vibrations that fall within our human hearing range. Scientists have long transformed inaudible data in this manner into audible information—take, for instance, the "beep...beep...beep" of a heart-rate monitor.



DEEP-SPACE SONATA: A conversion to audible sound of one of the most powerful explosions on record in the universe, the gamma-ray burst GRB 080916C. The number of notes played represents the gamma rays received by the Fermi Gamma-ray Space Telescope. The accompanying sounds correspond to the probability of the rays emanating from the burst itself, with lowest-likelihood rays played as a harp, medium by a cello and highest-probability by a piano.

Many kinds of astronomical data have received this "sonification" treatment, ranging from solar wind particles streaming off our sun to gamma rays blasting across billions of light-years. These audible conversions have aided public outreach and allowed the vision-impaired to experience the universe's proceedings.

But hearing data, it turns out, also can open new scientific frontiers. That's thanks to the remarkable human ability to parse sound for patterns and meaning. "The auditory system is the best pattern-recognition device that we know of," says Bruce Walker, a professor of psychology and director of the Georgia Institute of Technology’s Sonification Lab. "If you're looking through a data set and trying to understand what's going on, it's often easier and more efficient to listen to the sound of it rather than looking at a screen or a printed version."



A paper published in The Astrophysical Journal in 2012 relied on such an approach. The finding—that varying forms of charged carbon atoms spewed by the sun can reveal differences in solar atmospheric temperatures—sprung from tuning in to audibilized data. "I was listening to some raw solar data and I heard this underlying hum," says paper co-author Robert Alexander, a sonification specialist with the Solar and Heliospheric Research Group at the University of Michigan.

SUNNY ANTHEM: The audio that Alexander was listening to when he noticed an underling "hum." The sound file corresponds to audibilized data of charged atoms, including carbon, within the solar wind from 1998 to 2010 recorded by a spectrometer onboard NASA's Advanced Composition Explorer spacecraft. The images of the rotating sun come from NASA's Solar Dynamics Observatory spacecraft. Credit: Robert Alexander/NASA/University of Michigan Solar and Heliospheric Research Group

The frequency of the hum, 137.5 hertz, corresponded to a period of around 27 days in the compressed data set. That's how long it takes the sun to complete a rotation. The duration implied a link to a surface or atmospheric feature, such as a region belching out a particular kind of solar wind, brought back around to face Earth periodically. "It gave me an inkling that this data could potentially be important," Alexander says. The ratios of carbon ions proved a reliable indicator of whether solar wind is either of two types, "fast" or "slow." That in turn speaks to the temperatures of the source regions for the wind in the solar atmosphere.

NASA, noting the innovative results, awarded Alexander a fellowship to further explore sonification applications, and new papers are pending. "This technique can be applied to an extremely wide array of data," Alexander says. "If you think about music as the 'universal language,' then you can think of audio as a universal platform for scientific inquiry. You can take data from any number of different sources, convert them into an audio file and then push 'play.'"

Over its history space sonification has yielded a number of other novel insights. Back in World War I primitive radio equipment picked up the phenomenon known as "whistlers." Decades later these spooky signals were tied to lightning strikes. Lightning sets off an electromagnetic wave in the soup of ionized gas, or plasma, surrounding Earth. High frequencies generated during the event travel faster in this milieu, arriving at a receiver before lower-sounding frequencies—hence, making a whistlelike effect. Explicating how whistlers worked helped advance the understanding of plasma physics in Earth's radiation belts.

SOUND FROM BEYOND: A "whistler," a radio emission from lightning that has traveled along a magnetic field line in space and reflected back from Earth's opposite hemisphere to a receiver on the ground. Credit: NASA/Donald Gurnett/The University of Iowa

As satellites began to ply the heavens, their data produced other enigmatic, yet ultimately revealing noises. Instruments on the Voyager 1 spacecraft, for instance, detected whistlers out at Jupiter, providing the first indirect evidence of Jovian lightning. Voyager 1 also captured the "sound," akin to a sonic boom, when it encountered a so-called bow shock in front of Jupiter. A bow shock is the shock wave formed where the giant planet's magnetic field hinders the supersonic solar wind, similar to the curved flow of water in front of a moving ship.

JOVIAN NOTES: A sonification captured by Voyager 1's plasma wave instrument as the spacecraft crosses the bow shock at the edge of Jupiter's magnetosphere. Credit: NASA/Donald Gurnett/The University of Iowa

Don Gurnett, a physicist at The University of Iowa and principal investigator for the Voyager 1 plasma wave instrument, has hung onto those recordings he found intriguing. "I've been collecting these things almost since the dawn of the space age, since 1962," Gurnett says. "I've got a cardboard box of cassette tapes here on the shelf in my office." (Most of the examples have been posted online at http://www-pw.physics.uiowa.edu/space-audio/sounds/.)

Other researchers have taken similar data and given it a digital makeover, translating it into sounds played by virtual musical instruments. These compositions can engage the public in a way that goes beyond words and images alone, according to Marty Quinn, a sonification researcher at the University of New Hampshire.

Quinn has composed sonifications since the early 1990s, with projects ranging from Martian polar ice caps to the ribbon of energy at the solar system's edge. His latest effort: CRaTER Live, a streaming Internet radio station, based on particles impacting the Lunar Reconnaissance Orbiter's Cosmic Ray Telescope for the Effects of Radiation (CRaTER) instrument. The key, pitch and selection of certain musical instruments (for example, piano, steel drum and guitar) in the spontaneous composition reflect the intensity of radiation bombarding the space probe.

LIVE LUNAR “MUSIC”: A scientific device onboard the Lunar Reconnaissance Orbiter records radiation spewed by the sun as it floods the vicinity of the moon. The radiation intensity is converted into musical instrument sounds. Credit: UNH CRaTER Space Science Team.

Schoolteachers have told Quinn how much students have enjoyed hearing his work. Cognition is augmented, he said, when we add in auditory elements. "I feel like I've enhanced my perception," Quinn says. "Instead of just seeing graphs or visualizations, I can learn more by hearing the data as music...and listening to music is something we love to do anyway."

