The first photograph of a black hole, seen around the world this April, was not a photograph in any conventional sense. Computer algorithms stitched together data from seven radio telescope observatories as far flung as Chile, Hawaii, Arizona, Mexico, and Spain and translated that data into shades of colored pixels to bring the dark center and hazy photon ring into view. No light can escape a black hole; it is, by definition, unseeable. What the world saw, then, was a color-enhanced computer representation (1).

The first image of a black hole, released in April, comes from the center of a massive galaxy called Messier 87 located 55 million light-years from Earth. The image, which is based on data from seven telescope observatories, was stitched together by computer algorithms and required discussions among multiple researchers. Image credit: Wikimedia Commons/Event Horizon Telescope.

“It’s hard for people to understand that what we’re seeing is not the thing—it’s a representation of the data,” says Felice Frankel, a research scientist in the chemical engineering department at the Massachusetts Institute of Technology in Cambridge and an award-winning science photographer.

But if the image is a representation, then how truthful is that picture to reality, and how much artistic license do the creators take? If a person somehow floated at the maw of a black hole, would she see the same fuzzy, orange donut scientists published this spring? Whenever the unseeable is seen, whether a black hole, a nebula in the reaches of deep space, or even the contents of our own cells, we’re left to wonder how closely reality matches the image.

To be clear, the researchers who generate these images use real and accurate data based on real phenomena. The final products are rooted in reality, not the imaginations of their authors. But the researchers who create these images do make interpretive decisions, often in the interest of clearly communicating the data with the public and other researchers. They aim to stay true to the object even if sometimes unfaithful to the human eye.

Take, for instance, the fiery orange photon ring around the black hole. “As to why it’s orange instead of purple or red, this is in many ways an arbitrary choice,” says Michael Johnson, one of the project’s imaging coordinators. “There is no sense in which the black hole’s emission is orange.” Johnson, an astrophysicist at the Harvard–Smithsonian Center for Astrophysics also in Cambridge, acknowledges that the color is not what the eye would perceive. “But we felt this particular color map did a good job of highlighting the different features in the image without masking or losing anything.”

The black hole image isn’t the only one using real data with a dash of interpretation. Hubble Space Telescope’s dramatic photographs of nebulae are also colored to discriminate the identities of glowing gases that are too dim to see through a backyard telescope or with the naked eye. And technologies to visualize microscopic viruses, proteins, and even individual atoms, interpret and extrapolate data. Although it’s all with an eye toward scientific truth, sometimes such images require a touch of artistry.

First Sight “Visualization is not the only way to establish the classification of objects, but it is one of our great tools,” declares Peter Galison, a historian of physics and physicist at Harvard University, and a member of the Event Horizon Telescope Collaboration that was responsible for the recent image. Looking to history, Galison begins his story of visualization in 18th-centruy Europe with Johann Wolfgang von Goethe, a behemoth of the German intellectual tradition, who classified objects such as plants by downplaying their individual variation to illustrate their essential, ideal forms. This kind of archetypal thinking survived into the 19th century, Galison explains, even while mechanical objectivity, the use of instruments to find a common view of nature that strives to minimize the human fingerprint on observation, took hold in science (2). Back in the 18th century, an illustrator might visualize a skeleton by drawing it ideally, Galison says, without cracked ribs or other flaws. But that changed in the 19th century, as researchers increasingly relied on tracings, rubbings, and eventually photographs to visualize nature objectively rather than ideally. By the 20th century, tides had shifted once again, as researchers realized that complex machines sometimes require trained, expert interpreters of their images, such as radiologists, trained to interpret ultrasound images, X-rays, and computed tomography scans. The last three centuries each left their mark on the recent black hole image, according to Galison. Creating it required mechanical objectivity, expert judgment, and, ultimately, a distillation of several image-making strategies into a final product that captured elements common to all the images. “In a sense,” Galison says, “it was a culmination of the whole history of scientific visualization.” To make the recent image, radio telescopes around the world swung in unison toward a point in the night sky suspected to hold a supermassive black hole some 55 million light-years away, in the galaxy Messier 87 (3). The telescopes recorded radio waves for multiple days, detecting them like faint ripples at the edge of a pond caused by some distant splashing object, Johnson says. The telescopes recorded 3.5 petabytes of information on these ripples. Racks of hard drives too massive to transfer online had to be shipped to two central locations, where supercomputers streamlined the data. A separate team then led a second stage of data processing. Only then could researchers sketch the source of the radio waves: the black hole. To complete the image without bias, the imaging team of roughly 40 people, including Galison and Johnson, split into four groups, each working independently and secretively. Each group mustered their collective expert judgement using different computer algorithms to make their image of the black hole. In July 2018, the groups compared their images, which were reassuringly similar rings surrounding a central dark patch. Finally, the teams distilled their images into one picture communicating the essence of the black hole to the public and scientific community—the cosmic donut seen around the world. What the public saw was a snapshot from a single day, which was an average of three images, each created by using different software and algorithms, “emphasizing the things that are common to them all,” Galison says. This first-ever picture of a black hole is just a start. It’s blurry, meaning it doesn’t tell the whole story of what’s out there, Johnson says, noting that sharper images could reveal filaments of hot, glowing material dragged around and into the hole or magnetic fields twisted into the jet of Messier 87. “We wanted the best of what our algorithms could do, but that didn’t overreach,” he explains. It may not be what an astronaut would see, but this picture still captures the scientific reality of a black hole to the best of modern astronomers’ abilities. So even as more research builds on this first picture, Johnson is confident of the image’s rigor. “I don’t think we’ll ever take another image that disproves an element of this one,” he says.

Artistic License In another famous image from the reaches of space, towering columns of brownish-red gas and dust are framed in a turquoise halo. This image, better known as the Eagle Nebula’s “Pillars of Creation” is the Hubble Space Telescope’s most famous photograph, adorning many laboratory walls. The stunning colors and composition have undeniable artistry, inspiring viewers to ask if space is actually that vibrant. Ray Villard hates that question. He’s heard it many times as Hubble’s news chief, based at the Space Telescope Science Institute in Baltimore, MD, for more than 30 years. “What we strive to do, very seriously, is to capture the essence of an object,” Villard says. Nebulae glow in a variety of discrete colors based on their gases, and Hubble’s photographs match these wavelengths. “We’re not just making up colors,” Villard says sternly. To capture the Eagle Nebula with its turquoises and reds, Hubble’s cameras focused their lenses and clicked like a consumer camera. But unlike a consumer camera, which has red, green, and blue color filters on its internal sensor, Hubble’s sensor operates with one color filter per picture. This allows for higher resolution than a conventional camera would by not subdividing the light into discrete red, green, and blue pixels, explains the telescope’s recently retired senior science visuals developer, Zoltan Levay. Hubble’s cameras have about two dozen filters, each restricting photographs to wavelengths of light from the visible, near-infrared, or ultraviolet spectra. To record oxygen gas in the Eagle Nebula, for example, light captured by the camera was limited to the blue wavelength of glowing oxygen atoms. A multicolored photo, such as “Pillars of Creation,” is a layer cake of many individual photographs, Levay explains. The glowing gases in that photograph—blue oxygen, green nitrogen, and red sulfur—offer some opportunity for artistic license, first and foremost to help viewers understand the image (Fig. 1). Out in space, hydrogen and sulfur gases glow in similar colors of red. To tell them apart in “Pillars of Creation,” Levay rendered sulfur normally but hydrogen in green—although Hubble originally photographed it with a red wavelength filter. The gases do look a bit more pink in color-film photographs, taken by conventional cameras mounted to backyard telescopes, but the nebula wouldn’t necessarily look redder up close in space. In fact, it would be hard to see any colors, Villard explains, because glowing gas would be dispersed across the sky. Fig. 1. Hubble captured each of these black and white “Pillars of Creation” images using a different wavelength filter, corresponding to blue oxygen (Lower Left), red sulfur (Upper Left), and red hydrogen and nitrogen light (Upper Right). Computers later added the appropriate color to each image. The final color image (Lower Right) is composed of several layers of these monochrome images, with hydrogen and nitrogen rendered in green. Image credit: NASA, ESA, and STScI. Hubble’s visuals team can adjust image contrast as well, Villard says, to showcase the wide span of brightness levels the telescope detects, a dynamic range broader than the eye’s. Bright brights and dark darks add a sense of drama to scenes of deep space, says Levay, adding that he and others on the project were inspired by nature photographers such as Ansel Adams, whose landscape portraits play with crisp whites, deep blacks, and middle greys to illuminate the awe-inspiring faces of towering mountains. “The universe is an extension of nature,” Levay says, “so why not use the same techniques as people like Ansel Adams and others, to produce the most visually appealing images we can from these dramatic scenes?”

Zooming In Capturing images of the very small entails some of the same challenges and choices as capturing the incredibly large. Not unlike imaging the black hole and “Pillars of Creation,” a technique called atomic force microscopy is among several that demand a touch of artistry to colorize images of molecules (Fig. 2). Fig. 2. The golden colors of these atomic force microscopy images highlight the peaks and valleys of giant Mimivirus particles. Low magnification (A) shows a defibered Mimivirus, revealing starfish-shaped features. Median magnification (B) shows partially digested Mimivirus with a particle in the upper left fully digested. High magnification (C) clearly reveals a starfish-shaped feature. A defibered particle treated with proteinase K (D) shows the arm of a starfish-shaped feature. Reprinted from ref. 4, which is licensed under CC BY 4.0. The technique uses a laser directed at a mirror on the tip of a cantilever that glides over the contours of a material, creating a topographic map of its surface, “like you’d use your hands to read a keyboard,” explains Frank Kusiak, former West Coast regional leader of the Nanoscale Informal Science Education Network, a 12-year-long National Science Foundation-funded project that aimed to increase public awareness of nanoscale science. The laser’s reflection off the mirror moves as the cantilever tip rises and falls with changes in vertical distance, and photodetectors pick up the position of the laser as it moves with the cantilever’s bounces over the tiny ridges and valleys of the surface. The technique can resolve small movements of the laser down to 1 angstrom, at the scale of individual atoms. Atomic force microscopy has a variety of applications that require visualization at the molecular level, notes Amrita Banerjee, a research associate at the Cornell NanoScale Facility in Ithaca, NY. They include tracking the behaviors of proteins and drugs, and confirming the size and structure of nanoscale products after fabrication. The images that atomic force microscopy yields are typically shades of yellow, although they can be any hue. The color is “strictly artificial,” says molecular biologist and biochemist Alexander McPherson, now emeritus at the University of California, Irvine. That’s because the technique measures distances, not visible light, he says. The coloration is meant to help the viewer best see the contrast between light-shaded high peaks and dark-shaded deep valleys. Hence, the technique doesn’t produce quite what we would actually see—colors don’t reflect the different biological properties of the viruses and cells or their ability to absorb and reflect visible light. But just as in the black hole image and “Pillars of Creation,” atomic force microscopy uses color to capture the essence of the information, which in this case is height data, McPherson says. Atomic force microscopy is just one of many techniques that capture the very small with a dose of interpretation. X-ray crystallography records the pattern of radiation diffracting off crystallized proteins, nucleic acids, and other crystallized molecules. Crystallography images have no color because X-rays are beyond the visible spectrum. McPherson and other researchers typically color the atoms in each image according to their charge, he says, with red for negative, blue for positive, and white for neutral. Color is not inherent to any of these techniques, but its addition makes images more informative and appealing. Even so, Frankel thinks that researchers have a responsibility to make their tinkering clear. “I forever have been talking about the issue of enhancing images, and how far can we go,” she says, alluding to a theme of her 2018 book, Picturing Science and Engineering. She thinks viewers should know about changes to an image because color enhancements for the purposes of communicating, for example, convey information behind those choices. “For me it’s all about informing the reader to what has been done to an image,” Frankel explains, emphasizing the importance of transparency. Although awe-inspired viewers might eagerly wonder what a given phenomenon “really” looks like, some make the case that this is the wrong question entirely. Whether tiny molecules or massive celestial bodies, many parts of the universe exist on scales that are fundamentally foreign to the human experience. “There are many places in principle we couldn’t go, like the interior of the sun or a black hole, or to visualize particle physics at a scale below the that of the nucleus,” Galison says. “I think restricting ourselves to trying to imagine what it would look like to us becomes increasingly artificial.”