An ordinary camera and an extraordinary technology create billion-pixel images that allow viewers to virtually fly deep into a landscape and explore nature in stunning detail.

When NASA’s twin Mars rovers began sending detailed pictures to Earth in January 2004, Randy Sargent, a computer scientist working on visualizations of those images, was enthralled by the sense of actually exploring Martian terrain. Onboard each rover, a camera known as the Pancam swiveled and tilted on command from NASA scientists. Sargent and his colleagues combined each exposure into a stunning digital panorama of the Red Planet’s landscape. Scientists at the Jet Propulsion Laboratory in Pasadena, California, could interact with the images on their computer screens, zoom in on fine details, hypothesize about what they were seeing, and pick the rovers’ next destinations. “The pan had so much resolution, it felt like peering through a little hole in the wall into another world,” recalls Sargent’s manager, robotics group leader Illah Nourbakhsh at NASA Ames Research Center at Moffett Field, California, who was then on sabbatical from Carnegie Mellon University in Pittsburgh, Pennsylvania. “What stunned us was this feeling of presence, which a simple picture that is not interactive doesn’t give you.”

That experience led directly to a technology that has become a powerful tool for teaching and public engagement with science and the natural world. Scientists are also using it for projects as diverse as analyzing Middle Eastern petroglyphs, monitoring an urban forest, archiving a museum insect collection, studying a collapsed honeybee colony, keeping tabs on glaciers, examining erosion in a jaguar reserve, and viewing Galápagos fish clustered into a bait ball.

Soon after the Martian panorama renderings, Nourbakhsh challenged his team to think creatively about “blue sky” projects they could tackle. Aware of the intense reverence astronauts felt as they gazed at Earth from space, Sargent proposed bringing that kind of experience down to Earth by building affordable equipment anybody could use to create explorable images. Nourbakhsh immediately recognized the idea’s potential for changing the relationship between viewer and image. “An explorable image is a disruptive shift away from the static image you just glance at, because now you have the power of exploration,” he says. “That sets people up with a different mindset because they decide where to zoom, where to go, what structures and details to see. And it’s not virtual, it’s not a video game. It’s real.”

Sargent developed a prototype for what is now the GigaPan system. Users punch numbers into a keypad on a robotic mount for a digital camera, specifying how expansive they want their panorama to be. A microprocessor calculates the size and number of exposures needed for the pan and moves the camera accordingly. A small robotic finger pushes the shutter button for each exposure. These are stitched together to form a panorama with a resolution 1,000 times that of HDTV. The largest GigaPan has 100 gigapixels.

The final image contains more data than most personal computers can handle, so Nourbakhsh and his team developed a massive server system and website, www.gigapan.org, for storing and accessing GigaPans. When viewers zoom in on an area of an image, they seem to fly into the image itself. The result is an immersive, interactive experience that can reveal surprising details—an ant on a leaf in a forest, or a hummingbird sipping nectar from a flower in a backyard. It’s like viewing nature through a huge magnifying glass.

—Karen A. Frenkel

From: “Panning for Science” by Karen A. Frenkel. Science 330:748.

Reprinted with permission from AAAS.

GigaPan in Conservation Science

Biologist Alex Smith of the University of Guelph in Ontario uses GigaPan images as digital field notes to record habitat details as he studies insects on the slopes of Costa Rican volcanoes. Explore this image online at http://www.gigapan.org/gigapans/72097/

Image size: 4.556 gigapixels

Capture time: 1 hour, 26 minutes

Number of photos stitched: 1,104

Photo by M. Alex Smith

COLONY COLLAPSE

Entomologist Dennis vanEngelsdorp studies colony-collapse disorder in honeybee populations. This diseased frame, photographed at a quarantined apiary in Pennsylvania, serves as an interactive tool for teaching bee biology and disease identification. Explore this image online at http://www.gigapan.org/gigapans/27538/

Photo by Michael Andree and Dennis vanEngelsdorp

PANNING FOR FISH

Jason Buchheim, a marine biologist and inventor, has taken the GigaPan concept underwater. He has invented an iPhone application to precisely track his camera’s position as he snaps multiple frames that he can later assemble into wraparound panoramas, such as this one of salema fish in the Galápagos. He’s also developed a 3-D stereo viewer for GigaPans, www.3d-360.com. Explore this image at http://www.gigapan.org/gigapans/34815/

Photo by Jason Buchheim

JAGUAR CONSERVATION

Craig Miller at Defenders of Wildlife is using GigaPan to monitor jaguar-habitat restoration efforts in the U.S.-Mexican border region. This image shows one of several study sites where Miller zooms in on vegetation changes and erosion hotspots following the removal of livestock from the area. Explore this image at http://www.gigapan.org/gigapans/35185/

Photo by Craig Miller

NANO GIGAPANNING

Jay Longson of NASA’s Ames Research Center has rigged the GigaPan system to a scanning electron microscope. This image of an ant holding a fly is composed of 288 photos taken at 400X magnification. Explore this image at http://www.gigapan.org/gigapans/28295/

Photo by Jay Longson and Molly Gibson