This picture, angled southwest-northeast, captures a huge swath of the state of Colorado: almost everything between Montrose and Fort Morgan. A tour of some highlights:

In the south of the image, you can see Black Canyon of the Gunnison, a gorge so deep that its bottom receives only 33 minutes of sunlight per day.

The resort town of Crested Butte hides in the middle:

As does Leadville and its airport*:

And in the image’s far-north corner, nearly all of metropolitan Denver can be seen:

As can Denver International Airport:

This image was taken by an unusual method. WorldView-3 turned around and looked across the surface of Earth to take it, a technique I’ve heard described as “looking through its legs.” DigitalGlobe has previously used this technique to photograph Mount Fuji from 1,500 miles away. And earlier this year, it also took a much-lauded oblique image of Nepal soon after the devastating series of earthquakes that struck that country. (I wrote about that photo—and the larger humanitarian effort that used it—in May.)

These are beautiful and useful images. But it is important to remember that these Earth pictures are not only used for humanitarianism and art, but for intelligence-gathering and war. Such themes have long been intertwined when it comes to pictures of our common home. The first photograph of Earth from space, after all, was taken of a region not far from Colorado—and it was captured by a re-purposed V-2 missile.

I asked Kevin Bullock, a product specialist at DigitalGlobe, to tell me more about how this photo was made. With some editing for brevity, he told me:

Before I worked at DigitalGlobe, I thought all satellites just sort of hung around up in space, and were communications or stationary, just sitting there. Like the ISS, just on its orbit.

And when I joined DigitalGlobe, they said, oh no we actually have space-vehicle operators who fly our satellites 24/7. Meaning, our satellites are very maneuverable and they’re always moving around up in space. They’re on a fixed orbit, but they’re always looking in different directions.

It’s kind of like the Mars rover, where we’ll send a command to a satellite and 10 minutes later it will complete the operation, but we can never see it completing the operation. There’s this high level of trust that we have all the systems and everything’s communicating. It’s not like a remote-control airplane, where if you tell it to roll or land, you can actually see it land. We’re communicating with it, and then sometime later, it’s actually doing it way up in space. We have no knowledge of it doing it other than little bits of metadata it’s sending back to us, saying like “yeah, I hear you, I’m doing it,” that kind of thing.

What our satellites are doing is they’re actually taking a scan of the Earth. We have an array of detectors. People sometimes call it a push broom. We’re sweeping across the Earth’s surface, taking a scan of the surface of the Earth—it’s not a snapshot. It’s not a frame.

That sounds simple enough, but when you think of the satellite orbiting at 17,000 miles per hour, 400 miles up in space, and think of the fact that the Earth is rotating at 16,000 miles per hour, the Earth is actually moving underneath our scan. We have to be super precise in all of our measurements to turn the image into something useful, otherwise it’s junk.

Onboard our satellite, we have GPS, we have star trackers. We have two cameras pointed back at the stars. The stars make great positioning data points. We look at the stars and we can tell where our satellite is; and then we have an I.M.U., that measures our inertia, how the satellite’s moving; and then we have these massive gyroscopes, which use angular momentum to point the satellite.

We were over the Indian Ocean, looking back at Nepal. That image helped with humanitarian response.

To do this Colorado image, it’s such a high oblique. If you were sitting in Colorado, and were able to see our satellite, it was eight degrees off the horizon. Which is really low, right? When the sun gets that low, it starts looking different and turning different colors. And we can’t actually program that into our satellite because the optics are so much different than what the typical operation is. We actually program the satellite to look at stars which are behind the field of view and behind the Earth, so to speak. So we’re looking at stars that aren’t actually visible from where the satellite’s position is, and the Earth gets in the way, and that’s how we capture the image.

We were trying to look at something that was not physically visible from where the satellite was, but that’s how we, not tricked, but programmed our satellite to collect the image of Colorado.