But where is “there,” and what does it look like?

“There” is nowadays likely to be increasingly large, powerful, energy-intensive, always-on and essentially out-of-sight data centers. These centers run enormously scaled software applications with millions of users. To appreciate the scope of this phenomenon, and its crushing demands on storage capacity, let me sketch just the iceberg’s tip of one average individual digital presence: my own. I have photos on Flickr (which is owned by Yahoo, so they reside in a Yahoo data center, probably the one in Wenatchee, Wash.); the Wikipedia entry about me dwells on a database in Tampa, Fla.; the video on YouTube of a talk I delivered at Google’s headquarters might dwell in any one of Google’s data centers, from The Dalles in Oregon to Lenoir, N.C.; my LinkedIn profile most likely sits in an Equinix-run data center in Elk Grove Village, Ill.; and my blog lives at Modwest’s headquarters in Missoula, Mont. If one of these sites happened to be down, I might have Twittered a complaint, my tweet paying a virtual visit to (most likely) NTT America’s data center in Sterling, Va. And in each of these cases, there would be at least one mirror data center somewhere else — the built-environment equivalent of an external hard drive, backing things up.

Small wonder that this vast, dispersed network of interdependent data systems has lately come to be referred to by an appropriately atmospheric — and vaporous — metaphor: the cloud. Trying to chart the cloud’s geography can be daunting, a task that is further complicated by security concerns. “It’s like ‘Fight Club,’ ” says Rich Miller, whose Web site, Data Center Knowledge, tracks the industry. “The first rule of data centers is: Don’t talk about data centers.”

Yet as data centers increasingly become the nerve centers of business and society — even the storehouses of our fleeting cultural memory (that dancing cockatoo on YouTube!) — the demand for bigger and better ones increases: there is a growing need to produce the most computing power per square foot at the lowest possible cost in energy and resources. All of which is bringing a new level of attention, and challenges, to a once rather hidden phenomenon. Call it the architecture of search: the tens of thousands of square feet of machinery, humming away 24/7, 365 days a year — often built on, say, a former bean field — that lie behind your Internet queries.

INSIDE THE CLOUD

Microsoft’s data center in Tukwila, Wash., sits amid a nondescript sprawl of beige boxlike buildings. As I pulled up to it in a Prius with Michael Manos, who was then Microsoft’s general manager of data-center services, he observed that while “most people wouldn’t be able to tell this wasn’t just a giant warehouse,” an experienced eye could discern revelatory details. “You would notice the plethora of cameras,” he said. “You could follow the power lines.” He gestured to a series of fluted silver pipes along one wall. “Those are chimney stacks, which probably tells you there’s generators behind each of those stacks.” The generators, like the huge banks of U.P.S. (uninterruptible power supply) batteries, ward against surges and power failures to ensure that the data center always runs smoothly.

Image Known for its bean and spearmint fields, Quincy, Wash., is also home to rows of servers in a 500,000-square-foot data center that Microsoft built in 2006. Credit... Simon Norfolk/NB Pictures, for The New York Times

After submitting to biometric hand scans in the lobby and passing through a sensor-laden multidoor man trap, Manos and I entered a bright, white room filled with librarylike rows of hulking, black racks of servers — the dedicated hardware that drives the Internet. The Tukwila data center happens to be one of the global homes of Microsoft’s Xbox Live: within those humming machines exists my imagined city of ether. Like most data centers, Tukwila comprises a sprawling array of servers, load balancers, routers, fire walls, tape-backup libraries and database machines, all resting on a raised floor of removable white tiles, beneath which run neatly arrayed bundles of power cabling. To help keep servers cool, Tukwila, like most data centers, has a system of what are known as hot and cold aisles: cold air that seeps from perforated tiles in front is sucked through the servers by fans, expelled into the space between the backs of the racks and then ventilated from above. The collective din suggests what it must be like to stick your head in a Dyson Airblade hand dryer.