Anytime you pay a visit to Mozilla.org, the home page of the open source Mozilla project, you're landing on a page served by a brand new type of machine. A SeaMicro SM10000 server – a fat box about the size of an air conditioner that houses 64 Xeon processors.

Mozilla has two SM10000's in its brand new Santa Clara data center, which went online in February. One of them powers Mozilla.org, and the second powers Mozilla's add-on website. It's part of a strategy to move lighter web-serving loads to low-power systems, and to free up the bigger hard drive capacity, memory, and CPU power on the organization's blade server so that they can run databases. "Our thinking was take a web server, turn it into a database server, and take the web server and move it into SeaMicro," says Matthew Zeier, Mozilla's director of IT operations.

Until chipmaker AMD shelled out $334 million to buy it a few months ago, SeaMicro was one of those hot Silicon Valley startups that got some buzz for its innovative low-power server designs. But it's also, in a way, part of the Mozilla community. Mozilla signed up to be a guinea pig, testing one of SeaMicro's early servers. This machine, based on an lower-power Intel chip called Atom, didn't quite work out for the work that Mozilla wanted to do. But not only did SeaMicro give the Mozilla team a free upgrade; they let them keep the old Atom machine, which is sitting idle in a corner of Mozilla's data center, for the time being.

Mozilla is one of the best known brands in the open source world. Its Firefox browser runs on about a quarter of the world's PCs, but three months ago, the group crossed another important threshold. It became big enough to build its own data center.

Mozilla's new $3 million Santa Clara, California, facility – a shared floor on a large facility run by wholesale data center provider Vantage – is tiny compared to the monster facilities run by the cloud giants. But it's also a community effort. After all, this is Mozilla – an organization that wants outside volunteers to work as its system administrators.

There are a lot of data center secrets. Google, Apple, Amazon, and others consider many of their data center techniques a serious proprietary advantage and, accordingly, they keep them under wraps.

A year ago, Mozilla's Zeier started to think about consolidating his three Bay Area into one large operation. They were starting to burn enough power that Zeier thought that it might be time to move to from retail to wholesale data center. In a retail data center, you plug your servers into a pre-built facility. In wholesale, you're given a big empty room with a handful of 250-kilowatt power hookups. The rest is up to you.

To figure out how to build its wholesale data center, Mozilla reached out to friends in the industry, people like Internet Software Consortium Founder Paul Vixie and data center guru Dave Ohara, who connected them with other data center experts. Ohara and Vixie couldn't be reached for comment.

"It's only recently that there's been a larger interest in more of this collaboration and community good approach to this," Moore says. "We're still finding each other."

They got invited to tour some data centers. With some companies, though, there was more of a stony silence. "Though this whole process of discovery where we were reaching out to the community, we also ran into those people. We realized how many people didn't want to talk about it," says says Derek Moore, Mozilla's manager of global data center operations. "There were people who simply wouldn't want to share any information at all."

Power is the single largest operational expense for data center operators, and companies such as Facebook, Netflix and RackSpace have banded together, forming the Open Compute Project, to share power-saving techniques and server designs, in hopes that they can cut down on the competitive advantage wielded by the Googles and Amazons of the word. The Open Compute Project will host its third public get-together on Wednesday, as members try to share information and design secrets so that organizations like Mozilla can save more on power.

For a small player like Mozilla, joining the Open Compute Project and opening up its data center is the natural thing to do. So Mozilla was happy to have Wired come by and take pictures of what was going on.

"The ethos of Mozilla and how we started was all community based," says Zeier. "There's very little that we do that's behind closed doors. So I don't think doing a data center and talking about it was a weird thing."

The new data center – built right next to a Silicon Valley Power substation, (always a reliable source of power) – went online in February. According to Zeier, it's designed for flexibility and power. Right now, Mozilla uses about a half-megawatt of power, but by year's end, as Mozilla adds more and more servers, its power consumption will double, he says.

The data center is conventional, but it has a few quirks. There are the SeaMicro servers. And the overhead beams are painted Firefox orange. And there's a rack of Mac Mini computers, consumer systems that you almost never see in a data center. Mozilla runs about 500 of these $600 PCs worldwide as part of its browser test-bed.

Zeier likes the inexpensive Mac Minis because they have the processing strength and features typical of PCs. They're not some pumped up server machines being tamed to do a PC's job. That makes them great for testing, but also a setup and maintenance nightmare. Mac Mini has a single power supply and a single network connection, and they're not designed to be slid in and out of server racks. "Once it's racked, you really can't move it," says, Erica Muxlow, a project manager at Mozilla. And even more tricky: the Mini's can't be configured remotely. "They have to be onsite with a crash cart connecting directly to the machines to configure the OS," she says.

Clearly, getting the new data center operational has been a lot of work for Mozilla's small team, but now that the hardest stuff is behind them Zeier makes it sound almost easy. "This isn't secret stuff. We've built data centers for 50 years," he says. "There's power, there's cooling, and there's walls."