The price of photovoltaic hardware has dropped so dramatically in recent years that, according to some projections, a well-sited panel may become competitive with fossil fuels before the decade is out. To reach that point, which comes when panels cost below $2 per Watt, prices will have to continue their steep decline. During our visit to IBM's Watson research center, we talked to two people who are working on ways to drive the cost down—but they are taking radically different approaches.

The panels that most people are familiar with use silicon as a semiconductor. That has a few advantages, like cheap raw materials and reasonably high efficiency. But manufacturing panels remains expensive, and there aren't obvious ways of squeezing large gains in efficiency out of standard silicon. So, IBM is looking at materials that don't involve silicon: thin films and concentrating photovoltaics.

Thin is in

We talked with David Mitzi, who manages the thin film project. These materials currently tend to be less efficient than silicon-based devices, but they have a large advantage: they can be much less expensive to manufacture. One key to this difference is that the boundaries between crystals in thin-film materials don't pose a barrier to the charge carriers (electrons and holes) generated by incoming light. While high performance silicon cells require a manufacturing technique that produces a single large crystal, it's possible to use polycrystalline forms of thin film materials.

That opens the door to solution-based manufacturing, meaning the crystals are deposited by simply layering a dissolved form of the photovoltaic material on top of a metal base and having the crystals precipitate out of that. The resulting materials can be flexible and foldable.

Unfortunately, at this point, they're still expensive. One common thin film material is called CIGS, for cadmium-indium-gallium-selenium. Two of those elements—gallium and indium—remain pricey, which raises the costs of the thin films as a whole. IBM is working with collaborators to develop CZTS, where zinc and tin replace the indium and gallium. These elements are significantly cheaper, which should drop the cost of the material as a whole. Unfortunately, the thin films made with CZTS aren't yet efficient enough to be commercially appealing. CIGS films are currently at 20 percent efficiency in the lab (meaning 20 percent of the incident photons get converted to electricity), and the ones in the market are around 13 percent. In contrast, CZTS had been stuck under eight percent.

Mitzi told Ars that there are teams within IBM that can do modeling to estimate the maximum efficiency of various forms of CZTS. His group tries to figure out what manufacturing techniques can get them closer to that maximal efficiency. When we talked to him, his team had already had a paper accepted for publication (since released) that announced the first CZTS thin film that hit 11 percent efficiency. He estimates that they'll need to get to about 15 percent before commercialization, so they are about halfway there.

Doing more with less

An alternate approach to dropping the cost per Watt involves what's called concentrating photovoltaics. There are some forms of photovoltaic chips that have a far higher efficiency than silicon, but are too expensive for use in standard, flat-panel arrays. To make these economical, you have to send them as much light as possible—the hardware that concentrates the sunlight on the chip is what gives the technology its name.

Bob Sandstrom, who provided a tour of some of the test equipment IBM has set up outside of Watson, said that it's possible to buy triple-junction photovoltaic devices from a division of Boeing called Spectrolab that can exceed 36 percent efficiency. But this hardware is significantly more expensive than the same-sized silicon cell.

To get a better return out of these cells, it's possible to use lenses to send these devices more photons. Doing that creates separate problems: the lenses can be expensive, and a good lens can send temperatures on the chip up to hundreds of degrees Celsius.

IBM first got interested in the chip cooling problem when it developed a gallium-indium alloy that was liquid across a broad range of temperatures, and found it could be used as an efficient thermal couple that links a chip to a heat exchanger. Although the first thought was to use it for cooling computer processors, the people in IBM's Smarter Planet group decided it could also be used to draw the heat off one of Spectrolab's chips. To develop the system, IBM partnered with the King Abdulaziz City for Science and Technology (KACST) in Saudi Arabia, which was interested in increasing its use of renewable power, including the possibility of using it to run a desalination plant.

Sandstrom showed off what he called the first generation design: a collection of large aluminum cones, with a fresnel lens across the wide end to focus the sunlight on the chip, concentrating it to 1,600 suns. At back was a bulky heatsink that looked like it had been pulled from a server made in the Pentium 4 days. A set of these cones were positioned on a large tracker that keeps them directed toward the Sun.

The whole thing was an impressive feat of engineering, but it was also costly.

Nearby, however, was a second-generation device. The cones were gone, replaced by a rectangular aluminum box with just a few simple fins on back to serve as a heat sink (the box itself also radiates heat). Instead of a single, large lens, the box had a sheet of material with rows of individual lenses imprinted on it. Each of these fed a small glass cone that focused the light even further. The liquid metal cooling system that got the whole project started was also gone (though Sandstrom wasn't at liberty to tell me what replaced it).

Aside from the design changes, the intervening years had also made the inverters that convert the output to alternating current much more efficient. The net result of all of this? It's approaching $2/Watt (but it's not quite there yet).

Context is key

Why would one company work on two different technologies? This isn't a case of increasing the odds that one of the avenues of research will work out. Instead, the two different types of photovoltaic devices should work in very different contexts.

The concentrator technology requires direct sunlight, and lots of it—that's why it needs the expensive tracker hardware and a pointing system that includes GPS data to keep it facing the Sun. It also means that it works best in places with very few clouds, like desert environments. Which is convenient, since the heat in those environments tends to drop the efficiency of regular silicon cells; in contrast, the constant cooling required by the concentrating system means that it should run at a nearly constant efficiency.

In fact, the good fit between their system and desert environments has the IBM engineers considering ways they might use some of their system's waste heat to assist in the desalination process.

But there will always be plenty of places without a constant supply of direct sunlight. If prices on solar panels drop enough, though, even those can make sense as a site for solar power. By pushing to make thin film solar cheaper, IBM may help ensure that it has a technology appropriate for these sites as well.