Gadget buyers today can purchase PCs, cell phones and mp3 players with significantly more memory than their predecessors for just a few dollars more than they paid a few years ago. To wit: you can purchase an 80-gigabyte iPod today for the same price (around $250) you paid for one with just 30 gigabytes of memory two years ago. But consumers are in for a rude awakening if technology makers fail to find a way to shrink memory components enough to continue packing more of them into ever-tinier gadgets.

Computers and other electronic devices use different types of memory to perform different functions. PCs use a hard disk drive, for example, to store large amounts of data for the long term and Random Access Memory (RAM)—also called "solid state" memory—to retain data outside the hard drive, where it can be accessed quickly and repeatedly. In an attempt to ensure that memory is not sacrificed with size, however, device makers are eyeing an experimental approach called "universal memory," which not only takes up less space but is also faster than the RAM now available. Currently only a handful of companies are investing in the budding technology, but that could change as new forms of universal memory emerge, most notably IBM's "racetrack" memory and Nantero, Inc.'s nano RAM (NRAM).

These join slightly more mature—yet still unproven—universal memories such as magnetoresistive RAM (MRAM), which uses magnetic polarization to store information permanently on a device's microprocessor, and "phase change" memory, which stores data in a glassy substance called chalcogenide as it is heated and its atoms are rearranged.

Each type of solid-state memory (which includes static RAM, dynamic RAM and flash) has its benefits and drawbacks. "Because of cost, it's often impractical to have all three, so the designer needs to make hard choices and sacrifice something (such as speed, battery life, et cetera) and go with only one or two out of the three memory types," says Greg Schmergel, Nantero's co-founder, president and chief executive officer.

SRAM, often used for cache memory in microprocessors, can rapidly read and write data without sucking up a lot of power, but it cannot hold as much data as DRAM. DRAM, however, is slower than SRAM and requires more power, making it impractical for portable devices running on batteries. Flash's main advantage is that it retains information (such as an address book in a cell phone) even when the device is powered down, but it is not as fast or as durable as its counterparts, which means too much use will cause it to wear out and lose data.

To create NRAM, Woburn, Mass.–based Nantero places billions of carbon nanotubes on a silicon chip to store data. When an electrostatic force is applied, the carbon nanotubes move up and down to represent the 0s or 1s of data. The company has demonstrated, Schmergel says, that NRAM's approach has the speed and capacity to surpass other memory types and that it can be manufactured in existing chip-making facilities (called fabs). The question is whether major chipmakers such as Intel, Micron or Samsung can be persuaded that NRAM can be efficiently and reliably manufactured in high volume.

Researchers at IBM Almaden Research Center in San Jose, Calif., meanwhile, are placing their bets on racetrack memory, which stores data in a magnetic pattern on thousands of nanowires—each 1,000 times thinner than a human hair—arranged on a silicon chip. Pulses of electrical current use the spin of electrons to move the entire data pattern along the wire as though on a racetrack (hence the name). Using the electron's spin, rather than its charge, may allow for devices that consume much less energy, says Stuart Parkin, an IBM Fellow and manager of Almaden's magnetoelectronics group. IBM projects that it could, within the next decade, enable a handheld device such as an mp3 player to store about 500,000 songs or 3,500 movies—100 times more than is possible today—without increasing the device's cost or decreasing battery life.

Chipmakers are reluctant to use a new, unproven type of memory on their microprocessors that might raise the price tag of their products. "RAM is placed on the silicon chips used in electronic devices, and the companies that make these chips have it down to a science," says Jim Handy, a director with Objective Analysis, a semiconductor market-research firm based in Los Gatos, Calif. "Every year, they put more memory on smaller chips while keeping their prices relatively steady. The whole idea with memory is the cheapest wins."

However, as chipmakers such as Intel continually shrink the size of their chips so they can act as the brains of increasingly smaller devices, they need new approaches to memory that combine the best assets of solid-state and hard-disk memory and keeps costs comparable, if not lower.

"Consumers expect that, for the same price they paid last year for a cell phone, they can get a new one this year with more memory," says Handy, who adds that the role of universal memory is to allow this trend to continue. Flash memory is the cheapest, currently costing about $3 per gigabyte (a price that's expected to drop 40 percent by next year). Similarly, DRAM, which costs about $9 per gigabyte will likely cost about $6 per gigabyte next year. SRAM goes for about $100 per gigabyte and does not change much in price from year to year.

It's hard to say how much universal memory will cost in any of its potential incarnations. "Price projections for new memory technologies are not usually worth the paper they are printed on before those technologies have reached volume production," Handy says. Costs could go down once the technology catches on and is in full-scale production. In addition, manufacturers would be able to make smaller computer chips (that use less silicon) if universal memory is more compact than flash, DRAM or SRAM. That would mean smaller, more powerful electronics for many years to come.