Technologists have long known about silicon's limitations as principal ingredient in microchips. It just so happens they've just been really good at coming up with workarounds. Still, as chipmakers continue to shrink down computer circuitry so they can cram all manner of components into smaller and smaller areas, eventually those diminutive circuits come face-to-face with the cold, hard laws of physics.

Put simply, they start to leak current, making them lousy at retaining digital information.

Even Gordon Moore, who is constantly predicting an end to his own law, has expressed perpetual amazement that companies like Intel have come up with temporary solutions to these very real physical limitations for so long.

Well, that's about to change, according to Suman Datta, a researcher at Pennsylvania State University. During a talk at Britain's University of Leeds on Friday, Datta gave a hard deadline for silicon's remaining usefulness, saying that silicon chips have no more than four years of further miniaturization left. After that point, chipmakers will need to look elsewhere for their electronic-circuitry needs.

One potential replacement is carbon nanotubes. These tubes of pure carbon, which are about the width of a typical protein molecule, also happen to conduct electricity. As such, some researchers have proposed using them as tiny molecular-scale wires for making electronic circuitry. Unfortunately, they also cost about $500 a gram at the moment.

The other solution that will be proposed at the Leeds physics conference in the coming week is superconductors, or materials that conduct electricity with zero electrical resistance. Researchers will supposedly demonstrate new ways to harness the power of quantum qubits to boost computing power. To date, there remain very practical difficulties in building a quantum computer, and so far such computers have only been used to solve trivial problems.

Furthermore, even if we do switch conducting materials, Moore says there will remain some very fundamental limitations to microelectronics we can't mess with – namely, the speed of light and the atomic nature of matter.

Looks like we'll have to revert back to old-fashioned magic to overcome those hurdles.