Robots, mathematics, physics, climate simulations: All of these things rely on computers and computing. Computers, though, are quickly reaching their limits; they simply aren't fast enough and can't store enough information with smaller amounts of space.

Moore's Law has long been held as a true and hard fact of computers. It states that computer power doubles roughly every 18 months. Yet this particular statement may not be true, after all.

Two limits on current computers are energy and communication. The power consumption issue comes from the fact that the amount of energy used by existing circuit technology doesn't shrink in the same proportion as shrinking physical dimensions. In addition, the speed at which computers can send information is also constrained by the limits of physics.

Communication and sending information is crucial. Compute nodes have to be able to talk to each other, and supercomputers simply can't be scaled up indefinitely without taking communication into consideration. New algorithms can partly relieve the issue but eventually, you're going to hit a barrier.

And that's exactly where optics come in. Optical circuits, in addition to quantum computing, could speed up communication; conventionally, photons are used only to deliver information, racing along fiber-optic cables. Increasing these light-based components could, in theory, help speed up communications.

This is currently what researchers are working toward. In 2013, scientists have created transistors made of "light." They created a cloud of chilled caesium atoms suspended between two mirrors. The transistor is set "on" by default and allows a beam of light to sail through the cloud. But sending a single "gate" photon through turns the switch off.

Yet this isn't the latest advance. A team of researchers has also built an array of light detectors that are sensitive enough to detect the arrival of individual light particles, or photons. They then mounted the array on a silicon optical chip. This could actually advance traditional computing toward quantum computing.

So what is the difference between normal computing and quantum computing? Quantum computing actually makes use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. It actually harnesses the power of atoms and molecules. Of course, it's a bit more complicated but, in theory, it could be far faster at making calculations.

The main difference between the two forms of computing is based on the idea of superposition. In classical computing, a classical bit can either be on or off. In quantum computing, there are these same two states. Yet the qubits that are used can be both in the on and off state simultaneously.

In theory, a set of qubits can store an exponential amount of data while the same number of bits in traditional computing would only be able to store a linear amount of data. This data storage capacity could help process even more information at the same time at a reduced size.

Today, researchers are using light as an information carrier in computing and calculating. Light, though, does come with challenges.

"It doesn't slow down, it's always moving, and that makes it very hard to use as a carrier of information," said Sergio Cantu, a second-year PhD student in physics at MIT, in an interview with MIT News. "How do you imprint information on something that you can't pin down?"

Experts in this field are actually using something called electromagnetically induced transparency to help with this issue. This technique allows scientists to slow the speed of propagation of light. With this method in addition to better and more-controlled lasers, researchers are looking forward to creating ultra-fast computing.

Related Stories

Quantum Teleportation May be Possible: Electrons Remain Entangled Even After Separation

New Reliable Qubit Breakthrough May Make Quantum Computing a Reality

For more great science stories and general news, please visit our sister site, Headlines and Global News (HNGN).