Two researchers at MIT, working with an IBM foundry, have almost produced a monolithic optoelectronic computer chip — a chip that integrates both silicon components and optical interconnects.

This is primarily significant because of MIT’s use of a monolithic process; a process where all of a chip’s features are fashioned on a single die. Every modern chip is fabricated in this way — but as it stands, optical equipment is never produced like this. Currently, optoelectronic solutions involve a silicon chip and bulky off-chip optical devices, such as lasers, detectors, and modulators. These devices consume large amounts of electricity and are much too large to include in a laptop or desktop.

As a result, the telecommunications industry is the only real sector that utilizes optical networks. MIT’s thinking is that if monolithic optoelectronic chips can be produced — where lasers, waveguides, photodetectors, and modulators are all on the same piece of silicon — companies like Intel and TSMC will be much more likely to pursue optoelectronic solutions. So far, MIT has produced a monolithic optoelectronic chip with on-die photodetectors, ring resonators, and waveguides — but they haven’t succeded at etching channels under the waveguides, which is necessary to prevent light leakage.

Vladimir Stojanovic, one of the researchers working on the project, admits that existing processes would have to be changed a little for monolithic optoelectronic chips to become a reality. He also says, however, that it could be easier to add optical components to chips built from the bottom up, which covers most “3D” designs, including Intel’s 22nm FinFET chips.

In the next few years, then, we are likely to see the first generation of optoelectronic chips. These early chips would have off-chip lasers, meaning they would likely be used for off-die interconnects, such as between your CPU and RAM. On-chip lasers are a little further away, but when they finally roll around we’ll be looking at multi-core chips that use light instead of electricity to communicate.

The main advantage of using light is a huge reduction in power, and potentially a vast increase in bandwidth. In general, increasing bandwidth over an electrical connection requires more power. Computers already constitute a large percentage of humanity’s power needs, and optical interconnects could go a long way to keeping power consumption down in the long term. The same multiplexing techniques that are used to carry terabits per second over fiber optic networks would also apply to computer chips, too.

Finally, MIT speculates that it should be possible to create chips where a single core can communicate with every other core at the same time using lasers. Basically, imagine a 100-core chip where each core has its own laser. Instead of shuffling data to and from memory/cache, each chip could simply fire data directly to other cores. Not only would this save power, but it would be incredibly fast.

Read more at MIT