Silicon photonics has emerged as one of those areas of such far-reaching potential that its challenges and benefits tend to be clouded in generalities. Light as the main medium does indeed promise to alter fields including biological and chemical sensing, navigation, radio frequency sensing, communications, but for our purposes here, the potential within large-scale computing is of greater interest. On that point, take a look at the proceedings of a recent technical conference on photonic devices, noting the range of problem areas–not to mention potential applications. And these just skim the topical surface.

On the computing interface and integration fronts, IBM has produced new chips that integrate optical and electrical components on the same die, HPE Labs has produced the free space optical interconnect, and Intel has long-standing ambitions to replace copper with photonics for future datacenter applications and is rolling out silicon photonics switching this year. All three of these vendors, as well as others, have pushed funds toward research and development and made solid cases for the performance potential. But despite all of these (and many other) efforts, why does it seem like this technology, with its promise of upending datacenter efficiency trends, still sits at the fringes? And to what extent is any of this ready for primetime?

We shouldn’t be holding our breath for photonic devices to make it to market and hit large systems within the next year—or even for the next few years, says David Calhoun, a PhD fellow who focuses on the integration of photonic devices with larger systems at Columbia University’s Lightwave Research Laboratory. There are some examples now of such devices appearing on the horizon for high performance computing (HPC) but for these to become ubiquitous, there is quite a leap—up to ten years before these are a common element on extreme scale systems, he says.

What’s interesting here is that the roadblocks to such a point are very much rooted in the manufacturability of such devices. To reach the economies of scale needed to push photonic devices to market at a reasonable price, there must be key technological barriers broken—and enough incentive from the market to make photonic devices something key vendors invest in. In other words, there is something of a chicken-and-egg problem. Without the ability to prototype and implement silicon photonics-based devices inside of systems, manufacturability efforts stagnate. Nothing can progress, in other words, at least to produce photonic devices that can be created and tested at scale following a manufacturing process that is not simple—even for companies that are already producing transistors and chips.

Calhoun is one of several participants in the federally-funded Integrated Photonics Institute for Manufacturing Innovation, which is helping to push the research and development and ultimate production of photonic devices for the datacenter, military, communications, and other markets. Although there are disparate development efforts happening around the world at the component level in particular, there are yet any bold, comprehensive strategies to effectively integrate photonic devices into large systems at scale, cost, and with the reliability required.

Much of this boils down to the fact that the field itself is scattered—and not just in terms of the application areas. Rather, the way to tackle the photonics integration problem itself is not clear—or more accurately, it has not been settled on. Just as there are several camps that believe silicon photonics will either change supercomputing in the next decade (and an evenly balanced camp that believes they will go nowhere), the research effort itself is bifurcated. This makes the challenge and opportunity for Calhoun’s group at the Institute even greater—but doesn’t help us settle the question about where silicon photonics in 2016, not to mention five or even ten years from now.

“One of the biggest challenges, especially for photonics in computing, is bridging the gap between the novel functionality of these devices and the real need from the application side,” Calhoun says. “Data has a particular profile on a computing system and depending on the application, the way it moves through the network can change. So we’re spending a lot of effort in electrical networks looking for a happy medium—no one network can solve all problems at this point, electrically, with silicon photonics, or with some hybrid combination of those.”

There are three ways researchers and early developers of photonics for computing systems are looking at the integration problem. Rapid prototyping, 3D integration, and a co-existence model where photonics are wrapped with electronic components on a single die. Of these, chipmakers like Intel and IBM are interested in the latter for obvious reasons, but also because they have the ability to manufacture devices at scale using existing technologies. However, for the same reasons, having a 3D or stacked approach with the photonics, electrical components, and compute layer are piled onto one another—kept separate but interacting.

Although these are all separate modes of research, they are interdependent, thus complicating things. One cannot develop a hybrid approach to integration without understanding and being able to implement a hybrid solution—at least not at this stage. And rapid prototyping on its own is not sufficient at manufacturing scale because it is not reliable and robust enough.

Some technologies already exist in the market and are being used in datacenters and HPC where there are applications for photonics, for instance, connecting one side of a datacenter to another. Companies like Luxtera make these small form factor pluggable devices that provide an optical connection. But if we think beyond those use cases to powering large-scale supercomputers with such devices, they have to not only work, but be manufacturable—a problem Calhoun’s research center is seeking to address.

“We have significant plans to make these things widely available and provide a good basis for people in the data communications realm to say, here is a networking or computing problem, we examine and understand the problem, present a photonic architecture from the ground up and a manufactured product with full interfacing to implement it with all the engineering steps in between.” In essence, the government funding is pushing a new industry in this regard, starting with one of the biggest problems (interfaces and integration) and following it through to the manufacturing challenges.

At that intermediate step of problem solving are some tough physics and computer science challenges. “For instance, from a computing perspective, these devices tend to have a lot of input and output characteristics, so being able to get all the right data on and off the chip that holds the architecture is but one challenge.” There are already many fast prototyping solutions that tackle this problem, “but pretty good isn’t good enough when it comes to making something manufacturable at scale. We need it to be excellent—so there’s still a little bit of a revolution with the technology to be had, which we’re working on.”