With the launch of its Haswell architecture in 2013, Intel (NASDAQ:INTC) introduced a then-new branding scheme for the graphics technology embedded in its personal computer processors.

The standard, relatively low-performance graphics processors would be marketed as Intel HD Graphics. A step up from Intel HD Graphics would be a technology marketed as Intel Iris graphics. And, at the very highest end of the product stack, Intel had a technology called Iris Pro.

Intel's reasoning for building multiple different tiers of graphics technology is simple; it believed that it could charge more for chips with more sophisticated graphics technology, thereby boosting its processor average selling prices and ultimately profits.

Intel's Iris graphics, which was primarily found in Intel's low-power notebook processors, saw a reasonable degree of success as major system vendors like Apple (NASDAQ:AAPL), Dell, and others adopted it.

The company's Iris Pro graphics, on the other hand, was found on higher-performance notebook (and, in some cases, desktop) parts.

Iris Pro, unfortunately, didn't see much success. Apple adopted the first-generation Iris Pro parts in its 15-inch MacBook Pro and today uses Intel's second-generation Iris Pro parts in its 21.5-inch iMac, but the Mac maker abandoned Iris Pro with the 15-inch MacBook Pro that it launched back in October, opting instead to use a discrete graphics processor alongside an Intel chip with just Intel HD Graphics.

I expect that the next 21.5-inch iMac will also feature a discrete graphics processor paired with an Intel processor with HD Graphics.

Iris Pro appears dead as the company did not release an Iris Pro product as part of its recently launched seventh-generation Core processor product line. In this article, I'd like to go over how Intel could potentially resurrect Iris Pro.

Why was Iris Pro cancelled?

Intel never "officially" explained why its Iris Pro product line was canned, but the bottom line is this: Intel builds chips that its customers (the major personal computer vendors) want to buy. If there were significant demand from Intel's major customers for Iris Pro parts, then Intel would surely have continued to build them.

So, Intel stopped building Iris Pro because it didn't see demand for the products.

This leads us to the question of why customers didn't find value in Iris Pro. That one's easy to explain, too. With Iris Pro, Intel added a lot more graphics resources to its chips, which in turn led to a ballooning in chip size. Larger chips are inherently harder and more expensive to produce than smaller ones.

Not only were the Iris Pro chips themselves much more expensive to build than their non-Iris Pro counterparts, but the use of Iris Pro also necessitated the inclusion of an on-package memory technology, known as eDRAM ("Embedded DRAM"). This further added to cost.

Higher cost isn't a problem; the alternative to a larger Iris Pro chip with embedded memory is the use of a stand-alone graphics processors, which, too, adds significant cost.

The problem was that Iris Pro added meaningful cost, but its performance and power efficiency were rather disappointing. It simply makes more sense for computer makers to use relatively low-end stand-alone graphics processors than to use Iris Pro.

How does Intel fix these problems?

The first step that Intel needs to take to fix the issues that sunk Iris Pro is for Intel to simply build a better graphics architecture. I think that Intel's graphics architectures will continue to improve over time, and perhaps at some point in the next couple of generations Intel's graphics architecture will become robust enough to realize its original vision with Iris Pro (i.e., low-end discrete graphics processor replacement).

Beyond that, though, there's still the issue of manufacturing cost.

Intel recently talked about how it plans to chop up its future personal computer processors into discrete blocks, rather than try to integrate everything into a monolithic piece of silicon.

This so-called "disaggregation" doesn't fundamentally change how much area a high-performance graphics block would take up (so there's still a cost adder), but since the graphics portion of the chip would be separated from the rest of the chip, the yield (and therefore cost) issues associated with a large monolithic silicon die wouldn't be present here.

Foolish takeaway

I have no idea if Intel is currently planning to resurrect its Iris Pro graphics technology in future chips. Leaks suggest, at the very least, that Iris Pro won't be coming back in the company's eighth-generation Core processors.

It'll be interesting to see if Intel changes course with its ninth-generation Core processors or beyond. If the company can develop more compelling and more cost-effective solutions than it did with its prior attempts at Iris Pro, then maybe system vendors might be interested in buying such chips in the future.

However, if Intel can't fundamentally tackle the challenges that plagued the previous Iris Pro parts, then Iris Pro will remain in the graveyard of failed Intel product lines.