June 6, 2005, seemed to be a triumphant moment for Intel. The chipmaker was already dominating the market for processors that powered Windows-based PCs. Then Steve Jobs took the stage at Apple's World Wide Developers Conference to announce that he was switching the main Windows alternative, Macintosh computers, to Intel chips as well. The announcement cemented Intel's status as the leading company of the PC era.

There was just one problem: The PC era was about to end. Apple was already working on the iPhone, which would usher in the modern smartphone era. Intel turned down an opportunity to provide the processor for the iPhone, believing that Apple was unlikely to sell enough of them to justify the development costs.

Oops.

On Tuesday, Intel announced that it was laying off 12,000 employees, 11 percent of its workforce, the latest sign of the company's struggle to adapt to the post-PC world. Intel still isn't a significant player in the mobile market — iPhones, iPads, and Android-based phones and tablets mostly use chips based on a competing standard called ARM.

The company is still making solid profits — it just announced a $2 billion profit for the first quarter of 2016. But the company's growth has stalled, and Wall Street is getting worried about its future.

Obviously, Intel made a mistake by missing out on the iPhone business. Intel's error in judgment is a classic example of what business guru Clay Christensen calls "disruptive innovation." The term disruption has become so overused in the technology world that it's sometimes treated as a joke. But Christensen gave it a more precise meaning that fits Intel's situation perfectly: a cheap, simple, and less profitable technology that gradually erodes the market for a more established technology.

Intel is just the latest in long line of companies that have failed to effectively deal with this kind of disruptive threat.

Smartphones are based on a different chip standard than PCs

Intel invented a chip standard called x86 that was chosen for the IBM PC in 1981 and became the standard for Windows-based PCs generally. As the PC market soared in the 1980s and 1990s, Intel grew with it.

The key to success in the PC business was performance. Chips with more computing power could run more complex applications, complete tasks more quickly, and run more applications at the same time. During the 1990s, Intel and its rivals raced to increase their chips' megahertz ratings — a measure of how many steps the chips could perform in a second.

One thing these early chipmakers didn't care about was power consumption. Higher-performance chips often consumed more energy, but this didn't matter because most PCs were desktop models plugged into the wall. Even laptops had large batteries and could be plugged in most of the time.

But this became a problem in the late 2000s, when the market began to shift to smartphones and tablets. These devices had smaller batteries (to keep the weight down), and users wanted to use them all day on a single charge. Existing x86 chips were a poor fit for these new applications.

Instead, these companies turned to a standard called ARM. Created by a once-obscure British company, it was designed from the ground up for low-power mobile uses. In the mid-2000s, ARM chips weren't nearly as powerful as high-end chips from Intel, but they consumed a lot less power, which was important for smartphones from Apple and BlackBerry.

Even better, the ARM architecture is designed for customization. ARM licenses its design to other companies such as Qualcomm and Samsung, which make the actual chips. That provides flexibility that allows smartphone makers to combine a number of different functions on a single chip. And packing a bunch of functions — like data storage and image processing — onto one chip helps to keep power consumption down.

Today, ARM chips totally dominate the mobile device business. iPhones and iPads run on a chip called the A9 (and predecessors such as the A8 and A7) that are based on the ARM platform, designed by Apple, and manufactured by chipmakers like Samsung and TSMC. Most Android-based phones run on ARM-based chips from Samsung, Qualcomm, and other ARM chipmakers.

The mobile revolution is leaving Intel behind

Intel had not just one but two opportunities to become a major player in the mobile chip market. One was the opportunity to bid on Apple's iPhone business. The other was its ownership of XScale, an ARM-based chipmaker Intel owned until it sold it for $600 million in 2006.

Intel sold XScale because it wanted to double down on the x86 architecture that had made it so successful. Intel was working on a low-power version of x86 chips called Atom, and it believed that selling ARM chips would signal a lack of commitment to the Atom platform.

But Atom chips didn't gain much traction. Intel has made a lot of progress improving the power efficiency of its Atom chips. But ARM-based chipmakers are experts at building low-power chips, having focused on that task for more than a decade. So they had the early advantage. And at this point, ARM has a huge share of the market. That gives them all of the advantages — more engineers, better software — that come with being a dominant platform.

Intel's decline is a classic story of disruptive innovation

On one level, you can say that Intel just got unlucky and backed the wrong horse. The chipmaker could have tried harder to win Apple's iPhone contract, and it could have bet on its XScale ARM subsidiary instead of trying to create Atom processors. But it chose not to.

But on a deeper level it's not surprising that Intel took the path it did, again because of Christensen's theory of disruptive innovation.

Intel's basic problem was that the mobile chip market didn't seem profitable enough to be worth the trouble. Intel had built a sophisticated business around the PC chip. Its employees were experts at building, selling, distributing, and supporting PC chips. This was a lucrative business — often Intel could charge several hundred dollars for its high-end chips — and the company was organized around the assumption that each chip sale would generate significant revenue and profits.

Mobile chips were different. In some cases, an entire mobile device could cost less than the price of a high-end Intel processor. With many companies selling ARM chips, prices were low and profit margins were slim. It would have been a struggle for Intel to slim down enough to turn a profit in this market.

And in any event, Intel was making plenty of money selling high-end PC chips. There didn't seem to be much reason to fight for a market where the opportunity just didn't seem that big.

What this analysis missed, of course, was that the mobile market would eventually become vastly larger than the PC market. ARM-based chipmakers might make a much smaller profit per chip, but the market was destined to grow to many billions of chips per year. Even a small profit per chip multiplied by billions of chips could add up to a big opportunity.

Meanwhile, Intel had to worry that jumping wholeheartedly into low-power mobile chips would undermine demand for its more lucrative desktop chips. What if companies started buying Intel's cheap mobile chips and putting them in laptops? That could hurt Intel's bottom line more than the added mobile revenue would help it.

Obviously, Intel's leadership now recognizes that they made a mistake. They're now so far behind that it's going to be a struggle to gain a foothold in the new market. And as cheap mobile chips get more and more powerful, we can expect more and more companies to put them into low-end laptop and desktop computers, eroding demand for Intel's more expensive and power-hungry chips.

Chipmakers are doing to Intel what Intel once did to Digital Equipment Corporation

Ironically, Intel is now suffering the same fate that it inflicted on an earlier generation of computing innovators three decades ago. In the 1980s, there was a thriving community of "minicomputer" makers led by a company called the Digital Equipment Corporation.

These washing machine–size minicomputers were only "mini" compared to the room-size mainframe computers that preceded them, and they cost tens of thousands of dollars.

Early PCs based on Intel chips were referred to as microcomputers, and companies like DEC dismissed them as toys. They did this for exactly the same reasons Intel dismissed the mobile market — selling a $2,000 PC was a lot less profitable than selling a $50,000 minicomputer, and DEC didn't expect PCs to be a big enough market to be worth the effort.

Of course, that turned out to be totally wrong. The PC market turned out to be vastly larger than the minicomputer market, just as the mobile market is now much larger than the PC market. But by the time this became clear, it was too late. DEC and most of its peers were forced out of business by the end of the 1990s.