The conclusion of our two-part series on AMD. Part one covered AMD's attempts to transform itself from a second-source supplier of Intel designs into a chipmaker powerhouse in its own right.

Athlon 64, and AMD’s competitive peak

Overall, the Opteron's architecture was similar to K7’s but with two key differences. The first was that the CPU incorporated the system’s memory controller into the chip itself, which greatly reduced memory latency (albeit at the cost of some flexibility; new CPUs had to be introduced to take advantage of things like dual-channel memory and faster memory types like DDR2). This showed that AMD saw the benefits of incorporating more capability into the CPU itself, an instinct that would inform the later purchase of GPU maker ATI Technologies.

The K8’s biggest benefit for servers, though, was its 64-bit extensions. The extensions enabled AMD’s chips to run 64-bit operating systems that could address more than 4GB of memory at a time, but they didn’t sacrifice compatibility or speed when running then-standard 32-bit operating systems and applications. These extensions would go on to become the industry standard, beating out Intel’s alternate 64-bit Itanium architecture—Intel even licensed the AMD64 extensions for its own compatible x86-64 implementation. (Intel's initial approach could only run x86 code with an immense performance penalty.)

The K8 architecture was successful on the desktop in the form of the Athlon 64 lineup, but it was the Opteron server variants that brought AMD real success in the high-margin market. By the time Intel introduced dual-core Xeons based on the company's Core architecture in September of 2006, AMD had snapped up an estimated 25 percent of the server market. AMD continued to iterate successfully on K8 for a few years, performing several architecture tweaks and manufacturing process upgrades and even helping to usher in the multicore era of computing with the Athlon 64 X2.

Despite technical successes, AMD's financial situation had become precarious. Processor unit sales were falling, and margins on most chips dropped quickly after 2000. AMD also had problems with producing too much inventory; in the second half of 2002, AMD actually had "to limit shipments and to accept receipt of product returns from certain customers," it announced, because the chips it made weren't selling fast enough. The company had a net loss of $61 million in 2001, $1.3 billion in 2002, and $274 million in 2003.

What was sucking away the company's money? It was those darned fabs, just as Raza had feared. In the company's 2001 10-K, AMD estimated, "construction and facilitation costs of Dresden Fab 30 will be approximately $2.3 billion when the facility is fully equipped by the end of 2003." There was also a $410 million to AMD Saxony, the joint venture and wholly owned subsidiary that managed the Dresden fab.

By the following year, AMD upped its estimated costs to fund Dresden to $2.5 billion and added that by the end of 2001, it had invested $1.8 billion. The estimated costs continued to rise, as per the 2003 10-K: "We currently estimate that the construction and facilitation costs of Fab 30 will be $2.6 billion when it is fully equipped by the end of 2005. As of December 29, 2002, we had invested $2.1 billion in AMD Saxony." That same year, AMD plowed ahead with a new Dresden fab ("Fab 36"), investing $440 million into it by the end of the year.

The money for these huge investments all relied on AMD's ability to sell chips, and AMD's ability to sell chips was made easier by its competitive edge over Intel. Unluckily for AMD, Intel didn't take this challenge lying down.

Intel resurgent

AMD's high point was, in most respects, one of Intel's lowest. "Clearly [AMD] had a very competitive product in Opteron in particular," Intel spokesperson Bill Calder told Ars, "and there was a lot of consternation inside of Intel and a lot of work going around trying to correct the problem and trying to counter not only in the market but in the press. At the time, there was quite a bit of focus on the competitive threat from AMD, but it was also very much a rallying call inside of Intel."

Even as AMD was beating Intel soundly with the Opteron server parts, the AMD64 extensions, and the Athlon desktop parts, Intel was sowing the seeds that would eventually grow into one of the company's most resounding successes: the Core architecture. By 2003, it was becoming clear that the NetBurst architecture that powered the Pentium 4 wasn't performing as well as the company had hoped—Intel had hoped to push the chips' clock speeds all the way up to 10GHz, but even at 4GHz, the Pentium 4's heat and power consumption were causing reliability problems. These same heat and power issues also made NetBurst ill-suited for use in the growing laptop segment. Rather than modify the Pentium 4's architecture to work better in laptops, the company went back to the drawing board and assigned a small team in Israel to work on a project known as Banias. This chip would later go on to be known as the Pentium M, the basis of Intel's successful Centrino marketing push (Centrino bundled a Pentium M processor, an Intel chipset, and Intel 802.11b and 802.11g wireless adapters).

Pentium M didn't start from scratch; it instead went back to Intel's Pentium III architecture and modified it to increase performance and efficiency. Pentium M also refined power saving technologies like SpeedStep, which dynamically adjusted the CPU's clock speed and voltage depending on how heavily the chip was being used.

The CPU was such a success for Intel in laptops that, when the NetBurst architecture's time was up, the company set about to adapting the Pentium M's architecture for desktops and servers as well. It ramped up Pentium M's clock speed, added 64-bit extensions (licensed, of course, from AMD), and added a second CPU core, which provided the basic ingredients for the Core 2 Duo (the original Core Duo and Core Solo were sold only in laptops and lacked 64-bit extensions—Core 2 Duo was this architecture's first foray into non-mobile form factors.)

This Core architecture accomplished several important goals: it gave Intel a fast, power-efficient 64-bit Xeon in the server market to stem Opteron's tide; it took back the symbolically important performance crown in the desktop market; and it was much more power-efficient than AMD's laptop chips right at the time when laptops began to outsell desktops for the first time. (AMD's power consumption in laptops became competitive only recently with 2011's Llano and 2012's Trinity parts.)

The Core architecture hit AMD where it hurt, but the biggest damage to AMD's long-term health came from Intel's execution strategy. Beginning around the same time, Intel moved to a system of smaller but aggressively timed processor updates that it called "tick-tock."

Every year, Intel would introduce a new processor lineup—the "ticks" would gently tweak a CPU architecture and move it to a smaller, lower-power manufacturing process, while the "tocks" would remain on the established manufacturing process and introduce more drastic architectural changes. This system limits the risk that a new process or architecture will run into significant problems during the manufacturing stage, and new processor iterations can be introduced so quickly that a competitor with a superior architecture won't necessarily be able to stay on top for years, as AMD did with K8.

Neither Core nor any subsequent Intel architecture has left AMD behind all by itself, but Core 2 kicked off a relentless string of well-executed Intel CPUs. While AMD's CPUs continued to improve, they were over time shut out of the high-end market once more and forced to compete again mainly on price, mirroring the company's early struggles. It also didn't help that, just as Intel was churning out its best products in years, AMD was trying to swallow another company whole.