AMD is finally turning the big 5-0, and what a rollercoaster of a half century it’s been for the Silicon Valley startup. You may not know it from the company before you today, but it’s not been an easy journey for ol’ AMD. Rather its been a perilous venture complete with highs, lows, and various lawsuits that have left this underdog teetering on the edge of collapse more than once in its history.

AMD’s staff now numbers in the tens of thousands spread across the world in over 23 countries. It holds offices, R&D labs, and even a couple manufacturing facilities across Europe; in the US, including its headquarters in Austin, Texas and Santa Clara, California; South America; across Asia; and all the way Down Under. But it all started in Silicon Valley, just a short hop away from the present day headquarters of both Intel and Nvidia.

From the very beginning AMD has been a company willing to dive headfirst into the ‘next big thing’, risking it all for a chance at the big bucks. Nowadays that attitude has it taking on Intel with AMD Ryzen, and soon it could – for the first time in a very long time – rule the market with its AMD Ryzen 3000 processors. But while AMD may come with a market cap of some $30bn nowadays, it is still, as it always has been, an underdog.

And here we are at the red team’s semi-centennial. To celebrate the company’s successes, and wildly fascinating story, we’ve put together a history of AMD, from its founding in 1969 until today, to show just what it takes to make it big in semiconductors and take on the giant that is Intel at its own game.



Image courtesy of Christian Bassow, CC-BY-SA-4.0

Second source years (1969 – 1985)

AMD was incorporated on May 1, 1969 by Jerry Sanders along with a few loyal pals from Fairchild Semiconductor. Sanders had decided to go it alone, and, whether he would ever admit it or not, walk in the footsteps of Robert Noyce and Gordon Moore. The well-known semiconductor high-flyers left Fairchild to found Intel just one year prior to Sanders’ own solo escapade.

Sanders wore expensive suits, drove two Rolls-Royces, owned a home in Beverly Hills, and worked out at the same gym as Sylvester Stallone Hector Ruiz

Sanders is a fascinating guy. Hector Ruiz (AMD CEO, 2002 – 2008) would later describe AMD’s founder in his grandiose book Slingshot: AMD’s Fight to Free an Industry from the Ruthless Grip of Intel, Greenleaf Book Group Press (2013) as “six foot two and fit, with a shock of cotton-white hair and a perfectly manicured beard. Sanders wore expensive suits, drove two Rolls-Royces, owned a home in Beverly Hills, and worked out at the same gym as Sylvester Stallone.”

But for all his eccentricities and ego, he was diabolically committed to chasing down business, and it wouldn’t be long before AMD had corralled work as a second source supplier for Fairchild and a few other important semiconductor manufacturers. It would also get to producing its own logic and RAM chips.

It wasn’t until Intel launched the first commercial microprocessor, the 4004 in 1971, that AMD would start travelling down the road that would lead it to where it is today. Intel followed up on its initial successes with the Intel 8080 in 1974, and that’s when AMD got to work reverse-engineering the company’s efforts, lopping off the top of an Intel 8080 and figuring out what made it tick. One year later and it had produced the Am9080, an unlicensed clone of Intel’s microprocessor.

CPU wars: Who’s got the best CPU for gaming in 2019?

With pressure from customers for second source suppliers – companies that are able to manufacture copies of first-party designs – Intel turned to AMD for help. The two entered a cross-licensing agreement, and AMD racked up some sweet business off the back of Intel’s success. Intel would soon create the Intel 8086 after all, and that chip, with its fabled instruction set, would become seminal to both companies’ futures.

By 1981, IBM was building the IBM Personal Computer, and it wanted Intel inside (that particular marketing slogan wouldn’t actually appear until 1991). But there was one condition, it would only do so if Intel racked up a whole bunch of second source manufacturers. AMD was a prime candidate – along with six other companies that would also net the business. Both companies would enter a 10-year licensing agreement beginning in 1982 promising each other the right to sell products based on the other’s designs.

Intel now could sell to IBM, who’s PC would sell so fast it would be described by the New York Times as so successful it “surprised many people, including IBM itself.” And AMD was able to pounce on a whole lotta’ business, too.



Image courtesy of Cosmic73, CC-BY-SA-4.0

Taking on Intel at its own game (1984 – 1996)

In 1984, the silicon love affair ended. Intel was reluctant to hand over anymore of its secrets to AMD, and the companies would be dragged into a colossal tiff that still resonates between the two today.

By the mid ’80s, the integrated circuit market was getting trickier and Intel had decided to withhold the code for its 3rd generation x86 processor, the 80386, from AMD in an attempt to maintain market dominance.

That move wouldn’t hold off AMD forever, but it did throw a rather considerable spanner in the works. In order to keep orders flowing, the red team (or actually green team at the time) needed to reverse engineer every single Intel chip following the 386, and it wouldn’t be until 1991 that AMD would be able to reverse engineer Intel’s ancient 1985 chip. Similarly, it would take AMD until 1993 to create its clone of the 486 Intel launched in 1989.

They will free us from the tyranny of the commodity marketplace AMD spokesperson

AMD would also work on its own chips during this time, with Sanders pledging the company to the so-called Liberty Chip program. In what would seem like a mortally impossible task today, AMD pledged to creating a new chip every single week during 1986.

“We believe that when the upturn comes, the demand will not be for old chips but for new products and new applications,” an AMD spokesperson said to the Chicago Tribune back in 1985. The spokesperson would later add the program would free AMD “from the tyranny of the commodity marketplace.”

AMD’s clone, the Am386, sold a hefty amount of units before the end of its first year, and subsequent cloned chips performed admirably. However, the process was long-winded, Intel’s chips were increasingly complex, and realistically this approach was impossible to parse as a business model going forward.

AMD would eventually win its legal dispute with Intel for the aforementioned code in 1996, but after that it was to go cold turkey. No more Intel blueprints and no more Intel second source money.



Image courtesy of Fritzchens Fritz, CC0-1.0

Going it alone (1996 – 2003)

With no more Intel code AMD needed to get moving on its own R&D, and it wouldn’t be long before it launched the K5. This was AMD’s first in-house x86 processor, built from the ground up by its engineers, and directly rivaling Intel’s Pentium processors. The ‘K’ stood for Kryptonite, which, as you may have guessed, was a nod to taking down a seemingly unstoppable and superhuman Intel.

AMD would also acquire NexGen at this time, hiring a small team of CPU boffins to create future chips for the company. Rebranding the team’s in-house efforts and workshopping it to work with AMD’s fabs, this team created the AMD K6 – a chip capable of competing with Pentium and even utilising motherboards designed for Intel processors.

Hector Ruiz, who would soon replace Sanders as CEO, joined the company in 2000, leaving a cushy exec role at Motorola. He would later remark in his book that when he joined AMD it was financially worse for wear.

“The Computation Products Group was bringing in revenue but losing money,” Ruiz says in his book. “The Memory Group, on the other hand, brought in less total revenue but contributed disproportionately to earnings; the business was, in fact, floating the company. That meant that AMD—known as a chipmaker—was actually making up for massive losses in its microprocessor business on the success of its Flash memory business.”

But Ruiz was still confident in turning the company around into a profitable venture that could, against the odds, take on Intel. For one, Ruiz wasn’t entirely convinced Intel ever wanted AMD out of the chip making game to begin with.

“We had never seriously considered the prospect of bankruptcy; in a perverse way, Intel needed us,” Ruiz says. “Intel would never allow AMD to go out of business, we reasoned. If we did, there would be no way to hide its size and the fact that it was a de facto monopoly. Intel would prefer to keep us as a weak competitor, always in the ring but with a bloody nose.”

After success with the K7 in 1999, first of the Athlon name, and the successive Athlon XP, AMD would take the market by storm.



Image courtesy of Smial, CC-BY-SA-2.0-DE

High-performance innovation and AMD64 (2003 – 2006)

With the K8 launch in 2003, AMD was riding high. This processor came equipped with the company’s own 64-bit extension to x86 (AMD64, x86-64) and integrated the memory controller into the chip itself. The move proved so successful that the Athlon 64, as the consumer chips would then be called, started to rule the processor market. Its Opteron server chips would also challenge Intel’s finest.

This instruction set proved so valuable that Intel ended up licensing it from AMD. And thanks to AMD’s Open Platform Management Architecture, which let third-parties develop chipsets for its processors, even Nvidia was able to get in on the Athlon’s success, building its own chipset for use with AMD’s silicon.

AMD would follow up this success with the first dual-core Opteron in 2005. During this time AMD’s chips were the de facto standard for server performance, and major OEMs were starting to take notice. Dell, which held a massive share of the PC market, started buying up chips for its servers (even if AMD did have to sell them at a bargain price to make the sale). It followed this up with the dual-core Athlon 64 X2 in the client space, and these chips would go head-to-head with Intel’s Pentium D and Pentium Extreme Edition.

But there was something afoot at AMD. While the company was successful, it required serious sales to maintain its costs. If its chip sales started crashing, it would be left financially crippled and unable to pay off the costs of its fabs. Foreboding….

And while this is all going on, AMD’s burgeoning flash memory business is spun off into Spansion, a co-owned subsidiary with Fujitsu that would, in 2005, be divested to help AMD hone in on its microprocessor business.

2005 was also the year that AMD launched a lawsuit against its rival, Intel. AMD claimed Intel was offering monetary rewards to manufacturers that snuffed AMD and its processors, specifically Japanese PC manufacturers in this case. This wasn’t settled until 2009, when Intel finally agreed to pay AMD $1.25bn and settle all disputes. A drop in the bucket for Intel.

Intel strikes back and AMD’s vibrant future (2006 – 2007)

But Intel wouldn’t stay down for long. Its NetBurst architecture was a huge failure, but efforts by a small team working on the leftovers of its old mobile architecture led to the Core architecture, beginning with the Pentium M and later becoming something a little more threatening: the Core 2 Duo.

Learn the basics Our handy techie glossary of terms will give you all the key phrases you'll ever need to memorise to masquerade as a PC pro. Get learnin'

The architecture would later find huge success in the desktop and server world, packaging 64-bit extensions from AMD, high clockspeeds, and more cores. Intel would dominate the market once again with its latest chips, adopting the Tick-Tock development cycle to crush any resistance from AMD along the way.

Now there’s some serious pressure on AMD to deliver. Remember it wasn’t all that long ago that it was floated entirely by a consistent bout of sales that, if were to dry up all of a sudden, would leave AMD’s finance department in a rather sticky situation. So what does AMD do as Intel comes back at it in the CPU market? It spends $5.4bn on graphics experts ATI.

ATI was how AMD’s execs saw the future of the company: graphics. Intel could package up everything that a customer wanted – graphics, processor, and memory – and AMD felt that Intel’s integrated graphics were its weak spot. Its Achilles’ heel according to Ruiz.

Contemporary power: These are the best graphics cards today

The deal net AMD the ability to compete with Intel on integrated graphics, and also the foundations of what would become Radeon and AMD’s semi-custom division. ATI was one of the most renowned graphics chip and chipset manufacturers going. The Canadian company had also acquired ArtX for itself fairly recently, who were the creators of the graphics chips within both the Gamecube and Wii.

But ATI wasn’t always AMD’s first pick, and, rumour has it, a deal to merge with Nvidia only fell through because Nvidia’s CEO, Jen-Hsun Huang, required he became CEO of the merged companies once the deal was finalised. Since AMD couldn’t afford Nvidia outright to boot Huang from the driver’s seat, the deal quickly dissipated.

But the deal did go through with ATI. Unfortunately it coincided with underperforming AMD processors, and the company started to crumble under the financial pressure. Even if AMD’s CEO believed ATI’s sales were enough to weather the storm, it wasn’t looking too good for the company.

Debt and GlobalFoundries (2007 – 2011)

“‘Now hang on a second,’ I said slowly. To me, Abu Dhabi’s reaction to our initial proposal seemed to be opening a door. ‘Why not sell them all our factories?’” – Hector Ruiz, 2007

AMD had invested a lot of money in fabs, Sanders demanded it during his tenure as CEO. But by the late 2000s, Ruiz and his cohorts had another plan in mind: ditch vertical integration, divest AMD’s fabs, and drop their significant, overwhelming financial burden. After all ATI had ditched manufacturing and outsourced elsewhere and it had all worked out, and it meant AMD could stay ahead of the game without investing any more money.

The survival of AMD was at stake Hector Ruiz

One billion dollars was certainly going to help out the company in its time of need, and it may have just found a buyer in Abu Dhabi investment company Mubadala Development.

The deal had reportedly come about through a chance connection. AMD sponsored the Ferrari Formula 1 team, as it still does to this day, and, funnily enough, the Crown Prince of Abu Dhabi had also just bought up a 5% share in the entire car manufacturer. A couple of calls through some of their mutual contacts, reportedly Piero Ferrari, son of Enzo Ferrari, and AMD was meeting with VPs from Mubadala to talk through the pitch to buy up all of AMD’s fabs. The deal was critical to the company – as Ruiz later put it, “the survival of AMD was at stake.”

The deal would be announced in 2008, immediately offloading $1.2bn in debt to the new company and off of AMD’s shoulders. Mubadala would end up with a 19.3% stake in AMD as a result of the newly-issued shares, and ATIC would pay AMD $700 million for a stake in The Foundry Company. Fitting name, right?

Fun fact: the deal would be announced just moments after the beginning of the 2008 financial crisis.

AMD left that deal with a financial burden lifted and the capability to fight another day. But there were stipulations for AMD’s execs. Doug Grose would leave his role as AMD’s VP of manufacturing operations to run The Foundry Company, and Hector Ruiz would also relinquish his position to chair it. Dirk Meyer would continue on as AMD’s CEO and president as he had been doing from 2008.

The Foundry Company would soon change its name to GlobalFoundries.

And while all this is happening, ATI is busy developing the TeraScale GPU architecture, complete with unified shader model. Originally intended for the Xbox 360, it would also end up being mainlined into ATI’s Radeon graphics cards. In 2009, AMD merges its GPU and CPU divisions into one, and shortly after in 2010, it kills off the ATI brand and replaces it with AMD.



Image courtesy of Ilya Plekhanov, CC-BY-SA-4.0

The Bulldozer years (2011 – 2017)

AMD decided to totally reset its CPU design process with Bulldozer in 2011. A fitting name for an architecture that destroys everything before it and begins again. However, the new design wasn’t quite the fresh start for the company as it may have hoped for.

From Bulldozer, to Piledriver, and onto Excavator, AMD struggled to gain any traction outside of the entry-level market. The architecture was reasonably ahead of its time in many ways, shifting focus away from speed and onto core count – a concept now adopted by both AMD and Intel today.

The market just wasn’t ready, and due to the shared architectural design of Bulldozer, its multicore chops came at a price. Cores shared restrictive resources, and instruction per clock (IPC) remained stagnant while Intel ran away with performance.

The company’s financial situation reached such dire straits that it ended up selling its own HQ in Austin, Texas and then leasing it back for an immediate influx of much-needed cash.

But there was some salvation for AMD in the form of its APUs. The Bobcat architecture would find success later in life as it was moulded into the Jaguar architecture. This would be used within the PlayStation 4, PS4 Pro, Xbox One, and Xbox One X. AMD dominates the console market with its semi-custom unit. And you know who to thank for all that? Lisa Su.

Super excited to expand our partnership with @Sony on their next-generation @PlayStation console powered by a custom chip with @AMDRyzen Zen2 and @Radeon Navi architecture! 😀 https://t.co/EvdIrMNLiV — Lisa Su (@LisaSu) April 16, 2019

Dr. Lisa Su joined AMD in 2012 – the same year AMD created its semi-custom division – taking on the role of senior VP and GM overseeing global business units and COO before swiftly moving into the role of President and CEO in 2014. Lisa Su was adamant that AMD needed to push beyond the PC market and into embedded and console business.

And no, Lisa Su isn’t Jen-Hsun Huang’s niece, or long-lost cousin, or any relation whatsoever. Su dismissed all that as “not true” last year.

The Enterprise, Embedded, and Semi-custom segment accounted for $433 million of AMD’s revenue in Q4 2018, nearly half of the Computing and Graphics segment at $986 million – with a little help from EPYC, of course.

And while graphics had been tentatively under AMD’s umbrella, during which time it released the Graphics Core Next (GCN) architecture, this period would see AMD spin its graphics division out once again as the Radeon Technology Group (RTG), headed by Raja Koduri. It would gain autonomy in operation from AMD’s computing business, and would later go on to produce Polaris in 2016 and Vega in 2017. Since then, it has unofficially fallen back into line with the rest of AMD, hence it seems the red team still hasn’t made up its mind about this department’s self-governance.

Zen, Zen 2, and AMD’s future (2017 – present)

And so we arrive at today… well, nearly. Let’s quickly dive back to 2012, when Jim Keller and his team started work on the Zen architecture. AMD set out to reset its processor development once again with Zen, starting fresh after many iterations of Bulldozer.

And suffice to say it was a huge success. AMD surpassed its initial IPC goals with Zen, and the move to 14nm meant AMD Ryzen CPUs could compete head-to-head with Intel once again, while Intel was bogged down in 10nm process node development hell. Chipzilla is still wrestling with its upcoming process node.

In a roundabout way, and over a decade later, Ruiz’s belief that going fabless would offer AMD flexibility in chip design and a competitive advantage was in fact true.

EPYC is also reigniting the race for server supremacy and following in the footsteps of Opteron all those years ago. While Intel’s market dominance is mighty, EPYC has managed to sway a few major OEMs to adopt EPYC in some capacity. And Intel’s CPU shortage sure helped AMD along.

And after years of fighting, lawsuits, and antitrust battles, AMD finally managed to patch things up with Intel, teaming up to create Kaby Lake G with both Intel CPU silicon and AMD Vega GPU tech. And they lived happily ever…

Nah. The companies are still making snide remarks and poking fun at one another in marketing materials, and Intel never ceases to steal away AMD’s long-standing employees at any given chance. But at least the days of major billion-dollar lawsuits are over… for now.

Tensions are, if anything, rising between the two companies. With AMD Ryzen 3000 CPUs on the march the red team will finally have process node supremacy over its rival. These Ryzen 3rd Gen chips will be the first desktop processors built on the 7nm process node with the ‘revolutionary chiplet design’ of the AMD Zen 2 architecture.

And Intel is planning on pushing into the discrete GPU game with Raja Koduri’s help, potentially posing a threat to both AMD and Nvidia’s gaming and professional cards. But RTG is hoping AMD Navi later this year will keep the company right in the thick of it.

Click here for more hardware video goodness

“For a half century, AMD pushed the boundaries of what’s possible in high-performance computing to create new experiences and possibilities for hundreds of millions of people,” says John Taylor, chief marketing officer at AMD. “We celebrate this moment with our fans around the world who inspire us to push forward in that spirit for another fifty years to 2069 and beyond.”

And after all that, AMD is still standing. Sure, it has often tied its own shoelaces together, and has been its own worst enemy at times, but it’s also shown its talents in adroitly adapting to the moment at hand, skirting financial pressure and, against all odds, proving itself more than just a thorn in Intel’s side.

AMD is one company you love to see do well. After all, everybody loves an underdog story. But there’s more to it than that. AMD has been pushing Intel and Nvidia, two companies that enjoy the IC industry’s oligarchic rule, to create better processors, make better graphics chips, and stay competitive. Because they know, as AMD too believes, that the red team will always be ready and waiting to pounce the moment they show weakness. Bring on the next 50.