He called it the "HD Era."

On March 9, 2005, the gaming press crowded into a ballroom in the Moscone Center in San Francisco to witness the birth of a new era of game machines. Sony had dominated the industry for the last decade, and even goliath Microsoft had been no match for PlayStation 2. But the original Xbox was just the foot in the door, and the company's gaming evangelist J Allard was about to take the stage to talk next-generation – only three and a half years after the release date of that first Xbox.

The last great leap in gaming, Allard said, was the transition from 2-D to 3-D. What would happen next, he said, would be no less momentous: The transition to high-definition graphics, or the HD Era. This seems obvious in retrospect, but what Allard was saying at the time was that everyone would have to go out and buy an expensive new TV if they wanted to truly enjoy the next Xbox. At the time, a tiny 23-inch Samsung HD set cost over $1,000.

(Another indication of how long ago this was: Unlike most game industry keynote speeches, Allard's was not immediately uploaded to YouTube – because YouTube wouldn't launch until one month later. The gaming website IGN uploaded the video in tiny low-res chunks.)

But even though the speech was themed around the rapidly approaching transition to high-def televisions, Allard didn't spend a lot of time talking about graphics. What he emphasized were the radical new features of the next Xbox. Online multiplayer gaming was about to transition from an experimental add-on feature to a key component of the console gaming experience, and the next-gen Xbox would be built around the idea of delivering a consistent, system-level experience. In rapid fire, Allard laid out the major features of the new Xbox: "gamercards" that represented your online profile, system alerts, custom soundtracks, microtransactions.

To help usher the game developer crowd into the HD Era, Allard concluded his speech by giving out hundreds of those Samsung 23-inch HD sets to roughly a third of the crowd, via lottery.

The next step in Microsoft's unorthodox plan for kickstarting the next generation of game machines was not, as everyone assumed it would be, a gala press briefing at the upcoming Electronic Entertainment Expo later that spring. Instead, it went straight to the consumers with an MTV infomercial hosted by Elijah Wood and featuring The Killers and the guys from Pimp My Ride. You didn't come away from the special with much actual information about what was then known to be called Xbox 360, although Microsoft did introduce a few features, like what it felt was the most striking element of the console's design, the green "ring of light" around the power button. Allard was introduced as "Lord of the Ring."

The 2005 E3 Expo was the stage for the big showdown, as Microsoft, Sony and Nintendo would all be talking about next-gen. If Nintendo and Sony thought it premature, well, their hands had been forced by Microsoft, which was gearing up to actually launch its box that year. So while Microsoft was actually ready to let consumers get hands-on with the 360 and its games, Sony and Nintendo were trying to distract everyone with promises, vague proclamations and as much smoke and mirrors as they could muster.

Sony's game group, still led by "father of the PlayStation" Ken Kutaragi, showed PowerPoint slide after PowerPoint slide filled with numbers upon numbers upon numbers, talking up the theoretical power behind the Cell processor that would power PlayStation 3, then pulled out a mockup of the PS3 box and promised it would be available in the spring of 2006.

Sony's response to the early launch of Xbox 360 was typical of the company's hubristic attitude: "The next generation doesn't start until we say it does," Kaz Hirai, then the head of SCEA, told CNN.

Nintendo, by contrast, gave the first indication that it was getting out of the graphics horse race: It, too, showed a mockup of the box for the game system it was calling the Nintendo Revolution, but CEO Satoru Iwata pulled it out of his jacket pocket. The box, he said, was the size of three DVD cases stacked vertically.

Sony got lots of applause when it showed off pre-rendered target videos that supposedly provided a rough estimate of what PlayStation 3 games would look like in high-def. But the biggest pop for Nintendo was when it announced that Revolution would be a "virtual console" that could download and play games from the company's older platforms.

No matter which console attracted you more, it was still all a distraction, as hollow as the mockup consoles that Ken Kutaragi and Satoru Iwata posed for photos with. It was an attempt to get consumers to not buy Xbox 360 as it hit the market early.

Maybe the most interesting thing about that E3 was the dog that didn't bark: Nintendo showed Revolution, but not its controller. Nintendo had a reputation of introducing innovations in each of its game controllers that would eventually become industry standards: the D-pad, triggers, analog sticks. The Wavebird gamepad for its then-current GameCube had turned wireless controllers from a novelty item that didn't work reliably into a must-have feature, one which Sony and Microsoft were borrowing for their new consoles.

"Nintendo is always trying to be on the forefront of control innovations, like the analog stick, rumble or wireless. As soon as these are available, our competitors snatch them up," Nintendo's game design chief Shigeru Miyamoto told me that year. "Because the user interface is going to drive the Revolution software design, that's what's going to make our software stand out. Nobody else is going to be able to do what we do with next-generation game software. So, I can't reveal anything. It's under wraps because it's the big gun."

One of the first set of images of prototype Wii Remotes, released by Nintendo in late 2005. Image courtesy Nintendo

Just before Microsoft launched the 360, Miyamoto pulled out the big gun. It didn't happen on a stage, but in a series of private demonstrations at the Hotel New Otani just before that year's Tokyo Game Show. I don't remember the first time I played an Xbox 360 or a PlayStation 3, but I will never forget the first time I tried a Wii controller. We walked into the room and these things that looked like television remote controls were arrayed on a shelf in a rainbow of colors. Miyamoto picked one up, pointed it at the TV and used it to shoot a target. My stomach did a little flip; I couldn't believe what I was seeing and couldn't understand how it was being done.

Iwata formally announced the Revolution controller concept at the Tokyo Game Show later that week, although Nintendo didn't let anyone besides that small group of media actually play or see the controllers in action. (And even though Nintendo kept it under wraps as long as possible, Sony did rip off the motion-sensing idea for PS3.)

Shortly after Tokyo Game Show was over, it was time to launch the Xbox 360. Microsoft apparently wasn't happy with its earlier decision to put a hard drive into every single Xbox, and so for the 360 it gave consumers two choices: a $399 package that included the wireless controller and a 20 GB hard drive, and a $299 package with a wired controller and no internal storage at all. Although it was understandable that Microsoft would be gun-shy about the costs of hard drives, it made a rookie move here because it fragmented its user base. It didn't matter that the 360 supported a hard drive or that the majority of early adopters went for the more expensive SKU, because developers had to program their games on the assumption that the hard drive wouldn't be there.

E2 2006 was more momentous than 2005's show. Microsoft already had its hardware on the market, and it was selling well. The spring of 2006 was here and there was no sign of PlayStation 3. And apart from a few more tiny details, we hadn't heard anything else about Nintendo's new console except it had changed the name from Revolution to Wii, which people hated almost universally. (I defended the choice.)

YouTube was not yet up and running when Microsoft unveiled the Xbox 360. By the time Sony had its fateful press conference in 2006, the video sharing site was fully operational – much to Sony's chagrin. A crudely remixed video of the conference's lowlights – terrible-looking games and an exorbitant $599 price – was stickier than any of the traditional media coverage that came out of the press conference. The idea that the PlayStation 3 was shaping up to be a huge mess was taking hold.

Nintendo, too, was fighting off the idea that it was in an inextricable mess. GameCube had sold not very well at all when compared to PlayStation 2, and even the original Xbox was outselling it. The conventional wisdom was that Nintendo should get out of the console wars by "going third party" and putting its games on PlayStation and Xbox. Nintendo was going to get out of the console wars, but in a different way: Its low-powered, low-cost, family-friendly hardware would be aimed at those who didn't really play videogames.

At E3, it finally showed off retail games, as opposed to tech demos, for Wii. Nintendo enclosed its entire booth – what I remember hearing at the show was that the booth was a giant Faraday cage, designed to keep out electronic interference and allow the hundred or so Wii Remote controllers inside the booth to operate without problems. What the closed-off booth meant was this: Every morning, when the doors to E3 opened, a mad rush of people stormed in, running at top speed through the aisles, straight past the PlayStation 3 area, back to Nintendo's booth to get in line to play Wii, where they found a line of people already there because exhibitors from other booths had walked over and gotten in line before the doors opened.

The launches, later that year, of PS3 and Wii could not have gone more differently. Thousands lined up on day one to get the PlayStation 3, as Sony, facing chip yield issues with the Cell, had cut the launch quantities of PS3 by 75 percent. But even with severely restricted supply, PlayStation 3s were languishing on shelves by February and sales were sluggish in the U.S. and Japan alike. Ken Kutaragi became an insane quote machine, calling the $600 box "too cheap" and suggesting that players "work more hours" to be able to afford one.

It was the other way around for Wii – launch day was quieter, and you could walk in and buy a console. But they kept coming day after day, as word took hold among the casual gamers that Nintendo was targeting that this Wii Sports game was a lot of fun. And Nintendo wouldn't be able to keep Wii in stock reliably for another few years.

The next generation had launched, technically, but not entirely. Because nobody was fully ready for what consumers would start to demand, and everything would change.

This is what it used to look like when you turned on your Xbox 360. Image: Microsoft

Going all the way back to the Atari 2600, the box we started the generation with was not the one we finished with. Advancing technologies meant that as a console generation went on, hardware makers could redesign the box to be smaller and cheaper. But typically, this would happen towards the end of a console's life cycle, usually when a new machine was either on shelves or almost there, aimed at late adopters that wanted a really cheap gaming machine. Those who bought a Super Nintendo in 1991 didn't buy the mini version that Nintendo released in 1997 (if they had even heard of it to begin with).

This generation was different. Microsoft and Sony started altering their boxes right out of the gate. Sony pulled the PlayStation 3's $499, 20 GB hard drive variation off shelves in early 2007, a few months after launch. Then it took the PlayStation 2's Emotion Engine chip, which had been included on the PS3 motherboard to provide perfect backward compatibility, out of the design before it launched PS3 in Europe, meaning that only the U.S. and Japan got the PS3 as Sony originally pitched it. The version it launched in Europe and later in the U.S. still ran most PS2 games, but with software glitches that were sometimes serious. After a while, they removed even that functionality.

Microsoft, too, kept tweaking the Xbox design. It eventually added a few hundred megabytes of on-board flash memory into the low-end, hard drive-less Xbox so that everyone could download games. It introduced the "Elite" model of the system with an HDMI port. Not a year or two after launch, the PlayStation 3 and Xbox 360 that you could go buy in stores had very different feature sets than the boxes purchased on day one.

As this was going on, Microsoft's vaunted "green ring of light" had turned into the Red Ring of Death. The Xbox 360 had design flaws that led to a phenomenally large percentage of units bricking themselves and dying, which was helpfully cued to the user by the green lights around the power button turning red to indicate a "general hardware failure." Microsoft insisted that the return rate of Xbox 360 was within 3 to 5 percent of all units, but anecdotal evidence (to wit: everybody knew five people whose 360s had red-ringed) didn't match up. What's more, Microsoft did a terrible job at getting those units repaired in a timely fashion, and often charged customers exorbitant repair fees for consoles that were out of warranty. This ended in a billion-dollar solution and a massive apology.

By 2010, 360 was 5 years old – older than the Xbox was when Microsoft had announced it – and gamers were looking for the next generation. But that wouldn't happen for a long time. In 2009, Sony had introduced a slimmer, cheaper PlayStation 3, and Microsoft followed suit with a redesigned 360 in 2010. And they'd both redesign them again before the generation was over!

Since the Wii was already profitable and incredibly popular, Nintendo had no need to refresh the box. This resulted, however, in Wii being perilously deficient in certain areas. Despite E3 attendees' enthusiasm for the Virtual Console download service, Nintendo must not have expected users to purchase very many games on the service, because the console's pitiful half a gig of memory filled up quickly and then there was literally nowhere else to put the data – you could copy games to an SD card but not run them from one. Nintendo never upgraded the hardware to add more memory, but it did eventually come up with a software kluge that allowed you to run games off the SD card (accomplished by copying the data temporarily to the internal memory).

The rise of downloadable games and microtransactions actually presented similar problems for both of Nintendo's competitors as the generation wore on. When the consoles were released, downloadable games were a novelty – the vast majority of game dollars were spent on $50-60 discs, and this could be supplemented with bite-sized game experiences available by download with electronic payment. "Xbox Live Arcade," Microsoft's name for its download service, was a literal moniker – the first titles were ports of ancient coin-op games like Frogger. To make sure that hard drive-less users could download any game they wanted, Microsoft forced all Xbox Live Arcade games to be less than 50 megabytes in size.

Meanwhile, Sony was putting up all kinds of stuff as downloadable titles, from the flying game Warhawk to Gran Turismo HD Concept. There was still a divide where the best stuff was on retail discs, but the distinction was becoming a little more arbitrary, especially on the Sony side. Microsoft kept having to raise the size limit for Xbox Live Arcade titles as the generation went on.

As J Allard had predicted, Xbox 360 games were using "microtransactions," getting players to pay a few bucks each for pieces of extra game content that extended the range of features in their disc-based titles. This was brand new territory, and game developers didn't really know how much money players would actually pay for content. So we got things like the infamous "horse armor" incident, in which owners of The Elder Scrolls IV: Oblivion were asked to pay 200 Microsoft Points, or about $3, to buy cosmetic bling for their in-game steed. Nobody was forced to pay, of course, but if this was an indicator of the value-for-your-money tradeoff that these microtransactions were going to carry, was it not possible that developers would just start holding back content we actually wanted and putting it behind an expensive paywall, which we would only discover after we bought the $60 game in the first place?

"Imagine players slapping down $.99 to buy a one-of-a-kind, fully tricked-out racing car to be the envy of their buddies," Microsoft wrote in a 2005 press release. In Forza Motorsport 5 for the Xbox One, you can slap down cash to buy a fully tricked-out car, but it costs ninety-nine dollars.

No other console generation had been like this. You went to the store, bought your machine, went home and that was it. It worked, the games worked, everything just worked. The HD Era was all about growing pains. Maybe Microsoft and Sony pushed too hard, too fast, to take gaming kicking and screaming into high definition. They ended up with expensive boxes and hardware failure for their troubles, and a resurgent Nintendo that outsold both of them combined month on month with the standard-definition Wii.

Sony, among all, took it on the chin. PlayStation 3's low sales and the difficulty of programming games for the Cell led to a bad first few years. The implosion of PS3 also led to the end of PlayStation father Ken Kutaragi's career, as well as numerous other executive shuffles. Sony had to transform from the powerful industry leader into a scrappy underdog, and until it truly understood that its position had changed, it had no hope of making a turnaround with PS3.

And yet the boxes they'd built were future-proofed, to a point. Both Microsoft and Sony had the ability to update the systems' firmware. For Microsoft, this meant totally ditching the "blades" interface that J Allard had shown off in 2005 and radically remaking the Xbox dashboard into something more user-friendly – twice. Sony stuck with its inefficient Cross Media Bar interface through the life of the PS3, but constantly upgraded the experience – importantly, retrofitting the UI so that the menu could be opened up while a game was running. If you were to find a launch-day 360 that had never been opened and turned it on, you'd barely recognize what you saw.

The strange paradox of this past generation of consoles is that for all of the problems, all of the hardware failures and flubbed launches and redesigns, they've still sold more consoles so far this generation than last generation. Combined, PlayStation 2, GameCube and Xbox sold under 200 million units. So far – so far! – PS3, Xbox 360 and Wii are over 260 million. It wasn't all even – Wii roared out in front of everyone at first, then slowed down significantly after about four years. Xbox did well, especially so after the Slim and Kinect launched. Sony just had to keep reducing the price of the machine and cranking out better and better games, and now – on a worldwide basis, since it's more popular in Japan and Europe – PS3 has drawn up roughly even with Xbox.

So, what lessons can we take away as we bid farewell to this console generation?

Expect the unexpected. Nobody saw the Wii coming, and anyone who tells you they did is a liar. Some people imagined that Nintendo might do okay for itself and vastly improve upon the GameCube's performance at least, but the level of unprecedented success it enjoyed illustrated that Sony and Microsoft were not the last word in gaming. PlayStation 4 might take off like a rocket this time, or Xbox One, or neither or both, due to some unforeseen combination of factors. Or another company might still ride in with the Wii of this generation, whatever that ends up being.

Disappointed in the console? Voice your concerns and be specific. Microsoft and Sony have the ability to fix any software problem with the devices that you already own. We should expect, as with last generation, huge changes to the user interface and feature sets of Xbox One, PS4 and Wii U as the console's life cycles go on. If stuff doesn't work as well as you'd like now, don't just write the machine off as terrible, but tell the company what you want to see.

You may just have to get used to some things that you don't like. As we begin the generation, we're seeing Xbox games integrate free-to-play mechanics even though the games are not actually free-to-play in many cases. We complained about microtransactions and mandatory game installations, but those didn't go away – quite to the contrary, they became almost mandatory.

Expect this console generation to go on for a long time... maybe even forever. You may have been unhappy that it took this long for new consoles, but Microsoft and Sony were probably ecstatic that they were working on the same platforms for eight and seven years, respectively. Getting to address a growing, growing, growing installed base without having to start back at zero must have been fun while it lasted. (And note that publishers are still, for the most part, releasing their big games across the two console generations.)

If last generation lasted eight years, how long will this one go? What will change? Will Xbox still be tied to the TV eight years from now, or will you play your Xbox One games on a Surface tablet? Will Sony run out of money? Will Nintendo get Wii U turned around as Sony did with PS3? Will Apple cannonball into the pool?

One thing's for sure: If the last generation was that insane, this one's going to be even crazier.