







































In the never-ending war between PC and console gamers, one of the PC side's favorite points is the fact that console hardware stays frustratingly static for years at a time, while PC users can upgrade everything from the RAM to the graphics card as technology improves. Thus, by the end of a given console generation (and sometimes earlier), a price-competitive PC will almost always be able to outclass the performance of its aging console competition.

This is true, as far as it goes. But as any console owners can tell you, unchanging hardware does not mean unchanging graphical performance over the life of a console. On the contrary, as time goes on, developers are often able to extract more from a console's limited architecture than anyone ever thought possible when the system launched.

In the early days, new processors and memory chips in the actual game cartridges contributed to this evolution. More recently, it's become a function of developers having the time and experience to know how to get every last ounce of power from an architecture that is intimately familiar.

As we take a nostalgic look back at how this intra-generational advancement has played out in the past, keep in mind that the same process will more than likely play out in the current console generation as well. In a few years, we'll look back on even the impressive launch titles on the Xbox One and PS4 and wonder how we ever tolerated such low-quality images.