For decades, a desktop PC purchased for $2000 would be selling for $500 inside of two years. The combination of Moore’s law and Dennard scaling drove microprocessor performance rapidly skyward and companies like Microsoft and Adobe pumped out new products to consume CPU cycles nearly as fast as Intel and AMD could boost them. That ended in 2005, but the arrival of multi-core CPUs, clever scheduling on Intel’s part, and sheer inertia kept the idea current.

Here’s an example of just how dead the old obsolescence model truly is. In 1996, top-end systems were built around Intel’s Pentium 166 with a 512K off-die L2 cache, 16-32MB of EDO RAM, video cards with 2MB of VRAM, and 5400 RPM HDDs limited to ATA-33. Four years later, 1GHz CPUs with full-speed on-die cache, 128MB of PC-100 SDRAM, 7200 RPM, ATA-66 HDDs, and 16-32 MB of video RAM were selling for under $3000.

In four years, CPU clockspeed quintupled. Actual performance gains were even higher, thanks to further improvements in CPU efficiency. In 1996, high-performing EDO RAM offered up to 264MB/s of bandwidth, in 2000, PC100 had hit 800MB/s. AGP slots weren’t available in 1996, four years later they were an essential component of an enthusiast PC. USB controllers had gone from dodgy, barely functional schlock that locked up the mouse every time you launched a program, to reasonably reliable. 32MB of RAM was great in 1996, by 2000 you wanted at least 64MB for acceptable performance. By looking at the highest-grade systems you could buy, we avoid the issue of whether or not certain features were just priced out of a given segment. A top-end PC in 2000 had access to performance and features that simply weren’t available in 1996, for any price.

In the late 90s and early 2000s, upgrades were driven as much by the bugs you were fleeing from as the performance you were looking to. Early plug-and-play hardware was a joke. ISA required manual configuration (occasionally by jumper) and was hamstrung by the bus’s 8MHz clock speed. One of the first enthusiast motherboards I ever owned offered ISA overclocking; my 3COM Etherlink III card (pictured above) ran noticeably faster if I pushed the ISA bus up to a blistering 12MHz.

Now, compare an early Core i7 system (Nehalem) against what’s shipping today (Ivy Bridge). Clock speeds are up a bit, and there are cheaper/lower-end options available, but the Core i7-920 that launched at 2.67GHz with four cores and eight threads is still absolutely capable of powering the latest games and applications. RAM bandwidth is roughly comparable if you assume the 2008 system used three channels of DDR3-1066 while a newer system deploys dual-channel DDR3-1600. SATA 6G is spurious unless you own an SSD, and USB 3 is available by add-in card. Notably, the add-in cards don’t suck this time around. As for total RAM, the 6GB (three-stick) RAM kits that debuted with Nehalem in 2008 are still more than enough for 2012. Thanks to the surge in phone/tablet sales, companies like Microsoft are busy rearchitecting their software to use less RAM or at least holding the line steady between generations.

The situation is somewhat different in laptops, where trimming power consumption and slowly increasing battery life have a direct impact on the components you can put inside a system. Even here, however, the trend has shifted. Apple’s MacBook Pro with Retina Display has been rightly criticized for its non-replaceable battery, proprietary storage, and soldered-in RAM. What’s striking about the situation is that the issue is one of replacing broken parts, not upgrading old ones.

Are there people who need more than the 8GB of RAM that the lower-tier MacBook Pro w/RD offers? Yes. But if you’re one of those people, chances are you know it already. There’s no operating system or general software suite coming down the pipe that’s going to push the 8GB limit, nothing in the works that will unexpectedly turn your hard drive (assuming you still own a hard drive) into a chittering swarm of crickets.

The desktop’s evolution is effectively over and the laptop isn’t far behind. The latter is being driven by advances in 3D graphics and GPGPU capability. These abilities are than backported to the desktop space, where they keep things moving along, at least a bit. Tablets and phones are now driving evolution in computing, and that’s not necessarily a bad thing. If you create content, program, or design, chances are that you do it on a desktop or laptop. Long lifespans, stable designs, and strong relative performance are essential to those roles. The fact that the hardware is dirt cheap helps to bridge the digital divide — cell phones and tablets may be at the forefront of what’s hip, but beige boxes with wired Ethernet still have a vital role to play when it comes to pushing internet access out to the poorer segments of society.

Do I miss the days when Nvidia drivers and chipset launches could boost performance by 20-30% across a huge range of applications? Yes. But would I trade them for the data destroying sound card conflicts, substandard driver support, and days when Windows would BSOD if you crossed your eyes at it? Not really. And I like the fact that the computer I built for my parents in 2008 is still “blazing fast” with the addition of an SSD and a bit more RAM, as opposed to needing an all-new system with a new OS installation. These days, a laptop replacement is more likely to be accident-related than upgrade-driven, and desktops should have a useful lifespan of 6-8 years. It’s not as exciting, but it’s arguably more useful and certainly far more economic.

Read: 30 years of personal computers, and 4004 to Sandy Bridge: A walk down CPU memory lane