No one should take this amateur analysis seriously. Did he analyze the socket and CPU under elevated temperature conditions? Did he artificially age the socket and CPU to introduce oxide, sulfide, or nitride contaminates at the socket/CPU interface? Did he run the test for years? Did he factor in the potential impact of socket or CPU warp over time or from manufacturing variations, which might cause some pins to lose or have reduced contact? Did he look at how much extra power was wasted by the increases heat generated by reducing the number of pins? Did he consider the effect of accelerated reaction rates due to pin heating on long term reliability?



And let's crank the numbers: the revised LGA 1151 adds 18 more power pins (going from 128 to 146).

The video demonstrates that each bin has a little more than 0.045 ohms resistance (0.23V voltage drop at 5A).

That makes the aggregate power pin resistance for 1151V1 14% higher (0.35 milliohms) the 1151V2 (0.31 milliohms).

Doesn't sound like much, BUT those Coffee Lake processors are pulling 138A through those pins.

The extra power pins reduce the resulting voltage drop across the pins (not to mention the drop across the wiring connected to those pins) from 48mV to 42mV, with a corresponding decrease in power dissipated just by the pins of almost 1 Watt. And that's without overclocking. Sure, it's only 1W, but it's 1W at a place that's hard to get extra cooling to -- what the thermal conductance of the material they make sockets from, anyway?



So it's one thing to say "there's no need for these extra pins if you're just going to run your system for a week."

But it's an entire different matter when your customers expect the socket-CPU interface to not be an issue for three years .

Or when your customers expect to be able to feed 150A to an overclocked CPU without the socket melting.