Intel launched it’s Kaby-G CPU with AMD graphics proving that their internal GPU is as broken as SemiAccurate has been claiming. We haven’t been fans of the architecture for a while, but today the company has also confirmed that their 10nm process is broken too.

The company released the long rumored Kaby-G CPU/MCM/GPUl. It is such a momentous occasion of the company, it didn’t even merit an entry on their press page, just a smarmy ‘blog’ coupled with denied preferential briefings to those that tow the line. Take a Kaby Lake CPU, add an AMD GPU, slap a stack of HBM2 on the side, and call it a win. It isn’t unless you are talking about AMD. Why? It’s technical.

Non-HBM GPU vs Kaby-G

So what is the problem? First Intel touts this as a big win for EMIB, a silicon interposer strip put in to their package. This saves cost over a full interposer like AMD has used but adds significant expense to the package cost. Unlike what Intel says, it is not cheap, not even close, only cheap*ER* than a full interposer if the package substrate with EMIB yields well. Intel will probably say it is incredible, best yields ever, but like their processes of late, big grain of salt until it is independently verified. Ironically the CPU to GPU connection isn’t an EMIB based trace, it is just PCIe pulled off the die and routed through the normal substrate.

The package with stiffening ring

The first problem comes from heat. Intel is claiming that Kaby-G will enable thinner, smaller laptops. It will meet half of those claims vs a discrete GPU, with consequences. The questionable half is the thinner part, SemiAccurate can see the smaller side, Kaby-G will reduce board area, but thinner is a stretch. If you can put a similar GPU in a package as thick as the Kaby-non-G CPU, then it won’t save height, just board area. Intel for some reason doesn’t explore this area of thought in their official releases.

Lighter is also a distant possibility but you are talking very low single digit grams if that, countered by other factors. How much does <5cm^2 of ultra-thin PCB weigh again? Add to that the much larger metal stiffening ring around the whole package and it is unlikely to be a weight win at all, last time SemiAccurate checked, dense metal weighed more than light organic substrate per unit area. Strangely Intel did not go into any details about weight in their releases either.

Then there is the heat problem. Take a look at the Intel supplied ‘win’ picture for Kaby-G. It takes a large board area and shrinks it down to a significantly smaller but undisclosed footprint. We won’t quibble about Intel stacking the area deck by using an AMD GPU with GDDR5 instead of an AMD GPU with HBM2 that takes up ~1/2 the area, even then it would be an area win for Kaby-G.

Back to heat though, take the two hottest parts of a modern laptop and put them in an area significantly less than half of what it was. You might recall that SemiAccurate called Ultrabooks, “Shiny things for the stupid” because on the engineering side they were simply a bad idea for many reasons, heat being one of them. Intel can’t cool their own CPUs in this form factor and instead lie to the press who repeats that lie until it becomes truth. If you can’t cool a 15W CPU in a given form factor, capping the average TDP at 9W in firmware somehow allows them to run it at the same clocks and ignore physics? Most sites repeat this line of BS because they are aware Intel cuts out sites that don’t play ball with their messaging.

So if you can’t cool a CPU in a given area and a GPU puts out even more wattage than the CPU, reducing their area to, oh lets call it 1/3 of what it was by eyeballing the supplied pictures, is going to make things better? See why Intel didn’t allow real press questions on this one? Add in heat soak from one unit to the other and you have a right mess. That said you will be able to spot the Kaby-G devices quite easily, just look for the glowing orange spot on the bottom. You can also identify their owners, look for the bandage through the singed hole in their jeans, skinny jeans, no one with half a brain would buy this over an AMD APU.

This also might explain why Intel isn’t revealing a single spec about this CPU. No clocks, no shader counts, no areas, no nothing. If they did, well, you would see why coupling two hot devices in a small area is a really bad idea, PR messaging notwithstanding. In our experience PR wins on the internet, physics in the real world, so now you can make an informed decision on your next laptop.

Area also bites in another place, sockets. It doesn’t take a keen eye to realize that the Kaby-G socket is not a standard Intel socket, not even close. This means laptop only, something which Intel readily admits to, but also likely cuts them out of the desktop market. According to SemiAccurate’s OEM sources, Kaby-G is going to be an extremely low volume halo product. This is doable in modern laptops because most have bespoke board designs, desktops are not in the same boat. The cost overheads likely will never justify a Kaby-G desktop even if Intel sorely wants it, bags of MDF not-a-bribe notwithstanding. Then there is the slight problem of losing 8 or 16 PCIe lanes from the meager few Kaby has to offer, anyone see the upside in a gaming board with half a slot free best case?

That brings us to the most interesting bit, what OEMs have been telling SemiAccurate that about Intel’s pitch for Kaby-G.

Note: The following is for professional and student level subscribers.

Disclosures: Charlie Demerjian and Stone Arch Networking Services, Inc. have no consulting relationships, investment relationships, or hold any investment positions with any of the companies mentioned in this report.