zlobby We really don't know if RDNA was made with mobile in mind, or at least I don't. Porting RDNA to mobile may not bring much benefits if it was not tailored for mobile platforms in a first place.

R0H1T Yeah I don't remember seeing any major reviews publish their numbers with LPDDR4x @4266 MHz, the vast majority of reviews you are seeing have 2666 or 3200 MHz regular DDR4 & yet they smash every other Intel IGP out there & nearly match or beat the MX250 ~ in that sense there's still plenty of performance gains to be had. Remember at CES we didn't have final retailer versions of laptops neither drivers fine tuned to make the IGP shine, I'd say (IGP) Vega is still king of the hill for about a year or so!

Wait, what? A GPU architecture is a GPU architecture, and AMD builds all of theirs to be modular and scaleable (which, frankly, all GPU architectures are to some extent due to the parallel nature of the workload). The only criterion for it being "built for mobile" or not is efficiency, where RDNA clobbers GCN - a 40 CU 5700 XT at ~220W matches or beats a 60CU Radeon VII at ~275W on the same node after all, and that's theefficient implementation of RDNA. AMD has specifically said that RDNA is their GPU architecture (singular) for the coming decade, so GCN is going the way of the dodo in all markets - it's just that iGPUs generally lag a bit architecturally (due to having to combine multiple architectures it's more of a challenge to have everything line up properly, leading to delays). Of course they also have CDNA for compute accelerators, but those aren't technically GPUs. Of course RDNA dGPUs all have GDDR6, which is a significant advantage compared to any laptop or desktop platform, but the advantage isn't any bigger than in the DDR3/GDDR5 era - and arguably current/upcoming LPDDR4X designs are much more usable than anything previous. I would beif next-gen APUs didn't use some version of RDNA, as there is absolutely no reason for them not to implement it at this point.Have there been any proper reviews of the U-series at all? I've only seen leaked ones (that I trust to a reasonable extent, particularly that Notebookcheck Lenovo leak, though we don't know all the details of the configurations for those laptops), and otherwise there are H-series reviews with DDR4-3200 like the Asus G14. And as AnandTech has shown , that implementation roughly sits between the desktop 3200G and 3400G with DDR4-2933 (slightly closer to the 3400G on average), and soundly beats the 3500U (15W Picasso Vega 8) with DDR4-2400. Of course this is a 35W chip in a chassis with significant cooling capacity, so 15W versions are might perform worse, but might make up for that or even beat it if they have LPDDR4X - all iGPUs are starved for bandwidth, after all. Also, at least according to that Notebookcheck leak, the (possibly 25W-configured) 4800U with LPDDR4X consistentlythe MX 250 and 330, while lagging about 5% behind the MX 350.But as this news post says, it's possible that Tiger Lake Xe iGPUs can match it - though frankly I doubt that given Intel's driver track record. They have often managed to get close to AMD iGPUs with Iris Plus SKUs in synthetics like 3DMark, yet have consistently lagged far, far behind in real-world gaming. I expect a push for better and more frequently updated drivers with Xe, but it'll take time to get them out of the gutter. And by then, RDNA APUs will be here.