People calling the CPU 'weak' are so far off the mark that you're more likely to miss an open goal than a one-legged man in a football-kicking contest. The tech analysis has been horribly flawed, and it's very apparent that on the Internet, the vast majority of sites have always aimed for the lowest common denominator and even lower, if they feel they can get away with it. The Wii U has to display content on the GamePad, sometimes two times the display, THEN it has to have that content working in tune and in time with the main screen, all off the same processor, because unlike the Vita, that controller is NOT a specialist console. It executes this more smoothly than the PS3 or PS4 with the Vita, which are all specialist consoles with their own tech. Among the negativity and hot air, one would've thought that SOMEBODY, SOMEWHERE, perhaps a 'professional games journalist' or a professional 'tech-literate' member of the gaming community would've thought of this point - When you do, you soon realise that some of the claims about the Wii U specs just don't add up. But I guess it's too much to ask, let alone expect!!

The teardowns on the Internet are horribly flawed, too. I wouldn't give credence to anything you've read on it, especially on the NeoGAFs, Beyond3Ds and EuroGamers (a click-hungry trollbait site) of this world. The power draw doesn't mean the CPU and GPU are weak. Why? They've been compared with PC desktop parts, for one, when it's different for consoles. Secondly, energy efficiency - An A-Graded Washing Machine, TV, just about any electrical appliance from 2008, 2009, 2010 wouldn't necessarily be rated the same in 2012 because more increasingly efficient products would've made the shelves since then. You can also take something from that time period and make it perform better. This is surely true for CPUs and GPUs, too. The truth is that Nintendo's spec sheet told you all that you need to know - it's a 'customised GPU', and the system has a 'memory-intense design', so they didn't just pick off-the-shelf parts and whip them together into a box. For the CPU, some have suggested that it's a POWER7 derivative; "But it has '3-cores!!'" - Hahaha, and... No. Only P7 can support eDRAM with that architecture, and it's very questionable to say it's 3-cores by just die-watching, which is what those sites have done. It's more likely to be 6-core, with the 3 modules having 2 cores per module. That would make sense as P7 exists in 4, 6 and 8-core forms, and 3 X 2 = 6. Should probably point out that these aren't my words, but from what I've read, I could subscribe to much of this. Oh, and P7 was created with energy efficiency in mind - The most important word, however, was always 'derivative', when lot of people concentrated on the 'P7' part. There were some suggestions going round that it was 6-core, too, but they were largely ignored, written off or dismissed. I don't know all the answers, and I won't pretend that I do, but that makes sense to me, and I guess I'm 'crazy' enough to believe it until I hear something else from the horse's mouth itself.

There were some early criticisms with development, but none of the games at launch were made with the final kits. If you remember, Battlefield 3 (Frostbite 2 game) was cancelled in 2011. Medal Of Honour: Warfighter (another Frostbite 2 game) was pencilled for a Wii U release, but cancelled quietly. I would put it to you that EA's comments about the engine's performance relate to events in 2011, and that they hadn't returned to it since - this is critically important, because the Wii U kits had seen significant advancements since then. They were, however, able to get Fifa 13 (Version 12.5 with GamePad features) and Mass Effect 3 (non-Frostbite games, but EA titles nonetheless) on that platform. Also, note that Criterion (EA's own studio, by the way) had very positive things to say about the system and making games for it - their approach was different. Let us remember that Cry Engine 3 (no slouch of an engine) 'runs beautifully' on it, too. Furthermore, Iwata said that final kits weren't available until the middle of last year. To judge its capabilities on late and/or unoptimised port jobs which weren't made on final kits is dumb - Nobody ever did the same for the PS3 when it often had lesser ports than the X360, such as Lost Planet, and nobody ever said this for the X360 with COD2. Finally, Digital Foundry are not Gods, and they've been called out on their douchebaggery many times. I do wish the gaming community wouldn't take their every word as the gospel truth.