Since before it even hit stores, much speculation has swirled over the graphical power of the Wii U, by which we mean how many polygons, shaders and effects it can push all at once. The launch window line-up includes exclusives that look gorgeous, others that have modest visuals, as well as ports and multi-platform releases of varying standards. There's been guesswork and contradictory developer's comments, and claims to have actually deciphered CPU and GPU speeds.
Courtesy of some keen members of the NeoGaf community and tech photography company Chipworks, it seems the mysteries of the GPU — such a key part of the Wii U infrastructure, as argued by Nintendo itself — have been clarified. These ultra-magnified images drill down to within the chips themselves, allowing analysts to assess the "die shot", interpret the transistors present and to calculate its capabilities.
Thankfully, the team over at Digital Foundry has interpreted the results, a trust-worthy and reliable source when it comes to matters such as these. The full article goes into great detail, professing to remaining unknowns and suggesting that in time development teams will squeeze more out of the system — as is the case with every platform. The core facts are clear, with a level of detail now known for Wii U's infrastructure that's on a par with known specifications of the upcoming successors from Microsoft and Sony. Extracts are below.
The final GPU is indeed a close match to the 4650/4670, albeit with a deficit in the number of texture-mapping units and a lower clock speed - 550MHz. AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league. However, the 16 TMUs at 550MHz and texture cache improvements found in RV770 do elevate the capabilities of this hardware beyond the Xenos GPU in the Xbox 360 - 1.5 times the raw shader power sounds about right. 1080p resolution is around 2.5x that of 720p, so bearing in mind the inclusion of just eight ROPs, it's highly unlikely that we'll be seeing any complex 3D titles running at 1080p.
All of which may lead some to wonder quite why many of the Wii U ports disappoint - especially Black Ops 2, which appears to have been derived from the Xbox 360 version, running more slowly even at the same 880x720 sub-hd resolution. The answer comes from a mixture of known and unknown variables.
The obvious suspect would be the Wii U's 1.2GHz CPU, a tri-core piece of hardware re-architected from the Wii's Broadway chip, in turn a tweaked, overclocked version of the GameCube's Gekko processor. In many of our Wii U Face-Offs we've seen substantial performance dips on CPU-specific tasks. However, there still plenty of unknowns to factor in too - specifically the bandwidth levels from the main RAM and the exact nature of the GPU's interface to its 32MB of onboard eDRAM. While the general capabilities of the Wii U hardware are now beyond doubt, discussion will continue about how the principal processing elements and the memory are interfaced together, and Nintendo's platform-exclusive titles should give us some indication of what this core is capable of when developers are targeting it directly.
...While there's still room for plenty of debate about the Wii U hardware, the core fundamentals are now in place and effectively we have something approaching a full spec. It took an extraordinary effort to get this far and you may be wondering quite why it took a reverse engineering specialist using ultra-magnification photography to get this information, when we already know the equivalent data for Durango and Orbis. The answer is fairly straightforward - leaks tend to derive from development kit and SDK documentation and, as we understand it, this crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.
The key points, it seems, are that the Wii U GPU is stronger than its contemporary in Xbox 360, for example, but performance issues in some early ports possibly reflect CPU restrictions and some unknowns — this has been stated before. Vitally, the Digital Foundry team states that the "GCN (Graphics Core Next) hardware in Durango (Microsoft) and Orbis (Sony) is in a completely different league"; on a technical level this isn't necessarily a surprise, but is important to state. The gulf towards a new level of graphical fidelity has been argued by Satoru Iwata to be less pronounced than between Wii and its HD rivals, and Nintendo gamers have become familiar with enjoying games with weaker graphical capabilities than those seen on other systems.
Perhaps the most revealing aspect is in the final paragraph, with a suggestion that the secrecy around Wii U's specifications meant developers were "left to their own devices" to figure out the console's capabilities, thus avoiding the leaks that have revealed much about the rival systems on the way.
So, what do you think of these comments and latest revelations of the Wii U's graphical capabilities? A gallery of the related images is below.