News Article

Rumour: The Recent Wii U System Update Has Boosted Clock Speeds

Posted by Damien McFerran

CPU and GPU both got a bump, according to uncredited source

Everyone loves a good rumour, right? Well this one is a doozy, so grab as much salt as you can before reading.

According to a poster on TV Tropes — who doesn't give a source — the recent Wii U system update has upped the clock speed of both the CPU and GPU:

The CPU and GPU are built on the same package.

CPU: IBM PowerPC 7xx-based tri-core processor "Espresso" clocked at 1.24 GHz before the 3.0.0 update, 3.24 GHz after the 3.0.0 update. This is an evolution to the Broadway chip used in the Wii, is 64 bit and uses Power6 technote When IBM has said that Nintendo has licensed the Power7 tech from IBM, Nintendo is not using it for the Wii U, explaining its backwards compatibility.

GPU: AMD Radeon High Definition processor codenamed "Latte" with an eDRAM cache built onto the die clocked at 550 MHz before the 3.0.0 update, 800 MHz after the 3.0.0 update.

Those figures are — for want of a better word — insane. Pushing the clock speed of the CPU from 1.24 GHz to 3.24 GHz would be an incredible feat, and as a result many people are already pouring bucketfuls of scorn over this rumour.

However, there's no smoke without fire, and there's still a chance that Nintendo has upped the speed a little via the update — the increase in system performance since the update could be directly related to such a boost.

As soon as more concrete information hits the web we'll let you know, but for the time being, this just seems too fantastical to be true. What are your thoughts?


From the web

User Comments (102)



Smitherenez said:

Wow, that is a large improvement! But still, we don't know if it is true. I do wonder how much faster it can go then, since we are getting another update this summer.



ThomasBW84 said:

I think Nintendo may have boosted the clock speed to help with menus and, as a result, the odd game may also run better, which is great. Would it have gone up this far? Not a chance!

I reckon the rumour's true, but the figures are a load of old poop



Einherjar said:

I did notice that games like Orochi 3, which had minor FPS problems (short, 1-2 secons breakins) now run much smoother. Like @ThomasBW84 said, the numbers seem rediculous, but i guess the overall performance was increased due to better memory handling.



SilentHunter382 said:

There is not a chance that the update was able to overclock it that much. Even to 2 ghz would be a feat in its self.



ThomasBW84 said:

Still, the core story is probably true, and it's nice that Nintendo's made the move. I imagine they have teams monitoring performance of systems worldwide — oh yes, they're watching — and figuring out how much juice could be increased without risks of overheating, malfunctions etc. I'm guessing, as I'm no tech-head, but it's good for us and, possibly, devs, if Nintendo loosens the leash on the Wii U's assets.

If I'm talking a load of rubbish, then apologies!



dizzy_boy said:

Well, we'll see how much things improve in the summer update before I believe that.



Nintenjoe64 said:

I don't know why everyone thinks that it's not possible for the clock speed to be changed so dramatically. Nintendo probably didn't make a console with such an under/over-clockable processor but if a firmware update can change the multipliers on the CPU from within the BIOS then there is no reason that this improvement couldn't happen but I think the architecture would have been figured out by Digital Foundry if that was possible.

I thought the speed seemed more like the OS had been cleaned up rather than running on better hardware. Can't rule it out though, it would be an excellent anti piracy measure to not give would-be hackers enough power out of the box to run future games.



ajcismo said:

Doubt thats all true. But my U is running everything much quicker since that last update. Except the load times on Lego City.



TreesenHauser said:

I've noticed a couple games have run a LOT better after the update, with Darksiders 2 and ZombiU being the biggest improvements. This news wouldn't surprise me at all.



AyeHaley said:

I noticed some games like Lego City load way faster in game. I wonder how big the change actually is. But I'm already happy.



8thGenConsoles said:

There's a reason why rumors are started. This particular rumor didn't start just to lie to us Wii U owners. There is some truth in this rumor or maybe it's 100% true. Shigeru Miyamoto or was it Satoru Iwata did say that they are focused on improving Wii U's performance right now so it wouldn't surprise me if it's true. Let's just wait and see.



ETLN said:

I don't think its true, I would imagine the system would draw more power if the CPU and GPU clock rates were that much higher, I just measure mine with a watt metre and mines still drawing around 32watts, 34 was the highest I saw it go. I tested it on the Wii U system menu and the Rayman Legends Challenge App.



hms said:

I unplugged mine while updating, and it's now magically clocked @ 7Ghz/Ghz ,also lego city renders now @ true HD 1080p60 and wiiuchat @ 4K resolution? It's like christmas again =]



XCWarrior said:

That is one heck of a rumor. Would love if someone could confirm or deny it. The system certainly runs quicker, but not sure about THAT much different.



Zyph said:

Most games already been developed before won't really benefit from this I believe. They have already been developed using stock performance assuming the developers capped the game performance to a specific clock rate. This is more so in terms of RAM. For example, if game X was made to use only 1.2Ghz of CPU then even if it was overclocked to 3.0Ghz, it won't make any difference. Same as if they only made it to utilize 1GB of memory then they will only use that amount. Unless devs issue an update to make their games adapt to the clock changes made to the hardware then obviously we'll see our games perform relatively well than usual.
To be clear, "overclocking" the system is too risky even for Nintendo. I reckon that 3.24Ghz clock rate is the native clock rate (if ever this rumor is true though) and they just released an update to release the cap.
If this rumor is actually true and if you see that games like Orochi 3 Hyper still has those occasional frame drops then most likely they need to patch the game to match with the improved clock rates. Or maybe the changes only applies the system menu and OS and not for games.
Most hardware makers usually under-clock the CPU/GPU to lessen power consumption just like they did on 3DS, which the PICA200 GPU was natively clocked at 400Mhz but Nintendo opted to clock it down to 200Mhz or less.



Pod said:

Seeing as the Wii U was released rather early after having it's hardware finalized, I wouldn't put it past Nintendo to deliberately underclock its processor at launch, to have early adopters help with stress tests.
When the systems held up just fine, they could have "clocked up" the speed again in the already planned update.

This could all explain quite a few things, such as Nintendo claiming the system is more powerful than most developers thought, and many many developers moving their releases out of the launch window.

Of course it sure would be nice if any of the official developers would throw us a bone on this, which they of course can't because of Nintendo's strict-as-ever non disclosure agreements.



GiftedGimp said:

Amout of Ram allocated to the System/Games & Cpu & Gpu clock speeds can be altered with Firmware updates. To find extra performance this along with optimizing the OS is the only way it can be done. the Supposed increase in Cpu is doubtful, maybe its been upped to just under 2mhz range, think from the power usage/heat generation side of things and WiiU's compact design.
Since this performance update is being done in two stages, and the second part is due in the summer it would make sense that Nintendo are altering clock speeds, but are Testing to see try and figure out whats the best balance between extra performance, heat generated, power consumption, and reliability.
Nintendo arn't going to say excactly how the are optimizing system performance, but maybe after the summer update developers will be given information on new clock speeds, its possible some devs have information on intended speeds now but are tied to NDA, maybe upping the 'spec' so to speak is part of Nintendo answer to get those devs who think the cpu is too slow on-board.
But Maybe its all rubbish though, and any/all optimizations are to do with the Systems OS.



Pod said:


That's pretty interesting thinking. That they would keep requiring the system to be updated from scratch upon purchase, so that any potential weaknesses in the launch firmware would only give hackers access to games from the first half year.

Ultimately, I think hackers would also find a way to activate the overclocking.

The most prominent Wii hacker has openly stated he doesn't want to bother with Wii U, though, seeing as it doesn't offer enough new features for a closed platform to bother with hacking, when homebrew developers are free to make anythign they wish on their android devices.


It's not entirely the same thing. While most games are indeed built around set clock limitations, in many cases they will allow their game to draw as many frames as it can manage, up to a certain roof (usually set to 30 or 60fps). In all the cases where slowdown is experienced in Black Ops 2, Batman, and Lego City, simply raising clock speeds on CPU and GPU might actually prevent some of this from happening.

I'd be very curious to hear owners of these games report on whether this seems to be the case.



snax007 said:

I remember rumors about the same when the original Wii launched. Pure fanboy rubbish not worth publishing on a serious game site.



TheAdrock said:

Smells fishy.
How about an update that boosts the number of developers who want to make games for it?
How about rebranding (renaming) so that its not the same name as the last gen console from 8 years ago? A child can see that the public identifies it as merely a Wii upgrade.
How about supporting more than one gamepad?



Haxonberik said:

Ok I dont know who made up those figures, but they're just way out of place. And this isnt even the final performance update.



Moshugan said:

This is just such a load of crapola.
The only possible way would be that they had underclocked the components heavily, of which there is little evidence.



Jayvir said:

Yeah, let's publish something that someone somewhere said with no mention of a source!

  • Nintendolife


Kifa said:

I'd like to remind everyone that "overclocking" updates already happened in the past. Namely - the PSP originally had it's clocks capped at 222 Mhz. Later on Sony released firmware update that allowed both CPU's to run at 333 Mhz no problem.

Thing is - games written for 222 still used that speed (unless hacked, of course), so they didn't benefit from that bump. It could be the same case with the WiiU - software may be able to select CPU operating speeds, or even the number of active cores, so we may not see any performance gains in existing titles. But that's just theory, we have no real evidence...

That said - boosting the CPU by 2 Ghz on a humble cooling solution WiiU has is highly unrealistic. Underclocking is one thing, but even with standard voltages the temps will go up because of clock increase alone.



Selene said:

I highly doubt this rumour is true. But maybe Nintendo's freed more resources by optimizing the os, thereby making everything run faster and smoother?



element187 said:

Must be some super computer that can more than double its performance without any extra cooling.

The rumor is complete crap.. The GPU getting a small boost might be possible, but that wouldn't cause the OS to run any smoother (OS is CPU dependent)... If the rumor was started by some Nintendo fan, they aren't doing Nintendo any favors by making all Nintendo fans look like desperate loons.... As a Nintendo fan for life, i"m embarrassed.

come on, just accept it, the Wii U is a great little package, it doesn't need to compete with Sony/Microsoft on a spec level.



thelink said:

I have an alternate theory.

A lot of modern CPUs have power saving features that keep the frequency/multiplier as low as possible to run basic functionalities. For example on Android, the frequency while you are browsing the OS remains very low to save power, and only increases as the system needs it.

I think it is likely the 1.24GHz rumor was always wrong. It was a best guess at best, and a complete oversight of how modern CPUs mitigate their power draw. I think it is much more likely that, instead of an overclock, Nintendo disabled the power saving feature on the Wii U OS to allow the CPU to always run at full speed. This will help with all of the loading issues since the CPU would not have to continously spike and then lower its frequency constantly. I believe this is possible even if there was a slight OC. Anyone who has overclocked a CPU before knows how unstable your OC can get if you do not disable power-saving functions.



SteveW said:

720 IS HD but there are also 3D 1080 games on the Wii U that run just fine, Skylanders Giants for example...



shinesprite said:

At launch, people did say that the console's power supply was designed to handle a load many times higher than what the system was drawing.

. . . so, while Nintendo could have boosted the clock speeds, I'm highly skeptical that they would boost it that much in a single update.



Dpullam said:

I highly doubt they boosted the clock speed that much. It would be pretty cool if they did though.



wober2 said:

Regardless the speeds I wish accessing miiverse and eshop from within games was more stable. The update seemed to cause problems with this.



Faruko said:

This CAN be true, for a number of reasons

1) The CPU/GPU might have been underclocked for X reasons, and they just beef it up to its normal/OC speeds, wich i think mught be the best reasons, since i can hardly believe a CPU would run at 1.24GHz in this day and age... now 3GHz ? thats more like it to a console (i say to a console, since i have my i5 running at 4.5Ghz stable, pushing 5ghz if i put some cooling in it)

2) Power saving features: like somebody said, the numbers could have been on a power saving feature, and would mean that the CPU/GPU would have the wrong numbers (i hardly thing this is true, but hey, it could be...)

3) We dont know how exactly its the CPU/GPU build, it may be a powerhouse when it comes to "OC"



Pod said:

A lot of people are citing the cooling for a reason that this is bogus.

This is not actually necessarily a problem.

If the CPU was clocked low on purpose the fans in the system might've been fully capable of cooling the processors when running at 3GHz the whole time, but simply had their rotation speed upped by the update as well.

It's not like the processor would get three times as hot in the first place though, that's not how it works.



JSuede said:

@ThomasBW84 Yeah....IF they gave the system a bump....there is no way it was to over 3Ghz. That would definitely increase the power draw, and might even need a different architecture for the CPU to begin with.

The system still uses ~33w of a bump to 3Ghz did not happen. It doesn't make sense to increase the output of the CPU and GPU without increasing the power draw....unless Nintendo has discovered magic.

This was debunked pretty quickly in the GPU die shot thread on NeoGAF. There's no massive bump, as cool as it would be. Power draw is the same throughout sorry folks....nothing to see here.



Morph said:

Although its probably crap, itd be funny if this was true, what would all these 3rd parties use as excuses for not getting their engines running on wii u then.



Pod said:


The NeoGAF shots do not rule out that the system really COULD have be running at 3GHz during games the whole time, like thelink suggested.

The PS3 Superslim runs at about 80W, and the Wii U is a CONSIDERABLY more powersaving architecture, meaning I certainly wouldn't rule out that it was running a tri-core processor at 3GHz while still going below 40W.



Alphack3r said:

We've either horribly miscalculated the initial speed, there /was/ an increase (modest), or it's all just a steaming pile of you-know-what xD

Really brings new meaning to "cool story bro"



MadAdam81 said:

Games that are 720 on Wii U are just quick ports from the 360 version. Wii U cam handle 1080p, but 360 only can in certain situations.



GiftedGimp said:

@JSuede NeoGaf is the last place to go to check anything to do with Nintendo. I'm dubious of the increased figures, but would't be suprised if there has been some increase.



QuickSilver88 said:

The GPU bump would be easily believable in that it is an AMD made GPU and those of us using AMD APU inebrated chips on the PC side know they are easily overclockable even with stock cooling. The A8 series are stock 600mhz eand easily go to 800mhz and in overclock testing people get them to 960mhz no problem. So that is easily believable. The CPU side could only be explained if the processor was designed for higher speeds and was intentiomally being throttled for power saving. Otherwise pushing the mutliplyer a 1/3 would be the max so getting to 1.8ghz would be a real push. With all that said we do know manufacturers do throttle and sometimes later remove the cap or hackers do as evidenced with the PSP.....The proof will be in the pudding if Devs start to produce better results and warm to the system then it will leak from more reliable sources.



MitchVogel said:

Hmm, so that's why this was codenamed project "Cafe". I always wondered why they'd picked that name when this console was just getting started.



Moshugan said:

@Pod [Underclocking] ''...Of which their is actually substantial evidence.''
Source? It's common to underclock but not by this much! It does not make sense.
Either this is completely made up or the info we got from the chips was wrong in the first place.



Widdowson91 said:

@ThomasBW84 Have you even considered the fact that the initial clock speeds were effected by the OS? We all know the OS was rushed, so maybe it was taking up more of the CPU. Now that it has been updated it is now as it should have been.



Cheaptrick said:

What Nintendo advertising department needs right now is to spread rumors of Wii U hype up power performance. The Wii U is getting hit left & right as not that powerful of a game console. It had some effects on gamers. The Wii U needs something like this even if it's just rumors.



JSuede said:

@Pod @GuttedGimp Considering the GPU shot thread has been completely devoid of trolling or misinformation unless the entire thread gets derailed, and anyone coming in spouting off bs gets torn to shreds.....I'll take it. It's not the thread itself I'm referring to but the people in the thread...who know how to look at a die shot and figure out things about it....they pretty much know their stuff.

I agree that there could have been some kind of power bump, but nothing as significant as the rumor suggests, that would just be crazy. I get that the PS3 slim is running at 80w....but increasing the power output of the CPU almost 3x and the GPU by 300Mhz.....there would at least be some difference in draw. The Wii U is insanely efficient, but it isn't magical.



Shworange said:

Stgyhvbk gdffinnnly djjkkfgbn dd. Ftgkll gruffrejbnkg hg bjhj jvfni fghjikn! Yuhbn njk ghojjj hesttthko stu juuunmm. Tdfhhhh ovgh fghjikn uhhjjjjjjj tytttce vbjon jifh efgy gdf gikmnnhink hhki.

There. Understand that? No? Now you know how I feel.

Ps. It's amazing how much of the above gibberish that I typed was continually highlighted by my iPhone for auto correct. No autocorrect, hg doesn't mean ghost, and fkhghjj doesn't mean gingham!



AVahne said:

Seems fake to me. Though I would love to see the Wii U get such a massive boost in clock speed, this sounds waaaaay too good to be true.
Unless of course, Nintendo asked Anna and her "sisters" to write a magical overclock tome.



kevkeepsplaying said:

I can see them trying out for that in the summer update, but right now...? It doesn't look like it. I'm sure they've increased those parts of the system, but not by that much. Not yet. I'm sure they're working on it.



SCAR said:

Guys. Xbox 360 and PS3 can read up to 3.2 and 4.0GH respectively on some functions they perform.
This wouldn't be hard to believe at all. Once they update it in summer, the clockrate will probably rise again.
I honestly had already thought they pushed the console harder by pushing more computation. I haven't really noticed any software running better, but the OS has definitely picked up.
If the Wii U isn't even loading games as fast as it can in game, that means the Wii U isn't even trying yet.



Characterror4D1 said:

We have to keep in mind that Nintendo never released the official specs to the hardware. So even the rumored 1.24 GHz needs to be taken with a grain of salt.
Until the day - which might NEVER come - that Nintendo officially releases the specs of the hardware, everything about it is just a rumor.



gloom said:

I have noticed improvements in speeds in almost everything... except Lego City. Why does that game take so long to load???



tanookisuit said:

I'd buy this story as much as being plausible, hell even likely as far as the GPU goes but nearly a 3x over speed boost to the CPU is comical. Sure I bet it if it was real got a boost looking at new performance levels people keep reporting but not that much over the top. Even if it could be done what about the amount of heat that would give off, could the sandwiched little system handle that?



NinGamer85 said:

@gloom At least for the city it's quite a bit to load. There are no loading times no matter how far you drive. It only loads when you progress in the story. It's bad though, like Skyrim on Xbox.
@SCAR392 I think maybe the Spring update was for the OS and bringing the system up to speed. Maybe the summer update will focus on software performance. I don't know about the numbers because none were official to begin with but it's not impossible. My CPU on PC clocks down when I'm not gaming.
If there's truth to this it does make sense. Nintendo knew they didn't have a strong launch but they did get a decent start (around 4 mil) and it needed to be released before nextbox or ps4. No matter how much testing is done by Nintendo, consumers give the system the real test. I'm guessing it passed as they are now bringing it up to speed. The entire design of the hardware is very new for consoles even pc's just got gpgpus a couple years ago. It's very capable and developers are learning...except DICE who I'm thinking EA has paid to not even try...oh well EA makes crap anyways and they will hurt not making games on Nintendos.



9th_Sage said:

Marcan has weighed in on Twitter and said it's all BS. He would know better than most (and is certainly more credible than 'random anon on TVtropes').



TheHeroOfLegend said:

If this is true(which I want it to) then in the next couple of months, we can hear from DICE on twitter saying: "Oh, our Frostbite engine actually works on Wii U, so we'll bring it to Wii U!" Lol



SanderEvers said:

@ngamer155 That only happens if EA allows it. Which will never happen.

And it might be true that it's running now at it's intended clock speeds (as shown in the article) and that it was underclocked because the firmware wasn't able to handle it.

@gloom: Lego City is a big open world type of game, which would have load all the graphics from disc. And even if the CPU & GPU are at the speed of supercomputer you will still have to load the game from a extremely slow spinning disc. This is why PS3/PC games are usually completely installed on the hard drive.

And yes, hard drives are ALSO very slow compared to a CPU You'll need an SSD or faster to make it really load fast.

Playing Lego City Undercover from hard drive is indeed faster than from disc, but the harddrive is also limited by it's USB 2.0 connection. Why Nintendo never put USB 3.0 in the Wii U we'll never know .

But I'll laugh if in the end the Wii U outpowers the PS4 / XBOX 720 with the following system update



Neram said:

All I know is that would be awesome if the CPU and GPU were codenamed Espresso and Latte.



Jukilum said:

As far as I'm aware we never got the full picture from the die shots. Wasn't there around 30% that no one could figure out what it was for?



mikeyman64 said:

@hms Wow, I really hope you didn't just cause a bunch of ignorant kids to brick their Wii Us... not cool.

Oh, and define "HD" for us. Hate to get into a petty argument, but I find it annoying when people diss a game for not being 1080p. Technically, the term "high definition" relates to anything beyond 480 pixels wide. Some would even call 480 with progressive scan a type of high definition. 720i (which hasn't been used since the early "HD" days) and up are all considered "HD" and few can tell the difference between them due to the immense amount of pixels and identical aspect ratios.



TreonsRealm said:

I could honestly see Nintendo releasing the Wii U underclocked through code limits at launch and then releasing the limits through the patches, just think about how much of a big deal they have made about the updates (they even repeatedly featured them in Nintendo Directs). Bare in mind that almost all of their hardware from the NES up have featured upgrade abilities of some kind weather through chips on the carts, hardware peripheral upgrades or firmware updates so artificially limiting the Wii U could have been part of their strategy considering press negativity pre-launch and not wanting to show their full hand to the competition. One thing that never ceases to amaze me is that critics of the console don't consider the fact that the machine is often rendering games for 2 independent screens at the same time through a wireless connection no less. In some cases such as Call of Duty and Sonic Racing, it is doing this for 2 separate players simultaneously. It may just be me but that seems a bit intensive for the unconfirmed specs that hackers have supposedly discovered (a 1.24 ghz processor?).

As for the improvements to existing software, I can only confirm that for Batman AC. I got the game in March and have logged over 100 hours on it prior to the update (This was the first game I played in the Arkham series . I am also a PC (Linux) Gamer so I took notice of the framerate issues the the game displayed (I still enjoyed the hell out of the game). I can say that there was a noticeable improvement to the framerate post update. Indoor areas showed an improvement but the largest improvement seems to be in the city. Populated areas and gliding used to be rather choppy but everything is noticeably smoother (most technical reviews pegged the outdoor areas with a framerate around the low to mid 20's where they now seem to closer to 30 with occasional dips into the high 20's when performing boost glides and extreme maneuvers).

I'm on the fence about the clock speed increase until their is an actual confirmation but I could see Nintendo doing something like this. At the very least, the system update seems to help some software which could come from optimization of the operating system code. Either way, I'm thrilled with the system and happy to see Nintendo improving the console.



GiftedGimp said:

At e3 2011 Ghost Recon & Darksiders was shown for WiiU, but because the Dev Kits were underclocked the version shown were running on Xbox 360.
With a fair reports also suggesting developers had to work on underclocked hardware into 2012 Nintendo definatley have history.
System updates to improve system performance, has to be with either the cpu, gpu, or ram allocation and in a smaller part OS optimisation. Since the last update every game I have played the initial splash screen has been and gone much quicker, and games loading load times have been reduced, even LCU which although still has the longest load time says somethings definatley been altered in someway.
Its possible they have done something to the cpu/gpu, but doubt the scale as stated, (although it would be great) but I actually wonder if, Since its a 2-Stage system performance improvment update, the First one was for OS/general system optimisation, and Resererved Ram re-allocation, (Letting games have use of some of that 1gb reserved for the system operation), and the second part is for cpu/gpu un-underclocking or some Overclocking, what ever the case may be, which could take a bit more time to get to a optimum balance of Performance, Heat Generation, Power Draw, and System stability.
Especially if Nintendo are being cautious with the WiiU. Maybe due to the compact design and potential problems caused by heat generation and possible extra power requirements they don't want to risk having unreliability problems like seen in the early days of the 360.

Afterthought... could this rumour be be quoting Nintendo's Target figures for Cpu/GPU output? Its likely that if any actually performance increase as this rumour suggests is true that at least some developers would of been informed of at the very least the target Nintendo are trying to reach, and possibly getting updates out for the dev kits. Probably making these developers abide to Non Disclosure terms.



P-Gamer-C said:

If sony or soft were rumered to overclock by 2 FRIKEN GHZ i wouldent believe it much less nintendo



GiftedGimp said:

@TreonsRealm "One thing that never ceases to amaze me is that critics of the console don't consider the fact that the machine is often rendering games for 2 independent screens at the same time through a wireless connection no less. In some cases such as Call of Duty and Sonic Racing, it is doing this for 2 separate players simultaneously. It may just be me but that seems a bit intensive for the unconfirmed specs that hackers have supposedly discovered (a 1.24 ghz processor?)."

I've never thought that this was the true running speed of the cpu, I still believe if this figure is true its the idle speed. For just the reason you stated.



Traxx said:

Never will understand why Nintendo underpowered the CPU that much. If you want to provide new, engaging game experiences you need cpu horsepower. All those gpu emphasize on the WiiU is just for the shiny things, normally things nintendo fanboys are quick to shun.



StarDust4Ever said:

LOL. My PC rig uses an octo-core AMD processor, stock clocked at 3.6Ghz, turbo up to 4.2Ghz. Performance was limited to keep the prcessor within it's thermal design limit, but in fact all 8 CPUs can and will run stable at 4.2Ghz. I set the default clock rate to 4.2Ghz, disabled turbo-clocking and APM, and boom, I get a 4.2Ghz machine that can run stable at max load on all cores at 4.2Ghz. It pulled around 90 watts idling; 300 watts under the stress test, meaning my 125W processor was exceeding 210 watts, but that doesn't matter because I have a big-arse heat sink on it. Also I've got thermally controlled fans so it's whisper quiet on idle but sounds like the Indy-500 when the CPU is maxed out. However, there's no giant heat sink on the Wii-U processor, so overclocking it would likely create an overheating situation similar to the heat problem that caused the early RROD deaths on most of the model 1 Xbox360s. Thanks but no thanks, if an overclocked processor leads to eventual failure of the console. And yeah, unless they somehow figure out some way to attach a 3x3 inch heatsink brick to the Wii-U (which would likely involve cutting a rectangular hole on top of the console to make room for the massive heat-sink), I'm calling BS on the 3.24 Ghz figure. Greater than 2x speed on any stock-clocked processor is likely BS in my book.

The Wii-U may have been slightly under-clocked to maintain a certain thermal design rating, but exceeding the thermal design limit without a sufficient cooling system is a nono. That's why all mobile tech (DS, ipad, etc) are inherently underclocked. MS and Sony may have pulled this OC'ing crap (which is why the launch models of the PS3/360 both had heat issues which were not resolved until fab tech gave them a smaller die process which allowed the production of low-power "slim" models), but Nintendo knows better than to sacrifice performance for durability.



3dbrains said:

I can confirm with 100% certainty that this is not true. The WiiU still draws between 28w and 33w of power - depending upon weather on wiiu menu, playing a game, browsing online etc. This means the clock has not been increased as this would use more power.
I just checked.



ballistic90 said:


That doesn't actually indicate much of anything. CPU overclocking doesn't always result in higher voltage draws from the device. It's just common with PCs.



Sun said:

Traxx said: 'Never will understand why Nintendo underpowered the CPU that much'

Agreed. I cannot understand it either. Even with a nice GPU the CPU (they are on the same multi-chip) is still way too poor. I do not believe the rumour but I would love Anandtech or somebody to check the new speeds anyway.



Pod said:

Nah, that's simply because it's receiving a lot of wireless data constantly that it has to decompress and display at 60fps on a 6" color display, and because Nintendo went a little cheap with the battery.

I'm saying there's a certain possibility that the processor has been running at the rumored 3GHz all the time in the first place.
The images that we're seeing the machine produce would seem to suggest that.

Just that the interface between games had previously been locked at a lower clockspeed (the rumore 1.3GHz) to save power if people left the system on to download content, or were only using the simpler apps.

Having to switch speeds would explain long load times and a slow interface.

There's a substantial waiting time when asking the machine to go into "Wii hardware" mode as well.



JSuede said:

No. That is not a possibility at all. The cooling solution in the Wii U is not big enough to account for the kind of heat that would be put out at those clock speeds. It would run hot and be quite loud.....if the heat sink didn't melt entirely. It would also mean that the developers who have said that the CPU is slow have no business coding games if they think a 3Ghz CPU is slow.

If it were changing speeds on the fly depending on what the system was doing, the power draw from the wall would change drastically too. The system idles at ~30-32w and gets to ~34-37w while playing a taxing game on a disc. Changing the clock speed to 2.5-3x what is was before would increase the power draw into the 45w range.

CPU clock speed has next to nothing to do with the images produced on the screen....that's what the GPU is for. Sure, the CPU handles some things, but a higher clock speed isn't going to make the graphics prettier by itself. Especially when logic that would produce better visuals is being handled by the GPU barring some failure by the developers. Dynasty Warriors wouldn't have the framerate issues it does from all of the AI on screen if the CPU was faster, or if they coded it to the GPU instead.

The CPU is not, has never, and will never run with a clock speed of 3Ghz.



D2Dahaka said:

I highly doubt ts been overclocked that much. There would be alot of malfunctioning and melting Wii Us. That being said any performance enhancement is welcome



unrandomsam said:

@SilentHunter382 Might be able to but only a single core. (Disabling the other 2). The 360 is the same can do a high clock speed using a single core but not with all cores active. (Helps with devs who suck at programming which seems to be most of them who coded for 360/PS3).

If that is the case then only badly coded games will benefit.

Leave A Comment

Hold on there, you need to login to post a comment...