While it's easy to get tangled up in technicalities when it comes to the graphical prowess of Wii U, one thing is 99% certain: the successors to Xbox 360 and PS3 will be more powerful, from a graphical standpoint. Whether or not the Wii U matches or surpasses the existing HD systems in this respect — it will take time for its full potential to be seen — doesn't detract from the fact that, once again, Nintendo is prioritising its concept above pushing unprecedented polygon levels.
It was a gamble that paid off with Wii, in terms of the system out-selling its rivals during its lifespan, though issues of third-party support and securing multi-platform blockbusters could arise once again. Will future titles designed for Microsoft and Sony's new systems be scalable to work well on Wii U, will the difference be too substantial for developers to think it worthwhile, or will there be a perception, once again, that gamers who love triple A titles don't want them on a Nintendo system? Satoru Iwata has already briefed investors earlier this year that the graphical difference between Wii U and its rivals won't be as severe as it was in the case of Wii, but it's a potential issue nevertheless.
It's possible, however, that Nintendo is once again reading the trends and catering to shifts in the game industry. The painful budgets that often accompany major HD projects are well known, and one opinion is that the days of triple A franchises are diminishing, making these bombastic affairs less common. Speaking at the Gamercamp festival in Toronto, Splinter Cell: Blacklist director Patrick Redding had this to say about shifting priorities in gaming development.
The market as a whole is going to undergo a critical shift in priorities, a shift away from the absolute primacy of graphics and production values and content creation toward systemic depth. This trend is going to trigger a reality check for developers like me who work on established franchises with a large succession of sequels, and it's also going to be a call-to-arms for smaller game creators, including a number of people who are sitting in this room, I hope.
If there is a trend of moving away from a focus on visual fidelity, that could undoubtedly be a positive for Wii U. Redding went on to explain that "lower-case aaa" games could focus on other ideas, like asynchronous multiplayer and more open-ended games. The former suits Wii U and its GamePad perfectly, though systemically open games that encourage the gamer to determine events and the story — Minecraft was cited as an example — could be reliant on the little-known Nintendo Network being robust and reliable.
Do you think Redding may be right, or will demand for a high number of triple A blockbusters keep that side of the development industry strong? If you agree with Redding, do you think Wii U will potentially benefit, or will gamers still want the graphical powerhouses predicted from Microsoft and Sony? Let us know your thoughts in the comments below.
[source gamesindustry.biz]
Comments 21
I think that the current model of blockbuster games just isn't sustainable. I remember reading a while ago that if EA didn't sell at least 5 million copies of Dead Space 3, they would consider it a failure.
Think about that for a second. 5 MILLION copies.
The constant pressure to aim for more lifelike graphics, better AI, 40+ hour games and the like is just causing game development to become too expensive. While I don't see AAA games dying out completely - there will always be franchises that will sell like crazy - I think developers may begin looking elsewhere for their sales.
Developing several middle-budget games that are solid sellers is certainly a viable way of doing business, and arguably better than selling one huge-budget one.
Many game developers have folded because they've bet everything on their next game, accumulated lots of debt and really require their next big game to sell well in order to avoid going bankrupt. All it takes is a little bad luck, or a game that isn't as good as you think it is, for sales to tank and your company being unable to pay back loans.
In contrast, if games are being released on a regular basis, and they don't cost so much to produce, it's not the end of the world if one doesn't sell that well. Hopefully the next game that comes out in a few months will be well-received, and you won't be stuck in debt for years until you can get another game out the door.
Sorry, but how on earth will the PS4/720 successfully go on to get an even bigger upgrade in graphics? Or has everyone forgotten the industrial disaster that amounted from the PS3 and Xbox 360 prematurely pushing HD entertainment?
Developers and consumers won't be able to afford to take advantage of expensive technology that is still in its prime, especially with the current financial state inflicting the industry. Native full HD is something the current-gen is still missing, and it's an option that practically all gaming households will be able to fully appreciate at the start of this era, unlike forcing consumers to upgrade to some other $800+ TV technology that's still in its prime, along with buying an absurd $600 "power-house" console. I'd say graphics-wise, next-gen consoles need to make it a priority to make this a universal standard, not outdo themselves once again.
If the definition of a successful triple-A franchise is consistently utilising costly new technology and increasing development budgets to a ludicrous limit for the sake of saying it's "advanced", then the alleged "top-end" of this industry is going to see a huge collpase within itself.
I'm with Redding. Skyward Sword and both Galaxies kick Uncharted's a**, just to put one of many possible examples. More Limbo and less CoD.
@Wheels2050
Great comment.
It's essentially the "don't put all your eggs in one basket" sort of idea.
@GameLord08: I don't think it's out of the question - building yourself a $500 PC will get you incredible graphics, so I'm sure it's feasible for MS or Sony to custom-build a system that they can sell for a reasonable price.
I think the main problem is on the game development side. We're really getting to the point of diminishing returns in terms of graphics. For example, someone might decide that an open-world game needs some birds flying past in the background. Not birds that can be interacted with, just birds for the visual effect. That means you have to pay someone to sit down and model/draw the birds, someone to animate them, and someone to add in the code in the game to make flocks of birds randomly fly by. It's not a huge effort, and it's something probably 10% of players will notice, but it means it's something you're paying people to do instead of fixing bugs, coming up with new gameplay ideas, or something else that's arguably more important to the player.
@WesCash: Exactly. It still doesn't guarantee success, though - if you look at the story of Looking Glass Studios (of "Thief" and "System Shock" fame), they were releasing games almost yearly (and different genres of games, not just 1 series - those were the days!) but still went under. Unfortunately, the model of releasing several smaller games - even if it's better than 1 huge game every 4 or 5 years - can compound your problems if you have a string of games that fail to make a profit. Looking Glass hung in there for a while, but they had several commercial failures and, just like that, one of the world's best developers was no more.
@Wheels2050: Indeed, that may be a possibility - but I think the cost of hardware reflects differently in PCs than in consoles. With PCs, you only consistently upgrade the key components on an established infrastructure, unlike games consoles where you start development from complete scratch. So, that's why it seems PCs are always light years ahead at such a sustainable cost; you can't get that same huge technical leap for consoles without suffering finanically - it takes quite a lot more research and development costs.
Although, I agree that the issue lies more in game development nowadays. Graphics are starting to become too much of a superficial improvement and focus for developers, and the bare quality and ingenuity of games are starting to suffer as a result. Graphics are definitely also a substantial part of a game but when it starts taking priority over the experience itself, it becomes too much of a tactical diversion to cover up increasingly mediocre, underdeveloped or repetitive games.
We need more open-ended developers willing to fully realise new gameplay ideas on to consoles, and that means development must retain open to developers of all classes in terms of cost, ease and fluidity. And that means less focus on technical upgrades, and more focus on opening up consoles to control innovation and accessibility.
"It was a gamble that paid off with Wii, in terms of the system out-selling its rivals during its lifespan" Not entirely though as it only outsold its rivals for the first half of its lifespan, for the second half of its lifespan its rivals outsold it. Partly due to the graphical disparity, it turned out to be the hare and tortoise scenario that many predicted (and who were smugly mocked by Wii fanboys for suggesting it at the time...)
But anyway, I think it's really nice to see low-budget "small" games getting so much attention these days and being seen as legitimate giants of the industry, rather than just as quaint retro novelties with the AAA blockbusters being seen as "real games". It would be a shame to see development costs and hunger for ultra-realistic graphics spiral so much out of control that only the mega-rich could make games anymore.
As long as there's a version with better graphics the vast majority of people will always ask 'emselves what the game would be like with 'em. The question is how much more they're able or willing to pay for it.
Personally i was fine with the graphics in most Wii games (especially SMG, Metroid Prime Corruption, No More Heroes and parts of NSMBW). I really liked the price point of these games.
Wii U games will have to do more than give me better graphics to make me spend more than i'm used to atm. NSMBU seems to do it right with its special modes. I'd be willing to pay 10€ more for it than i paid for NSMBW atm.
@GameLord08: Absolutely, I (deliberately) omitted R&D costs and the like, which drive up the cost to the manufacturer.
However, I don't see a good reason why consoles couldn't be made from off-the-shelf PC components. If, say, Sony partnered with AMD, they'd have ready access to CPUs and GPUs. A custom OS could take make the most of the chips (as, like you say, consoles are required to do a lot less than a PC, so the OS could remove a lot of extraneous memory usage etc.) and wring out a lot of power from them.
Having said that, I'm by no means an expert on console hardware, but it seems to me that console development is so closely tied to PC development these days that it would make sense to use off-the-shelf parts (perhaps with minor customisations) to avoid a big hit in R&D costs, which will allow for cheaper consoles (or more profit for the manufacturers).
@Wheels2050
Agreed, that is exactly what happened with Kingdoms of Amalur. it is a great game, but it didn't sell well for its huge budget, and the development teams were forced to shut down because of it.
@Wheels2050: That's actually an ingenious proposition. It'd effectively improve the performance and capabilities of consoles, while keeping it significantly cost-effective. The threshold for games consoles could be set even higher if manufacturers adopted a strategy like that. In addition, it could reduce the costs and trials of console development overall - manufacturing consoles from scratch technology might stop being the requirement. Consoles could set themselves a basic infrastructure using standard off-shelf parts, and upgrade upon it every cycle - off-shelf PC parts would've gotten another drastic upgrade within the 5-6 year gap. Plus, as you say, it'd be easily customisable, which is another good incentive aside from costs.
Besides this, I'm not quite sure but I think something similar to this is what's been said that Nintendo has used for the Wii U. It uses a completely different infrastructure from the Wii entirely as its base, but it incorporates some of the Wii's old hardware. From what we know, the Wii U uses an IBM motherboard and an AMD graphics card, but I'm still not too sure whether or not this is the case.
@bulby1994: Whoa, really? I got to review Kingdoms of Amalur around the time it was about to see release; the development teams seemed like really nice folk and it was a great game, especially in multiplayer. Sad to hear that.
Yeah, this has been apparent for a while that there needs to a major change in gaming because of the absurd cost of some of these games and the fall of so many gaming companies. I mean there's nothing wrong with super expensive hd-riffic games necessarily, but some of these developers and publishers need to take a step-back from their latest super expensive hd-riffic game and look at the direction the industry is pretty obviously going. Because if they don't, at least a few of them won't be around in a decade.
Wii U will no doubt outsell nextbox and ps4. Those systems will be pricey as well as their games. There will be lack of games at least for the first two years due to higher investments and time needed to create ssoftware. As much as I wait for ps4, I fear it'll be another "Vita situation" where system is highly priced and there are lack of games on it due to lack of developers interested in creating them. Wii U in other hand will be affordable system, It'll have huge amount of games by the time 720/ps4 launches... Problem with this system is, of course, it's performance. Wii U is not much of the next gen in terms of power.
@GameLord08 I totally agree with you there. The threshold of graphics on a home system is being met due to the extreme costs of developing games such as COD ect.. I recently read an article from Crytek who have said they got the technology to move graphics to another level but financially its not viable for a few more years at least, it would make the next gen xbox or ps at least 3x the cost it is now. The public just wouldn't accept a console in the region of £500-£600. Nintendo have been smart, releasing there console now with the HD capability and the franchises that could come to the console will almost gurantee it outsells its competitors again. How many people grew up with games such as Metroid, Zelda, Mario, Pokemon, Donkey Kong all these franchises in full HD could be amazing.
More people need to think like this guy.
I do think that the number of AAA titles will be reduced, although that isn't necessarily a bad thing as creativity often stems from smaller budget titles. I also believe that the Wii U will be able to compete with the next Xbox/PS4, because game development costs are already high for current HD systems, so publishers wont want to increase them further just for slightly improved graphics that only a minority of people will be able to appreciate.
Maybe another market crash is an order. It is becoming an expensive hobby to maintain, maybe another dip may help push gaming back to innovation instead of franchise.
Of course, I could point out the irony that he's helping turn Splinter Cell into the most stereotypical modern day AAA title possible, but he seems to be kind of aware of that so whatever.
God I hope so, I've been waiting for this shift for a long time.
Top end video games have gotten so repetitive and predictable, purely because no one can afford to take risks anymore.
If he's talking about his own game either i'm totally with this guy, if not he's blind.
@DarkCoolEdge
YES! I played Limbo last week and was stunned by its presentation, from a graphical perspective people should concentrate on making their games look distinctive instead of constantly trying to mirror real life and cinema. Although to be fair I did play a demo of Uncharted recently and would love to play more.
I honestly can not see how the next gen can make games look significantly more real that ffxiii anyway.
Show Comments
Leave A Comment
Hold on there, you need to login to post a comment...