I mean really—how much better can graphics get? The main issue going forward isn’t how lifelike a game looks, but how fun or engaging it actually is. The problem facing Sony and Microsoft isn’t that their games look bad—it’s that they’ve reached the top of the mountain, and there’s nowhere higher to climb. You can only polish realism so far before it stops feeling exciting.
That’s why I think Nintendo might quietly take the lead in the next console generation. The PS5 and Xbox Series systems have already shown players what “maxed-out” graphics look like, and many have grown bored with beautiful shells that lack something inside—the spark of fun, surprise, or creativity. Meanwhile, Nintendo’s always been behind in raw power, but that disadvantage has kept its focus where it matters: gameplay and imagination.
To borrow an analogy, if you’ve lived on bologna sandwiches and hot dogs (Wii, Wii U, Switch 1) and suddenly get a mid-tier meal—say Golden Corral steak and mashed potatoes (Switch 2)—it tastes incredible. Sure, it’s not filet mignon or lobster risotto (PS5), but to someone used to simpler fare, that first real steak is magic.
I’m not talking about console life spans so much as the natural growth curve. It takes three to five years for developers to truly master a new system. By 2030, Switch 2 titles could reach late-PS4-tier visuals like Spider-Man or Wukong, and for Nintendo fans, that’ll feel amazing. Even knowing finer dining exists, we’ll savor that mid-tier steak because it’s seasoned with fun.
Of course, Nintendo will eventually hit the same wall that Sony and Microsoft are hitting now. Maybe ten years from now the Switch 3 will start out matching low-end PS5 quality and, by its twilight years, reach the high end. But when the Switch 4 arrives twenty years from now, I’ll probably be asking the same thing I’m asking today: Really, how much better can these games look? That’s when we’ll remember that what keeps us playing isn’t how real it looks—it’s how it makes us feel.
And from an outsider’s perspective, I already see Nintendo starting to walk the same path Sony, Microsoft, and big developers like EA and Bethesda took years ago: rising budgets, longer waits between tent-pole titles, and higher prices. Metroid Prime 4 has been nearly seven years in the making; Animal Crossing and 3D Mario fans have waited five to eight. Even Donkey Kong is approaching a decade gap. And now Nintendo’s joined the $70-base-game club—finally first at something in the console wars.
So yes, Nintendo’s catching up—but maybe not in the way we hoped.
Once we Nintendo gamers finally reach the mountaintop where Sony and Microsoft stand, will we really be satisfied with a gorgeous wrapper around a barely-above-boring product?
Nintendo doesn't have to catch up to either Playstation or Xbox since Nintendo have been leading in the console space for the past several years. And their path is set for the next several years with the Switch 2. Instead Nintendo will continue to thrive as the Fisher-Price of video games while Playstation fans decide if they want to spend $1,000 on a PS6 a couple years from now or if Xbox fans want to spend $1,200 on their next home console.
Nintendo isn't out here throwing money on the fire, chasing the live service trend as Sony has been; it'll be hilarious when Marathon is dead on arrival like Concord was. Meanwhile, Microsoft leadership is asking the Xbox division for +30% profit margins...or else. I'd say their next generation of console will be their last.
Switch Physical Collection - 1,536 games (as of December 14th, 2025)
Switch 2 Physical Collection - 4 games (as of December 8th, 2025)
I mean, I don't disagree, but you put a more dramatic spin of my own opinions which are honestly more extreme (among other things that I'd be fine if nearly zero games looked better than Mario Kart 8 and that Pokemon is wasting its time on modern 3D graphics and game design when their games looked better on DS). Like this is too dramatic to be worth saying on an internet forum of all places but too boringly obvious to be interesting enough for its own thread.
Honestly I'm kind of relieved that the new Fire Emblem and Mario Tennis just looked like enhanced versions of Switch games, instead of being morons that waste a year on making textures look 2% better or whatever, makes me think Nintendo will be fine.
There's nothing to worry about. Despite the many changes Nintendo has gone through over the years, one thing has remained consistent and that's Gunpei Yokoi's philosophy of 'lateral thinking with withered technology'
Think of the Game Boy with its colourless screen but plentiful battery life, the DS had subpar graphics in comparison to the PSP but had two screens, the Switch 2 uses optical sensors which have been around for eons for mouse controls.
Point is, graphics haven't been a priority for Nintendo consoles for nearly two decades now and even longer for handhelds.
The Switch 2 was an exception as the form factor stayed mostly the same and even then, it's not like it's competing with PS5 and Xbox Series, far from it. And as successful as the 'Switch family of systems' have been, you just know Nintendo's R&D team are constantly throwing things at the wall to see what sticks and it may well be that the 'Switch 3' may be a totally different beast but that's a seperate conversation. But for me, it's that unique playfulness that keeps Nintendo going regardless of what the rest of the industry is doing.
Maybe I'm missing the point, but I'm just not worried about this for Nintendo. I don't see them focusing on photorealistic graphics as a priority for most first party games.
That focus seems to be misplaced in my mind, although I'm sure it markets well, but it balloons development timelines like nothing else for extremely diminishing returns.
For me, Nintendo offers the filet mignon, not Sony. Breath of the Wild is a better game than anything on the PS5, and it has nothing to do with graphical fidelity.
I've supp'd for months on naught but broth of bean
I’m not trying to be a downer or, as someone above put it, a “Ninten-doomerist.” I’m not speaking ill of Nintendo, Sony, or Microsoft, but the logical trend that we are seeing happen in the console gaming environment.
My point is just that, as hardware and chipsets keep improving, Nintendo will eventually reach the same pinnacle of graphics, frame-rate, and processing power that the PS5 and Xbox have now — whether that takes 10, 20, or 30 years.
Picture this: it’s 2045. Your kid or grandkid has a Switch 3. They love it — it looks great, plays great, classic Nintendo fun. Then one June morning a Nintendo Direct drops, revealing the Switch 4. The graphics? Just a tiny bit better. The price? $1000-plus.
That future gamer might ask the same question that I put forth here and many or some may now or in the future be asking about the PS6 today: is the minor improvement worth the huge price tag?
That’s not doomer talk — it’s just wondering what happens when “better” finally stops being enough
What happens when the consumer isn't willing pay $1000-$1500 for maybe a 10% maximum improvement?
In my humble opinion, this seems like a lot of conjecture based purely on assumptions that have little to no evidence to support them. The fact of the matter is we don't know what the next 20 years will look like. Think back to what we had 20 years ago tech-wise: no iPhone, no hybrid consoles, not even the Wii existed yet. There are so many factors (technological, economical, political, societal, etc.) that can radically shift the direction of not just Nintendo, but the entire industry. I'm not sure why you assume Nintendo is going to ride the Switch through to a fourth iteration, nor am I understanding why you believe Nintendo would break from its core philosophy of deeply fun and engaging video games.
All this to say, I don't think you should stress about Nintendo going stale or overpriced in 20 years, especially when there isn't even much evidence to say Nintendo would become crummy by 2045. Gaming is supposed to be a fun hobby that helps us relieve stress, not give us more.
"well it appears I am upside down. what ever will I do?"
@AppleroseGrace The problem here(from my view anyway) is that the point you're making instead essentially just boils down another "Greedtendo is getting more powerful tech which means things will be more stagnant now" take in the most melodramatic form possible. There are some legitimate arguments to be made here, especially in regards to redefining the standard for quality graphics, but if you're gonna frame them like this, then you can't really expect folks to take your point entirely seriously. I know I don't.
I can kinda see the point with the PS5. Unlike previous generations, games haven't really caught up to take full advantage of the technology that's already available on that console. If you look at early PS1 games compared to late PS1 games, they seem a world apart - and the same is true for PS2 and PS3.
When it comes to graphics though, PS4 to PS5 was probably the least significant generational leap ever for PlayStation. I haven't played many PS5 games that feel substantially different from the best looking PS4 games. I've heard people refer to the PS4 and PS5 together as one big super generation, and it's kinda right... PS5 is basically a PS4 Pro Ultra. The most significant upgrade was storage - moving to the newest SSD technology over the PS4's HDD basically eliminated loading screens for PS5 games. That and maybe being able to play games in 4k at a reasonable framerate (but uhh, I still don't have a 4k TV so)
When it comes to the Switch however, there was certainly plenty of room to grow performance wise. The Switch 2 gave us that, providing a true generational leap and giving developers a lot more breathing room.
Past that, who knows what the future holds? If they stick with the portable thing, I think there's still ample opportunity for mobile tech to get much more powerful and more efficient. Just look at what Apple has been doing with their M-series silicon.
What does this mean for the games? For Nintendo, I think we've already seen a taste of that. DK Bananza wouldn't have been possible on Switch 1 hardware. I think Nintendo will stay creative and continue to use the hardware to the full extent they can, like they always have (more or less).
I've supp'd for months on naught but broth of bean
Honestly, I feel like there's a non-zero chance Nintendo's the one that finally figures out how to get casuals to buy VR by making a version of it that's reasonable and cheap and has significant enough exclusives.
But they would absolutely only go in that direction after everyone else has given up on it (which outside of Oculus it kinda feels like they have).
It's honestly gonna feel weird if Nintendo makes an iterative system for the 2nd time in a row. That's literally never happened, unless you count...Gameboy Color between regular and Advance.
When you look back at the history of gaming, one pattern becomes impossible to ignore: the more powerful our consoles become, the smaller the visible leaps in what we actually experience. Economists call this the law of diminishing returns—each new investment produces smaller and smaller gains. The same principle applies to video game hardware. Every new console generation still costs more to design, build, and develop for, but the improvements in how a game feels or looks are shrinking to the point that only the most dedicated observers notice.
In the beginning, each generation was nothing short of revolutionary. The Atari 2600 arrived in the United States in 1977 and practically invented home gaming. Eight years later, the NES launched in October 1985 and redefined what video games could be with colorful sprites, smooth scrolling, and memorable characters. The jump to the Super Nintendo in 1991 and the Sega Genesis in 1989 brought 16-bit processing, rich colors, and even early 3D illusions like Mode 7. Players didn’t need side-by-side comparisons to see the difference—one glance at Super Mario World versus Super Mario Bros. said it all.
Then came the true 3D revolution. In 1996, the Nintendo 64 made analog control and freely moving cameras feel magical. A few years later, the Sega Dreamcast (1999) and PlayStation 2 (2000) pushed that 3D world into the realm of cinematic storytelling, while the GameCube and original Xbox (both 2001) made real-time lighting and texture detail commonplace. These machines still produced excitement that you could feel immediately, but for the first time the budgets began to balloon. Developers were now spending exponentially more to chase each new layer of realism.
By the mid-2000s, the HD generation arrived. The Xbox 360 (2005) and PlayStation 3 (2006) introduced high-definition graphics, complex physics, and movie-quality cutscenes. Games looked incredible compared with the PS2 era, but the trade-off was enormous: longer development cycles, larger teams, and rising retail prices. Nintendo’s Wii, released the same year, quietly opted out of that arms race, betting instead on motion controls and a lower price. Ironically, it became the best-selling console of the generation—not because it looked better, but because it felt different. That marked the moment where raw power stopped guaranteeing success.
From 2013 onward, the graphical curve began to flatten completely. The PlayStation 4, Xbox One, and Wii U all delivered smoother frame rates, higher resolutions, and better lighting, yet the visible difference between a PS4 game and its eventual PS5 upgrade is often imperceptible to the average player. The Nintendo Switch, released in 2017, didn’t try to compete at all—it succeeded by blending portability and creativity over fidelity. We now live in an era where studios spend hundreds of millions of dollars for an improvement of maybe ten or fifteen percent in how a game looks or performs.
Today’s consoles—the PlayStation 5 and Xbox Series X/S—are technological marvels capable of 4K, 120 frames per second, and instantaneous loading. But how often does that translate into new experiences? Most players describe games as “beautiful” rather than “groundbreaking.” The Switch 2, expected in mid-2025, will likely achieve what once seemed impossible: PS4-quality visuals on a handheld. That’s a triumph for Nintendo—but it also means they’re entering the same territory Sony and Microsoft have occupied for years, where each jump in hardware power delivers smaller emotional payoff for greater cost.
If you charted this progress like an economic graph, the early decades of gaming would be a steep upward climb. Every generation—from Atari to NES to SNES to N64—was a revolution. Starting with the PS2 and Dreamcast era, the slope began to level. By the HD years of the PS3 and Xbox 360, it flattened further. Now, in the age of PS5 and the coming Switch 2, we’re nearly horizontal. The returns haven’t stopped; they’ve just shrunk to the point of invisibility. The miracle that once defined each new system has become a whisper.
Nintendo’s long-time strength has been avoiding this trap. Instead of chasing the sharpest graphics, they’ve always prioritized lateral thinking with withered technology—finding new ways to play with existing hardware. But as their machines grow closer to Sony and Microsoft in sheer capability, they too will face the same law of diminishing returns. The challenge ahead isn’t how realistic Mario’s hat looks under ray-traced sunlight—it’s how to keep the experience of discovery and joy feeling fresh when visual perfection has already been reached.
In short, the first 30 years of gaming were a climb up the mountain of possibility. The last 20 have been a slow walk across the summit, where the view changes only slightly no matter how far we travel. The next great revolution in gaming won’t come from more teraflops or frame rates—it’ll come from imagination, design, and the willingness to surprise players again. That’s where Nintendo has always excelled, and where the entire industry must return once the graphics arms race finally runs out of oxygen.
and now I'll explain how I tie the financial laws of diminishing returns to the home video game console history.
When you follow the history of video game hardware, one thing becomes clear: every leap in graphics, processing power, and technical sophistication has delivered smaller and smaller perceptible improvements to the player while costing more to make and to buy. Economists call this the law of diminishing returns—each new investment yields a smaller reward. The same principle perfectly describes the modern console market.
In the early days, every new system felt like a miracle, and each upgrade came with a reasonable price tag. The Atari 2600, launched in the U.S. on September 11 1977 for $199 (roughly $1,000 when adjusted for inflation), introduced home gaming to the masses. The NES followed on October 18 1985 for $199 and transformed the industry from simple beeps and blocks to colorful scrolling worlds full of personality. When the Super Nintendo released on August 23 1991 for $199, and the Sega Genesis on August 14 1989 for $189, the jump to 16-bit graphics brought richer color, smoother animation, and music that felt alive. Each new console was visibly better than the last; players didn’t need a side-by-side comparison to see the revolution happening on their screens.
By the mid-1990s, 3D arrived, and the excitement reached its peak. The Nintendo 64 launched on September 29 1996 for $199, while the PlayStation 1 debuted a year earlier on September 9 1995 for $299. Sega’s last great console, the Dreamcast, appeared on September 9 1999 at $199. The difference between 16-bit sprites and true 3D worlds was night and day—new genres were born overnight. These were still affordable revolutions; the price-to-experience ratio felt fair.
Then came the early-2000s boom, when gaming started to mirror Hollywood in scope and expense. The PlayStation 2 released on October 26 2000 for $299, the GameCube on November 18 2001 for $199, and Microsoft’s first Xbox on November 15 2001 for $299. Players saw real-time lighting, smoother textures, and cinematic storytelling—but the budgets behind them began to swell dramatically. The leaps were still visible, but for the first time developers were spending exponentially more to achieve them.
The HD generation—Xbox 360 (November 22 2005, $299–$399), PlayStation 3 (November 17 2006, $499–$599), and Nintendo Wii (November 19 2006, $249)—ushered in gorgeous high-definition worlds, but also the first signs of creative strain. Development costs exploded; teams doubled and tripled in size. Sony and Microsoft pushed power and realism, while Nintendo won the generation by moving sideways instead of forward, betting on motion controls and approachability. The Wii proved that innovation could still trump horsepower.
From 2013 to 2017, the incremental era began. The PlayStation 4 launched on November 15 2013 for $399, the Xbox One a week later for $499, and the Wii U had already arrived in 2012 for $299. Graphical improvements existed—higher frame rates, sharper lighting—but the leap no longer felt magical. Four years later, Nintendo released the hybrid Switch on March 3 2017 for $299, prioritizing portability and creativity over sheer power. It succeeded because it felt new, not because it looked new.
Today’s machines—the PlayStation 5 (November 12 2020, $499), the Xbox Series X (November 10 2020, $499), the smaller Series S ($299), and the Switch 2 arriving mid-2025 with a price of $449—represent the technological peak of the home console model. They deliver 4K resolution, 120-frame performance, ray tracing, and lightning-fast load times. Yet to most players, the difference between PS4 and PS5 gameplay is almost imperceptible without a technical breakdown. Studios now spend hundreds of millions of dollars and five or more years of work to produce an improvement most consumers can only describe as “smoother.”
And waiting just over the horizon, industry analysts expect the PlayStation 6, likely arriving around 2026 or 2027, to carry an MSRP of roughly $999—a thousand dollars for a console that will, by all projections, offer perhaps a 10–20 percent improvement in visual fidelity and performance. That number perfectly illustrates the law of diminishing returns: a staggering rise in cost for a shrinking sense of wonder.
Over nearly fifty years, the price of entry into each new generation has crept from two hundred dollars to a thousand, while the perceptible improvement has shrunk from revolutionary to marginal. In the 1980s and 1990s, a new console meant a whole new way to play; today it usually means prettier reflections and faster load screens. Nintendo has long avoided that trap by focusing on how we play instead of how real it looks, but as its hardware approaches the same power levels as Sony and Microsoft, even Nintendo will eventually face the same ceiling.
The lesson is simple: technology can only impress us for so long before our eyes and our imaginations catch up. The first thirty years of gaming were a rocket climb toward possibility. The last twenty have been a slow walk across the summit, where each step costs more but changes less. The next true revolution won’t come from more teraflops or frame rates—it’ll come from ideas. That’s the realm Nintendo has always ruled best, and the one the entire industry will need to rediscover once the hardware arms race finally stops giving us more for our money.
I struggle to understand how this is problematic for Nintendo. You literally acknowledge that Nintendo rules the realm of ideas, and as such, they will weather this crisis by just continuing to do what they have always done: prioritize innovation over graphics.
"well it appears I am upside down. what ever will I do?"
@AppleroseGrace "Nintendo has long avoided that trap by focusing on how we play instead of how real it looks, but as its hardware approaches the same power levels as Sony and Microsoft, even Nintendo will eventually face the same ceiling."
Except they haven't and probably still won't, mostly due to the fact that even with the new more powerful hardware their current titles have prioritized strong art direction over technical fidelity. MKWorld, DK Bananza, Kirby Air Riders aren't these fancy RTX riddled titles because the teams behind those games recognize that the tools don't exist just to make stuff look shiny and overpolished.
Strongly agreed, I think we've past the point where graphics are a reasonable basis for upgrading hardware, the graphics feel like they're doing nothing at this point (and there's good reasons why diminishing returns apply to graphics when you look at improvements in areas such as pixel count, polygon count, colors, shading, etc. it all gets less noticeable as it goes on).
My opinion is that the focus should shift towards improving controls over graphics. They should be experimenting with new control schemes and technologies that expand on the range of actions you can do with technology. VR, AR, dual screens, that feels like it offers far more potential to innovate gaming than continuing to push resolution and frame rate further and further and further.
@N00BiSH AFAIK it seems to be a problem with chasing higher and higher resolution. Higher resolution art takes more and more time to create and that seems to be extending development time the most. Development times have certainly gotten longer, Nintendo's admitted as much and the game output seems to have further decreased during the Switch 2 generation with multiple games seem to be taking 5-8 years for new entries. Nintendo may be lagging behind but they're still progressing in this direction and it's becoming unsustainable. What's going to happen say, 3 generations from now when they're developing 16K graphics that take 10-12 years of development time? When does it end? Short of a technology emerging that somehow makes graphical development easier and take less time, there has to be a breaking point here.
Development times have certainly gotten longer, Nintendo's admitted as much and the game output seems to have further decreased during the Switch 2 generation with multiple games seem to be taking 5-8 years for new entries.
How much of that time is actually being taken solely to work on the visuals? How do we know it's chasing graphics causing the longer development periods and not just meticulously refining the mechanics and game design, which has always been at the forefront of Nintendo's modus operandi for game dev? I won't deny that visual development for games is always going to take time, but acting like Nintendo titles are taking longer mainly because they're pushing for higher resolutions and frame rates is just not an accurate assessment at all.
Forums
Topic: A short essay about the future of Nintendo
Posts 1 to 18 of 18
Please login or sign up to reply to this topic