[....]i know all this techinal details because i did a course on game development.
Not meaning to be rude but I don't really need to read the comments on the next page or what you said before this. A course on game development doesn't mean you're across the technical details.
Hell, I'm a Developer, I have a CS degree. Most of the people I work with have a CS degree. A CS degree covers the actual deep technical stuff that may or may not be included under the umbrella "game developer". The day to day work of myself and most of my colleagues do is whiteboard and arrows and hour long discussions type stuff
But there are people I work with who have no idea about hardware. They wouldn't have a clue. I mean they know the fundamentals obviously and they know how software works to a depth well beyond the average person on the street. But ask them to go laptop shopping and all they could tell you was "bigger number better". They don't need to know, all they need to know is how to put the logic puzzle pieces together to meet the requirement and, hopefully, in a way that's efficient and easy to follow
And that's with people doing the equivalent of the work that exists on the "hard" side of game development. There's also a "soft" side of game development which is where most game development actually lives. All of which is even more abstracted away from the hardware itself than things like doing net-code or doing the background manipulation stuff they did in the NES/GB days
Saying "I know the technical details, I did a course on game development" is basically like saying "I know the efficacy of vaccines, I'm an occupational therapist". I mean you might and frankly probably do if that's your general area of your interest and something you are likely to dive into. But if you do your knowledge will be mostly incidental
..... also just generally, you shouldn't really throw around qualifications in a discussion like this unless it's actually relevant. I know I throw around mine occasionally but I hope it's only when it actually makes sense. Like when people were talking about the state of the newest Pokemon games or when people discuss things like the "update model" games have these days. My job I think gives me a different perspective on those things. But when we're talking about things well outside the scope? Things I may be super into but which I have no particular insight other than "I'm generally interested in this stuff"? I don't bring it up and I shouldn't bring it up
This hardware speculation game is DEFINITELY one of those things outside of the scope of my work and degree. If I make sense at all on these threads when discussing hardware it's because I like to read about hardware in my spare time. At most my qualifications/work I could maybe just very slightly claim a slither of insight into how they might want to handle the transition from a software perspective. i.e. the ideal is something like this: BotW on old hardware:
Game: Hey hardware, BotW here, what are you?
Hardware: We're docked and connected to a 1080p screen
Game: Cool. Can you give me 720p 30fps?
Hardware: Sure, 720p 30fps it is
BotW on new hardware:
Game: Hey hardware, BotW here, what are you?
Hardware: BotW? Ok. We're docked and connected to a 1080p screen
Game: Can you give me 720p 30fps?
Hardware: BotW? Let me see. Lets do 1080p and we'll push that to 1440p with DLSS. We have VRR and at 1440p we can do 20-120Hz so lets cap you at 120Hz and see how we go
TotK on old hardware:
Game: Hey hardware, what are you?
Hardware: We're docked and connected to a 1080p screen
Game: Cool. Can you give me 720p 30fps?
Hardware: Sure, 720p 30fps it is
TotK on old hardware:
Game: Hey hardware, TotK here, what are you?
Hardware: TotK? Ok. We're docked and have a 4K display with HDR. We do have VRR with a range of 20-120Hz and we can do that at 1080p, 1440p or 4K
Game: Ok. Can we render at 1080p and push it to 1440p with DLSS? Also we can do some HDR. Since we have VRR lets just see how we go with that, I think we can hit around 30fps
Hardware: Sounds good, lets do that
Which is why I'm team "Switch Pro" and really hate the concept of a "traditional successor"
@Cotillion
No you wouldn't, but Nintendo likes to keep things easy for consumers. I don't see it happening. Maybe, but I'm not expecting it.
Even SteamDeck doesn't have an extra NVMe slot, and it's twice as large and a PC. I think room is precious, and there's just not enough room in a small form factor. Certainly not the full 2280 sticks. It would have to be 2230 which are the most expensive and hardest to find- far less common than micro SD and not enough exist to sustain 100+ million consumers like the 2280s. And I think most common consumers and kids aren't familiar with where to buy them and how to install. That's changing, with PS5 implementing it, but Nintendo typically sticks with existing mainstream tech until it gets cheaper and enough consumers are familiar with it.
@Giancarlothomaz the only information we about the Switch sucessor is it chipset, we don't know if Nintendo gonna continue the hybrid concept of Switch on it sucessor
You're not listening. You're hearing me, but you're not listening. Its going in one ear and out the other. We went through this already in the last discussion. And I really hate repeating myself
We DO know it's hybrid BECAUSE of the chip. The chip is a MOBILE CHIP. You only use a MOBILE CHIP if you're making another PORTABLE HYBRID SYSTEM.
Please, let this be the last time I have to say that.
OLED display is still expensive to use on a portable console... it gonna raise it production and the price Nintendo is gonna charge for it substancialy, do you really want to pay $500 for a portable console?
Everything you just said... is wrong.
I won't waste my time explaining it. The onus is on you to educate yourself. But I will make it easy and drop a link.
edit
On second thought, I don't want there to be any misunderstanding on this, so I'll just cite the article
“The new Switch’s 7-inch OLED display from Samsung Display Co. costs an additional $3 to $5 per unit"
Ya. I'm "really worried" about the system "costing $500" because of that extra $3 spent on the screen /s
The reason it's hard to have conversations with you (aside from the whole not listening thing), and I say this with all due respect, is you don't have a clue what you're talking about but you think you know everything.
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
@JaxonH I play on low light levels because of my eyes, and on low brightness levels part of the SWOLED screen turns a tint of green on dark colors,
I got an OLED this summer and the first one I got was faulty one half of the screen was brighter than the other and then I exchanged that OLED for a new one and it was fine but the green tint annoyed me (and my eyes) so I exchanged that OLED for a V2 and I've had minimal problems since.
I wrote the above rant without reading the two pages that followed it so I have a few things to add. Again, nothing to do with my qualifications or anything this is all just me generally knowing crap
OLED I think is a given, OLED screens are not a huge cost anymore especially not if they're going to land around the price point of the Switch OLED. I think what happens at launch is that we get a price cut on the OLED and the regular Switch. Then relatively soon after launch they phase out the Switch OLED from sale. I don't see the need for the Switch OLED to exist on shelves if there's a more premium SKU, so the more premium SKU needs to be better in every aspect including screen. In the medium to long term I'd think we'd have an OLED (new SoC), a mini-OLED (new SoC but maybe paired back) and a mini (same as current mini)
For storage? I think the distinction we're making between SSDs and eMMC is a tad too pedantic. The line between the two is super fuzzy. I mean fundamentally what we care about is performance and capacity. You can get some fast and relatively high capacity eMMC and some pretty slow, janky and low capacity SSDs. My gut feeling says the cost has dropped to the point where "faster than SATA" speeds are viable. I'm swinging for the fences and saying non-user replaceable, 128GB, NVMe, PCIe Gen 4 but nowhere near pushing the limits of it, no DRAM Cache. The microSD card slot I don't think is going away, certainly not being replaced by an upgradable m.2 SSD. It could work but that'd be... very not-Nintendo
The SoC itself I think others have dug into it far more than I have. Whether or not those deeper diggings get them closer to the truth I'm not entirely sure. But what we can be relatively confident of is that it's Tegra, it's Ampere and it's 8 cores. The lowest spec SoC in that family is basically identical to the existing Switch on the GPU side except it has Tensor cores for DLSS (ish), a significantly lower TDP and a bit more CPU horsepower. If you go a couple of steps up to something with an 8 core CPU? You get a similar TDP to the current Switch but just under 4x the raw power and 4x the memory bandwidth. Plus other cool stuff like HDMI 2.1
I expect it'll be somewhere between those two, probably more like the higher powered of those two examples but underclocked. So somewhere around 2-4x the current Switch, maybe 3-4x the memory bandwidth. Remembering this is a custom SoC so it's not going to be either of the above, the above is just what can be done with Tegra and what products actually exist already. With that said the significantly lower TDP SoC I mentioned above? Something like that could be interesting for a future "Switch Nano" or something. Original 3DS sized, GBA-SP style Switch anyone?
More broadly I think a lot of what will make new hardware different will be that transition from HDMI 1.4 -> HDMI 2.1. There are a lot of goodies we'll get from that which aren't directly related to "more power go brr". We're talking VRR, HDR more generally including Dolby Vision/HDR10+, 120Hz at 4K (up from 30Hz). Do I expect it to take full advantage of all of those things? Well no, obviously. But it will take advantage of some of the things and its by its ability to do it will be what differentiates it from the current Switch
@JaxonH only $10 more it cost for Nintendo to use OLED display on Switch OLED(it suprising this information considering PS Vita used OLED display and it was a very expensive console).
@Kermit1Pineapple
Ya I remember that, but given the vast majority of ppl don't experience that issue I'm inclined to think you're just really unlucky or it's where you bought it from. That's not really an inherent "feature" of OLED as mine is not like that and neither is anyone else's here that I'm aware of. But then again, most ppl don't play on ultra low brightness either so you could just be an outlier use case.
@Giancarlothomaz
OLED prices have come down drastically in the last few years. As more and more devices switch to OLED and it becomes industry norm, economies of scale make it cheaper to produce than legacy technology such as LCD. Hence why Nintendo made the OLED Switch in the first place- because prices had dropped so much it was an easy and cheap way to offer a higher quality model for consumers.
@skywake
Agree with most of what you've written, but I don't think they'll phase out the OLED. In fact, I think they'll phase out the non-OLED and simply drop price on OLED making it the standard model.
But, I could be wrong. They could do exactly as you predict. Who knows.
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
@Giancarlothomaz
OLED screens have become a lot cheaper, especially smaller sized screens. The thing about making screens and really any product of that type is that if there's a defect you have to throw out the entire thing. Smaller sizes mean defects have a smaller impact on yield. Eg if you're making 80" TVs a 50% defect rate every square m means 75% of your panels have a defect. That same defect rate for a 5" screen? It's under 1%. That defect rate is more or less where the cost comes from. (edit: this is also why all smartwatches have OLED screens now, all but the most budget phones have OLED screens and OLED TVs ~55" are cost competitive... but 65"+ has a huge premium and 80"+ is basically LCD only)
Even without knowing the actual cost of OLED panels, which I assume what @JaxonH said is about right. When the Vita launched they weren't really selling OLED TVs to consumers at any significant volume. The yields simply weren't high enough for it to make sense. They did say they would start selling them in 2012 but that didn't really happen, the expected price at the time was ~$12,000AU for a 55". 2015 was around when they started to actually sell them and at that point it was ~$5,500AU for 55". You can get a 55" for ~$1,900AU now. Which basically tells us that the defect rate has dropped dramatically
Which is why you can't bring the Vita up in a discussion about the cost of OLED in 2022. It's an entirely different landscape now. From what I understand we're actually getting close to the point where OLED will start to actually become cheaper than LCD due to the lower complexity and reduced materials required. Especially at smaller sizes
@JaxonH, I think in the medium to long term you're probably right that there are no LCD models. But I think in the short term the OLED SKU doesn't make a whole lot of sense when there is a more premium-er SKU. Of course this is like meta-speculation but my thinking is:
Current: OLED, OG, Mini
2023: Pro (with OLED), OG, Mini
2024: Pro (with OLED), Mini, Mini OLED (revised SoC)
2025: Pro (with OLED), Mini OLED (revised SoC), Nano OLED (revised SoC)
Another piece of the puzzle on the hardware front is Dying Light 2. Cloud version was announced in September 2021 for release in February 2022 and it got delayed but not cancelled. Maybe a native version for more powerful Switch hardware is happening? Also Square Enix said not to discount the possibility of native ports of KH in the future.
It may be that the new hardware opens up a situation where base Switch gets cloud versions meanwhile meanwhile the more powerful Switch gets native ports.
@Grumblevolcano Would Nintendo have told any developers outside their own about new hardware yet? I understand they didn’t send out dev kits for the Switch until only months before it launched? It could be that the developers of Dying Light 2 decided that the sales in cloud games on the Switch wouldn’t be high enough to justify the expense.
NEW WEBSITE LAUNCHED! Regular opinion articles, retro game reviews and impression pieces on new games! ENGAGE VG: EngageVG.com
@skywake I don’t know anything about hardware, so I just have a question about the Tegra chip Ampere. Is that a totally different chipset to what others have mentioned T239? I’m sorry if it’s a stupid question I’m just trying to figure things out as there’s different suggestions about what the next hardware is going to use.
NEW WEBSITE LAUNCHED! Regular opinion articles, retro game reviews and impression pieces on new games! ENGAGE VG: EngageVG.com
I’m enjoying Chained Echoes, and still remain in awe of it at times when I consider it is a one man game for the most part, but blimey is it hard. I saw some reviews warning of difficulty spikes, and maybe I’ll find those spikes, but so far even the base line feels really, really, tough. This is kinda like an SMT game where even regular enemies can absolutely crush your team if you aren’t perfect in your strategy, plan and execution. Thankfully they include accessibility options which retune some of this stuff, I think the game would border on unplayable for me with these options.
@FragRed
I know you didn't direct your question to me, but I think I can bring some clarification. Let me just paste from a source to give you all the info:
"Nvidia Ampere is a microarchitecture for Nvidia’s 3000-Series generation of graphics cards. This means you’ll find it in GPUs such as the RTX 3090, RTX 3080, RTX 3070, RTX 3060.
For Nvidia’s consumer graphics cards, Ampere uses a 8nm process node from Samsung, which is a noticeable upgrade on the 12nm process node found in the preceding Nvidia Turing generation – having a smaller process node generally results in a higher performance.
The Nvidia Ampere generation was the first to offer support for GDDR6X memory, while also introducing third-generation Tensor Cores and second-generation ray tracing cores.
Ampere not only supports ray tracing, but DLSS too. This is Nvidia’s Deep Learning Super Sampling, using artificial intelligence and upscaling techniques to boost the frame-rate performance of any supported game with minimal compromises to the graphics quality."
Of course, each year Nvidia releases new GPUs, and along with them comes new architecture. The 4000 series has moved on to Lovelace. But that's PC for you- constantly evolving each year.
As for the T239 chip:
"Tegra T239 is expected to be a customised version of Tegra T234, which NVIDIA codenamed Orin. Currently, Tegra T239's codename is unknown, although kopite7kimi noted that 'Black Knight' is actually the codename's codename.
If the Tegra T239 is a customised Tegra T234, it should be based on NVIDIA's Ampere architecture. Reputedly, the Tegra T239 will contain 2,048 CUDA cores, up from 256 shader cores in the Maxwell-based Tegra X1. Nominally, the Tegra T239 should deliver 4 TFLOPs of performance too, a 10x increase on the Switch's current 0.4 TFLOP output."
Keep in mind, that statement at the end is not accounting for the fact it would also be downclocked similar to current Switch, to conserve battery. So I wouldn't expect a 10x increase. But 4-5x? Absolutely.
Here's what was said recently concerning the T239:
"An Nvidia employee has confirmed the existence of the Tegra239 chip, which has been rumored since 2021 as the SoC being developed for the Nintendo Switch 2.
According to the Nvidia employee, Nvidia has added support for the Tegra239 SoC, the likely SoC for Nintendo Switch 2, which has eight cores in a single cluster. In addition, the SoC manufacturer has moved num_clusters to soc data in order to avoid always over allocating memory for four clusters.
The claim made by reliable NVIDIA leaker kopite7kimi that Nvidia will use a modified version of its T234 Orin chip for the next-generation Switch is further supported by this revelation.
As of this information, the following details can be inferred about the Nintendo Switch 2 console:"
T239 SoC
8-core CPU
Ampere-based GPU that may incorporate some Lovelace features
The 2nd generation Nintendo Switch graphics API contains references to DLSS 2.2 and raytracing support
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
@JaxonH Ray-tracing on a hybrid console? very unlikely to happen, considering Sony e Xbox Series X/S is having trouble using this tecnology, too memory intensive for a console that is gonna have a graphical/techinal power equivalent to a PS4.
@Giancarlothomaz
Again, we've had this discussion just a few days ago.
It has the theoretical capability to do raytracing. Tensor Cores for DLSS can also do raytracing. That doesn't mean it's actually going to be used.
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
Raytracing requires Tensor Cores. If you don't have those Tensor Cores, it's impossible to do raytracing with Nvidia GPUs.
So the current Nintendo Switch, it has no Tensor Cores. Therefore, raytracing is impossible.
The T239 chip for Switch 2 is based on Ampere architecture which does have Tensor Cores. Therefore, it is theoretically possible to do raytracing with it, unlike the current Switch.
That doesn't mean it will do raytracing, though. Having the theoretical capability to do something doesn't mean it's practically realistic. Raytracing requires a lot of power, which is precious in a hybrid device. So I would not expect games on Switch 2 to have raytracing. But that doesn't change the fact it has Tensor Cores which are capable of it.
So why include Tensor Cores if they're not gonna be used for Raytracing? Because that's not the only thing they're good for. They're also required for DLSS upscaling, and that is what they will be used for.
But that doesn't mean no game will ever use them to implement basic raytracing. It's possible some basic raytraced reflections at 1080p combined with DLSS could happen in a few games. I seriously doubt we'll ever see raytraced lighting and shadows though. But it could happen. The point is, the chip is capable of it, irrespective of whether there's enough power to use in a practical sense.
Nobody is saying the Switch 2 will have games that utilize raytracing. It probably won't. But it is at least theoretically possible because the chip meets all the requirements necessary to implement it. Personally, I think it's a waste of resources on a hybrid device- it's one thing to implement on a power console where you can pull hundreds of watts without consequence. It's another thing entirely to implement on a device where every watt counts. But again, there are different kinds of raytracing- lighting, shadows, reflections, etc. And some types draw more power than others. So it's always possible we see at least a few games that implement some basic form of it in a less demanding game when docked. But I wouldn't expect it.
Psalms 22:16 (1,000 yrs before Christ)
They pierced My hands and feet
Isaiah 53:5 (700 yrs before Christ)
He was pierced for our transgressions
Zachariah 12:10 (500 yrs before Christ)
They will look on Me whom they pierced
Ray tracing has been around for ages as I remember it being a big thing on the Amiga in the late 80s / early 90s. But according to Wikipedia, it has, conceptually at least, been around for a little longer - the 16th century..!
8k has been around for even longer, it says here it existed in the early 12th century.
Forums
Topic: The Nintendo Switch Thread
Posts 64,941 to 64,960 of 69,785
Please login or sign up to reply to this topic