Forums

Topic: SD to HD to 4K to 8K diminishing returns...?

Posts 21 to 40 of 40

JaxonH

@skywake
And what, exactly did he state that was incorrect? Seems he knows quite a bit. That's not some advertisement or person with a bias or agenda, so what motive is there to give inaccurate information?

Or do you just not want to accept its implications?

Did you actually watch the video? I found it very informative. If course if you say its inaccurate, I'd love to have clear cut examples with evidence to the contrary. Cause I dont want to believe something that's not true. By the same token, I'm not going to reject an informative video just because someone makes a blanket claim that everything they say is wrong.

Edited on by JaxonH

All have sinned and fall short of Gods glory. Wages of sin is death. Romans

God so loved the world He sent His only Son- whoever believes on Him has eternal life. Unless you believe, you will die in your sins. Whoever believes, rivers of living water flow within them. John

Octane

@skywake Oh yeah, that's true. Comparing my HDR screen to my non-HDR, there's a clear difference. But I've also seen it compared to the high-end TVs in the store, and there's a noticeable difference. But that's also the price you pay for it I guess.

Octane

NEStalgia

The big thing that really matters is viewing distance and pixel pitch. A very low resolution display at 200" viewed from 10 blocks away looks as good as a Retina iPad up close. Ironically "4k" displays have much much more value on a 6-12" tablet display than they do on a TV because it's a few inches from your eyes...the larger pixels can be individually seen at that distance, where the dot pitch of a dense display with tiny pixels renders the pixels indistinguishable and it looks more like paper. Walk up near a TV like a store display and you'll very easily see the difference. Step back to normal viewing position and whether you can really see any difference depends on your eyes, your distance, and the display size.

For gaming, though, 4k rendering matters much more than the display outputting the resolution due to the texture detail and anti-aliasing involved. Video scaling algorithms in cutting from even 2k to 1080p makes a much more natural output than per-pixel rendering to 1080p. I buy >1080 displays for laptops/tablets/phones because at 3" from my eye, it matters. I'm fine with my 1080p displays/projectors...I'd be hard pressed to tell a difference as a near-sighted individual sitting a normal distance.

NEStalgia

skywake

@JaxonH
As I said I watched it about a week ago, it was pretty average. But just to humour you I'll scrub through the entire thing again and highlight a few of the things he got wrong.

1. BluRay vs HD-DVD wasn't a battle between Playstation & XBox, it was a battle between Sony and Toshiba. It's a bit strange to highlight the way HD-DVD was included on the 360 as "stupid". What would've been the smarter play here and for who? The 360 benefited from the exclusion by being cheaper than the PS3 at the time. HD-DVD needed exposure to it made sense for Toshiba effectively beat Sony to the market. Yes Blu-Ray won in the end thanks largely to the PS3.... but that's also why it was a smart move by Toshiba and Microsoft.

2. What he says about the Soap Opera Effect is completely wrong. The term is called the Soap Opera Effect because of it's origins, cheap Soap Operas. If you made content for TV you are more likely to film it at the TV's refresh rate while with higher budget films were designed for projectors and 24fps was the standard. So through psychology we have been trained to think of 60fps video as "cheap". Which it usually was because 60fps means a lower resolution at the same bitrate and, for filmed content, less exposure time per frame. None of these are factors for games.... infact....

3. Motion blur? Yes, it's a thing however he got it backwards. There's natural motion blur with captured content which means that you can run at fairly low framerates and it won't look choppy. With rendered content every single frame is crisp so as a result low framerate content looks choppy. This is one of the reasons why higher framerates matter more for games. Low framerates on rendered content is, if anything, the one kind of content that doesn't look smooth and natural. RL doesn't stutter.

3. The naming convention for 4K isn't "dumb" it just comes from a different place. It's literally borrowing the terminology from digital cinema because before "UHD" it was known as the resolution used for digital cinema. By comparison 2K never really took off as a digital cinema standard as such it was named after the hand-cam resolutions. You could get miniDV cameras that shot at 720p or 1080i, in 2004 what else was the early adopter putting on their brand new HDTV? BluRay and HD-DVD weren't even out yet and most films were still shot on 35mm.

4. The claim that "anything filmed before last year will never be in true 4K" is complete BS. As in really, really wrong on so many levels. See my previous comment.

5. I honestly don't know what his point about bitrate is at the end of the video. Really at this point he kinda lost the plot, I had stopped watching the video before this previously. Very strange for him to try to simultaniously argue that 4K doesn't matter because you "can't see the pixels" but also argue that the 80-120Mbps bitrate of UHD BluRay means it's all for nothing.

6. Steam users are not the highest of high end enthusiasts. The Steam hardware survey literally collects data from every single machine that Steam is installed on. Also I don't see what <1% of machines running Steam having 4K displays has to do with whether or not affordable graphics cards are close to rendering games at 4K. There are also other ways to take advantage of extra power without going to 4K. Again, higher framerates but also things like super sampling. In any case, it's a strange connection to make.

Edited on by skywake

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

JaxonH

@skywake
1 That seems kind of a petty nitpick. The battle played out between PS3 and 360. And it was a stupid implementation since you had to buy it separately. If that's not stupid Idk what is. It was included in every PS3.

2 I dont think he's wrong at all about the soap opera effect. It doesnt look weird because of how we're "trained". Nobody is trained seeing 60fps. Most non gamers dont deal with 60fps anyways. He's on point with that analysis.

3 Whether you agree or disagree the naming convention is dumb or not, something that is completely subjective opinion, that's not even remotely grounds to claim the video has no merit

4 Perhaps using absolutes is inappropriate, but it's very true most movies are remastered into 4K, not native.

5 Don't know, dont think it matters

6 I disagree. Not all Steam users are high end enthusiasts, but its well known most high end enthusiasts game on PC, specifically Steam.

So ya, I dont really see any definitive evidence to suggest what is discussed is not valid. Not from the points listed anyways

All have sinned and fall short of Gods glory. Wages of sin is death. Romans

God so loved the world He sent His only Son- whoever believes on Him has eternal life. Unless you believe, you will die in your sins. Whoever believes, rivers of living water flow within them. John

NEStalgia

@subpopz But that's totally laame (and stuff.) I mean all their friends (like) have (like) 4k TVs to play (like) Fortnite (and stuff) on!

NEStalgia

skywake

JaxonH wrote:

4 Perhaps using absolutes is inappropriate, but it's very true most movies are remastered into 4K, not native.

No really, this is the main reason why I really dislike this video. If you are going to talk about 4K being a stupid marketing ploy you have to at least get this bit right. The vast, vast, vast majority of movies have been filmed either on 35mm film or 35mm film equivalent i.e. 4K+ digital cinema cameras. If you walk into a shop and buy a UHD-BluRay you are buying the content at as close to its native resolution as you have ever been able to. With few exceptions these are not upscaled 2K films.

This is quite literally the entire point of the push to 4K. If you can't get that right you shouldn't be making a video about it pretending to be an expert on the subject.

JaxonH wrote:

5 Don't know, dont think it matters

Huh? The dude's entire point is that 4K doesn't matter because we can't see that resolution. That along with some unrelated nonsense half trying to say that 60fps is stupid. Then at the end he effectively says that UHD BluRay is a waste of time anyway because it only has a bit-rate of 80-120Mbps. "Real" 4K would be closer to 750Mbps. So why bother? Lets watch videos at 8Mbps on Netflix....

huh?

I don't know, I think it kinda matters in terms of the credibility of his argument. To me it kinda feels like he's just quoting sections of wikipedia that work with his particular pre-conceived views on the matter. Then he loses track of what he's actually saying and starts arguing with his own script. The act appears to have fooled a few people but as someone who is a bit of a nerd on this topic? To me he seems like a crazy dude on the corner yelling at a lamp post.

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

HobbitGamer

skywake wrote:

To me he seems like a crazy dude on the corner yelling at a lamp post.

Listen here, that lamp post needed to hear things that should have been said a long time ago. Inconsiderate luminescent pole...grumble grumble

#MudStrongs

Switch Friend Code: SW-7842-2075-5515 | My Nintendo: HobbitGamr | Nintendo Network ID: HobbitGamr

skywake

Just to make a point here are some of the highest grossing movies of all time and what kind of camera they were shot on.

Avatar - 100% CG, can be re-rendered at 4K
Titanic - 35mm
Star Wars: The Force Awakens - 35mm
Avengers: Infinity War - 6K digital
Jurassic World - a mix of 35mm and 65mm film
The Avengers - 35mm and a mix of digital sources
Furious 7 - Some 35mm, some 6K digital
Frozen - 100% CG, can be re-rendered at 4K
Incredibles 2 - 100% CG, can be re-rendered at 4K

Honestly aside from some specific action shots where they might use a Go-Pro or other more portable camera? Most films are shot either on film or at 4K or above. The only exception again being a brief period in the early 2000s where digital cameras were new. At that point yes there were some who went out of their way to film at 1080p.

Pretty much every movie you've watched was at a resolution similar to 4K one way or another at some point in the production line. It was then scaled down and sold to you at 480p or 1080p on discs. The limitation was the disc format and your TV, not the source.

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

NEStalgia

All the debate about original capture is kind of a red herring. Original captures in photography have little to do with the final print/display. It has to do with headroom for editing, cropping, etc, without affecting the final output format. In an era of 16+MP digital SLR cameras, you're generally printing using a bout 3MP at most for a print up to 20x30. The headroom helps for edits and crops on the cutting floor in film. I dare any TV to try to beat fine art fibre for resolution. Sure if you're shooting billboards you're going to need a bit more resolution, MF or LF in still cameras, 4k, 6k in film for big film screens, but, of course, "resolution" on a projected surface loses quite a lot of meaning as well.....the optics in front of the emitter will matter a lot more than the actual content resolution. On a home TV? Fluff most of the time. Useful for digital signage and, of course small screens for text interaction simulating paper.

NEStalgia

HobbitGamer

I'm happy with HGTV

#MudStrongs

Switch Friend Code: SW-7842-2075-5515 | My Nintendo: HobbitGamr | Nintendo Network ID: HobbitGamr

skywake

@NEStalgia
Yes, you would typically print at ~300dpi which is about the same pixel density as the Switch. Which means for traditional print sizes you're closer to 1080p than 4K. So yes, for the every day consumer a lot of those extra pixels are only useful for cropping and zooming. It's not a co-incidence that the first post I made in this thread was about digital cameras. As I said there, once we hit 8MP (and arguably 3-5MP) pixels had stopped mattering.

However we aren't talking about printing out stills of movies onto A4 sized sheets of paper. We're talking about the 55"+ panels we all have in our houses. Also the argument isn't about the merits of 5MP vs 20MP, it's about the merits of raising the default resolution cap on content from 2MP to 8MP. If the discussion was about going to 8K in a world already dominated by 4K content and displays I'd be on the same page. But it isn't.

Also yes it is true that just because a film was captured at that resolution doesn't mean it was edited at that resolution. Especially for movies within that similar period of recent history I mentioned earlier. The 4K workflow for digital video editing was certainly out of reach until fairly recently. So yes, it's not necessarily as easy as just pushing the re-export button. So again, what you are saying here is technically true to a degree.

However the implication before was that they were captured at 1080 and as such "UHD Remasters" must be snake oil. The video in particular repeated this nonsense claiming that "everything" produced before last year could never be seen at 4K. And this simply isn't true because pretty much everything for cinema has always been captured on either 4K+ or 4K+ equivalent film. Also while editing at 4K is "new" it isn't a "last year" thing.

I'd also again add that while 4K is the headline thing it's not the only thing. Arguably things like colour depth and bitrate for content and things in panels like deeper blacks and brighter highlights matter more. And yes, the fact that some newer panels have higher and variable refresh rates also matters. Dismissing UHD as a fad and only talking about resolution? Well not only are you wrong about the resolution but you're even more wrong by ignoring the other more important advances that are riding in on its coat-tails.

Edited on by skywake

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

KaiserGX

I guess that goes for other technology as well. We're reaching a point where it's possible to support better stuff but might not be practical or can be supported unless there are bigger leaps. I'm fine with 1080p. Another thing people don't seem to be mentioning is the more power it takes to run this stuff and the huge file sizes that come with that. So yes, I'm good with the Switch being 1080p and 720p in handheld.

✉ Youtube: http://www.youtube.com/kaisergx
✉ Twitch: http://www.twitch.tv/kaisergx
✉ Twitter: https://twitter.com/kaisergx

Switch Friend Code: SW-3625-8025-1230 | My Nintendo: KaiserGX | Nintendo Network ID: KaiserGX

skywake

A few of other things I want to add to this discussion before it disappears into nowhere land again. Firstly while I have been pushing back against the people suggesting that 4K is snake oil there is definitely diminishing returns. As much as I might say the ceiling of resolution doesn't stop at 1080p at the same time I don't really care that much. I'm not actively avoiding it like I did with 3D but I'm also not exactly going out of my way to make the jump to 4K. It matters.... but 1080p is good enough.

And the truth is that all of the big companies know this. It's why UHD has these other technologies coming along for the ride like HDR, Freesync, Dolby Atmos, H.265. NVidia sees the writing on the wall for the resolution race and is starting to push RayTracing for more realistic lighting effects. So while the next generation of consoles might be all about 4K it's going to be about 4K because 4K is relatively easy to do. I think it'll be a long time if ever before they're pushing 8K because, frankly, RayTracing is as hard to render and more impressive.

Put simply, 4K matters but even now the movement is away from a pure push for resolution. I think we'll see consoles pushing for stuff like this before we see one that's advertising 8K:

Edited on by skywake

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

Trajan

I read somewhere that to see the difference from 1080 to 4k you would need to be like 2 inches from a 55"?

Disclaimer: until I got my Switch I had a 27" CRT. Even tried making the switch work with it, but the tv couldn't change the aspect ratio so I got a 32" 1080.

I just like games. Probably won't get a 4k tv for a long time. Don't see the need. I only just play my Switch and SNES classic on it.

One thing I hate is pixels make old consoles look horrible. And lmao at people saying that's the retro look. CRTs didn't have pixels.

My CRT is in the closet. Eventually I'll have room for it.

Edit: the Switch isnt even powerful enough to render most games in 1080 as is.

As far as cameras, the big jump to me seemed to be once we got north of 10. Photo people say MP doesn't really matter now. Just like "bits".

Edited on by Trajan

Sakurai: Which is why I think we should forget about console wars and focus on what’s really important: enjoying the games themselves.

"If we did this (mobile games), Nintendo would cease to be Nintendo." - Iwata

6ch6ris6

1080p 60fps is the sweetspot in every regard: ressources, visually, financially etc.

Ryzen 5 2600
2x8GB DDR4 RAM 3000mhz
GTX 1060 6GB

skywake

Trajan wrote:

I read somewhere that to see the difference from 1080 to 4k you would need to be like 2 inches from a 55"?

Assuming you don't have garbage vision it's actually closer to about 2m for a 55" set. But there is something to this. Here's a graph showing the point at which 1080p and 4K make a difference against the optimal distance you should be sitting from your TV.
Untitled
Note that 1080p makes sense for pretty much any sized panel for any kind of content. Comparatively 4K only really makes a difference sense at "optimal distances" for panels 60" or bigger and even then only for a cinema experience.

Trajan wrote:

One thing I hate is pixels make old consoles look horrible. And lmao at people saying that's the retro look. CRTs didn't have pixels.

CRTs have pixels. What they lack is enough size to make them noticeable at reasonable distances from the display. There's also the fact that they are a lot more rounded and that a composite signal really softens the image.

Trajan wrote:

the Switch isnt even powerful enough to render most games in 1080 as is.

And the Wii was only capable of rendering games at 480p. Just because one piece of hardware can't support the full capabilities of a display doesn't mean the display has no value. As I've said in this thread already, there is plenty of 4K content out there. There's more 4K content now at a lower cost and with less risk than there was HD content when we transitioned to 1080p.

Trajan wrote:

As far as cameras, the big jump to me seemed to be once we got north of 10. Photo people say MP doesn't really matter now.

As someone who got into digital photography pretty early on I'd say the magic number was about 5MP. Once cameras started to hit 5MP the quality of the lens and the size of the sensor became more important. Of course it's worth pointing out that when we're talking displays here not cameras. So you need to remember that:
1. 1080p is only 2MP
2. Pretty much every bit of content you watch is filmed with a high end camera with a quality lens
3. Anything that's CG, including games, has every pixel rendered

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

erv

@skywake interesting. Is that average distance to notice, or distance to position yourself for an ideal experience?

Oh and I agree about the camera parallels. I remember shooting models with a "super high end 4mp digital camera!" in art school back when those were quite rare lol, that was all they talked about.

Switch code: SW-0397-5211-6428
PlayStation: genetic-eternal

Nintendo Network ID: genet1c

skywake

erv wrote:

@skywake interesting. Is that average distance to notice, or distance to position yourself for an ideal experience?

The two "resolution" lines are the distance at which someone with 20/20 vision can theoretically see a difference. Basically if you are bellow the line then you can see the difference, if you are above that line then you can't. Of course there's a bit more to newer panels then simply resolution but when we are talking resolution this is the graph that matters.

The other two lines are recommended standards for viewing distance. The movie one is the THX standard which pretty much any movie theatre will be built around. The other line is the TV equivalent as defined by the Society of Motion Picture & Television Engineers. For these two lines you are ideally sitting at a position where you are on that line. Too far away and it's not filling up enough of your field of view, too close and you're missing out on what's happening on the edges of the screen

Some playlists: Top All Time Songs, Top Last Year
"Don't stir the pot" is a nice way of saying "they're too dumb to reason with"

This topic has been archived, no further posts can be added.