Mark Romero 2 Posted October 3, 2018 Share Posted October 3, 2018 Alistair Chapman made a comment in an interview that broadcasters will more likely be using Rec 2020 sooner than they would be broadcasting in 4K since Rec 2020 doesn't require more bandwidth than Rec 709, while 4K requires a lot more bandwidth that HD. (By broadcasting I mean ALL forms of broadcasting whether over the air, internet, 5G... whatever) I know they aren't mutually exclusive or anything like that; Many 4K TVs sold today are already HDR capable. But does that make sense to you guys? Do you think we will see a wider adoption of HDR over SD sooner than we will see an adoption of 4K over 1080p??? I know Tony Northrop said something like they only shoot their videos in 4K now because they wanted to make them "future proof." My understanding is they still shoot in SD gamma and gamuts though... Then again, maybe I am misunderstanding the science behind all this... AlexTrinder96 1 Quote Link to comment Share on other sites More sharing options...
AlexTrinder96 Posted October 3, 2018 Share Posted October 3, 2018 One thing I've noticed when watching premier League football in 4k (on sky q) is that the highlights in the scene always look really close to clipping and yet on the hd channel they look fine! I know sky is releasing an update for half/hdr in early 2019; that should kickstart it! I do agree with Alister that hdr doesn't need to be paired with 4k! Interesting topic ? Quote Link to comment Share on other sites More sharing options...
KnightsFan Posted October 3, 2018 Share Posted October 3, 2018 Please correct me if I'm wrong, but I thought Rec.2020 only specifies 4k and 8k resolutions, using standard dynamic range, not HDR. Perusing Wikipedia, I'm finding: Rec.709: standard dynamic range, standard color gamut, 1080p, 8 or 10 bit Rec.2020: standard dynamic range, wide color gamut, 2160p/4320p, 10 or 12 bit Rec.2100: high dynamic range, wide color gamut, 1080p/2160p/4320p, 10 or 12 bit So perhaps Alistair Chapman was referring to Rec.2100? (Not trying to be pedantic, just making sure I understand the alphabet soup here!) Back on topic, I think 4k is easier to market for whatever reason, so we will see mass adoption of 4k before HDR. The public seems to "understand" 4k better than they do HDR. Moreover, we're all agreed on what 4K is, whereas HDR is still in a kind of format war from what I can see, between HLG and PQ. Mark Romero 2 1 Quote Link to comment Share on other sites More sharing options...
webrunner5 Posted October 3, 2018 Share Posted October 3, 2018 Rec. 2020 requires 10 bit or better. A Lot of these new cameras are not doing 10 bit, let alone 1080p 10 bit. Only 4k 10 bit. Sure you can do an external recorder, but then you are back to a big, bulky rig. Quote Link to comment Share on other sites More sharing options...
Mark Romero 2 Posted October 3, 2018 Author Share Posted October 3, 2018 1 hour ago, KnightsFan said: Please correct me if I'm wrong, but I thought Rec.2020 only specifies 4k and 8k resolutions, using standard dynamic range, not HDR. Perusing Wikipedia, I'm finding: Rec.709: standard dynamic range, standard color gamut, 1080p, 8 or 10 bit Rec.2020: standard dynamic range, wide color gamut, 2160p/4320p, 10 or 12 bit Rec.2100: high dynamic range, wide color gamut, 1080p/2160p/4320p, 10 or 12 bit So perhaps Alistair Chapman was referring to Rec.2100? (Not trying to be pedantic, just making sure I understand the alphabet soup here!) Back on topic, I think 4k is easier to market for whatever reason, so we will see mass adoption of 4k before HDR. The public seems to "understand" 4k better than they do HDR. Moreover, we're all agreed on what 4K is, whereas HDR is still in a kind of format war from what I can see, between HLG and PQ. You are probably more right than I on this. He might have been referring to Rec. 2100 (it was an inteview with the guy from ProAv but I can't find the clip now). But either way, pretty sure he was saying that HDR has a better chance of being more readily adapted than 4K because the need for bandwidth is less. Agree that 4K might be an easier sell than HDR, and I think their are competing "versions" of HDR too so that might be difficult as well. Here is kind of an interesting article geared toward consumers about 4K and HDR. Can't vouch that it is technically correct but it is kind of an interesting read. https://www.businessinsider.com/4k-tv-hdr-whats-the-difference-2016-8#why-hdr-will-make-your-4k-tv-worth-it-4 So either way we just kind of have to wait and see. Maybe I am just more prone to FUD than most people, but I do wonder now whether it is smart to invest LONG TERM in a system that might not have HDR and WCG... 1 hour ago, webrunner5 said: Rec. 2020 requires 10 bit or better. A Lot of these new cameras are not doing 10 bit, let alone 1080p 10 bit. Only 4k 10 bit. Sure you can do an external recorder, but then you are back to a big, bulky rig. They can do internal 4K 10-bit but not 1080p 10-bit?!?!? I think I need to put my credit card back in my wallet and go lie down somewhere for a while... Quote Link to comment Share on other sites More sharing options...
KnightsFan Posted October 3, 2018 Share Posted October 3, 2018 I think the difference may be that for 4K, commercials can zoom in and say "if you had a 4k screen, you would see THIS much detail!" and that demonstration works pretty well, even on an HD screen. HDR is literally something you can't display on your current screen, so marketing is like "well, we can't show you what it is unless you buy the hardware." It's way too abstract unless you either see it yourself, or have some prior knowledge about how displays work. The hurdle that I see with HDR is that Rec 709 and sRGB are so entrenched, not just for pro/semi-pro web and T broadcast, but for desktops, documents, video games, and everything else we see on screens. Scaling an HD image (whether it's a film or Windows Explorer) to a 4k screen is simple. I'm not sure how easy it is to coordinate all the moving parts for the switch to HDR. For example, I've got some old photos from ten years ago. If I get an HDR/WCG monitor, will those photos display properly? I don't know if they even have the necessary metadata to identify them as sRGB. Will my video games from the early 2000's look correct? How about my DVD collection? It seems like a bigger mess for backwards and/or forwards compatibility to go to HDR, compared to 4k. Quote Link to comment Share on other sites More sharing options...
androidlad Posted October 4, 2018 Share Posted October 4, 2018 BT.2020 for now is only used as a container colour space, only a handful of RGB laser projectors can fully cover the gamut. Current UHD Bluray discs are encoded in P3 D65 colour space using BT.2020 primaries. Quote Link to comment Share on other sites More sharing options...
jonpais Posted October 4, 2018 Share Posted October 4, 2018 I’ve got no opinion either way on the original topic, since all I watch are the occasional youtube video and netflix. however, I can say that watching hdr on my lg oled, hdr images look like they have more detail than sdr images. it can be very beautiful to see and is not the same as UHD vs HD, which I have trouble telling apart anyway. that highly detailed appearance is the result of the much higher local contrast hdr offers. (lack of blooming probably enhances this effect too) unfortunately for filmmakers, we won’t be seeing affordable true hdr monitors for a while, even though the tech appears to be there. and vesa’s loose guidelines have allowed manufacturers to falsely claim every model on the showroom floor is hdr, just as I predicted they would. the few hdr monitors that are out there are for gamers, not content creators. Quote Link to comment Share on other sites More sharing options...
KnightsFan Posted October 4, 2018 Share Posted October 4, 2018 41 minutes ago, androidlad said: BT.2020 for now is only used as a container colour space, only a handful of RGB laser projectors can fully cover the gamut. Current UHD Bluray discs are encoded in P3 D65 colour space using BT.2020 primaries. To be clear, that's just because the content itself was mastered in P3. The P3-based image data needs a transformation to look correct if it is displayed in the Bt.2020 space. So it's not "encoded" in P3, it's encoded in Bt.2020, but doesn't use the parts of the Bt.2020 gamut that are outside the P3 gamut. Right? Quote Link to comment Share on other sites More sharing options...
webrunner5 Posted October 4, 2018 Share Posted October 4, 2018 My son and I were at Best Buy a few days ago and they had a few OLED HDR TV's there. Mostly Sony and LG. And they were definitely a stand out compared to the "normal" TV's there. Buty the cheapest one that was really worth having was a Sony 55" and it was over 2000.00. Most of the better ones were over 3000.00. But they were pretty big, like 60 to 72". They were damn impressive though, I will say that. There was no doubt you were looking at something pretty special. Quote Link to comment Share on other sites More sharing options...
thebrothersthre3 Posted October 4, 2018 Share Posted October 4, 2018 4k is definitely more marketable. But I think it will just be standard after some point. You don't actually need a 10bit camera to get a good HDR image though. Most cameras can do 10 stops of dynamic range in 8 bit these days Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.