kye Posted April 4, 2018 Share Posted April 4, 2018 I've been playing with Magic Lantern RAW and looking at different bit-depths, and after all the conversation in the BMPCC thread (and the thousands of 8-bit vs 10-bit videos) I got the impression that bit-depth was something quite important. Here's the thing, I can't tell the difference between 10bit, 12bit and 14bit. I recorded three ML RAW files at identical resolutions and settings, just the bit-depth varied. I converted them in MLV App to Cinema DNG (lossless) and pulled them into Resolve. Resolve says one is 10-bit, one is 12-bit and one is 14-bit, so I think the RAW developing is correct. I just can't see a difference. I tried putting in huge amounts of contrast (the curve was basically vertical) and I couldn't see bands in the waveform. I don't know what software to use to check the number of unique colours (I have photoshop, maybe that will help?). Am I doing something wrong? Does a well-lit scene just not show the benefits? Is 10-bit enough? Am I blind to the benefits (it's possible)? I've attached one of the frames for reference.. Thanks! M02-1700_000000.dng Quote Link to comment Share on other sites More sharing options...
Deadcode Posted April 4, 2018 Share Posted April 4, 2018 ML RAW is uncompressed it's hard to break even a 8 bit raw file. You cannot see difference between 12 and 14 bit, because the last 2 bit contains noise most likely. You can see difference between 10 bit and 12 bit if you lift extreme amount of shadows, shadow banding can occur (caused by noise) with 10 bit files, but with such a heavy lifting the footage is unusable regardless of bit depth. You will not see any difference between the 10/12/14 bit (raw) files in everyday scenario. Why didnt you asked this question in the topic where it belongs? kye 1 Quote Link to comment Share on other sites More sharing options...
Papiskokuji Posted April 4, 2018 Share Posted April 4, 2018 As Deadcode said, it has more to do with raw than with bits. I know you tried to push hard the image so it doesn’t apply to you kye, as having a 10 bit file helps with grading no matter what, but I find it always annoying those comparisons about 8 and 10 bits SOOC saying « i can’t see a difference » when you know almost NOBODY has a 10 bit display. 10 bit doesn’t do much of a difference for internet delivery anyway, but there is a difference, especially coupled with a good codec like prores (codec is even more important than bits to me!). Deadcode, kye and Kisaha 3 Quote Link to comment Share on other sites More sharing options...
webrunner5 Posted April 4, 2018 Share Posted April 4, 2018 5 hours ago, Papiskokuji said: As Deadcode said, it has more to do with raw than with bits. I know you tried to push hard the image so it doesn’t apply to you kye, as having a 10 bit file helps with grading no matter what, but I find it always annoying those comparisons about 8 and 10 bits SOOC saying « i can’t see a difference » when you know almost NOBODY has a 10 bit display. 10 bit doesn’t do much of a difference for internet delivery anyway, but there is a difference, especially coupled with a good codec like prores (codec is even more important than bits to me!). Out of likes. Well stated. Most of us are WAY behind on our displays on current tech. Me included. Quote Link to comment Share on other sites More sharing options...
HockeyFan12 Posted April 4, 2018 Share Posted April 4, 2018 6 hours ago, Papiskokuji said: As Deadcode said, it has more to do with raw than with bits. I know you tried to push hard the image so it doesn’t apply to you kye, as having a 10 bit file helps with grading no matter what, but I find it always annoying those comparisons about 8 and 10 bits SOOC saying « i can’t see a difference » when you know almost NOBODY has a 10 bit display. 10 bit doesn’t do much of a difference for internet delivery anyway, but there is a difference, especially coupled with a good codec like prores (codec is even more important than bits to me!). Codec is WAY more important, but the whole bit thing is kind of a mess. The human eye is estimated to see about 10 million colors and most people can't flawlessly pass color tests online, even though most decent 8 bit or 6 bit FRC monitors can display well over ten million colors: 8 bit color is 16.7 million colors, more than 10 million. And remember sRGB/rec709 is a tiny colorspace compared with what the human eye can see anyway, meaning 16.7 million fit into a smaller space should be plenty overkill. But also remember that digital gamuts are triangle shaped and the human eye's gamut is a blob, so fitting the whole thing into the blob requires overshooting tremendously on the chromasticities, resulting in many of those colors in digital gamuts being imaginary colors.... so the whole "8 bit is bad" thing needs a lot of caveats in the first place... I haven't tried 10 bit raw from the 5d, but I suspect in certain circumstances (100 ISO just above the noise floor) 10 bit will have visibly higher contrast noise than 14 bit after grading, though only if it's the exact same frame and you A/B it will the difference be apparent. That's my guess. Something VERY subtle but not truly invisible, though possibly effectively invisible. It's possible there could be banding, too, but the 5D III sensor is quite noisy. The science behind it is so complicated I gave up trying to understand. The more I learned the more I realized I didn't understand anything at all. First you're dealing with the thickness of the bayer filter array and how that dictates how wide the gamut is, then you're dealing with noise floor and quantization error and how that works as dithering but there's also read noise that can have patterns, which don't dither properly, then you're dealing with linear raw data being transformed with a certain algorithm to a given display or grading gamma, as well as translating to a given gamut (rec709, rec2020, etc.) and how wide that gamut is relative to the human eye and how much of the color there is imaginary color, and then what bit depth you need to fit that transformed data (less than you started with, but it depends on a lot of variables how much less), and then you introduce more dithering from noise or more banding from noise reduction, then compression artifacts working as noise reduction and to increase banding via macro blocking, then there's sharpening and other processing, then... then it goes on and on to the display and to the eye and of course that's only for a still image. Macroblocking and banding aren't always visible in motion, even if they are in a still, depending on the temporal banding and if the codec is intraframe or inter-frame. It's possible everyone who's proselytizing about this understands it far better than I do (I don't understand it well at all, I admit). But I frequently read gross misunderstandings of bit depth and color space online, so I sort of doubt that's the case that every armchair engineer is also a real one. (That said, there are some real engineers online, I just don't understand everything they write since I'm not among them.) I know just enough about this to know I don't know anything about this. From the armchair engineers, we do have some useful heuristics (overexposed flat log gamma at 8 bits heavily compressed will probably look bad; raw will probably look good), but even those aren't hard and fast rules, not even close to it. All you can do beyond that is your own tests. Even inexpensive monitors these days can display close to 100% NTSC. They should be good enough for most of us until HDR catches on, and when it does bit depth will matter a lot more. kye and sam 2 Quote Link to comment Share on other sites More sharing options...
sam Posted April 4, 2018 Share Posted April 4, 2018 very well put. what's a typical reference monitor setup on the sets you frequent though? Quote Link to comment Share on other sites More sharing options...
HockeyFan12 Posted April 5, 2018 Share Posted April 5, 2018 22 minutes ago, sam said: very well put. what's a typical reference monitor setup on the sets you frequent though? I don't spend much time on set. I see some Panasonic monitors for directors. I think DPs rely on the Alexa viewfinder, which is pretty good, or SmallHD maybe. I don't know about high end sets. I suspect iPads are getting more common for directors and producers, but obviously not for DPs. In post a Flanders or a calibrated Plasma seems sufficient for anything except HDR. I don't pay attention that carefully to brands I can't afford. :/ Quote Link to comment Share on other sites More sharing options...
IronFilm Posted April 5, 2018 Share Posted April 5, 2018 24bit depth or bust! Oh wait... sorry, are we not talking about audio?! sam 1 Quote Link to comment Share on other sites More sharing options...
IronFilm Posted April 5, 2018 Share Posted April 5, 2018 4 hours ago, sam said: very well put. what's a typical reference monitor setup on the sets you frequent though? Flanders Scientific / Sony / Panasonic / etc are some of the common ones I see on sets with budgets. On low / no budgets then I can see almost "anything" used! Quite commonly, just a TV or PC monitor: This btw is the trailer of the film which was shot during the previous vlog from just before: It was just shot on a lowly Canon Rebel DSLR, seriously. A BTS shot of the director (it was her first ever film) and the camera (oh, and you can see my blimp too in the shot! ha): An indoors pic (you can tell from the HDMI cord running out the director is sitting somewhere there to the right of the pic, monitoring the shot): Me again, this was back when I was still using my Tascam DR680: kidzrevil 1 Quote Link to comment Share on other sites More sharing options...
IronFilm Posted April 5, 2018 Share Posted April 5, 2018 Dennis Hinsburg is a smart cookie, so I'd take note of what he says, and he just started a thread here about his monitors: http://www.dvxuser.com/V6/showthread.php?359829-Impressed-with-Dell-UP2716D-vs-my-Flanders-Scientific Quote Link to comment Share on other sites More sharing options...
kye Posted April 5, 2018 Author Share Posted April 5, 2018 Thanks @Deadcode @Papiskokuji @HockeyFan12 that totally makes sense - I'd forgotten that it was RAW vs compressed codecs. @Deadcode I asked it separately as I thought the answer would go into what scenes benefited from extra bit depth or that it was me being blind. ML RAW just happened to be how I got the files, and I only included it in case I was screwing something up and not really getting the extra bits! This basically answers my question, which ultimately was about what bit-depth I should be shooting. This is especially relevant with ML because if I lower the bit depth I can raise the resolution, which I thought I would have to trade-off against the gains of the extra bit-depth. 10 hours ago, HockeyFan12 said: I don't spend much time on set. I see some Panasonic monitors for directors. I think DPs rely on the Alexa viewfinder, which is pretty good, or SmallHD maybe. I don't know about high end sets. I suspect iPads are getting more common for directors and producers, but obviously not for DPs. In post a Flanders or a calibrated Plasma seems sufficient for anything except HDR. I don't pay attention that carefully to brands I can't afford. :/ On another forum I saw a post talking about getting new client monitors while things were on sale - they said they needed about half-a-dozen of them, and I just about choked when I read their budget was $8000 ..... each! I think I only recognised every second word in the rest of the thread - between brand names and specifications etc. It's another world! Quote Link to comment Share on other sites More sharing options...
Deadcode Posted April 5, 2018 Share Posted April 5, 2018 56 minutes ago, kye said: @Deadcode I asked it separately as I thought the answer would go into what scenes benefited from extra bit depth or that it was me being blind. ML RAW just happened to be how I got the files, and I only included it in case I was screwing something up and not really getting the extra bits! This basically answers my question, which ultimately was about what bit-depth I should be shooting. This is especially relevant with ML because if I lower the bit depth I can raise the resolution, which I thought I would have to trade-off against the gains of the extra bit-depth. ML RAW is important here because the way it was implemented. While you record 10 bit raw, the camera truncate the 14 bit signal's last 4 bit while saving the footage to the CF card. After that when you are using MLV Dump while you import or transform your files to DNG the app applies the missing bits to the DNG files. So technically you work with 14 bit files in resolve even if it was recorded in 10 bit, where the least important 4 bits are noise. (probably MLV dump add "0000" to the end) Quote Link to comment Share on other sites More sharing options...
sam Posted April 5, 2018 Share Posted April 5, 2018 7 hours ago, IronFilm said: Dennis Hinsburg is a smart cookie, so I'd take note of what he says, and he just started a thread here about his monitors: http://www.dvxuser.com/V6/showthread.php?359829-Impressed-with-Dell-UP2716D-vs-my-Flanders-Scientific if you read that, he's using it as a gui monitor only. Fsi with a proper signal path is what Dennis uses for those that don't click the link. Quote Link to comment Share on other sites More sharing options...
IronFilm Posted April 5, 2018 Share Posted April 5, 2018 2 hours ago, kye said: On another forum I saw a post talking about getting new client monitors while things were on sale - they said they needed about half-a-dozen of them, and I just about choked when I read their budget was $8000 ..... each! I think I only recognised every second word in the rest of the thread - between brand names and specifications etc. It's another world! Which forum was that? Yeah in some worlds $8K/monitor is not unreasonable at all Quote Link to comment Share on other sites More sharing options...
kye Posted April 5, 2018 Author Share Posted April 5, 2018 11 minutes ago, IronFilm said: Which forum was that? Yeah in some worlds $8K/monitor is not unreasonable at all IIRC it was lowepost.com but I just had a quick look and couldn't't find it. If that isn't it then maybe LGG? Quote Link to comment Share on other sites More sharing options...
IronFilm Posted April 6, 2018 Share Posted April 6, 2018 13 hours ago, kye said: lowepost.com Ah cool, I discovered a new website! Thanks Quote Link to comment Share on other sites More sharing options...
kye Posted April 6, 2018 Author Share Posted April 6, 2018 1 hour ago, IronFilm said: Ah cool, I discovered a new website! Thanks Not sure if you're on the CML? https://cinematography.net I've only dipped my toes in the water a little there, but it seems to have lots of people who casually talk about RAW workflows for Red and Arri cameras. A very different world than the one I live in! Quote Link to comment Share on other sites More sharing options...
kidzrevil Posted April 7, 2018 Share Posted April 7, 2018 I only shoot 14 bit. The 2-4 bits you chop off has a massive amount of data in them. The most I’ll do is convert the 14 bit file to a 12 bit cdng Quote Link to comment Share on other sites More sharing options...
Mark Romero 2 Posted April 7, 2018 Share Posted April 7, 2018 3 hours ago, kidzrevil said: I only shoot 14 bit. The 2-4 bits you chop off has a massive amount of data in them. The most I’ll do is convert the 14 bit file to a 12 bit cdng What are you shooting with??? But for some crazy reason I thought you shot sony 8-bit cameras, but I guess I am wrong... Quote Link to comment Share on other sites More sharing options...
kidzrevil Posted April 7, 2018 Share Posted April 7, 2018 @Mark Romero 2 yeah I shoot with the Canon 5d Mark iii as well. I use the 14 bit raw hack Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.