Jump to content

Prove me wrong... 10bit is a load of B****cks


Andrew Reid
 Share

Recommended Posts

  • Administrators

I just remembered this from a while ago on the Sony a7s iii...

https://www.eoshd.com/news/sony-a7s-iii-10bit-image-quality-vs-same-camera-in-8bit-with-surprising-results/

I think it's still relevant now.

Can someone prove me wrong and actually show with some examples that are not TOO extreme in terms of a silly grade that you wouldn't use in real life?

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

Depends on the specific flavour of codec. The first 10bit codec on the S1 was the only 10bit one without the Vlog update. It offered 75mbps 420 h265 for HLG at UHD resolution. That mode provided nice colors and dynamic range. But it fell apart into macroblocking tetris on homogenously colored surfaces or a blue sky. The 8bit 100mbit h264 for 709 outperformed it big time in that regard. The later 10bit 150Mbit 422 running circles around both.

The 8bit 4k mode on the A7SII was easily bettered by a GX80 in regards of color, tones and lack artefacts in 709.

The 10bit 444 HD on the Canon C300II performed the worst artefact I've ever seen under lowlight. It mushed a persons head completely away on a few of the frames. Crazy and hillarious. UHD has problems too. Cine 4K seems best from my experience. Some people prefer to record externally on the MKII.

Prores 4444 on the Ursa 4.6 couldnt rescue IR poluted footage.

A good 8bit codec is no slouch, whereas certain 10bit codecs can perform certain flaws.

Link to comment
Share on other sites

  • Administrators

Some 10bit codec are worse than the best 8bit stuff, you're right.

MJPEG on the 1D C is still a thick chonky image... In 8bit.

The compression quality, macro blocking, noise reduction and DSP all are more important than whether it is 10bit or not, in my view!

Link to comment
Share on other sites

I think Starwars episode 1 was shot in 8bit 422 on a three 2/3" CCD HD camera. The 10bit codecs and pipeline on the S5,1,1H are pretty impressive and would not make me yearn for 8bit. Still, I have been impressed by Rec709 glory in 8bit 420 colors on the tiny Lumix LX10 when using 4K. It's a camera that needs to be graded though. Ooc colors were a shock to me when I saw them for the first time. I would love the 4K 10bit 150mbit HLG though for a successor, that's for sure.

I still enjoy to browse 1DC videos.:)

18 minutes ago, Andrew Reid said:

MJPEG on the 1D C is still a thick chonky image... In 8bit.

 

Link to comment
Share on other sites

6 hours ago, Andrew Reid said:

I just remembered this from a while ago on the Sony a7s iii...

https://www.eoshd.com/news/sony-a7s-iii-10bit-image-quality-vs-same-camera-in-8bit-with-surprising-results/

I think it's still relevant now.

Can someone prove me wrong and actually show with some examples that are not TOO extreme in terms of a silly grade that you wouldn't use in real life?

IMO 10bit and 8bit are identical in perfect lighting conditions. Even when color grading the two, to me they are identical since few people are going to color grade to such an extreme that the difference will matter.

The key words though from my statement above is "perfect lighting conditions". Where 10bit shines (when paired with a good codec) and far outperforms 8bit is when the lighting conditions are not so perfect.

For event work I do not usually have time to get the exposure or white balance perfect. At events people couldn't care less about mixed lighting, or the fact the sun just went behind a cloud while they are posing, or I just came from a brightly lit area to a heavily shaded area, they just want their 5s in front of the camera then on about their day.

What I have found in those scenarios is that 10bit combined with LOG is far more forgiving than 8bit. 8bit falls apart very quickly if you start changing the WB or tint, or bring up the mids or shadows. 8bit likes to start adding color shifts, banding, or highlight rolloff gets harsh and skin tones are the first to suffer; whereas modern 10bit feels almost as pliable as raw footage. 

I used to think 10bit was overkill until I went from the 8bit GH5 to the 10bit S5. The latitude that I had to fix marginal footage was eye opening for me. Also surprising to me, was I then went from 10bit to raw out of the C70 and R5 and raw was not some magical step up to me. Raw seemed only slightly more pliable than 10bit but with massive file sizes; if anything I might even say that raw might be the actual load of b***** strictly from a color grading/exposure fixing standpoint.  Obviously raw comes with other benefits especially with Canon cameras depending on the camera and sensor, but I would say the difference between the raw that I have seen and 10bit seems to be smaller than the difference between 8bit and 10bit when you need to fix a marginal clip.

 

6 hours ago, Andrew Reid said:

Can someone prove me wrong and actually show with some examples that are not TOO extreme in terms of a silly grade that you wouldn't use in real life?

I don't do charts and graphs let alone have time to shoot the exact same scene in both 8bit and 10bit, but I do have over 15yrs of shooting events and working in the fashion industry where they are very picky with their skin tones, and I can tell you from firsthand experience I would never want to go back to 8bit.

One thing I do wonder sometimes is if 10bit 4:2:0 vs 10bit 4:2:2 matters and I questioned that difference a lot as my previous editing workstation could not smoothly edit 4:2:2 out of the Canons since NVIDIA cannot HW accelerate H.265 4:2:2 but these days I don't notice or care anymore after upgrading to a QS capable CPU.

Link to comment
Share on other sites

Oh, this is a good chance to test some lenses I just got. I haven't done many format comparisons on my S5 yet.

As you show in your blog post, there's a decent difference between 10 and 8 bit at equivalent total bitrate in the form of color splotchiness/posterization. To me it's pretty visible under normal viewing conditions (which, in my case, is a fairly nice 4k PC screen with controlled ambient lighting). I would consider your example shots of the sky to be a significant difference in overall effect. That said, I wouldn't be upset using 8 bit, but since there's no downside to 10 bit, I can't imagine picking 8 bit instead, no matter how small the difference.

Something to keep in mind is that very few of us ever see anything as actual 10 bit. On Windows, even Resolve only outputs 10 bit with a Blackmagic card hooked up to a dedicated monitor--so even if your screen is 10 bit, you need to do some homework to figure out which software will output that kind of signal. And obviously most image formats are 8 bit, so as soon as you take a screenshot and put it on the web as a comparison, it's all 8 bit. There is of course HEIF in 10 bit, and then there are flavors of PNG and TIFF in 16 bit.

Link to comment
Share on other sites

On the S5, the difference between 8 and 10 bit is very obvious. I shot a scene twice, once with 8 bit 100 Mbps (93 Mbps actual), and once with 10 bit 75 Mbps (66 Mbps actual). Both clips have a color space transformation to Rec709, but no other adjustments. The amount of color noise and weird color in 8 bit is very high, and is almost nonexistent in 10 bit. Viewed at 100% scale in motion in my normal viewing condition, it's apparent that "something is off" particularly in skin tones (I did tests facing the camera, too, which I'm not posting). To me, this is part of the "thick" color that people talk about: pure tones without posterized splotches of color noise, which gets worse with saturation.

Now clearly this is something to do with the encoder, since the output images that I am posting here are both 8 bit PNGs. So I'm not going to make universal statements, but I will say that on every camera that I have ever used, I get nearly the same result.

1091564498_10bit.thumb.png.cd7ece6faf689461f489558284d24b3e.png

114192513_8bit.thumb.png.46527526bdd27e710c300cf2d151fa58.png

 

You can see the difference everywhere in the image, but particularly notice the evenness of the skin on my arm, the black on the t-shirt, the plain green wall, and the deepness of the brown on the wood on the darker parts of the piano. Is this enough of a difference for average audience members to notice, even subconsciously? Probably not. However, I do notice it consciously, and considering the 10 bit file looks better AND is 25% smaller file size, I would never choose 8 bit on this camera.

arm.thumb.png.ed1522d4543888342bf11a23661bc83e.png

Link to comment
Share on other sites

48 minutes ago, maxJ4380 said:

You had a lot of nice things to say about the codecs and other stuff in the Panasonic GH5 Review and exclusive first look at Version 2.0 firmware - EOSHD.com - Filmmaking Gear and Camera Review

Has anything changed ? because i'm almost sold on a gh5.  

 

i would go for a GH5M2. i have both and M2 has the latest hw and fw.

Link to comment
Share on other sites

actually 8 bit cams have a distinct advantage. using c300 og as an example, its default wide dr and eos standard picture profiles are very usable. canon has to offer its best to make buyers happy with its 8 bit codec. I am very happy with these two profiles, good enough for almost all of my needs. 

Link to comment
Share on other sites

11 hours ago, zlfan said:

actually 8 bit cams have a distinct advantage. using c300 og as an example, its default wide dr and eos standard picture profiles are very usable. canon has to offer its best to make buyers happy with its 8 bit codec. I am very happy with these two profiles, good enough for almost all of my needs. 

Feel free to post examples to back up what you say.

Link to comment
Share on other sites

On 10/7/2024 at 9:54 PM, KnightsFan said:

Feel free to post examples to back up what you say.

take what i said as you like. 

my now strategy is to use canon cinema line cams for events, streets, i.e., people face related scenarios. i don't use the handle and the xlr/monitor, so the ergonomics is nice for my needs. i use sony cams like f3, fs100, fs700, f5, fs7, for landscape. sorry, sony fanboys. it is what it is. af100 is difficult to use, i can only use it for events, but it is not as good as canon cams for skin tones, but it is good for pattern related situations, c300 og sometimes has alias and moire artefacts. i use my reds and magic lantern cams for heavy post. 

Link to comment
Share on other sites

for 8 bit cams, better to use in cam picture profiles. even good 8 bit cams like c300 og cannot do much post with its 8 bit 422 codec. if you like post, then choose other suitable cams like bm, red, magic lantern, etc. 

Link to comment
Share on other sites

I assume the old Canon D Mark II was 8-bit, right? It was used to shoot this (which was actually a clothing advert masquerading as a short film; the trailer was released and the other segments are available on Vimeo but no full-length film was ever released):

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...