Administrators Andrew Reid Posted April 4, 2014 Author Administrators Share Posted April 4, 2014 I don't think that video is relevant? On DSLRs with 1080p, the sensor gives a weak signal to the image processor. It is a signal that is heavily chopped and sliced before it even gets to the encoder and is turned into 8bit. On the GH4 the sensor gives a very strong signal to the image processor. It is a 1:1 pixel readout, debayered to 10bit 4:2:2... That is a lot of data for the encoder to work with. It is also a LOT more data than 1080p, around 4x more. So working with 4x more data in your grade is going to be a bit different to working with the usual 8bit 1080p from DSLRs. That it is still 8bit is kind of irrelevant - it is still 4x the data of 1080p and the stronger sensor signal. The whole internal imaging pipeline in the GH4 is 10bit 4:2:2. When you pack 4x the data into a 1080p file using your computer, you are throwing a lot of processing power at it. I can already see for myself how well the 4K data grades and how nice the 2K is from it when oversampled from 4x the data that you have usually! I must admit I do not understand the maths that well, I am not a mathematician but a filmmaker, so others may or may not be right on the maths... in the end I don't care... I have a pair of eyes and that is what counts :) Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 Hey Andrew, Basically if you take a grayscale gradient and you capture, resample, whatever.. from 10bit to 8bit, you wind up with 256 shades of gray (wow, just though of a great title for a book).. and if you push the crap out of that, you will see banding, no matter what size the resolution is. But with a camera, there is Bayer pattern grain, algorithms and lots of interesting IP going on that can make it look much better.. conceal some of the obvious artifacts, but you're still stuck with the math.. 256 colors per channel at 8bit. And lower quality when you factor in chroma sub sampling. Quote Link to comment Share on other sites More sharing options...
Thomas Worth Posted April 4, 2014 Share Posted April 4, 2014 Also, I don't know if this happened during transcoding or not (which is why you need to post the original files from the camera, please), but these are clearly upscaling artifacts: Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 The files for me read as 4:2:2 10bit, but I thought it might be a Linux thing. Anyway, I've seen enough to know the footage looks great. Unless something unexpected comes out at NAB, I'm getting one... but will need an HDMI capture solution. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted April 4, 2014 Author Administrators Share Posted April 4, 2014 That's on the original files Thomas. 4:2:0 artefact I think. This is the kind of thing you avoid when: A - You record via the 10bit 4:2:2 HDMI output or B - You downsample to 2K 4444 ProRes Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted April 4, 2014 Author Administrators Share Posted April 4, 2014 Hey Andrew, Basically if you take a grayscale gradient and you capture, resample, whatever.. from 10bit to 8bit, you wind up with 256 shades of gray (wow, just thought of a great title for a book).. and if you push the crap out of that, you will see banding, no matter what size the resolution is. In my experience, bit depth is more important than most of the other factors. But with a camera, there is Bayer pattern grain, algorithms and lots of interesting IP going on that can make it look much better.. but you're still stuck with the math.. 256 colors per channel at 8bit. Indeed, which is why when you downscale 4K to 2K and pack it into a file which can handle greater precision you have smoother gradients with finer steps in-between which respond better when you push them around in the grade. Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 Okay, I'm not going to belabor the issue (too late).. it's an awesome camera. Thanks for the footage! Quote Link to comment Share on other sites More sharing options...
Thomas Worth Posted April 4, 2014 Share Posted April 4, 2014 That's on the original files Thomas. 4:2:0 artefact I think. This is the kind of thing you avoid when: A - You record via the 10bit 4:2:2 HDMI output or B - You downsample to 2K 4444 ProRes Let's test this to be absolutely certain. If it is indeed an artifact of the color sampling, then the artifact will not be present in the Y channel. This can be verified by transcoding the original with 5DtoRGB using the "None" setting for decoding matrix. It will show Y, Cb and Cr as R,G, and B in the output file. Any way you can post originals? I'd like to do some experimenting over here. :) Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted April 4, 2014 Author Administrators Share Posted April 4, 2014 In private as it's a pre-production model. Thomas can you email me? I'll respond later as it is 3am here now :) Quote Link to comment Share on other sites More sharing options...
jcs Posted April 4, 2014 Share Posted April 4, 2014 Indeed, which is why when you downscale 4K to 2K and pack it into a file which can handle greater precision you have smoother gradients with finer steps in-between which respond better when you push them around in the grade. Sorry Andrew, the 4K 420 => 2K 444 math of 8.67-bits/pixels (not 10-bits) doesn't support any significant extra color depth. Here's how you can prove it to yourself: grade something in 4K that starts to break up due to limited color depth. Downsample to 2K- see any improvement? 2K 444 8.67 bits is still very nice at this price point. Since the camera supports 10-bit output, perhaps a future firmware upgrade will support 10-bit H.264 (supported by the spec and used in Sony's XAVC). It might not happen for a while due to upline cameras such as the S35 Varicam, but should be possible. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted April 4, 2014 Author Administrators Share Posted April 4, 2014 "Grade something in 4K that starts to break up". Wrong way to think of it. The math is backed up by David Newman who is CTO at CineForm. Quote Link to comment Share on other sites More sharing options...
Thomas Worth Posted April 4, 2014 Share Posted April 4, 2014 Sorry Andrew, the 4K 420 => 2K 444 math of 8.67-bits/pixels (not 10-bits) doesn't support any significant extra color depth. Here's how you can prove it to yourself: grade something in 4K that starts to break up due to limited color depth. Downsample to 2K- see any improvement? 2K 444 8.67 bits is still very nice at this price point. Since the camera supports 10-bit output, perhaps a future firmware upgrade will support 10-bit H.264 (supported by the spec and used in Sony's XAVC). It might not happen for a while due to upline cameras such as the S35 Varicam, but should be possible. The 10 bit figure is achieved by summing the values of four 8 bit pixels, which automatically downsamples to 1/4 the resolution as a result. This requires special image processing designed for this exact purpose, and is most likely not being done by Compressor, etc. Quote Link to comment Share on other sites More sharing options...
jcs Posted April 4, 2014 Share Posted April 4, 2014 Let's review the math: 10 bits is 2 more bits vs 8, 2^2= 4, so adding 4 8-bit values together gives us a 10-bit result. If we add up all the values, then divide by 4, thus averaging the result, we'll still see the extra information in the fraction. So if we average the values together in floating-point, we've achieved the 'effect'. This is effectively what an NLE will do when rescaling, so we don't need any special extra processing for an NLE that works in 32-bit float. 420 AVCHD (H.264) is YUV. If we scale 4K YUV 420 to 2K 444 YUV, only Y is full resolution, and only Y will get the benefit of the 4-value summing and additional fractional bit depth. Luminance is more important than chrominance, so that's not so bad. So best case, we have 10-bits of information in Y and 8-bits for U and V. This is (10+8+8)/3 = 8.67 bits per pixel. If the NLE does the scaling after converting YUV to RGB, it's still 8.67-bits of information per pixel at best (no new information is added during the transformation). This is why we won't see a significant improvement in color depth. Here's a challenge- create an example that shows the '10-bit' effect is significant (I agree it's there, but at 8.67 actual bits, it will be hard to see in most cases). Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 Andrew, Is what you posted yesterday on the copy site directly from the camera? Thanks Quote Link to comment Share on other sites More sharing options...
jcs Posted April 4, 2014 Share Posted April 4, 2014 The 4K ProRes files are 10-bit 422 after transcoding from the original 8-bit 420. However, there's only 420 8-bits of color information in those 422 10-bit files. Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 Nevermind, you're sleeping. Quote Link to comment Share on other sites More sharing options...
Thomas Worth Posted April 4, 2014 Share Posted April 4, 2014 The 4K ProRes files are 10-bit 422 after transcoding from the original 8-bit 420. However, there's only 420 8-bits of color information in those 422 10-bit files. That's correct, which is why the results so far aren't showing any benefit. Testing them should be put on hold until we have the camera originals and can process them specifically for ~10 bit output. Quote Link to comment Share on other sites More sharing options...
sunyata Posted April 4, 2014 Share Posted April 4, 2014 The ProRes files are 10-bit 422 after transcoding from the original 8-bit 420. However, there's only (at best) 8.67-bits of color information in those 10-bit files. Right, I follow. The ffprobe on Andrew's files: Video: prores (apcs / 0x73637061), yuv422p10le, 4096x2160, 339440 kb/s, SAR 4096:4096 DAR 256:135, 24 fps, 24 tbr, 24 tbn, 24 tbc Quote Link to comment Share on other sites More sharing options...
Pascal Garnier Posted April 4, 2014 Share Posted April 4, 2014 Anyone have an idea if the following workflow would degrade the footage : first use 5dtorgb to convert the 4K footage to 2K ProRes 444 then start editing and grading This workflow would probably be the easiest way for people who are outputting 2K anyway, and are using a less recent editing bay. Quote Link to comment Share on other sites More sharing options...
Thomas Worth Posted April 4, 2014 Share Posted April 4, 2014 Anyone have an idea if the following workflow would degrade the footage : first use 5dtorgb to convert the 4K footage to 2K ProRes 444 then start editing and grading This workflow would probably be the easiest way for people who are outputting 2K anyway, and are using a less recent editing bay. 5DtoRGB isn't set up to do this at the moment, but I'm looking into adding this capability. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.