Charlie Posted April 5, 2017 Share Posted April 5, 2017 Hey all, Im shooting a new short film on Saturday and it will be shot on the GH5. After looking around the web, I have decided to stick to 8 bit. The reason is that my current PC monitors or graphics card do not support 10 bit colour, it is also very likely that the places that will screen the film will have 10 bit systems either. Furthermore, the current data rate on the GH5 is not all that high so based on all of this, it seems pointless to shoot in 10 bit. Unless you have a 10 bit system, you cannot see the difference anyway, right? What do you all think? Quote Link to comment Share on other sites More sharing options...
zerocool22 Posted April 5, 2017 Share Posted April 5, 2017 Hey, I think you should shoot in 10bit as you will not run into as much banding + you have more grading options. Cheers hyalinejim, Geoff CB and zmarty 3 Quote Link to comment Share on other sites More sharing options...
Charlie Posted April 5, 2017 Author Share Posted April 5, 2017 2 minutes ago, zerocool22 said: Hey, I think you should shoot in 10bit as you will not run into as much banding + you have more grading options. Cheers Is there a significant increase in file size do you know? Quote Link to comment Share on other sites More sharing options...
zerocool22 Posted April 5, 2017 Share Posted April 5, 2017 I would suspect so, but I do not have a GH5 to check. Check the bitrates in camera and compare. Quote Link to comment Share on other sites More sharing options...
Geoff CB Posted April 5, 2017 Share Posted April 5, 2017 12 minutes ago, Charlie said: Hey all, Im shooting a new short film on Saturday and it will be shot on the GH5. After looking around the web, I have decided to stick to 8 bit. The reason is that my current PC monitors or graphics card do not support 10 bit colour, it is also very likely that the places that will screen the film will have 10 bit systems either. Furthermore, the current data rate on the GH5 is not all that high so based on all of this, it seems pointless to shoot in 10 bit. Unless you have a 10 bit system, you cannot see the difference anyway, right? What do you all think? Shoot in 10-bit, you will have more leeway in grading like zerocool22 mentions. You will need to use prooxy to edit however. Quote Link to comment Share on other sites More sharing options...
Charlie Posted April 5, 2017 Author Share Posted April 5, 2017 1 minute ago, Geoff CB said: Shoot in 10-bit, you will have more leeway in grading like zerocool22 mentions. You will need to use prooxy to edit however. For some of the slow mo shots I will have to shoot 8-bit 1080p. Are there any problems mixing 10 & 8-bit files on the same timeline in Premiere?? Quote Link to comment Share on other sites More sharing options...
Geoff CB Posted April 5, 2017 Share Posted April 5, 2017 Just now, Charlie said: For some of the slow mo shots I will have to shoot 8-bit 1080p. Are there any problems mixing 10 & 8-bit files on the same timeline in Premiere?? Besides a slight image quality drop on the slowmo, I don't think so. Quote Link to comment Share on other sites More sharing options...
Charlie Posted April 5, 2017 Author Share Posted April 5, 2017 9 minutes ago, Geoff CB said: Besides a slight image quality drop on the slowmo, I don't think so. Thanks, if I am going to have to shoot some 1080p I am going to stick with UHD 4k as it is the same ratio, Cinema 4k obviously is not. Varfaibale frame rate is not available in UHD 4k though!!.....in 10-bit Quote Link to comment Share on other sites More sharing options...
TheRenaissanceMan Posted April 5, 2017 Share Posted April 5, 2017 It is better to shoot and process in the highest but depth you can, then export to 8-bit or whatever you're delivering to at the end. That'll guarantee the best possible image quality for all your delivery formats. Quote Link to comment Share on other sites More sharing options...
Charlie Posted April 5, 2017 Author Share Posted April 5, 2017 4 minutes ago, TheRenaissanceMan said: It is better to shoot and process in the highest but depth you can, then export to 8-bit or whatever you're delivering to at the end. That'll guarantee the best possible image quality for all your delivery formats. Thanks a lot - the GH5 is a big step up for me and a big learning curve but I have some great projects this year so gotta wrap my head around it..!!!! Quote Link to comment Share on other sites More sharing options...
dbp Posted April 5, 2017 Share Posted April 5, 2017 Even on an 8bit monitor, you will see the differences with gradients and transitions between colors. There's been tests out there that demonstrate this with the GH5, no pixel peeping required. Quote Link to comment Share on other sites More sharing options...
Charlie Posted April 5, 2017 Author Share Posted April 5, 2017 Ok, the consensus seems to be to stick with 10 bit so thats what im gonna do!!!.....thanks for the replies and as soon as the film is finished ill post a link to it! Quote Link to comment Share on other sites More sharing options...
Orangenz Posted April 6, 2017 Share Posted April 6, 2017 8 hours ago, Charlie said: Is there a significant increase in file size do you know? The recorded bit rate is the same, 150Mbps, so the files can't be bigger. Some of the editors need an update to work super smooth with the new codecs but that seems a separate issue from 10bit vs 8 bit. jonpais 1 Quote Link to comment Share on other sites More sharing options...
HockeyFan12 Posted April 6, 2017 Share Posted April 6, 2017 If you're shooting in V Log definitely shoot 10bit. 8 bit is (generally) enough for (small gamut) display, but if you plan to grade or do any vfx, 10 bit capture can make a big difference because you'll be stretching the image before it's displayed. If you don't plan to grade the footage at all or color correct at all, 8 bit is fine, but that means limiting yourself to not even fixing the exposure. Given the option, I'd choose 10 bit every time (and I wouldn't choose 4k or RAW most of the time, so in terms of priority it's pretty high). 8 bit is the minimum that works as a display standard, but the capture standard must be bigger. Film has 14+ stops of dynamic range. Most images use 8-10 of them at most. Capture and display are different things. I hear the GH5 is amazing! Hope it goes well either way. If you're not shooting log, either should be good but I would still shoot 10 bit. Fwiw, I work on 8 bit displays often but 99% of the time I render in 16 bit and the majority of the footage I work with is 10 bit or more color. The display is your limiting factor only after grading... in the grade, you will see far more detail in the 10 bit image as soon as you make an adjustment or introduce a transform LUT. Even on an 8 bit display. Charlie 1 Quote Link to comment Share on other sites More sharing options...
Shirozina Posted April 6, 2017 Share Posted April 6, 2017 Any links to tests that show 10bit vs 8bit at this low bitrate ? I'd be very wary of shooting a flat log profile at such a low bitrate as problems with banding will be compression artefacts and not bit depth artefacts. Quote Link to comment Share on other sites More sharing options...
HockeyFan12 Posted April 6, 2017 Share Posted April 6, 2017 12 hours ago, HockeyFan12 said: If you're shooting in V Log definitely shoot 10bit. 8 bit is (generally) enough for (small gamut) display, but if you plan to grade or do any vfx, 10 bit capture can make a big difference because you'll be stretching the image before it's displayed. If you don't plan to grade the footage at all or color correct at all, 8 bit is fine, but that means limiting yourself to not even fixing the exposure. Given the option, I'd choose 10 bit every time (and I wouldn't choose 4k or RAW most of the time, so in terms of priority it's pretty high). 8 bit is the minimum that works as a display standard, but the capture standard must be bigger. Film has 14+ stops of dynamic range. Most images use 8-10 of them at most. Capture and display are different things. I hear the GH5 is amazing! Someone who tried a pre-production model posted here and said it was easily the best camera under $10k. Wow! Hope it goes well either way. If you're not shooting log, either should be good but I would still shoot 10 bit. Fwiw, I work on 8 bit displays often but 99% of the time I render in 16 bit and the majority of the footage I work with is 10 bit or more color. The display is your limiting factor only after grading... in the grade, you will see far more detail in the 10 bit image as soon as you make an adjustment or introduce a transform LUT. Even on an 8 bit display. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.