kye Posted August 15, 2022 Share Posted August 15, 2022 When I got my iPhone 12 mini I didn't really look at the camera specs, but I've since discovered it can do 10-bit video and have been examining its performance. What I've found is that there seems to be a potential for high-DR video that isn't realised. I've only really researched the iPhone 12, but I think that what I've found applies to all modern smartphones. Here's what I've found so far. This video does a test between various modes on the iPhone 12 camera: That video showed the 10-bit to have more subtle colours but lacked some of the tests I was interested in such as dynamic range, so I did my own. My phone actually has five options rather than the four, and I shot a very unscientific test in the backyard but of a high DR scene. I should have done it using manual settings but I didn't and I was comparing the default app to Filmic Pro too and the default app doesn't have any controls so the test was always going to be a compromise. Here's some stills from my test.. I did some matching in post to compensate for levels but didn't correct WB. Same image with the levels pushed right up to reveal the noise floor (the levels adjustment was applied to all clips identically): Adjusted down to look at clipping levels: and with a ridiculous curve to try and break the image and see any banding that is occurring in the sky: One thing I noticed was that according to Resolve, the file from the default camera app was a 10-bit file. The iPhone marketing indicates that the app will automagically switch to whatever mode is best, so I assume that it either uses 10-bit all the time or saw I was pointing it at a high-DR scene and switched to it. I haven't followed that up, but it's worth noting. It's also worth noting that the bitrate from the default app might not be the same as the Filmic Pro one as I had set Filmic Pro to its maximum setting. What I took away from this (and playing with the phone in very high-DR situations where there was huge clipping) was that the 10-bit has the same, or very similar, DR to the Dolby Digital and "Video HDR" modes, and that the 8-bit mode has less DR than the 10-bit (I assume the 8-bit just rejects the two most-significant bits the same way that JPG images clip earlier compared to RAW images on digital cameras). Incidentally, I couldn't find anywhere what this "Video HDR" mode actually did - it's not mentioned online anywhere I could find. So what's the problem? Well, where is the high dynamic range? I mean, where is the multiple-exposures high-DR? Imagine this.. a smartphone can pull images off the sensor at least 120 or 240 times per second. Why not just bracket those two frames and merge them together? Motion is a problem, but if you're talking about camera shake then you have OIS to smooth that out and we're talking 1/120s or 1/240s delay - that's minuscule, and if you're talking about subject movement then that's well well under the delay of any temporal noise reduction mechanism which operates at as much as 1/24s delay (10 times the potential delay) and exists in many high-end cameras. Why do I care? Think about this. A smartphone has something like, what, 10-stops of DR (I'm being pessimistic here). If you imagine that we ignore the bottom two due to noise, and we decide to overlap a stop for a smooth transition between exposures, that would still give us video with 17 stops of DR. SEVENTEEN! Hot damn would that be amazing!! Now, don't get me wrong, 17 stops is a huge issue in colour grading - rec709 only has 5 stops or something so trying to compress all that into an SDR output would be very difficult, but imagine if all they did was to emulate the exposure curve of film by having some stops in the middle that are relatively linear and then roll off the highlights and shadows. Even if they only exposed the darker stop one or two stops darker (and thus created a 1-2 stop shadow rolloff) that would mean they could have an extended highlight rolloff like film. If the app gave you control over this contrast then you could dial in having a higher-contrast look or lower-contrast look with more stops in the linear range. Dial the contrast right down and you'd have a flat image that could be graded nicely. The people who shoot with their phones are more likely rather than less to be shooting outdoors in high-DR situations. Is this a greater trend to not space out your DR bracketing? From what it looks like, there's no multi-exposure HDR going on at all, or if it is then it's not bracketing the two images 7-stops apart. I've noticed that most dual-native ISO cameras only have their native ISOs a few stops apart, 3 maybe 4. These are 12-stop cameras that do this - WHY? The Sigma FP is a notable exception, with its native ISOs being 5-stops apart. Smartphones have a different set of strengths and weaknesses than normal cameras, this is a potential strength and serious potential advantage - why isn't it being utilised? webrunner5, PannySVHS, ac6000cw and 1 other 2 2 Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted August 15, 2022 Administrators Share Posted August 15, 2022 Excellent info. It will be interesting to compare Mcpro24fps on Android to the iPhone as well. And between the different recording modes in Mcpro24fps itself vs Filmic Pro on same device. I am also writing a guide to Motion Cam raw video which will be out soon. PannySVHS and kye 2 Quote Link to comment Share on other sites More sharing options...
Phil A Posted August 15, 2022 Share Posted August 15, 2022 I still have an iPhone 11 Pro Max but my main gripe is how over-sharpened and brittle the image looks. I don't mind shooting with super deep DoF but it's just not a "thick" image like I get with a mirrorless camera (which are also often artificially sharpened). I watched some reviews of ProRes on YT when the 13 came out but I didn't see a huge difference (I think many people pretended that the codec will make a difference, when it's actually more the processing before that). I'd wish we could deactivate sharpening in FilmicPro even further, like it's possible on Android. So personally, I'd rather get a less processed image that's smoother/organic than more dynamic range at the cost of it looking hyper digital. kye and PannySVHS 2 Quote Link to comment Share on other sites More sharing options...
webrunner5 Posted August 15, 2022 Share Posted August 15, 2022 You need to go with an Android phone to get the best results., Very little of all of the "aftermarket" apps or programs work on an iPhone. Raw is really the way to go if you want a lot of control. Be even some Android phones don't work well. Sony phones and some Samsung phones are not the thing to have. Some older 150 dollar ones work pretty well surprisingly, but the later models are best. PannySVHS and kye 1 1 Quote Link to comment Share on other sites More sharing options...
kye Posted August 16, 2022 Author Share Posted August 16, 2022 21 hours ago, Phil A said: I still have an iPhone 11 Pro Max but my main gripe is how over-sharpened and brittle the image looks. I don't mind shooting with super deep DoF but it's just not a "thick" image like I get with a mirrorless camera (which are also often artificially sharpened). I watched some reviews of ProRes on YT when the 13 came out but I didn't see a huge difference (I think many people pretended that the codec will make a difference, when it's actually more the processing before that). I'd wish we could deactivate sharpening in FilmicPro even further, like it's possible on Android. So personally, I'd rather get a less processed image that's smoother/organic than more dynamic range at the cost of it looking hyper digital. I also find the brittleness of the image (britality? is that a word? it feels appropriate!) to be disappointing. The RAW video from the Android phones doesn't look brittle at all, and I don't know of anything that would necessitate that look from a smaller sensor. I'd be curious to see some test shots of the iPhone 13 h264 vs h265 vs Prores for myself rather than through a YT video, but I completely agree that it's probably not the codec itself rather the processing that happens beforehand. One thing that @mercer mentioned was that it could be the quality of the compression that is done in the device. We've likely all seen that various cameras create compressed files that are of identical resolution/bitrate/bit-depth but vary drastically in quality. Anyone that is unaware can purchase a cheap 1080p camera from eBay and witness the quality of image that is almost a crime against videography, despite still having the same specs. The C100 was notable for the opposite - it had (IIRC) ~25Mbps 1080p that was better than the 4K of a lot of competing cameras. 21 hours ago, webrunner5 said: You need to go with an Android phone to get the best results., Very little of all of the "aftermarket" apps or programs work on an iPhone. Raw is really the way to go if you want a lot of control. Be even some Android phones don't work well. Sony phones and some Samsung phones are not the thing to have. Some older 150 dollar ones work pretty well surprisingly, but the later models are best. I'm still planning to experiment further. I didn't notice any quality differences from the iPhone 12 default camera app on the test I posted above, but I have also shot clips with the default app that looked rubbish, so I don't think I've really stress-tested the various codecs. Right now I'm torn between a few different projects, but have plans on putting more effort into this one. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.