KnightsFan Posted December 15, 2018 Share Posted December 15, 2018 9 minutes ago, sanveer said: Whoa. That was a long and wide compilation. Maybe I'll check it tomorrow sometime. Maybe if you added the date to the tests it would help one understand whether they were conducted around the same time, under different profiles, or over different periods of time when their methodologies and co cousins varied. The article date is in the leftmost column. I think that they often use old test numbers (e.g. on the Fuji page, they include existing numbers from Sony for comparison, rather than re-test the Sony). 15 minutes ago, sanveer said: I read one of the tests which showed the A7s or A7s as having 14 stops. That couldn't possibly have been the improvement from downressing it from 4k to 1080p. I didn't find any Sony cameras where they claimed 14 stops, but I would not be surprised if they did on some very old articles. On this article (https://***URL not allowed***/lab-review-sony-a5100-video-dynamic-range-power/) from 2014, they mention a lot of changes to their testing in various updates. The a5100 was "updated" from 13 to 10.5 stops. If you find any, I'll add them to the list anyway. At the bottom of this 2014 article (https://***URL not allowed***/dynamic-range-sony-a7s-vs-arri-amira-canon-c300-5d-mark-iii-1dc-panasonic-gh4/) it says: "Note: We have at one point in 2014 updated our dynamic range evaluation scale to better represent usable dynamic range among all tested cameras. This does not affect the relation of usable dynamic range between cameras." So I would be be cautious comparing numbers from pre-2014 with modern numbers. IronFilm 1 Quote Link to comment Share on other sites More sharing options...
thebrothersthre3 Posted December 15, 2018 Share Posted December 15, 2018 2 hours ago, KnightsFan said: The article date is in the leftmost column. I think that they often use old test numbers (e.g. on the Fuji page, they include existing numbers from Sony for comparison, rather than re-test the Sony). I didn't find any Sony cameras where they claimed 14 stops, but I would not be surprised if they did on some very old articles. On this article (https://***URL not allowed***/lab-review-sony-a5100-video-dynamic-range-power/) from 2014, they mention a lot of changes to their testing in various updates. The a5100 was "updated" from 13 to 10.5 stops. If you find any, I'll add them to the list anyway. At the bottom of this 2014 article (https://***URL not allowed***/dynamic-range-sony-a7s-vs-arri-amira-canon-c300-5d-mark-iii-1dc-panasonic-gh4/) it says: "Note: We have at one point in 2014 updated our dynamic range evaluation scale to better represent usable dynamic range among all tested cameras. This does not affect the relation of usable dynamic range between cameras." So I would be be cautious comparing numbers from pre-2014 with modern numbers. Yeah they did here lol, not the Cinema 5d website but they use one of their charts. https://nofilmschool.com/2014/07/sony-a7s-dynamic-range-arri-alexa-amira Quote Link to comment Share on other sites More sharing options...
KnightsFan Posted December 15, 2018 Share Posted December 15, 2018 2 minutes ago, thebrothersthre3 said: Yeah they did here lol, not the Cinema 5d website but they use one of their charts. https://nofilmschool.com/2014/07/sony-a7s-dynamic-range-arri-alexa-amira Nice find! Interestingly, I did look at the article NoFilmSchool links to in my original search, but that chart is nowhere to be seen. That is actually the article from 2014 that mentions that that they have changed their testing methods. NoFilmSchool quoting C5D: "Here we tested usable dynamic range of the given cameras. With 14.1 stops the usable dynamic range of the A7S comes surprisingly close to the Arri Amira with its legendary Alexa sensor" C5D (probably edited) "We tested usable dynamic range of the given cameras. With 12 stops the usable dynamic range of the A7S comes surprisingly close to the Arri Amira (13.1 stops) with its legendary Alexa sensor" Quote Link to comment Share on other sites More sharing options...
thebrothersthre3 Posted December 16, 2018 Share Posted December 16, 2018 Interesting So the Arri Alexa has 14 stops while the Amira 13? How do Red cameras do with Dynamic range I wonder. I know RED claims 18 stops with some of their cameras. Quote Link to comment Share on other sites More sharing options...
HockeyFan12 Posted December 16, 2018 Share Posted December 16, 2018 2 hours ago, thebrothersthre3 said: Interesting So the Arri Alexa has 14 stops while the Amira 13? How do Red cameras do with Dynamic range I wonder. I know RED claims 18 stops with some of their cameras. In my experience, Red is very generous with its ratings. The MX was rated very high, but I remember in practice it had about one stop less DR than the C300 (which is a more recent sensor design, so it makes sense) when I used it, which in turn had two to three stops less DR than the Alexa. Then the Dragon was noisier in the shadows than the MX but had more highlight detail, still trailing the Alexa by a lot. That was the original OLPF, I think they switched it up. For the time it was pretty good, but today's mirrorless cameras have more DR than the MX ever did. Dragon looks great exposed to the left, though. Good tonality. Recently, Red's gotten a lot better. My friends who've used the Gemini think it's just great. Super clean, good resolution, great DR, too. I've found CML does really good tests that correlate closely with real world use: https://cinematography.net/CineRant/2018/07/30/personal-comments-on-the-2018-cml-camera-evaluations/ They give the Gemini half a stop less than the Alexa–not bad. They also post Vimeo links, where you can see skin tones, etc. Venice looks awesome. Not sure about the Alexa and Amira being any different. Same sensor design and both to me seem leagues beyond anything else I've used. Not just best DR but best tonality and texture and color in the highlights. I remember that the original Alexa had worse performance than the Alexa Mini (pretty subtle, but it was there) and Arri confirmed that they did little tweaks that push the newer models to 15+ stops. But that should favor the Amira, if anything. In my experience the Amira is just as good as the Mini, though, 15+ stops. I think Cinema5D changed their testing methodology so their results are inconsistent, and they've always seemed pretty careless to me. AlexTrinder96, webrunner5 and KnightsFan 3 Quote Link to comment Share on other sites More sharing options...
thebrothersthre3 Posted December 16, 2018 Share Posted December 16, 2018 16 minutes ago, HockeyFan12 said: In my experience, Red is very generous with its ratings. The MX was rated very high, but I remember in practice it had about one stop less DR than the C300 (which is a more recent sensor design, so it makes sense) when I used it, which in turn had two to three stops less DR than the Alexa. Then the Dragon was noisier in the shadows than the MX but had more highlight detail, still trailing the Alexa by a lot. That was the original OLPF, I think they switched it up. For the time it was pretty good, but today's mirrorless cameras have more DR than the MX ever did. Dragon looks great exposed to the left, though. Good tonality. Recently, Red's gotten a lot better. My friends who've used the Gemini think it's just great. Super clean, good resolution, great DR, too. I've found CML does really good tests that correlate closely with real world use: https://cinematography.net/CineRant/2018/07/30/personal-comments-on-the-2018-cml-camera-evaluations/ They give the Gemini half a stop less than the Alexa–not bad. They also post Vimeo links, where you can see skin tones, etc. Venice looks awesome. Not sure about the Alexa and Amira being any different. Same sensor design and both to me seem leagues beyond anything else I've used. Not just best DR but best tonality and texture and color in the highlights. I remember that the original Alexa had worse performance than the Alexa Mini (pretty subtle, but it was there) and Arri confirmed that they did little tweaks that push the newer models to 15+ stops. But that should favor the Amira, if anything. In my experience the Amira is just as good as the Mini, though, 15+ stops. I think Cinema5D changed their testing methodology so their results are inconsistent, and they've always seemed pretty careless to me. That link is pretty interesting. The XH1 holds up well considering its price and size. Quote Link to comment Share on other sites More sharing options...
IronFilm Posted December 17, 2018 Share Posted December 17, 2018 On 12/16/2018 at 7:18 AM, sanveer said: Maybe if you added the date to the tests it would help one understand whether they were conducted around the same time, under different profiles, or over different periods of time when their methodologies and co cousins varied. I wouldn't be too surprised if after the criticism of C5D's early tests they tightened up their game. But as the comments from even their recent tests show, they're testing cameras without really understanding the camera itself and not setting it up for getting the most out of it. They shouldn't be testing them without getting intensive input from owner ops of that particular camera first, ideally on the shoot itself. On 12/16/2018 at 2:42 PM, HockeyFan12 said: I think Cinema5D changed their testing methodology so their results are inconsistent, and they've always seemed pretty careless to me. This. And this. On 12/16/2018 at 2:42 PM, HockeyFan12 said: I've found CML does really good tests that correlate closely with real world use: https://cinematography.net/CineRant/2018/07/30/personal-comments-on-the-2018-cml-camera-evaluations/ Yup, I'd rate the CML conclusions as waaaay more useful than C5D's. sanveer and hansel 2 Quote Link to comment Share on other sites More sharing options...
sanveer Posted December 17, 2018 Share Posted December 17, 2018 1 hour ago, IronFilm said: But as the comments from even their recent tests show, they're testing cameras without really understanding the camera itself and not setting it up for getting the most out of it. True. I remember reading comments saying that they aren't shooting in the recommended ISOs for some cameras. There are many such comments and maby of them merely state what the manufacturers have recommended. Which seems both fair and reasonable. Geoff CB 1 Quote Link to comment Share on other sites More sharing options...
TheRenaissanceMan Posted December 18, 2018 Share Posted December 18, 2018 C5D also doesn't understand how to use a Xyla chart properly. You expose so the brightest chip is just BARELY clipping, then count down from there. Every C5D test I've ever seen clips more than one chip, and I can never figure out why. Geoff CB and IronFilm 2 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.