Jump to content

kye

Members
  • Posts

    7,777
  • Joined

  • Last visited

Everything posted by kye

  1. kye

    DJI Pocket 3?

    iPhone, because it has different focal lengths. But that would apply to anything - if I could choose an iPhone or Alexa or FX3 or.... but the other cameras could only have one prime lens, I'd still choose the iPhone.
  2. kye

    DJI Pocket 3?

    It would take a lot of footage for me to be able to see through the grading that was applied to each video, and I haven't deep dived enough to do that. However, both seem capable of creating decent images, if they are pointed at something interesting and treated well in post. Probably the difference is that the Apple log has a fully-supported colour management profile, which enables it to fit into a workflow that includes professional tools and ability to accurately perform WB and exposure changes in post, and to also align to various other treatments to create the intended outcome. In terms of which is 'better', the differences in context completely overwhelm this question when compared to the image.
  3. kye

    DJI Pocket 3?

    If you have good taste and know how to process the image in post, there are many trade-offs that can be made. For example, 4K 10-bit 422 could be matched by 6K 8-bit 420 if the camera was used with the right ISO settings to generate enough noise in the file so that in post the downsample to 4K would re-create the intermediate values that are inaccessible by the 8-bit. Art does this with stippling. Soon, AI will be able to resurrect your SD MiniDV tapes into 3D 8K 444 glory with only minimal error. The ever-deepening naval gazing will create the ouroboros, assuming it hasn't already.
  4. kye

    DJI Pocket 3?

    Which image?
  5. kye

    DJI Pocket 3?

    Absolutely. It's like people have forgotten what the images in the cinema actually looked like, or that they were 35mm film. I say that because people who apply a "filmic" or "cinematic" look seem to apply a film emulation at about 284% of what is realistic. This is a scan of (IIRC) Kodak 200T (source) : The video above is: too sharp too heavy split-tone very heavy-handed diffusion ridiculous halation etc It's like they got a film emulation plugin and put some sliders to 0% emulation, and others to 350%. In colour, "if it looks good then it is good" definitely applies, but it doesn't seem to have a look of its own, it's just got a bad film emulation on it.
  6. Unfortunately, you can't buy skill with money. You earn it with time, curiosity, and humility.
  7. Yeah, that's terrible, and not easy to correct in post at all. I am wondering when the camera manufacturers and/or post people will get their act together and start addressing these distortions - if they profile a lens then it should be 99% fixable, either in-camera via processing or in-post using gyro and IBIS + OIS alignment data, or simply a more sophisticated stabilisation algorithm than a 2D crop of the final image. It doesn't even have to be perfect, an 80-90% reduction in the flappiness would be - well - 5-10x better. I mean, if a GoPro can do it for one lens, essentially perfectly, then it can't be beyond a multi-thousand dollar professional camera body with a native lens.
  8. It's a big "it depends", based on lots of factors: Some scenes are more difficult technically to capture than others Some scenes are more difficult aesthetically to reproduce than others Some differences can be compensated for in post easily (e.g. small WB differences, skin tone hue rotations, etc) Some differences can't be compensated for (e.g. skin tone smoothing, quantisation issues like 8-bit log codec, lots of non-linear processing) The stronger the grade you're going to put on it the less it matters The more skilled you are in post the less it matters The more powerful the tools you use in post the less it matters The better the cameras colour profiles are the less it matters The less picky your audience is the less it matters The less saturated the final image the less it matters Etc The problem with discussing it is that on the open internet, the only two opinions anyone seems to understand is "it is the only thing that matters" or "it doesn't matter at all", and those who dare to look in the middle ground can't tell which of the strange things in the test images belong to which camp - easy or difficult or impossible to fix.
  9. As a fan of colour grading, I'm the first one to promote the idea that the footage SOOC is like a film negative - it's yet to be developed in post. However, while there are some things you can adjust in post to improve the cameras colour science, like WB or hue shifts (which people get triggered about all the time), you can't (without AI) make the footage higher quality. The richness of 5D ML 14-bit RAW can't be created out of the 8-bit 709 images from my GX85 (believe me - I've been trying for years!). BUT BUT BUT BUT BUT.... without lighting this is a BS pointless discussion. (and without anything interesting to point the camera at, lighting is polishing a turd) .....((and without a story to tell, glorious compositions won't even keep me awake)).....
  10. I just watched the OM-1 vs G9ii IBIS only comparison and although the OM-1 looks better stabilised than the G9ii the majority of the time, there are places where it's the other way around, so goodness knows what the different settings were. However, I've experienced the difference between IBIS only and Dual IS on both my GH5 and GX85, and the different is night and day - potentially enough to make the G9ii the clear winner. If you're doubtful, think about the difference between having no stabilisation at all and adding OIS - that's the kind of difference it is. The G9ii might also have the extra mode the GH5 has where it removes all motion, which is enormously more stabilising than the normal IBIS mode, so that would be my preference in this situation. In terms of him doing the test and not telling us because he doesn't want to embarrass anyone, well, you might be interested to know there's lots of things about aliens that the government hasn't been telling us either...
  11. Just looked through the video, he didn't compare Oly IBIS to Penny Dual IS, so you can't judge from the video how good each one is. The reason this is relevant is that Olympus puts IBIS in their cameras but doesn't put OIS in their lenses, so a native Oly system only has IBIS but a native Panny system has Dual IS.
  12. Was the Panny using IBIS and OIS in its Dual IS mode?
  13. Every time I see a blind camera test, I am looking at the colours. It's not something I decided to do, or have to focus on, it's what I naturally find myself looking at. Especially skin-tones. All the blind tests I've done, where I've done them blind, taken notes, and then scored and ranked the cameras, before looking at the results to see what was what, I correctly chose things in descending order of price. There was one test where I put the Alexa lower than cheaper cameras, but it had a strange green tint to it that the other cameras didn't have, so I suspect something was wrong with the test. The colour and texture of skin tones is about 40% of the grade I think. Drastically under-rated, and barely discussed with any depth.
  14. It was like you were reading my mind as I was reading. The other thing that people often forget is that film is a creative medium. I know it's heresy and I'll flame myself after posting this, but it matters because if you're zooming while filming then your choice of compositions and even choices in how you move the camera etc will be made on the basis of what you see in the frame. People forget that the tech influences the operator, and the operator controls the camera and also influences the crew and even the cast (if they're grumpy or happy enough), and those things are far more relevant to the actual end product than any technical whatever that might exist as a counter-argument.
  15. Wow.. "I was there" wasn't a reply I expected! What else can you tell us about the shoot? Anything that looks that crazy on film has got to have some stories that go with it!
  16. That doesn't look good to me at all... It had the blur bursts, warping, also towards the end wasn't stabilised well with sort of jumpy and floaty motion. Getting it right in-camera still matters for now, until AI can sort out this issue too.
  17. A FF box-cam would make a lot of sense as a crash-cam, for drones, or for mounting in odd locations etc. It would have FF, good DR, good low-light, thermal management fans, and potentially the ARRI LogC curve / Prores RAW in it. L-mount primes can be quite compact, and the camera body itself would probably be smaller than the GH7 as well!
  18. Yeah, now they've got the license, they might be able to release it for other cameras perhaps. One thing that comes to mind is that GH7 has IBIS, which isn't so good if you want a crash cam or something where there will be significant vibration (which was the purpose of the GH5S) but all Panny cameras now have it, don't they?
  19. Another GH7 + LogC and ARRI discussion.. TLDR; Brady used the combo on two narrative projects, they matched up nicely in post (but not perfectly), and he sees it as a great option for a B-cam to an Alexa if you didn't have the budget for two Alexas. There's some side-by-sides in the video, including graded to match and also not.
  20. No knowledge with that camera, but if the focus selection AI just randomly changes modes on you, it sounds like there might be a problem. As a standard troubleshooting step, maybe try resetting the camera to default settings, and upgrade to the latest firmware for your S5 and each of your lenses too.
  21. I'm continually amazed at how much energy goes into pixel peeing at 4K or higher, and yet the high-end productions deliberately blur un-sharpen the footage. Compare that video with the reference footage from real projects and the difference is obvious. The visibility of 1080p vs 4K vs 6K is a debate that won't ever die, but the visibility of 6K 4:4:4 vs 6K 4:2:0 will be absolutely zero once it's been put onto a 4K timeline, un-sharpened to an aesthetically pleasing amount, exported and uploaded to a streaming service, and then heavily processed and brutally compressed before streaming to the end user. Unless people are literally doing green-screen work or VFX, putting money into lens sharpness or 6K+ resolutions is just paying to look less high-end, not more.
  22. I am wondering if sharpness might have something to do with it for me. As a reference, what do you think about this?
  23. Same impression for me during the opening few scenes in Trigger Warning which is new on Netflix. The VFX shots were very budget, but the footage had a real video look to them. It was shot on real equipment so I'm also not sure what it was. Also, the trailer on YT looks much better than the early parts of the film. I had noticed this "ingredient X" appearing on random things previously, and tried to ignore it because once you learn to see something you can't un-see it, and I was pretty sure that my preferred cameras wouldn't look good in this regard, but now I've seen it to the degree I can't ignore it, so I've reluctantly started investigating.
  24. For artists... it could win many Oscars and Academy Awards. For technicians... well, just keep reading this thread to find out.
  25. See - the question was wrong from the start!
×
×
  • Create New...