Jump to content

JulioD

Members
  • Posts

    131
  • Joined

  • Last visited

Everything posted by JulioD

  1. JulioD

    24p is outdated

    It’s my view that 24 FPS is the goldilocks temporal resolution for listening to a story. Realism?? Give me a break. What’s realistic about a Death Star? Human beings are story tellers. Religion is stories. Sitting around a fire telling stories, making sense of the world is what we do that makes us human. Representation. Not realism.
  2. JulioD

    24p is outdated

    Nah it looks like crap. What’s dated is that this keeps coming up as being an issue that needs to change, that only dinosaurs shoot 24 and we’re all out of touch blah blah. This argument is decades old now and guess what? The audience knows. So far it hasn’t worked in a cinema. Many have tried. Many have spent a lot of money. AUDIENCES don’t want it for DRAMA. You may delude yourself that your YouTube channel is empirical evidence of HFR take up but let me know when I can go watch a MOVIE of your YouTube channel in a CINEMA and I’ll let you know if it’s filmic. If HFR really really truely was better the audience would know. We have had HFR for DECADES with gaming, a whole generation that SHOULD prefer it but they freakin don’t. Theres nothing at all stopping you making a killing with your HFR on YouTube right now. Make a movie for YouTube and let us know how it goes.
  3. I’m not sure what you mean? Ive come to the conclusion that more resolution in this case makes the camera more transparent. You don’t see the tech you just have a more subtle beautiful result. I’ve had make up artists freak out and it actually has the opposite effect. It looks beautiful in close ups. I’ve shot Oscar winning older ladies with it, and it actually helps rather than hinders. If a sensor is a grid of photosites, having more density means you see and feel the grid less. Super sampling is real. Everytime someone make this kind of comment I have to ask, have you actually used it and shot with it? Because mostly they haven’t. Maybe you have? It’s not just about it being 12k. That’s like…a side effect. The sensor has some serious color magic. I wish it did ProRes and I have had to buy an OLPF. like any camera it has its downsides. Bit it’s hard to beat for image beauty and color separation
  4. Yeah unless you really need ProRes, the 12K is better supported. I had a G2 for a long time but they left so much stuff off. Can't even do dual card.
  5. I shoot the 12K a lot and the funny thing is that you can shoot 1 hour of 12K for about 1TB. Think about that. That's the same as shooting an hour of Arri RAW at 4.6K. BRAW is an amazing codec in that regard, and the damn files playback on very old machines no problem.
  6. We’re going around in circles. It’s not just the weight difference, which is by the way 40% heavier. It’s how it balances (if it even would) on the freakin custom ronin rig they wanted to use with the specific lenses they chose. I dunno why you want to keep pretending they aren’t real and valid reasons to not use your precious FX6 or Alexa mini. Maybe if you’re a DP with Garth one day you can convince him to make a different choice.
  7. I've said it before. The camera was chosen because it was part of the directors on-set process. You're all trying to say how you would have done it with an Alexa. But you didn't. It couldn't have been done. Not with the lenses they chose, the way they wanted to light, and the METHODOLOGY they wanted to shoot with using a hybrid gimbal handheld rig. No way are you hand holding an Alexa mini LF on a gimbal for 30 mins. Maybe with a rig sure I hear you say, but then it's DIFFERENT because the camera is a different height, it's not going into the same positions you can go with a hand held should rig. I'm surprised that creatives like you all don't understand this simple idea... The Director chose a certain way to work. This was the only tool that could work the way the director wanted. You keep saying a different camera could be used when it was not really as feasible. "The FX3 was chosen as an extension of the methodology director Gareth Edwards was interested in fostering on the film. In an effort to embrace an immersive, authentic filmmaking approach, and inspired by Gareth's approach in making his first feature film Monsters, we sought out a very small and lightweight camera that still provided a robust image for post-production and visual effects purposes, and that could be paired with a Ronin RS2 gimbal to be operated for extended takes and with massive flexibility and freedom to move around a location and react spontaneously to what the actors were doing, or to something happening just outside of frame Gareth would catch out of the corner of his eye. One of the unique things about this film is that Gareth operated the camera himself in order to be able to react in real time to spontaneous occurrences on set and would often shoot 30-minute extended takes with the actors, going over actions a few times and discovering different angles and approaches to playing and covering a scene in the moment with them, like a kind of dance."
  8. That not your choice my guy. they wanted to shoot with 135 format anamophic lenses. It’s like saying Dark Knight could have shot S35 and didn’t need all that imax.
  9. No. Full frame Anamorphic on a unique inverted hand held prosumer gimbal? Not going to happen. None of those cameras do it. The full frame 6K wasn’t dropped till just now. Alexa mini LF can be made small, but they aren’t THAT small and you still have to power them with some giant umbilical cord and it would never balance with full frame Anamorphics on that gimbal. Stop saying that it’s the same when it is no where near the same.
  10. The problem is that the question is wrong. It’s not a blockbuster. The cost of production was more indie. I’ve seen numbers like 7 million. The rest of the headline budget cost was in post. This is a very specific scenario with a particular methodology of a particular director. The real question is, could they have gotten the same results using a more conventional approach. I think the answer is no.
  11. Err..well maybe it was the director with the original DP? And Oren signed up for it as a condition of employment. I’m not sure why you have such a problem with the choice, you keep saying it should have been something else. There are so many reasons why they made the choice. You might not agree with why but THEY made the choice and you don’t know all the thinking that went into that choice
  12. The director Garth Davis is well known for operating on his own films. The decision was likely made by you know...the director. And his long time collaborator Greig Fraser ACS ASC who knows a bit about making nice pictures. Greig had to leave to do Dune and so Oren inherited the choices already made. Likely he would have known them before committing to replace Greig.
  13. And I think they’re fine with that because they seem to be selling tens of thousands of cameras so they don’t really need to care about wider adoption.
  14. Don’t shoot the messenger. I’m just talking about the way BMD see it. They are known for how good their color science is, and now workflow, especially on their own cameras. I mean most of the best looking images shot on other cameras are still going through Resolve for final color. They made a great codec that works on even the oldest computers. BRAW 12K plays back faster than 4K ProRes even on Mac. Don’t kid yourself that 4K or 2K delivery means you only need 4K acquisition. Because right now you can shoot 12K BRAW for a SIMILAR file size as ProRes 4K. So why bother with a legacy codec. If you are BMD why would you bother supporting a legacy codec anymore. It’s not just KINDA supported. The only KINDA supported is FCP. In everything else BRAW works, especially the big boys like AVID. And until recently the ONLY way to shoot ProRes RAW internally is on a Nikon. I’m not sure that problem has been solved yet either. You’re currently FORCED to use an external recorder to record PRR. And from what I understand, PRR won’t easily support the higher resolutions like 12K because the file size is then massive. It can’t scale up. And before you say I don’t need 12K, again, with BRAW it can be a similar file size to 4K ProRes BUT you get the super sampling advantages. I’m not saying that’s what everyone actually wants, but think about it from BM’s perspective. They made an awesome raw(ish) codec that plays back on any computer often faster than 4K ProRes, in most editorial and finishing systems that can handle and scale to higher resolutions with no real file size cost. It could be implemented by other manufacturers if they wished, but they’d be happy just using it for their own eco system anyway. I think the main reason that people aren’t using BRAW more on other cameras is that they don’t like the recorders that Blackmagic make, not because they don’t want BRAW.
  15. No, don’t think so. Above 4K res the file sizes alone become huge. No. BRAW is a widely implemented standard. Why continue with an older inferior codec - From BMD’s perspective - Not my opinion, just channeling how Blackmagic see it…. I mean 264 is a widely supported codec too and we’ve only just seen it introduced for the first time and I’m assuming we’re only seeing 264 because of future cloud workflows ProRes RAW is pretty popular and they don’t support that in Resolve either…see where this is going?
  16. No Apple don’t take a fee. They are very strict about compliance though and each model has to go through an arduous process to be “approved” which can take a long time. This has been a barrier because it requires the manufacturer to open up their code and manufacturing to Apple for an unspecified amount of time for a maybe, maybe not. You can’t even announce it as having ProRes support unless it’s been approved. Understandably a lot of these companies can’t just sit around for 6 months on a finished model waiting for Apple to say yes or no before they announce. I believe that this is strategic. BRAW is their codec now, it’s widely supported and there’s even decent ways to work with it in FCP. BM don’t see the point in supporting ProRes anymore. BRAW is better with lower file sizes. We users feel differently of course but I would be amazed if any more BM cameras have ProRes. I don’t think that since the 12K they’ve even introduced a camera with ProRes have they?
  17. Most cinema film projectors are three bladed so 72Hz actually. Also the sweep is not only short but ALL of the frame is being exposed for almost all of the same time interval. In a rolling shutter CMOS sensor they are line by line so there is a TEMPORAL or time offset from the top of the frame to the bottom. That doens’t happen on a film camera even though the shutter is “rolling” across the image. So it’s a confusion of terminology. The shutter in a film camera also appears to ROLL across the frame but what’s happening at an exposure level is GLOBAL whereas on a CMOS rolling shutter sensor the exposure is line by line within the exposure interval.
  18. It’s not just weight. It’s about balance. Where can you put the center of gravity. The lenses make it very very front heavy. You have to then push the camera way back to balance. And then you can even start hitting the physical dimensions. If the camera is too long to push back farther to get the center of gravity forward. I know some of the crew and it’s all about the director who was operating (not the DP) using the shoulder mounted ronin as a hand held / gimbal hybrid.
  19. Likely because even a slightly heavier camera would have precluded them ever balancing the lenses they were using on the gimbal they were using FULL TIME. Have you seen the way the camera was operated?
  20. @kye I think you missed the point. Just because you can over or under doesn’t mean it will grade the same way. It’s not linear. Especially once you get into a few stops over / under. If you want shot to shot consistency in grading maintaining the same exposure level helps a great deal. Of course you can recover a stop over or a stop under. But you’re going to have to park and grade it to match. Even in variable outdoor lighting it’s not too difficult to float your exposure using ND so that it’s very similar. Notice I say alter your exposure to MAINTAIN the established exposure level / ratio. That’s all Im saying. We seem to get very bogged down in individual shots and grades but chasing exposure shot to shot makes it so much harder to grade later on. ETTR is a classic dead end in this regard in my view for MOTION work. It causes your exposure to roller coaster and while each individual shot may well look ok in-itself, you quickly find it can be impossible to make them all look good TOGETHER on a timeline. ETTR works great for photographers. Not so great for cinematographers.
  21. I dunno. In this thread that’s been the complaint no? One rule I took on from when I started way back shooting on film was to try to shoot the SAME exposure for each shot in a scene. So don’t go changing it shot by shot, just have the same exposure once you’ve lit it the way you like. Just set the exposure and don’t effin touch it again. I often have colourists tell me how pleasant my work is to grade because it’s consistent. Changes in post tend to be the same across shots then.
  22. What you’re doing though is grading using a LUT as a shortcut to a look you’re trying to achieve. The point being there is no magic transform that just works without further tweaking. I bet too, if you had to grade footage I shot and lit it would react differently to the way you work because you’re grading your own material.
×
×
  • Create New...