-
Posts
997 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by Sean Cunningham
-
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
You're doing it wrong. That's the unfortunate reality here. I don't use those tools to make proxies but I can easily see at least part of the logical flaw in your workflow. You're exporting or linking between two packages while expecting the second to recognize and interpret proxies from the first. You need to re-link to the original footage in your editor (PPro), moving from offline to online status, before moving on to your finishing app (AE). Then you live-link over or use the PPro project file import option in AE. Don't get ahead of yourself. Think through the process. Regardless of the fact that Adobe wrote both PPro and AE they're two completely different programs. Any idea that they do or can function as one giant app that knows and understands what's happening in the other is a complete illusion. Adobe has never been that careful with either the "big picture" or one for sweating the details (maybe CC will eventually help). edit: oh, and there are plenty of other techniques for re-linking from proxy to online footage, if you've screwed up, by exploiting a given app's want to find and re-link footage after an XML import, PPro to AE import, etc. Naming the proxies the same, in an identical file+folder structure away from the location of the online footage is not only smart from a data management point of view but also for the offline-to-online swap. All you have to do is move the master proxy folder, drop it into a subfolder, disconnect the drive it's housed on, etc moving it away from where any project or XML file thinks it should live. The app will want to know where the footage is and so you point it to the online footage and, voila. Similarly, in the case of exotic formats like .dng or .r3d or even battling the old 3GOP "digital rain" shenanigans in CS6, juggling a start with .mts and a finish in .mov that's even easier. It's a simple matter of search-and-replace in an XML file. In fact, there are all sorts of user-created hells for re-linking that can be made manageable by manipulating the XML and exploiting a given application's want to make sense of what you're trying to accomplish. edit2: but none of this replaces the need to completely test your pipeline and ensure that you actually understand the process you're undertaking before starting a job. All these tools tend to try to anticipate the user failing to prepare but the best course of action is to not fail to prepare.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Iscomorphot 16, Sankors and other non Isco 36 glass
Sean Cunningham replied to wytshark's topic in Cameras
All you have to do is watch the trailer to see that Barlow is completely wrong (besides having a working knowledge of basic camera crews and the mechanics involved in shots with both actor and camera movement). His "use your eyes" only works if your eyes know what they're looking at. The film is full of tracking shots, many of which are z-space tracking shots. In the first few moments of the trailer is one of them with the camera dollying forward, focused on a monitor on the wall, looking at the back of an actor who is out of focus. The first AC would have been adjusting focus the entire time keeping the plane of the monitor in focus. In concert with the movement and the seemingly fixed point of focus you don't get cues that focus is changing. And that's why the 1st AC is the operator's best friend and the guy (or girl) who makes them look good, who's saved their job likely more than once. The 1st AC is one of the hardest jobs on the set under the most pressure and not being able to see or detect their work is the mark of the best. Their work isn't hidden in editing. Likewise, there are shots like Affleck coming through glass doors towards camera while the camera is dollying opposite his movement towards him. Focus Puller was working the entire time for that shot, and all the other z-space movement shots throughout the film, in crowds, through the market, etc. Meetings at the CIA, steadicam tracking shots in medium and close-up, not Barry Lyndon's restrained, period compositions based on paintings. This isn't a film that lives in the wide. It's a lot of cramped spaces with movement. Affleck tends to work close-in, more intimate and isn't doing scenes played out in tableau. Barlow was either playing at authority with totally naive eyes or it's an act. Regardless, the only reason I got into it (focus pulling) was because the OP brought it up as part of their concerns over different models available and this concern was downplayed. -
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
For DNG: transcode to proxies. That simple. Get a log-n-capture utility that manages the process for you if you're not going to do it yourself or don't have a slave, er, assistant to do the grunt work for you. Edit with the proxies and then conform. It's really not a big deal. We've been working in a very narrow period of history where this was not simply standard-operating-procedure. I'll offer not all content warrants raw, sure. Given a camera that does both raw and ProRes, it would be hard to make myself go with a half measure in the event the content actually warrants more than just shooting with an h.264 solution like the GH2.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Iscomorphot 16, Sankors and other non Isco 36 glass
Sean Cunningham replied to wytshark's topic in Cameras
Hah-hah. Wow. Wow. edit: and actually, I have a feeling you're being intentionally daft here. I'm not exactly sure why. It doesn't make it any less stupid. -
Iscomorphot 16, Sankors and other non Isco 36 glass
Sean Cunningham replied to wytshark's topic in Cameras
And here you prove complete ignorance over basic filmmaking. Jesus Christ. You have no idea what you're talking about. There are countless focus pulls in ARGO, like almost any film with a real camera crew. In fact there are no less than EIGHT credits in that film for individuals whose primary job is PULLING FOCUS. -
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
Yes. In fact I'm working on a project right now that's R3D based. Have you ever worked with scanned VistaVision? Super-35mm? You have a very narrow idea of what "pain in the ass" is because you've been spoiled by fairly new, entirely compromised, mostly irrelevant experience with consumer formats and methodologies which have copious amounts of ass-wiping built into them, I'm guessing. The problem is this: you're doing it wrong. That or you have unrealistic expectations. If you're editing with raw, you're doing it wrong, most of the time. R3D is kinda cool though because even without fancy schmancy hardware its built in proxy technology allows it to be edited in realtime on less-than-state-of-the-art hardware. I'm playing back realtime RED edits off a FW800 G-Drive that's not even a striped array. From a Premiere timeline. Booya. Folks who have been working in digital non-linear since its inception will instantly recognize the need to shift back to the the proxy->lock->conform model and they won't really lose sleep over it. People who are getting messed up over raw are in over their heads. It's a problem largely of their own making. edit: that's written too harsh. Sorry, I'm in a bad mood.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Iscomorphot 16, Sankors and other non Isco 36 glass
Sean Cunningham replied to wytshark's topic in Cameras
Dude, seriously... -
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
Nah.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Iscomorphot 16, Sankors and other non Isco 36 glass
Sean Cunningham replied to wytshark's topic in Cameras
It's not about doing 70's style kung fu zip-zooms, it's about following focus. Cinema lenses have longer focus throw than stills photography lenses, typically. If you don't have a single focus or focus through system you're cutting yourself off from being able to do all kinds of shots with moving talent. It's like having to write a paper and being told you can't use adverbs or pronouns or any word with more than three syllables. It's doable but it's far from ideal, especially when you have a choice. Actors will be able to move only within the plane of focus. Or not move. Forget about dolly or tracking shots that also aren't also very restrictive. Prepare to eat a lot of blown takes. Less experienced actors tend to over-animate, because they're not acutely aware of how small moves can be read quite well on camera, or they don't hit their marks. A good 1st AC can save what would otherwise be a lost take by being able to judge how far off the mark talent is and adjust focus accordingly. It's not a matter of obsession, it's one of necessity for certain kinds of content. I really could care less what someone did in their short film or music video to somehow "prove" that this wasn't necessary. No follow focus isn't a "dogma" that I can get behind and would sooner just shoot spherical. Dual focus and anamorphics that force you to use telephoto lenses work great for demos and test videos though. -
Raw video on the Canon 7D - Super 35mm raw for under $1000
Sean Cunningham replied to Andrew Reid's topic in Cameras
Ask Nyquist. -
Raw video on the Canon 7D - Super 35mm raw for under $1000
Sean Cunningham replied to Andrew Reid's topic in Cameras
If it's a pain to mix raw digital cameras imagine what film filmmakers go through routinely mixing different stocks shot with different cameras, film cameras mixed with digital cameras, etc. -
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
Why is easy, at least to guess. It's shooting to SD cards. Even hot-rodding MP4 shooters are limited to only a few expensive cards and they're only shooting at ~20% of the bandwidth (using high quality All-Intra patches) needed for uncompressed RAW shooting @ 1080P. The lossless CinemaDNG of the BMPC is reported as only about 1.5:1 compression on the same amount of data. Where are these cards that will handle that? Is there more than one? Is it bigger than 64Gb? I think they miscalculated hitching their wagon to that storage format, unless they know something we don't that hasn't materialized yet.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Some news on BMPCC - Bloom clip & blooming artifacts
Sean Cunningham replied to Axel's topic in Cameras
With film you get a somewhat similar effect at the emulsion layer. When it's in the emulsion, what I recall, the effect is fairly uniform across color records (unlike what even Panavision lenses do around highlights), it's just happening at a much finer scale. The photosites on the sensors for the BMCC and BMPC are larger than that of a 5D so the effect is being magnified here since there's no oversampling.- 70 replies
-
- PocketBlackmagic
- PB
-
(and 3 more)
Tagged with:
-
Raw video on the Canon 7D - Super 35mm raw for under $1000
Sean Cunningham replied to Andrew Reid's topic in Cameras
Folks have already been intercutting compressed DSLR footage with film and non-compressed digital for years so two raw cameras will, with care, be easier to intercut between. The crop factor issue can be made practically invisible with lens choice and exposure (ie. Nokton/Noktor lens + heavy ND). Crop factor has "tells" the same as anamorphic has "tells". Most people only look for the most easily recognizable effects with both types of photography and in both cases it's academic to fool these people. Were someone shooting Terence Malick style, wide angle, deep focus, the difference between a scene shot on a 5DIII and a BMCC becomes one of relative focal length and given that focal length is variable anyway in any film the issue of shooting 135-size or Super-16 size is not a big deal. A more relevant issue will be sticking to exposures that don't emphasize the 5D's lack of reach into highlights and the BMCC's lack of reach into shadows. Still, these issues aren't actually relevant beyond pixel fuckers in places like this. It's a non-issue for audiences who already aren't aware of such subtleties. Otherwise they might be wondering why Sony 4K projectors make footage shot on their highend cameras look like video. -
So it seems mobile has overtaken camera for 4K evolution
Sean Cunningham replied to ntblowz's topic in Cameras
It's Sony. They have a longstanding history of doing this sort of thing. I've seen it at least as far back as the DV revolution, preventing their prosumer line from competing with their industrial and professional gear. I'm sure they did some of the same protectionism going back into analog gear with formats like ED Beta and Hi8. The phone proves beyond a shadow of doubt that spatial resolution is easy and cheap and there to dazzle consumers and not much else. I'll get the popcorn and let's see some of these films shot on 4K smart phones instead of quality 1080P cameras. I double dog dare them. -
Until you have an internet connection that's at least USB2.0 speed, cloud backup for anything but compressed files isn't viable. In other words, it's pretty useless for the really important stuff (aside from project files). I shot footage last week that was edited down to about 3min. It was 25Gb of Moon Trial 7 takes shot over about an hour. Uploading to a cloud would be ridiculous. Likewise, I have multiple copies of the ProRes HQ master for my feature which clocks in at a little over 100Gb for the MOS stream. Uploading that is unviable. Not only because of typical upstream internet speeds but because the error correction needed would make available bandwidth even slower. Days or a week of non-stop uploading...I just don't see it. Worse yet for the 1+TB of raw "camera negative". It does make for a convenient method of random-access small file transfer between parties over large distances. We finished our last feature using DropBox as the link between my brother doing editorial in LA and me doing color + effects in TX, sharing XML and other project files and then me passing proxies and nothing bigger than 2Gb or so back. It worked because we had identical arrays containing the raw footage. It wouldn't have worked otherwise. The limitations of upstream bandwidth are overcome with networks like Torrent but that's dependent on the upstream source being split across as many individuals as possible and that just isn't possible in the context we're talking about here.
- 5 replies
-
- cloud storage
- cloud
-
(and 5 more)
Tagged with:
-
where to store/backup all my massive raw video data on a budget
Sean Cunningham replied to black's topic in Cameras
Tape? No. -
Raw video on the Canon 7D - Super 35mm raw for under $1000
Sean Cunningham replied to Andrew Reid's topic in Cameras
What? None of those accessories are any more necessary for the BMCC than an EOS shooting raw. Functionally they're almost identical. Post production is essentially the same. They both require grading. Starting from a baseline provided by your software's raw-reader is not indicative of your EOS footage needing less or different grading. The dng reader is applying a lut that you can then tweak but let's be real clear here, all raw, if viewed without a lut, looks pretty much the same. You are either looking through a lut or you're looking at a terrible image (assuming you don't have a high dynamic range monitor that costs as much as a German sports sedan). Anyone wanting an equivalent style of setup with a base correction and exposure that they can just tweak or leave as-is can simply tech-pass their raw BMCC with something like Film Convert. Or they can apply one of several film luts that have been made available for free out there. Don't confuse the efforts of software coders to create a novice-friendly, fool-proof work flow for EOS users with a competing workflow that makes the exact opposite assumption about its users. -
Once it's wide enough, yeah, if you're at that kind of depth, it should be really difficult to tell the difference. The cues just aren't there. I'm a fan of the look achieved with purposely shallow DOF, even the Tony Scott style of shooting a wide master on a 250mm from a quarter mile away, but seeing Malick's "tone poems" I can't deny the jaw dropping gorgeosity of the natural world, natural light and wide-open depth to capture as much of it as possible.
-
where to store/backup all my massive raw video data on a budget
Sean Cunningham replied to black's topic in Cameras
I'm not ever likely to trust Western Digital drives for archival. For a few years half of those purchased to do back-ups failed and I stopped buying them altogether. CD's aren't longterm storage either, not unless they're someplace where the conditions prevent the various layers and glue and such from breaking down. High quality or not, they deteriorate over time. Plus, they're just too darned small. I've got single photoshop files that won't fit on a CD. They and DVD-R are just too slow as well. Burners/players are prone to failure over time too and can have a nasty habit of, on their way to the big landfill in the sky, writing discs that can have issues being read later on. I've got discs from a Super-Drive that barely lasted two years that are unreadable on some new, functional drives but not on others. -
The listing of anamorphic is an error, or it might be for an isolated part of the film because this is, like Tree of Life, a mixed media film. But the primary process used is spherical 35mm with some sequences shot in 65mm. Deep focus, using wide primes and shooting at optimal times of the day are part of his dogma for the last two films. He uses steadicam but a lot of the BTS shots of the cameraman that I saw had him in an Easyrig, or equivalent. It's not actually a stabilizer so that could account for glitches in the movement. He used a lot of locals. Locations were scouted, particularly interiors, for how natural light played throughout the day. It's an available and practical light film. They didn't use movie lights. http://www.theasc.com/ac_magazine/April2013/TotheWonder/page1.php ...they occasionally used some bounce but in every scene they placed actors relative to natural light or practical lights and it's just that beautiful.
-
where to store/backup all my massive raw video data on a budget
Sean Cunningham replied to black's topic in Cameras
I've got four full-size ProMax towers with $16,000 worth (in 2002 dollars) of 10K RPM SCSI-160 platters that I'm half tempted to try to spin up now that I know the two DualMax RAIDs are still functional. I just need to buy some earplugs first, lol. What's sad is I've got a single half-height drive quietly spinning on my desk with the same amount of space as what I get with the ProMax array that I bought for $79. It doesn't have quite the same throughput but it's really, really close. All these fast SCSI systems were for the Cinewave-HD I build in 2002 (that also still works). Uncompressed 422 used to be hard. -
where to store/backup all my massive raw video data on a budget
Sean Cunningham replied to black's topic in Cameras
My first and only guess would be linux. Linux tends to be the best and most common OS choice for embedded, purpose-built device scenarios. They just self-contained all of the points of failure you have with a dedicated PC + array. By having the host PC inclusive to the arrangement they did remove the software+hardware related problem you get with a media only solution but you could very well be in a position where the media and data is good but the integral host PC has failed at any of the plethora of potential points that they do fail. The saving grace with new drives and continually moving your data forward for fear of losing it (not only to failure but of planned obsolescence), is they get faster and smaller and cheaper, requiring fewer of the new to eventually landfill the old. It's just a dreadful, repetitive and wasteful process. -
where to store/backup all my massive raw video data on a budget
Sean Cunningham replied to black's topic in Cameras
I just fired up a couple HUGE SYSTEMS HugeMediaVault DualMAX (SCSI-160) arrays a couple weeks ago, that likely hadn't been fired up since around 2004 or 2005 at the latest. That was one of those "fingers crossed" kind of moments, plus hoping I remembered the proper cabling and terminator setup, plus hoping that the SCSI-ID dials hadn't been fiddled with in all these years and two cross-country trips. Back-up is an enormous problem. Hard drives are the only cost effective solution that are also fast enough. For the individual or for facilities generating tons of data. Tape technologies haven't kept up and they've always been a bad solution anyway, both hardware and software wise. At one time "experts" expected optical discs to be viable. LOL. There is no truly longterm backup solution for digital data. This is one of the biggest issues for longterm film archival and one that the MPAA might have truly screwed the pooch on when that d-bag Jack Valenti was in charge. Nothing has been invented that has a chance of outlasting archival film, properly stored. edit: I can't imagine the hell it would be trying to find a working exabyte drive, or metrum, or the crummy software that was used to back data up back then. Getting ahold of an old machine with a SCSI or SCSI-II interface and the right version of an OS. That stuff was awful. ZIP drives didn't last a year because the drives themselves were crud and they'd just start corrupting any disk you put in. Is Iomega still around? Oh look, they're Lenovo now. I know never to buy any of their stuff, ever. -
That all depends on your patch. One would think that noise performance would be totally based on the chip and the various patches would only apply an overall enhancement or de-emphasis of fixed performance but that doesn't seem to be the case. ISO320 is good on Moon Trial 3 (an All-Intra patch) but maybe not with other patches it seems, like Flowmotion (a GOP-3 patch). Perhaps it's yet another GOP related issue that has more to do with what is interpreting and transcoding the MTS for display than absolute noise performance. Not all AVCHD readers are created equal, as many of us have learned, the hard way and after much hair pulling and gnashing of teeth. Oh, and sorry, Mirrorkisser, I use SMOOTH exclusively.