-
Posts
7,835 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
We all hear about Panasonic "needing" AF in order to stay in business, but my question is, how many people are saying that because they personally aren't buying Panasonic because of the AF, and how many are saying that because everyone else is saying it? I used to write electronic music in the 90s and 00s and anyone who knows that scene knows that there's a synthesiser called the Roland TB-303. It was famous for being so highly sought-after and that it was worth so much second hand. Everyone knew that units were selling for over $1000 second hand (basically 2.5x its retail price) but here's the thing... no-one could actually provide evidence that it was selling for that amount. There were article after article talking about how Roland should re-release it, lots of companies make clones, it was a whole thing. After years of this being the talk in magazines and stores and forums someone from Roland did some digging and they traced the stories back to a rumour that one sold for more than $1000. That rumour was so outlandish that it "went viral" and before you know it there were half-a-dozen of these things being listed for sale at that price and no-one accepted anything less. They couldn't locate a single sale at even remotely that price. Worldwide. Am I saying that no-one needs AF? Of course not. What I am saying is that I have witnessed a "fact" known worldwide become truth based on almost nothing at all except people who kept repeating what everyone else was saying. It was like a virus that needs people to copy itself, like those viral emails saying that everyones telephone number was going to be made public. I'd appreciate it if basically everyone shut up about it except for the people who actually would buy Panasonic but don't because of the AF. Otherwise we'll end up with Panasonic going bankrupt for no good reason, and we'll end up with cameras that have 50K resolution but still don't look as good as the 2012 Alexa. You and I are the noisy videographers on here that I think are actually in the minority with how we shoot. When I compare the patronage of these forums with other film-making / camera places like FB groups and YT channels and Discord groups etc, it is really freaking obvious that the discussions here do not even vaguely represent what most people are talking about in other places. I used to do websites and did the website of a (stills) wedding photographer, one of the highest charging and highest reputation ones in the city. It was a referral after he shot a wedding that I was best-man at, so I'd experienced his services too. His work was beautiful, but his base package (without prints) cost many times the average cost of a whole wedding, so he wasn't a cheap-and-cheerful option - he was in the top tier. He took the wedding parties to the same few locations, took the same shots with the same compositions, over and over and over again. He gave me a selection of his best shots to choose from for the website and I had to go through them super carefully to make sure that I didn't put multiple shots from the same wedding in there (doubles are a sign you don't have many clients) but it was difficult as the shots all looked the same. There were even couples and wedding parties that had the same clothing and flower designs - weddings are almost as cookie-cutter as actual cookies! He typically shot the formal shots and had a second shooter who was taking spontaneous shots, which were lovely and very unique, so it was a combination of both. It might be worth getting someone who also shoots (if they have the skills in post) as they might be able to either understand why you can't nail every shot perfectly, or might give you some shooting tips to improve what's happening in-camera.
-
Most compressed codecs will have a lower bitrate when there isn't huge amounts of movement. I'd suggest a stress test to max out the codec. One I have done before is to get half-a-dozen stills and put them on a 60p timeline at a single frame each, the point the camera at your monitor so that the whole frame is moving, and turn all the lights out, then set the computer to loop on that timeline, and hit record on a 24/25/30p mode. This ensures that no two frames will be remotely similar and it should stress the codec and push the camera to record the maximum bitrate for that codec. I've seen increases in bitrate by even 50% when doing this, compared to other test footage. I've tried pointing the camera at trees moving in the wind and this laptop test stresses the codec more, and is available even if it isn't windy and if you don't have any trees nearby 🙂
-
I'd suggest paying for some 1:1 coaching from a colourist, or someone that is very good at colour. I'd start by asking them to demonstrate how well they can grade a shot that has some issues - eg, exposure or WB or mixed colour temperature lighting etc. If footage is 100% properly shot anyone can just put on a LUT, apply some contrast and it will look great. The test is when things aren't great, and in weddings you'd have those issues from time to time I'd imagine. Have we forgotten the people that shoot narrative? They mostly don't care about AF, don't have long takes, and seriously care about the quality of the image. This forum has a few very vocal videographers (myself included) but that doesn't mean that no-one is out there shooting shorts or whatever. Noam Kroll shot a short on a single can of 16mm negative: https://noamkroll.com/shooting-a-no-budget-short-film-in-6-hours-on-ultra-16mm-film/ for a shooting ratio of something like 5:1. That's extreme, but even his original plan of 10:1 means you can do an entire short on one or two cards. A huge proportion of the people that I see in the film-making groups online are shooting narrative, which is the perfect situation for a Panasonic camera. Besides, the GH6 is getting record-to-USB in a firmware update anyway, so that makes things easier.
-
Yeah, I keep saying this but no-one wants to hear it. We had the OG BMPCC that shot RAW internal, had 13-stops of DR, and was under $1000, a decade ago. What we have now - lots of cameras that barely improve upon this spec, except in resolution. At the time there was the OG BMPCC (the camera we had and could afford) and the Alexa (the image we all wanted). The Alexa had barely more pixels but staggeringly better quality pixels. What did the manufacturers do? Basically zero improvement in pixel quality, but now we have 16 times as many of them. *eyeroll* BUT.... say that around here and the "progress is progress" people just shout you down. Andrew was right about us getting what we deserve.... and what we got was manufacturers that just bamboozled people with BS and then everyone swallowed it. Even with Yedlin proving that no-one could even see more than 1080p in most situations.
-
The setup I have been pondering is to have my MBP and an external display built for gaming or a laptop secondary display. Andrew reviewed one on the blog recently, but I can't find the link so maybe it's gone. I use Resolve, which works brilliantly with the BM display devices like the Ultrastudio 3G, and gives a clean HDMI out of the monitor window, similar to how you have configured your second monitor, only this bypasses the OS colour management and gives a completely controlled calibrated feed from Resolve, plus it's 10-bit which I suspect also helps. I believe they also work with other NLE's so that might be worth exploring? The one I have is powered directly from the laptop via thunderbolt and gives a 10-bit HDMI and SDI up to 1080p60, but there's also the (more expensive!) 4K Mini that does 4K: https://www.blackmagicdesign.com/products/ultrastudio/techspecs/W-DLUS-11 So I was thinking that I could pair my Ultrastudio with a USB powered HDMI-in monitor to create a second display completely powered by the MBP that is also great for editing and is nicely calibrated. On-camera monitors are probably too small (5-7 inch or so) and field monitors are probably too expensive, but I think there's consumer ones that also accept HDMI, so I might go in that direction. Combined with a MediaLight calibrated globe and a desk lamp I could probably have a dual-screen calibrated laptop setup in a lighting controlled neutral grading environment. Combine that with my Speed Editor and maybe a colour grading surface, and it would be a full editing setup that easily fits in a suitcase.
-
I suspect that @webrunner5 is right that as long as they're working they're probably fine. They're essentially a digital camera with a single pixel! I have the i1Display Pro, and the i1 series seems to be the one with lots of support, so it's worth checking what software you'll use it with and what support those packages have. I calibrated my MBP monitor fine on my i1Display Pro. I have only calibrated to rec709 though - were you trying to calibrate to a wider colour space? I've tried doing that before on my previous device (a Datacolor Spyder - best to avoid these) and couldn't get a proper calibration even though my monitor claims almost 100% of Adobe RGB. I understand that one might have a better calibration report than the other, especially if one has a wider gamut capability than the other, but once properly calibrated they should look basically the same. Unless one of them is really underperforming and not really up to getting a proper calibration? Ah - I just realised that both my MBP display and my Dell panel are wide-gamut monitors. Perhaps this enables a very good calibration against rec709? If that's true, that might be a consideration for @Rhood - to get a cheaper wide gamut monitor as it might get a better calibration result.
-
If anything is different to a calibrated monitor then it doesn't look "better", it looks "inaccurate". My Dell panel was also factory calibrated, but oh boy did it change when I calibrated it. This might lead you to question how good calibration really is, but when I calibrated my MBP display and my external display I could move an image between the two and the colours would look very similar indeed. The main difference was that the MBP display changes quite significantly when you change the angle of the display, so getting a perfect visual match on brightness / contrast / saturation is impossible. Think about it this way... if I made a custom LUT and applied it to your laptop display, you'd want me to remove it wouldn't you? Even if you were doing videos on planes/trains/automobiles. The fact that it's not controlled lighting doesn't mean you're happy to colour your videos through my LUT. That's what having an uncalibrated display is like - an unknown LUT applied permanently to that display. Besides, it's actually very easy to arrange a controlled lighting environment. The only thing you need is to have is relatively low-level temperature ambient lighting. For that I recommend MediaLight bulbs: https://www.biaslighting.com/en-au/products/medialight-mk2-dimmable-e26-110v-a19-bulb They fit into a standard light socket and are completely neutral in temp/tint. Put one of these into a lamp, ensure that it's shining on a relatively neutral surface, close the blinds, and you're fine. I agree that for rec709 most displays should be fine, if they're calibrated. One colourist mentioned that after calibration the image on their consumer-grade GUI monitor was very close to their super-high-end reference display, so unless you're doing mission-critical work then (for rec709) you should be fine with a calibrated monitor.
-
I have a sneaking suspicion that when it comes to looking at the colour of an image, the professional colourists aren't ignorant about anything. Even the tiniest details are discussed like they're obvious, and when the time calls for it, they can often just look at an image and tell you what was done to it in grading.
-
Yeah, and actually the absolute DR doesn't really matter either - what is important is having the same people test the other cameras in the same way so that the relative comparisons are valid. I read an interesting comment about the Alexa Classic on the CineD test for it, which basically amounted to it testing at about 14 stops of DR, but that was RAW and with NR it could get another few stops above that. Interestingly, a lot of the other cameras they test aren't RAW and so already have the NR (and smoothing from compression) applied - so if you're comparing a h264 camera with an Alexa then potentially the Alexa might actually have 17 stops in comparison, but maybe when comparing it to a RAW shooting camera then 14 is the relevant number. That struck me as being another whole variable on top of their numbers that they didn't seem to talk about much (I've read lost of their lab tests when I reviewed camera DR and put together my spreadsheet for it) and I also don't know that much about either. Fun stuff. I should really go read the technical manual for the software they use so I understand the charts better.
-
Cool - calibration is the key. In terms of monitors beyond that, I'm not really sure how to judge a good monitor vs bad. Obviously if you need a wider colour space than 709 then you'll want the monitor to have that coverage, and if you're working in an environment where you want to look at the monitor from different angles (eg, if you have multiple people reviewing your work at the same time - like director cinematographer producer etc) then colour accuracy over angle-of-view will matter. Maybe the bit-depth matters too? I use a BlackMagic Ultrastudio Monitor 3G to get a 10-bit image out of Resolve to my reference monitor without going through the OS colour profiles, so maybe the 10-bit helps? I read a bunch of monitor reviews some time ago but I got the impression they were the same as most camera reviewers - people that didn't know much (except how to speak like they're an expert) just repeating specs and media release BS back at you.
-
AUS only issue - that sounds about right!
-
If you're paying retail, $100K isn't a high end system... it's a very nice turntable, without a preamp!
-
"If it looks good, it is good" - common mantra from professional colourists.
-
Well, we all know how optimistic those manufacturers dynamic range specs are, but yes, Sony are up there in DR.
-
I suggest this model of calibrator: https://www.bhphotovideo.com/c/product/1506566-REG/x_rite_eodisstu_i1display_studio.html Then buy whatever you can afford with the rest of your budget. Buying a monitor without it being calibrated is like having a two-year-old adjust all the secret manufacturers monitor parameters and then using it as a colour reference for your work.
-
An average monitor that's been calibrated will be better than the most expensive monitor available that hasn't been calibrated. There is no such thing as an "accurate" monitor without calibration. Buy a calibration device and then go looking for a monitor. When you do this you can calibrate your laptop screen as well.
-
Everyone here should be very very happy that they're not into high-end audio instead of cameras. I used to be, and high-end audio is broadly similar, except: a setup is 8 products, not two add a zero to the price you can't listen to it over the internet - imagine if you were listening to a radio show about cameras instead....
-
I really think that editing, and analysing editing, is the best way to improve everything else. People talk about "shooting for the edit" and that's a great strategy if you know what the edit is going to be, but I suspect most people don't. The edit is really "where the rubber meets the road" and you can have all the great shots and nice sound that you want, but if it doesn't work in the edit then it isn't good, period. So by deepening your understanding of the edit you're really deepening your understanding of what "good" really is. I'm noticing so much stuff about cinematography and it's really making me think about shooting differently too. I'm much less advanced on my understanding of sound design than I am of cinematography, but I'm really starting to notice things about this as well. Apart from the ad-breaks where obviously every element in the edit all stops/resets, there aren't so many neat pauses, rather the structures of these edits are more like a continuous stream where threads are interwoven and so visual themes, voice-overs, music, sound design (eg, ambience), and other elements are all overlapping. It gives a great feeling of carrying the viewer along on the journey, with a really rich experience, and it also provides a great tone from which to break if you really need to. In one episode there's a scene in a restaurant where the host is interviewing some people and all of a sudden some gangsters drive almost into the restaurant in a stolen car and get out and have guns and the place just goes instantly to chaos with everyone crouching down but the cameras keep rolling. In that section they cut to just raw clips with basic editing, which contrasted starkly to the rich multi-layered production just previously. It further added to the feeling being changed. It's great to see how they're supporting the whole aesthetic theme of each episode with all the layers and techniques - a true alignment of all departments all pushing in the same direction.
-
I've been persevering with analysing editing, and the more I do this the more I recommend it to anyone who wants to improve their skills. One piece of advice I would give, which seems actually kind of painful, is to cut up the video yourself manually. I cut up a show using the auto-magical tool and set it to be pessimistic (so it misses some cuts rather than has false-cuts on strong action) and then made a pass to manually add-in the rest of the cuts that it missed. This manual pass made me examine, frame by frame, some of the faster more complex parts of the edit. I'm cutting up travel shows with stylistic edits, and I have been noticing all kinds of tiny details in the editing. Some things I've found include: digital punch-ins for one or two frames before / after an edit to add a zoom effect or glitch effect (if the punch-ins were offset in X and Y from each other) jump-cuts to remove maybe 2-4 frames in some shots to add pace and style to b-roll shots with movement in faster montages barn door / sliding door effects on edits 'burst edits' where there is a longer shot, 1 black frame, 2 frames b-roll shot, 1 black frame, 2 frames b-roll shot, 1 black frame, longer shot whip pan transition wipe transition where a blurry object moves rapidly across the screen to obscure the first shot and reveals the second shot whip pan transition with a single frame of crazy motion-blur between the two shots - the single frame wasn't at all similar to the others so it kind of creates a flash effect fades... the combination of fade-out / fade-in is used (noticing when it's used is interesting) but fade-out / hard cut in from black was another combination I saw, interesting aesthetic colour grading variations - sometimes the colour was one way and sometimes another - seeing this choice of different looks was very interesting This might sound super-trendy but it wasn't from a travel YT video - this was from a major TV travel show episode that won multiple awards for editing. There are probably other things in there I haven't noticed yet either, but there are cuts in there that I've spent whole minutes just rolling back and forth and looking at and thinking about. Also in addition to seeing things at the micro-level with individual cuts, I'm noticing things at the structural level where the relationship of scenes within the final edit is laid-out. if you're making up the edit then the relative structure becomes visually obvious and the act of doing this makes you think about what a scene really is. Do the shots that get you to and from a location count in this scene? What is a "location" - some "scenes" might move between rooms or buildings and others might just stay put. The analysis forces you to notice these things. Sometimes they might have one scene that moves around and another that doesn't - trying to understand why is very useful. It is having several effects on me: It is making me realise how shallow the level of skill is on things like wedding videos (despite them being perfect for many of these innovative techniques) as they're multi-layered and very stylised It is making me realise how little is actually required to get a nice edit - some of the videos I've really enjoyed watching have been very very simple edits just having a simple structure and just music and no other sounds It is showing me how much you can get away with - knowing what is in an edit frame-by-frame and then re-watching it and not seeing what you know is there (or removing a single frame and re-watching it and not noticing a difference) it really shows you what is perceptible and what isn't** It is adding more and more tools to my editing toolkit I'm realising the importance of sound design I'm learning so much by watching the cinematography too ** yes I have checked that my editing setup is playing all the frames (I've actually examined the latency and jitter on my setup by filming my laptop screen and external monitor with my iPhone at 240fps and verified that the image timing is identical on both screens and also that each frame is visible for a similar amount of time, ie ~10 frames at 240fps per frame of 24p). I cannot recommend this type of analysis highly enough. We watch so much content and yet so much of the artistry is simply not apparent until we go deep and really look, frame-by-frame, at what the masters are doing. Pick something done by the people at the pinnacle of the craft and go deep on it and see what you find. You won't regret it - I find it's actually more entertaining than just watching TV or a movie so it's not a chore at all, but like opening a window to new ideas you wish you knew earlier.
-
I wouldn't think that the 50p readout would be different to the normal readout - both RAW outputs are at full sensor resolution with no processing, so there's no downsampling or anything going on. I prefer to shoot in Prores on this as it does a great job of removing moire etc (BM engineers are better than I am!) but it is still prone to it on sharp lenses and fine detail. I'd suggest that the only reason to change the settings in camera are to change frame-rate, otherwise you can just set it and then simply adjust aperture / focus / ND shot-to-shot, so it's a very straight-forward experience actually. If you're shooting Prores rather than RAW then in theory you should be adjusting WB too, but if you are shooting outside then there's a good argument to be made for just setting to 5600K and then when things are warmer/cooler in reality then they will show up like that on the files and that's appropriate because that's actually how it was. If you're using artificial lights then just set for those and forget. IIRC I read somewhere that the WB latitude on the Prores HQ was as good as it was in the RAW, so that's probably something else that's worth a test to confirm, but certainly my experience changing WB in-post was just lovely and I did some quite strong changes as I was shooting at and around sunset so was dealing with some quite strong WB shifts and it was all very straight-forwards and a pleasant experience.
-
I think you missed the point. The GH5S was a non-IBIS camera designed for use in externally stabilised circumstances (ie, gimbals, tripods, steadicams, vehicle mounts, etc) where the IBIS mechanism would cause problems (eg, the sensor would wobble after a bump) and a fixed sensor is the preferred design. If you rely on IBIS (as I do) the GH5ii or GH6 are the upgrades to the GH5. The GH5S was the start of a parallel line of fixed-sensor cameras that happened to come out with some improved specs and happened to come out around when the GH5 replacement would have been nice and happened to have a similar body and happened to have a similar name. Saying that "imagine how successful Panasonic would have been if they just made a gh5s with IBIS a few years ago" is basically saying that you wanted a hypothetical camera that would have been popular. So on that note.... Imagine how successful Panasonic would have been if they just made a GH6 with 18 stops of DR and internal RAW in 2014!
-
I found that grading the BMMCC images was completely different to anything I'd experienced before. I suspect it's due to the fact it's RAW and so the challenge is to build a grade starting from scratch rather than having the camera do all sorts of things for you (if you want it to or not!). This was also the case for the ML footage I took from my Canon 700D, and would be the case for the EOS-M too. I'd suggest that you find some reference footage you like and use that as a guide for getting things in the ballpark and can adjust from there. You could even shoot a test clip in both RAW and Prores and then use the Prores as a reference for what the camera is doing and as a starting point for your grade too, especially with NR and softness/sharpening.
-
I suppose it's really about how close you are to the edge of what the equipment is capable of. I've posted about the GH5's limitations and how the GH6 is better in many ways, and probably not worse in that many, cost being a notable exception of course. Truth is that I'd be nervous about buying any camera other than a GH6. I've been around long enough to hear that camera manufacturers fill them with all kind of stupid stuff that can really bite you in the backside if you don't know about it. It's normally certain combinations of things, like in certain modes certain features aren't available or don't work as well, etc. I heard semi-recently from somewhere that in full-auto a camera doesn't smoothly adjust the exposure level, but goes up or down in steps while recording. I was stunned to hear this and made me wonder what the hell else that camera (and others) does or doesn't do that is completely ridiculous. Actually, the irony is that the shot in question would probably be unusable from a 5d2 or an NX (I don't know the Oly well enough to comment). The camera was on auto-ISO and auto-SS and the Voigtlander 17.5mm was fully open at F0.95, so the SS would have been 360-degrees and there's enough noise in the shot for me to know the ISO was really being pushed. IIRC I also lowered the exposure (using exp compensation) so that the background wasn't completely obscured, pushing the subject lower into the exposure range. Note that the BTS shot, which was taken with a modern iPhone, is out-of-focus. Think about how low-light the situation must have been for a phone to not be able to focus on anything in the frame! This is one of the shots that shows how well the GH5 operates under pressure - many of my shots are right at the limits of the IBIS, SS, lens T-stop, ISO performance, and right at the limits of my ability to grade the footage and bring out the best in it. I know that many of my shots are right at this limit because many are actually past that limit and I can't salvage. That's kind of the unspoken undercurrent of this thread. Depending on how heavily you rely on the features of the GH5, there may literally not be a better camera in existence (except the GH6). Many brands offering better low-light, colour science, DR, etc etc, but IBIS is substantially worse on all options except Olympus who can match the IBIS, but don't offer every feature of the GH5 still, and definitely not the GH6. Even the FF Panasonics have their own drawbacks, the S1 and S5 are cripple-hammered because they're not the S1H, and the S1H is large and expensive and still lacks all the features I hit the limits of the stabilisation of the GH5 quite often. Mostly it's due to being cold or low blood sugar. I'm shooting with manual lenses of course, so could get better stabilisation by using an OIS lens, but you'll lose a couple of stops of low-light, and guess when you often get cold - at night.. when it's dark! In reality, there could just as easily be a thread about who hasn't upgraded from the GH5 because it is still the best compromise for all the things that you need in a camera. The image quality is actually the worst feature of the GH5 - the best features are all the ones that add up to you being able to get the shot in the first place. I took the OG BMPCC out to shoot once, for a visit to a festival in a park, and that was enough to tell me it wasn't going to work for me. The screen couldn't tilt, I couldn't see it in daylight, I could solve that with a monitor but that would double the size of the camera which defeated the point, and the poor screen meant that I couldn't focus properly, etc... I spent so long getting each nice b-roll shot (and once graded they were genuinely lovely, the ones in focus that is) that I literally lost where my wife went and had to skip part of the place to find her again. Most of the spontaneous moments had evaporated before I was able to get the focus and exposure dialled in. This is all completely appropriate and not the fault of the BMPCC - it IS a cinema camera after all! It's that life doesn't wait for the camera to be ready. Of course, the easier your shoots are, the less these things matter to you. If you hit record on your camera and know what's going to happen with the lighting and composition of the shot, even 20s from now, then the demands on the camera you choose are significantly less. Anyone who can control the action to wait for the camera has such a different experience of shooting that they practically live in a separate universe.
-
Me sitting in a row boat: a shot I got from where I was sitting: Obviously if you can light or expose better then you should, and it depends on what you're filming about how much control you have over the variables, but most of the time the most difficult shots to film are the ones where I have basically no control over what is going on. I regularly find myself right at the limits of what the GH5 can do, and I just find it annoying that the generic response is to just "film better". I asked the question on the colourist forums too and they basically had no advice except to say that shooting in nicer conditions and using a better camera were the only answers. PS, if you're not aware, standing up one one side of a row boat is ill-advised unless you want to immediately transition into underwater photography.
-
You want people to shoot travel videos, adventure videos, home videos, weddings, outdoor sports, and everything else that happens outside in a studio instead?