Jump to content

kye

Members
  • Posts

    7,817
  • Joined

  • Last visited

Everything posted by kye

  1. I'm no expert, as I only know Resolve, but from my understanding its normally worth investing in things like shortcuts as they'll pay off in the long-term. In terms of learning Premier shortcuts vs customising them, I'd suggest reviewing the Premier ones and seeing if they suit your workflow. The problem in editing that no-one talks about is workflow and how some peoples workflows are very very different. Some examples: Do you add all your clips into a timeline, then make selects by removing the material that is no good (bad takes etc), or do you review the material from the Bins and add the good stuff? Fundamentally different shortcuts and even mindset. For larger non-linear projects, do you sort your selects into timelines (eg, b-roll by location, interviews, events, etc) or do you do this filtering via tagging and metadata and not via timelines at all? My understanding is that Premier and FCPX have much more power in their media management (like tagging etc) than Resolve, which is why editors don't think Resolve isn't really "ready" for big projects yet. Do you build your final edit by a subtractive or additive way? ie, do you pick all the good bits and pull them into a timeline, then cull and re-arrange and tighten in passes until you're done, or do you just pull in the absolute best bits and add and re-arrange things until you're done. If its the former then you'll spend a lot of time looking at clips that will eventually get cut (if a clip makes it through many passes and gets cut at the last minute), but once you've cut something you're unlikely to ever look at it again. Alternatively, if you add the good stuff then you'll potentially be reviewing your entire collection of clips every time you want to find the next clip to add. If your media is well curated (metadata, tagging, etc) then this can be an efficient process but if not you could get lost forever as there's no guarantee that you're making progress. When you're making an edit, do you want to move the timing of the edit point to suit the content of the clips or do you want to change the content of the clips around a fixed timed edit point? If you were editing dialogue you'd do the first but if you were editing to music you'd do the second - fundamentally different shortcuts and thinking. When you're editing, are you concentrating on the edit point, or the clip? For example, if you're looking at the edit point then the Clip End will be of one clip and the Clip Start will be of the next clip - so the editor will be focusing on editing two clips at once. If you're in a clip focussed approach, Clip Start and Clip End will be of the same clip. I hit this issue in Resolve as I edit in a clip-centric way and edit to music but some of the Speed Editor controls (very handy ones I might add) are in a edit-point-centric way, and there's no way to change this. etc... The other thing to realise is that there are many small tasks that must be done in an NLE, and the different editors may have different ways of doing them. It's easy to compare shortcuts, but it might be that to accomplish that outcome in one NLE can be done at (potentially) a fraction of the time by using one mindset over another. This is the hidden aspect of NLEs - they are designed to edit in a particular way and may be less efficient to be used in another way (or may not really support that other approach at all). Obviously this is very personal to what you edit, how you edit, how large the projects are, etc, so the answer ultimately can only be determined by you, but don't only learn how Premier can do what Resolve can do, try to learn what Premier can do that Resolve can't do.
  2. Yeah, I think you're right. I have a small 1080p action camera that would be pretty good so that's my current best plan. I could either mount it to the scooter or maybe get a mount for my helmet, which would be stable and more likely to be remotely level too. One thing I really like about the hypersmooth stabilisation on the latest action cameras is that when you mount them to a vehicle or something they aren't locked to the vehicle, so you see it moving around in response to the terrain (smoothly of course) but it's a nice effect I think. Can you turn the stabilisation on your cameras down to a lower setting perhaps? Not sure if that's something that's available?
  3. Just kidding... 🙂 My understanding is that Premier is probably still a better editor than Resolve (although Resolve is closing the gap) and that the main issue with Premier is that it crashes all the time. Save early, save often. In terms of colour though, we all love to think that the 5,984 controls that Resolve has are required for good colour, but it's not true - if you get the basics dialled in then you can get great looking images with the tools that basically any NLE has.
  4. kye

    Stabilisation in post

    Yeah, I suspect that it's often under the threshold of what is perceptible. I also have a theory that this threshold is getting higher over time as people slowly get used to cameras that expose with SS. Your comment about compression from online platforms is an interesting one, as, YT in 4K has more resolving power than basically any affordable camera had a decade ago, so that's actually gone through the roof, but peoples perception has dulled more than enough to compensate. I've actually gone the other way in my work - I used to shoot quite dynamic shots and stabilise in post a lot, whereas now my shots are much more static and I basically don't stabilise in-post at all. This forum used to be full of people talking about motion cadence, which despite never really getting a good definition was a pretty subtle effect at the best of times, and yet now people seem to be comfortable with the blur not matching the cameras movement, which I would imagine would be an effect at least one or two orders of magnitude more significant than motion cadence. I also find it amazing that people have adjusted to 4K being cinematic, when even now many cinemas are 2K, and every movie (apart from those on 70mm) basically had 2K resolution by the time you saw it in a theatre. How perception changes over time!
  5. kye

    Fuji X-H2S

    I'd be a bit careful about interpreting this type of Information - I remember statements from the launch of the Alexa 35 that indicated otherwise. Most likely everyone was telling the truth (there's unlikely to be a scandal here!) but that everyone was using carefully chosen words.
  6. kye

    Stabilisation in post

    If you have 180 shutter and shake the camera then your images will have shake and motion blur. This will look normal because the blur will match the shake - if you shake / move left the blur will be horizontal and the size of the blur will match the shake / motion in the shot. If you stabilise in post, you remove the shake but not the blur. If you stabilised in post completely so that the shot had no shake then it would look like a tripod shot because the camera movement would be gone, but all the blur would remain, so a stationary shot would blur in random directions at random times for no conceivable reason. This is a test I did some time ago comparing OIS / IBIS vs EIS (stabilisation in post is a form of EIS). The shot at 25s on the right "Digital Stabilisation Only" shows this motion blur without the associated camera shake. The IBIS + Digital Stabilisation combo was much better and is essentially the same as OIS + Digital Stabilisation. The issue here is that people using IBIS or OIS often have all the stabilisation they need from that, so the gyro stabilisation is aimed at people who have neither. This "blur doesn't match shake" also happens in all action and 360 cameras when they shoot in low-light and their auto-SS adjusts to have shutter speeds that include blur (which is why I bought an action camera with OIS rather than EIS).
  7. With Sony and BM now offering gyro stabilisation, what are your thoughts about stabilisation in post as it relates to the 180 degree shutter rule?
  8. Interesting tiny cameras. What are you using them for?
  9. No personal experience, but some years ago when I was looking for portable backup options for the XC10 I found that some of the portable HDD backup units would backup from a USB drive, and I read that those allowed you to plug in a card reader and backup from that. As I never confirmed it personally, don't take my work for it, but it might be a way to side-step the issues of which portable units have CF card-readers and which don't.
  10. kye

    Shooting Open Gate

    When it comes to TikTok, the overwhelming factor in how I create and consume content is called "opportunity cost". Yes, it can be very useful if you're shooting for aspect ratios less wide than 16:9. Noam Kroll is a fan of alternative aspect ratios: https://noamkroll.com/aspect-ratios-in-filmmaking-are-officially-no-longer-standardized-the-creative-possibilities-are-endless/ https://noamkroll.com/the-magic-of-the-1-661-aspect-ratio-how-i-plan-to-use-it-on-my-feature-film/ https://noamkroll.com/playing-against-filmmaking-trends-on-our-feature-with-arri-alexa-classic-2k-prores-hq-43-aspect-ratio/ That's just a few - he talks about it much more on his blog if anyone is curious.
  11. ML with Canon truly is some special images. If you can get over the lower DR, poorer lowlight, and lack of stabilisation then the images can be truly excellent. I found the grain from it on my 700D to be spectacular as well, very natural feeling. The other camera that seems to get forgotten is the 700D. When I last checked it was at the cutting edge of the ML development as one of the most active developers had one, so it got every new feature basically immediately. The limitation of the 700D compared to the 5D was that the 700D only had ~1700 pixels across the sensor (IIRC that was every third pixel) and if you wanted more horizontal resolution than that then you'd have to crop pretty severely into the sensor, but 1.7K RAW upscaled to 1080p didn't look too bad, especially if you were going for a slightly more organic look. The image from the 5D is nicer of course, but I'm yet to see images that the EOS-M can do that the 700D couldn't.
  12. @leslie @BTM_Pix I don't care about getting stabilised shots - I care about not shaking the camera literally to death - to the point that the OIS / IBIS mechanism stops working and the camera becomes unusable. The severity of the vibrations are unlike anything I can think of, except perhaps amusement park rides, and even then this would be a pretty harsh one.
  13. kye

    Shooting Open Gate

    Ha, an influencer making videos about how to be an influencer. He said "What do you struggle the most with as a content creator?" and I was thinking - obviously it's editing and sound design. Then he talks about cropping to post on TikTok. Umm, no. I mean, I get it. If you're an influencer then you want to shoot in 8K RAW so you can record everything and if something wonderful happens then you want to be able to crop and post on all the socials and make an edit where you crop from a wide to a mid to a long to a nasal close-up etc, but I'm just thinking - at what point would concentrating on writing and ideas be more beneficial.
  14. My recent hobby is riding my electric scooter around the place for Little Camera Tests, and I've contemplated mounting a camera to the scooter to get some zoom-zoom shots, but also just to get a smartphone mount to be able to have navigation or whatever open. The challenge is that the scooter has zero suspension, and footpaths / sidewalks are not even remotely smooth. The bumps ripple up the handlebars with almost infinite transmission and actually make my wrists sore from the severity of the bumps. I'm forced to drop down curbs etc occasionally too, so that's a massive jolt as well. 1) is mounting a smartphone (iPhone) a reasonable idea? 2) what about an IBIS mechanism like the GX85?
  15. kye

    Olympus OM-1

    Interesting. Olympus has the reputation for having superior stabilisation and of course have PDAF, but have been let down by lack of bitrates and other codec options (such as the open gate functions of the GH line) but maybe they'll rectify that on this new body? If they took some inspiration from the GH6 and implemented Prores then it could be a serious camera for those in the MFT ecosystem.
  16. kye

    Olympus OM-1

    Is the E-M1X their flagship camera? For some reason no-one seems to understand their range and how the model numbers work.
  17. It does depend on how good the compression algorithms are in the camera, but remember that action cameras and smartphones tend to be hand-held (increasing movement in the frame) and wider angle with deep depth of field (so all of the frame has loads of detail to compress) as well as high-res/low-bitrate.
  18. Use however much you need, but be aware that how much you need can vary radically depending on what you're filming. 50Mbps is tonnes if you're filming a talking-head with a blurry background, but point you camera at a tree while there's lots of wind, or during rain or snow, or at the ocean, or from a moving vehicle, and the 50Mbps you were loving before might make you cry. Also, if you're filming in higher frame rates and then conforming to normal speed to make things appear in slow motion then your bitrate will get stretched accordingly. 50Mbps is 25Mbps when viewed at 50% speed on a timeline, etc. You can't add bitrate in post!
  19. Thanks! That's really interesting - I wouldn't have run into that yet as my current Custom modes are all based on P mode and I normally use it with manual lenses and thus have control over the aperture and focus with the camera being auto-SS and auto-ISO.
  20. kye

    Lenses

    Thanks - I wasn't sure and hadn't gotten around to googling anything yet.
  21. These aren't from the same night as the above, but I'm guessing that these were similar setups, but taking images as a timelapse.
  22. @webrunner5 Nice shots! Getting sharp images of far-away objects is likely to be heavily dependent on the weather - I've spent a lot of time using my FD 70-200mm to shoot the sunset, and when I put it on a 1.4x teleconverter and use the 2x crop mode all you see is heat shimmer. If I use the same setup to focus on a close subject then the image is crisp, so it's not the optics, it's the distance. The combination of 200mm with 1.4x and 2x crop gives an equivalent of 1120mm, which is all kinds of fun to get steady video with. This setup was using the 4K mode so I could stabilise in post because even in this setup there was image shake and I had to stabilise in post - the image shake was coming from the ground through the limestone pillar!
  23. If you're trying to get low-light performance then the speed booster was never the right approach anyway. The speed booster is actually for something different - for seeing the full image circle of a lens when the sensor is smaller than that image circle. IIRC @Andrew Reid found several (or more) lenses that covered FF even after being speed boosted, so the sensor coverage wouldn't be limited in many cases either. You'd want to see the full image circle if you were interested in the lenses imperfections, which is mostly the source of lenses being 'cinematic' BTW, rather than the perfect modern lenses that are more sterile than a hospital and do not impart any emotion onto the images they create, leaving the film-maker to have to compensate for it elsewhere, representing a lost opportunity.
×
×
  • Create New...