Jump to content

kye

Members
  • Posts

    7,817
  • Joined

  • Last visited

Everything posted by kye

  1. Makes sense. There's a few channels I watch where they do talking head stuff, and all the B-Roll is stock footage or graphics / animations. There are even quite a lot of channels that don't have any filming at all and you don't know what the channel owner looks like because it's all motion graphics and voice-over.
  2. Just curious (I'm not in the market for this) but do you need a crew to just shoot with it, or is the crew really to setup & teardown? The reason I ask is that there's a bunch of YouTubers who make shows from a home studio setup where the camera, lighting, sound, and set don't move, and it's just about workflow efficiency and content. We're talking completely predictable exposure, WB, focus, and infinite power via wall outlets, etc. I know lots of those shows have multiple angles and that kind of stuff, but if you run a talking-head style channel with one angle and b-roll, maybe these old cinema classics will find home studios in suburbia to age gracefully in
  3. Ooh, I don't know... if you wait long enough. Those VHS plugins were/are pretty popular in music videos! Having said that though, there is still something about ML RAW, even on a non-classic Canon body, that is delightful in a way that even the nicest non-RAW modes of any modern MILC just don't quite have. And the image from the Alexa really is just exquisite ???
  4. From Toms video? What about it don't you like? I could do without the skin tone smoothing myself, but the technique is pretty standard, I just don't think I'd posted a video showing how to do strong colouring and still keep reasonable skin tones.
  5. kye

    bmp4k adventures

    Thanks! I've shot a ton of sunsets, most aren't amazing, so it's just a numbers game. I think the first video was with the GH5, the second was the GoPro (waterproof!), and the third was probably my iPhone 8. You're right about shooting images and processing in post as you get much better quality. Resolve is great for timelapses too because it treats image sequences as a single clip so it appears in the Media Pool as a single clip, drag onto the timeline, edit, grade, export as a single clip. No messing around with special software. It's because some cameras shoot RAW as image sequences, so they include it natively and it's seamless. I'm quite fond of making timelapses with the GoPro because you just position it, hit record, then do whatever and basically forget about it. I just ordered an Sony X3000 replacement for my GoPro so will have to work out how that one works.
  6. Some may find this useful... Toms videos are pitched at people less familiar with colour grading, but are thorough. TL;DR: First node is to convert whatever format you shot (eg, log) into a more rec709 amount of contrast and saturation Second node has a key for skin tones and you can adjust mid-tone detail for a beauty effect if you like Third node is applying a LUT for a cool look Fourth node is in parallel to the third node, and gets inputs from the second node of both the image and the key (which is inverted) The third and fourth node go into a Layer mixer which goes to output What this does is basically put the skin tones on top of the LUT, and then adjusting the Key Output you can adjust how transparent the skin is, blending it into the overall look.
  7. kye

    bmp4k adventures

    Good stuff - I look forward to seeing what you come up with. I shoot sunsets quite often, mostly time-lapse but not always, here's a random few: What are you thinking of shooting? Timelapses have their own challenges separately to normal speed video. Some general thoughts are that unless it's very cloudy the DR will be more than your camera, so you'll have to choose between blowing out the sun (and maybe the sky around it) or darkening the shadows a huge amount. The answer to this is to think about what the subject of the shot is and expose for that. In my case the subject is normally the sunset itself, so I just expose for that, letting the sun blow out but not the sky around it, and then it is what it is and grade appropriately. [Edit: the best approach is to just do it every day, or whenever there are clouds in the sky but not on the horizon, and then just do it over and over again, until there are no more mistakes to make and you start getting what you want ] and I totally understand about the Aussie sun - it's pretty harsh. On my travels I'm regularly surprised at how much less DR people in other countries require!
  8. kye

    Sports videography

    Went out to shoot this mornings game, new rig (as posted above) and new monopod (one of ebay's cheapest). I didn't have time to test the rig out before going to the game, and the verdict is.... The head on the new monopod is an example of you-get-what-you-pay-for and overall it's slightly too long, and moving the mic forwards resolved the conflict with my forehead, but not the conflict with the brim of my hat. DOH!! New plan is to move the mic to the side to make it hat-compatible, and to modify the monopod to remove the head. This is my first attempt at putting the mic to the side: I've disassembled the monopod, but unfortunately the head didn't mount to the body via a 1/4-20 so I'll have to order a 1/4-20 and install one myself. I figure I'll be using it with the monopod basically vertical anyway as I'll be panning basically the whole time, so I won't need the head. Watch this space...
  9. So let's see if that's true, and if so, by how much.......
  10. If you look at the tests from the iPhonedo Hero 7 review showing the waveforms you'll notice that different cameras expose slightly differently (where they put the darkest part of the histogram) and also have different DR (how stretched the histogram is) and also have different clipping points (where the highest values in the histogram are). How fast the lens is will mean that a camera can get a certain exposure level with a lower ISO, or faster shutter speed at base ISO, but that will only matter in low-light. The exposure levels aren't a factor of the lens, they are a factor of the cameras auto-exposure function and where it tries to expose. Action cameras are typically used outdoors in bright light and considering they have a very wide angle the auto-exposure algorithm will probably be tuned for when the sun is in frame and to just let it clip. If you use auto-exposure on a normal camera and put the sun in frame they often darken the frame radically such that your actors are silhouettes so that they're not blowing out the majority of the sky, which is a different approach because they're cameras for different things. Ultimately what you want is something with the shadows raised up a little so that there's a bit more information there, you want the DR to be kind of squashed to get more in, and you want the clipping point to be as high as possible so you clip the least. You also then want the highest bit-depth (although they're all 8-bit because they're action cameras) and the highest bit-rate so you can push and pull the image around more without breaking it. In terms of PBs video, it will be very interesting but it's a pity he didn't get more cameras - the X3000 of course but also Yi 4K and RX0ii as well. I get it when an extreme sports person only looks at action cameras, but PB is a pro cinematographer who should cast the net a bit wider.
  11. Yeah, I watched that one a few times, I guess I'm more referencing that at some points in the video he talks about the three of them, but then at other points it's just the two. The Hero 7 video was great because the coverage was much more even between cameras. There are some aspects that I can't compare the X3000 and the Osmo directly, but have to compare the Osmo to the GoPro, then in the GoPro review compare the GoPro to the X3000. But thanks
  12. You can! Step 1: apply the plugin Step 2: select the "Show Mask" box under Skin Mask Step 3: connect another node like this, then apply adjustments in the second node: As you can see it's not perfect, but this is probably a bad example and I didn't try at all to refine it, and there are lots of mask refinement adjustments in the Face Refinement plugin. Bonus Tip: you can do this with any node that has a "show mask" feature. I've done it with the EdgeDetect plugin, for example to soften only particularly harsh edges, etc. Lots of potential using this method to get highly stylised effects that no single plugin can do by itself.
  13. It would be nice, and it would be a long time coming too. Unfortunately I can't wait for that because I have a trip in mid-September that I need it for, so in a way I hope I don't buy an X3000 just before they release the updated model, but anyway. In case anyone isn't familiar with EIS vs OIS in low-light, here's why EIS is fundamentally flawed when the sun goes down... (link goes to the specific time in the video) Also, the X3000 seems to have more DR than the others and has the higher bitrate 100Mbps so the files are less brittle in post.... (link to specific time in video) I'd love to see a direct comparison video between the Hero 7 (with current firmware) vs Osmo Action vs Sony X3000, but because the Sony is old it seems to no longer get included. My logic is that if the Osmo Action isn't clearly better IQ than the GoPro, and the X3000 has some serious advantages over the GoPro then the X3000 is still a very strong contender. If you shoot low-light or want to use it more like a normal camera then it's still in first place, especially with things like a tripod mount, 3.5mm mic port, etc.
  14. kye

    Lenses

    FF lens tests with Geoff Boyle NSC Intro video explaining what they did and why: Main article: https://cmltests.net/CML-FF-Lens-Tests-2019.html Table of contents and links to individual lens test pages/videos: https://cmltests.net/CML-FF-Lens-Tests-2019-Individual-Lenses.html Table of colour shifts: https://cmltests.net/CML-FF-Lens-Tests-2019-CD.html Compilation video with all the lenses (the links above have links to separate videos too):
  15. kye

    film grain

    https://en.wikipedia.org/wiki/Double-loop_learning
  16. kye

    film grain

    Also, remember to adjust the amount of grain you're applying so that it looks good in your final delivery file. Compressing video really kills the grain, especially YouTube or other online streaming services. I'd suggest making a test project with a series of shots with increasing strengths of grain, output it, upload it, then look at it and dial it in from there.
  17. And whatever you don't crop you're downscaling, which improves the real resolution. 8K will be great from this perspective too. Think of it - you'll be able to get real 4K!
  18. I agree with @fuzzynormal in that while there are differences, they aren't huge, and if you really were shooting with the 'worst' of this bunch, the files would be good enough to push and pull around in post. The email thread on CML was titled "Budget camera skin tone shootout" because at the level that those guys are operating, these are all budget cameras(!). In that sense, the fact that none of them are really that bad speaks more to their ability to select good quality cameras rather than anything about the camera market, although the idea that there are no more bad cameras is certainly pretty solid. I've played around with skin tones enough to know that with a couple of very simple adjustments you can balance them across pink/yellow, and with a single Hue vs Hue curve you can even compress the Hue variation to push pinks towards yellow and yellows towards pinks to get more consistency and things in the sweet spot, so even if you're looking at a very saturated and contrasty rec709 output from a cheap and excitable DSLR there are simple solutions to make a serious improvement to the skin tones. On a personal note I find tests such as this quite reassuring, because I picked G as the best, with the other ones I liked also rating highly in the poll. In a sense it confirms that my perception is pretty good, considering I agree with a bunch of professional cinematographers I may not be great at colour grading but at least I know that I like what others like and if I keep practicing then I'll get better because if I do something right then I'll know it when I see it.
  19. This video was shared on the CML list and a poll was conducted too. The poll has closed now (I just saw this today) but it's still interesting. The 8 cameras are (in no particular order): The Arri Amira, Red Raven Ursa Mini Pro (G1) Sony FS7 Panasonic EVA1 Panasonic GH5s Kinefinity Terra 4K Canon C200 Here's the video - the answers and poll results are in the comments.
  20. kye

    bmp4k adventures

    Gives you an excuse to sit around and drink coffee like everyone seems to do in lifestyle advertisements!!
  21. I disagree. In the case of the gyro stabilisation, what you want is for the distinctive pixels in one frame to be as close as possible to their location in the last frame. It's a frame matching problem. In image analysis based stabilisation, the computer looks for distinctive pixels in each frame, works out the movement, then works out how to change frame 2 to better match frame 1, knowing the overall direction of movement from frame 1 to frame 1000. This method is about matching frames by analysing the frames directly. In gyro stabilisation, the computer wants to match frames to other frames, but does so by taking frame 2 and modifying it according to some completely different source of data. That data is very useful and does a good job, but it cannot do a perfect job because it is one step removed. This method is about matching frames by not looking at frames but by looking at something else. I think people are seeing that the GoPro and Osmo Action are better at stabilising than the software and thinking that's because it's a better method. I would suggest that it's not a better method but one that has had far more investment in the technology. The entire action camera industry now hinges on making the footage as stable as possible. It's a stabilisation arms race of sorts, and without going to 8K it's the only thing that is different between GoPro models. Conversely, there is no arms race with post=production stabilisation. The people who specialise in it are specialists, and the people that do serious post-production workflows and invest int them are people that don't record shaky footage in the first place, or if they do then they buy gimbals and IBIS and OIS and tripods etc. The action camera market has taken gyro stabilisation a long way further than the alternative. It's an investment thing, not an image stabilisation thing. Plus, think about how well IBIS works on a 16mm lens - pretty darn well. Think about how well it works on a 400mm lens - less well. Stabilisation in post is just fine at 1000mm Interesting idea. My understanding of OIS is that there are gyros built in, so any camera with IBIS or an OIS connected lens may have the hardware required. It probably wouldn't be setup to analyse that data and record, it might simply be used to move the sensor around. I said "it's only 2 seconds" because I meant that 2 seconds isn't much future visibility for the stabilisation to work. A 2 second delay on the viewfinder would be almost unusable. You quoted me out of context
  22. @Anaconda_ you raise good points. The cameras are looking into the future, but as you say it's only a couple of seconds, but Resolve (for example) can see the whole clip before it decides anything. They do use gyroscopes (gopro bought one of those companies I think?) but that's kind of a cheat because instead of analysing the image they just blindly follow what the gyro tells them to do, but if you have smart enough stabilisation software then it should be able to overcome that. The gyroscope is a hack to reduce CPU required, it's not a fundamentally better method.
  23. Another thing that no one is taking about is that EIS can be done in post production, but OIS can't. Yes it's great to have stabilised footage coming straight from the camera, but stabilisation in post has the advantage because the camera can't predict the future but plugins or Resolve sure can. I think I'll end up with the X3000 and stabilisation in post. The HDR would be great to have thought. I would have thoughts that anyone on this forum would be evaluating a camera with the whole production and post production work flow in mind. When they go 8K they will have 6K MEGA-AWESOME-NOT-MOVING mode and you can de-fish and downscale to a nice 4K imagery from there.
  24. Cool job on the left-hand DIY. I have a much more basic approach in my rig where I'm partly holding the rig but still able to focus.
  25. Or you get one, mount it to the front of your existing setup, set your 14-30 a bit narrower and assuming you can easily match the two in post, then you've got two angles to choose from.
×
×
  • Create New...