-
Posts
7,817 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
@thephoenix that is absolutely fantastic! I retime things quite a bit, and optical flow is great, but it doesn't do a good job when there is movement in the background, but this new SpeedWarp mode is really great. The video the guy uses is actually a really good test and shows the weaknesses in both the approaches. DR just keeps getting better and better!!
-
Nice stills. I've heard people have a lot of love for that lens, and it sure would be a convenient one to carry around for simple shoots. You better hurry up and finish that film, or pretty soon we'll have all watched it frame by frame in this thread! ???
-
I believe that what is being referred to is "super whites" where the output signal is above the white point by default, but can be brought back as the data is in the file. The XC10 does this. I suspect that if you're recording in prores then the conversion that happens may clip those super whites and they wouldn't be recoverable. I could be wrong, but I've seen super whites myself, and this clipping behaviour is very similar to the way that still images are processed with RAW / JPG files.
-
This is more popular than you'd imagine, although different people see this to varying degrees too. I recently noticed a shot with a low shutter angle in Peaky Blinders which is a very high production-value show, so although I have no idea what caused them to do that, it was even present there. I'm not sure how much this will help you though, as changing light conditions will still necessitate either AWB, or worse, WB corrections in post.
-
Actually, yes - I really liked this too - I forgot to comment on it. The strange ringing you get from loud sounds was done perfectly - "sound as headspace" was nailed completely!
-
I have an idea. If you can get a sample unit to me then we can work together to work out how to connect it to a live stream of stock market prices....
-
Awesome stuff! Lots of respect from me too. For just shooting it with what you had, for finishing it, and for uploading it. Only those who have done this will appreciate what an achievement that is!! In terms of my feedback, I really liked the different coloured lighting setups / grades. I didn't really understand the 322, but it seems to fit the whole other-worldly theme, and there's lost of cool stuff I don't get so it's probably just me Again, well done!
-
I watched that, and I also watched the one he referenced from the other channel (because I'm on Mac) and I noticed that they had different approaches to managing the database. Process completed and 16b1 works for me.. MBP 2016 13inch running OSX 10.13.6
-
It is good, although the footage is so mushy. In a sense an action camera is an over-capture device because you're probably going to crop into it in post, either through re-framing or stabilisation or de-fishing or a combination thereof. I read an article from the guy who shot the girls snorkelling section of the Hero3 demo video, which was fascinating. It took them a day to understand the camera needed crap-tonnes of light and needed to be set to mostly manual (which was limited for this model). The next day sorted out which profiles were the best and the limits of the slow-motion mode, and how to grade things. The third day was a test shoot day to work on framing and operator technique and refine the grading, then the last two days were shooting a crap-load and hoping to get a shot that looked good, framing and waves were good, and there happened to be the exact right lighting that makes it look good. It was a shoot where they were stretching every single variable to the absolute maximum to shoot the demo video. The reality is that there's no way in hell that the average person can make anything like the demo videos, and most pros would have difficulty too. I wish I could find it again (I've looked unsuccessfully several times) because it shows what it takes to truly get the best out of a camera. I think better sensors in newer models help, getting enough light into these things is a huge factor in good image quality. The above was shot in ProTune but I am not that sure there's much advantage over the standard mode TBH (at least for this model). I did a series of tests with different modes last night in controlled conditions and setting it to the non-protune mode where it grades and sharpens the image seemed to produce a slightly nicer image than I could manage in Resolve with a decent degree of trying. I'm thinking of the X3000 because it's got nice 4K and because the OIS eliminates movement during the exposure, which helps with RS wobble as well as in lower light. I actually think that the form factor is well suited if you have a mic on it. The GoPro is strange because it's wide and shallow, yet any directional mic will be narrow and deep, so having a Rode Video Micro style microphone would probably work much better with the X3000 than a GoPro. I've also been tossing up a 360 camera, but by the time you try and crop in, the picture falls apart. IIRC a 28mm lens is something like a 90degree horizontal angle. If your 360 camera is 4K and you crop to a 90 degree angle then you're getting under 720p resolution and throwing away 94% of the bitrate, so a 100mbps 4K file becomes 6mbps 1000 x 560. 6K resolution isn't that much better.... 6mbps is not a good look! I also need underwater, and for the price of having to point the camera, I think the extra IQ is worth it. When we get 20K 360 cameras then we'll be set, but until then
-
I think I read somewhere that the first computers with storage were programmed to store all a users information in a database, but they had to move to a file system because it made things easier to exchange with other people, which was a use they hadn't fully realised would be so important. I can understand there's a push to make mobile operating systems more like computers, but unfortunately there's also a push to make computers more like tablets, which is an awful step backwards for lots of users. To be honest, I'm pretty dismayed at how modern computers can calculate zillions of instructions per second, and graphics cards that can render millions of polygons per second and we use them to display the same interface that was invented in 1973 by Xerox Park with the Alto: Tablet operating systems are the first real innovation since then, and they made functionality simpler, but worse for doing actual work. It's shameful really.
-
Filmed this yesterday: Some thoughts: It's both a camera test but also a real video - if you're going to go through the whole process then why not get a nice memento out of it I've realised that I'm comfortable with the point-my-camera-at-things style of film-making, but my videos are about my family so should also include me, so this is a test to try and get better at that This was shot with my positively geriatric GoPro Hero3 and was kind of a test at using it as a selfie camera It's shaky as hell, and if you apply strong stabilisation in post it becomes wobble-vision.. I plan to replace it with the Sony action camera which has OIS It was convenient, small, and easy to get a variety of shots with, so I think the form-factor is good 2.7k mode isn't great, the Sony in 4K would be much improved
-
I think I might. After watching that video and working out it'll be months before we see the full release. Plus I shot a short video yesterday and that was quite fun, so I have a bit more energy for editing again I think
-
Oh, interesting. How do you.... oh, hang on. I look forward with anticipation of the official launch of your new YT channel... CatRidesMan ???
-
...and the people using Canon for video is a drop in the ocean of Canon photographers. ......and the people using Canon for stills is a drop in the ocean of photographers. ..........and the photography market is a drop in the ocean of the electronics market. ..............and...... ???
-
@UncleBobsPhotography @Shirozina I think it depends on what level of lighting variety there will be between shots, the tolerance the OP is willing to take in the final edit, and the level of time, skill, and software capability available for colour matching in post. I've spent a lot of time trying to match different cameras in post (made more difficult than this situation because my cameras also had different colour science and were pretty bad quality cameras) and I've found that it can be really difficult to get acceptable matching, even if you half know what you're doing in grading. Obviously ND filters vary far less than entirely different cameras, but the differences I couldn't overcome were so large that they'd be completely unusable in most commercial settings, so for the OP the difference of NDs might be practically relevant. The fact they're asking about how to do a test, and that their approach is detailed and seems sound would also indicate they might be creating a higher quality product where there is less room for variations. I could be wrong of course - we all obsess over tiny little details that don't much matter in the grand scheme of things (like cameras, colour science, etc!) ???
-
Throwing stones is good I can't comment on the P4K, but I'm in m43 land with my GH5 and I think your summary is about right. FF lenses adapt really well for the longer focal lengths but getting fast/wide/both lenses is the challenge. In case you're not familiar with the options: Voigtlander make an excellent series of f0.95 lenses These come in 10, 17.5, 25, and 42.5mm lengths, which are the equivalents of FF 20mm/1.9, 35mm/1.9, 50mm/1.9, and 85mm/1.9. They're gorgeous to use, and the aperture ring can be adjusted to de-click. I only have experience with the 17.5mm one, which is soft at 0.95 but is almost fully crisp at f2.8 (FF f5.6 equivalent) and for some reason has a strange colour shift at 0.95 that cleans up by 1.4 I use the SLR Magic 8mm F4 (equivalent to a 16mm F8) that is optically good but not great, but ergonomically is a pig because it's designed for drones not a human user. There is a Laowa 7.5mm f2 which is more expensive, but seems to be good optically and is designed for human use, and it popular with vloggers due to the 15mm equivalent FOV. The Sigma 18-35 f1.8 on Metabones adapter is popular and optically and ergonomically nice too, although it's heavier than the M43 lenses. In a sense, I think your criticisms aren't so much that the P4K image isn't great, but rather that the competition is actually really good.. I think we're spending more and more time being overly critical and nit-picky, or making genuine and practically relevant criticisms about cameras that are outputting a level of quality but are a fraction of what equivalent cameras used to cost
-
I was just thinking about how long this might take. Some googling revealed that v15 was released April, had 8 beta versions, and was fully released in August. Version 14 beta 1 was released in April and came out of beta in September, so 4 or 5 months for the last two releases. Both these had entirely new products spliced into Resolve, whereas the Cut page is a different view on existing features, with some extra smarts in the background, so should be far less of a programming challenge. I have no idea how many beta versions and how long it will take, but I'd suggest that although it's probably less than the 4-5 months of previous versions, BM aren't afraid to take their time and release lots of beta versions before considering it ready for 'real use'. I say 'real use' because I think that there are lots of seriously high-end people using it for big budget projects who would wait for the final release and wouldn't even think about using a beta version. Of course, PP seems to be buggy as hell and that's got its fair share of VIP users, so who knows. I am wondering if I should just backup my database and install it The Cut page is seriously tempting.
-
It's probably worth mentioning that a tablet can be really good if you're filming yourself and you use the tablet to control the camera. Some cameras have excellent remote control apps or browser-based apps with touch-to-focus and all the aperture/SS/ISO/etc settings.
-
I don't, but IIRC my sister was using it when she was a Script Supervisor for all her notes and photos for each location/setup/shot/take. I don't know if there was a particular app she was using or if it was just notes and the camera or mixed with paper notes too, but it worked for that really well. I worked out early on that tablets are for consuming content and laptops / computers were for creating content. The fact that the iPad doesn't actually multi-task is a complete deal-breaker for me. On a computer if you have a few applications running then they're all running and if you change away from one and then back again you'd expect it to be the same as when you left, but they actually just make a note of where they were, then when you go back to them they then re-create where they think you were, but mostly get it wrong. If you have bad internet you'll realise that the web browser actually has to re-load the page when you go back to it, too bad for you if the internet is now gone... or they forget where you were through a video or page or whatever, and you have to navigate back to where you were....
-
Yeah. Plus that insert edit which doesn't need the play head to be frame perfect. I suspect there's dozens of little tweaks in here that will all save a lot of time (and probably frustration too!) I have done a bunch of efficiency / process improvement projects in my day job and I suspect that they might have video'd a bunch of professional editors doing this and then analysed every keystroke and mouse movement to classify what they're doing and how long that particular thing adds up to be over an entire editing session. Then you can see that they spend X% of the time zooming, Y% of the time doing whatever else, etc and then just work out how to reduce or eliminate those things completely. If they have taken that approach (which seems logical) then the zooming and the insert edit will be in a long list of many many small improvements they could have made. People that design user interfaces sometimes look at how far away the buttons are from each other (for example if you're moving the mouse from one side of the screen to the other and back a lot) and will rearrange the buttons so there's less mouse movement etc. When Toyota starts with a car assembly plant the assembly line moves slowly at first, and they work out which steps are taking the longest, then they improve them until they've got a bit of spare time, then when every step has a bit of spare time they speed it up a little, then do it all again. Just making these little changes you can make an assembly line go twice or three times as fast as it was originally because all these little changes really add up. Which is absolutely great stuff for us, because we're the ones who benefit
-
Great overview video about why the Cut page is so much faster than the Edit page: I haven't downloaded v16 yet (I don't do beta versions) but this makes me really excited about cutting together an initial assembly. An interesting point from the video is that he mentions you might be cutting together 1000 clips in a day (1000 clips at 2s each is 33 minutes of video so that seems reasonable). I did some maths on this and it works out that if you spend 5 seconds zooming in and out to make and fine-tune the beginning and end clips, which seems about right from my editing experience in Resolve, then that's 83 minutes of zooming! The Cut page shows three different zoom levels simultaneously and you don't need to zoom at all, so that's a huge time-saver just in zooming. There are a bunch of other things that they've fixed too that I noticed are a bit of a PITA when doing this in the Edit page, so it looks like it will be so much easier. I am really excited about this!
-
Aram K commenting on whether there is a big difference between having an external reference display with BM hardware vs the GUI display. (TLDR - not really, as long as you have a good quality monitor and calibrate it). I'm not a huge fan of Aram as his level of technical knowledge isn't the best, but in terms of being able to see colour and comment in it, I think his impressions are probably useful, and considering I've never compared the two it's a useful opinion.
-
I'm no expert, but assuming that you custom-WB and set exposure every time you change anything then that should be a good approach. Changing the aperture is more likely to only test the lens and not the ND I would have thought? Still, it's useful for testing before using that lens for your film. I know lenses that change colour at different apertures, so this is a thing. If you're going to go to the trouble of testing one lens then test each of your lenses, preferably at wide open and at something neutral like 5.6 of 8.0. Considering this is a test you should also ensure you've gotten a steady and high CRI light source, even if you use natural light it can be subject to subtle variation between shots, or even between doing the WB and hitting record. I think if you go back and forward between non-ND and ND and look at Vectroscope and Waveforms then that should tell you everything you need to know. In a sense, as long as the ND isn't too far off neutral then having colour accuracy kind of doesn't matter as long as you're consistent throughout the film (by using the same ND). Colour grading often skews colours, for example cooling the shadows and warming the highlights, and it's more about how the final look matches the emotional tone of your film. You can go pretty far from neutral and still be fine, think about The Matrix or any Soderbergh film, or music videos where coloured lights heavily skew skin tones. Good luck!
-
lol. I'm not sure about that exact situation, but it sure will be interesting times. I saw an interesting interview where the guy said that the first stage of robots/AI taking over is automation and that's been happening throughout the industrial revolution and may be one of the critical drivers behind the rise of nationalism around the world as the people that used to have good factory jobs now don't have jobs, or they have shitty unskilled jobs cleaning the machine that paints the cars instead of painting the cars themselves. There are going to be lots of people saying "I'm glad...." or "I wish.....". We're going to need some form of universal income because otherwise there won't be enough work to go around. Independent artists making a living on YT and Patreon and Etsy are certainly coming up in leaps-and-bounds, but it's nowhere near enough to counter the disappearing middle-class we used to have. In Australia the economy has had decreasing total hours worked for many decades, but it's been partially obscured by taking children out of the labour force and also the 40-hour week, but that trend can't continue indefinitely without resulting in mass unemployment.
-
I agree. Funnily enough, I also have a Computer Science degree Robots and AI are gradually making things faster / easier.. eg, NLEs now automatically scan for faces = work you don't have to do = work you don't have to charge for = less total industry value = less people can afford to survive in the industry. AI won't stop doing this, it'll be gradual but it will be relentless. The trick is to understanding what can be automated and what can't be. IIRC it's about repeatability and predictability, lawyers for example are quite automateable because most of what they do is to know the law and basically regurgitate the relevant bits. Yes, there is judgement and also lateral thinking involved, but lawyers are going to be hit very hard when you can buy a magic box that knows all the law and you can just talk to it and have it quote precedence. Film-makers will have the total value of the industry eroded by AI doing rough-cuts and things like that, but I think there will always be room at the top for the most creative. Not necessarily the 'best' but certainly the least predictable or those with a style that is least popular, so the robots will learn it last. It will take a good long time before the robots are actually operating the camera and sound equipment though, robots aren't very coordinated or agile in the physical world, and teaching them the etiquette of how to be in the face of the couple while being the least disruptive will be quite a challenge. The short-term strategy is to get good so that you're still going to be able to stay in the industry as people are gradually squeezed out of it. And pretty much "quality is the answer" is a good strategy regardless of whatever the hell happens, so it's not like you're going to spend all this time getting great at your craft and it turning out to be a bad strategy