Jump to content

KnightsFan

Members
  • Posts

    1,292
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. That depends on your software and hardware. Resolve's performance with XT3 footage is significantly better than Premiere's performance with GH5 footage on my computer. That is my opinion as well. Though to be fair my comparison comes from personally using an XT3 vs. editing projects shot by other people on a GH5.
  2. I am a huge fan of the XT3. You mentioned the lack of IBIS is a downside. Could you afford a used gimbal on top of the XT3? Or would warp stabilizer work well enough for your purposes? It would be sketchy for walking shots (though to be fair, so would IBIS), but warp stabilizer can do wonders for a relatively steady handheld shot.
  3. That's fascinating, I'll have to try it out to see whether I can replicate his results. I'm not really sure yet how mismatched Chroma resolution scaling can make the Luma artifacts that he points to as evidence out: "If you take a look around high contrast edges in a 4:2:0 encoded image you will see noticeable chroma artefacts, often appearing as a lighter or darker halo around the edge of objects." (emphasis added). Anyway, seems like a potentially easier option is to edit in 4k (with proxies if needed), then do the YUV downscale after exporting.
  4. Unfortunately I don't believe Resolve has V-Log L, only V-Log. I'm not sure how different it is to be honest, never worked with it myself. Generally, using a color space transformation (CST) is the best and most accurate way to translate between color spaces. However, if you want to keep your old workflow, you can always use LUTs in Resolve the same way as in Premiere. Whether you use a LUT or CST, you can apply it in different places. In the color tab, you can right click on a clip and set it there, or you can set it inside a node. I prefer to do it in a node, because then it is much clearer what the order of operations is. I can never keep the order of operations straight, but with nodes there's a handy graph with arrows that makes it all so simple! With Resolve, you don't have adjustment layers, but you can add clips to Groups. Right click a clip (or clips) in the color tab and Add into a New Group. Now, you have a separate node graph for group pre-clip and group post-clip, which, as the names imply, are done before or after the clip. So you could add all your V-Log shots into a group and then add in the CST/LUT or whatever you want in the group pre-clip section, and then all those clips will start from that baseline. (You also have a timeline node graph, which applies to everything in the timeline after all the other graphs are applied) I've read that it's best to do white balance adjustments in linear space, so my node graph usually goes: 1. CST plugin to linear gamma (V-Log to Linear, in this case) 2. White balance 3. CST plugin from linear to Rec.709 color and gamma 4. Whatever else
  5. @mirekti Apparently production was on break because of Chinese New Year, and they sold out of existing stock. The Z cam people say it will be back in stock soon. The E2 has been in and out of stock a number of times at B&H.
  6. Wait does Resolve support Flog as an input space now?
  7. For adapting! And JVC had that neat feature where you put a M43 lens on and it just uses a M43 crop the sensor, or you could put a S35 lens on and use the whole thing. Very flexible. To be honest, though, I'd go for any mount that you can get an EF lens onto. The only real downside to higher MP is more rolling shutter, if the processor isn't upgraded as well. High MP with downscaling is great for low light. I'm sure all near-future Z Cam products will have 4k60 ProRes in addition to H.265. I don't think they'll have 4k120 though, they seemed to imply that was a unique feature for the E2. It will be interesting to see if Canon makes a decent cinema camera. Their latest photo cameras have been disappointing for video to say the least. It would be quite forgivable in my eyes if they came out with a good low budget RF video camera, as a complement to the R and RP.
  8. Hopefully, yeah. I don't see why low light would necessarily be worse, if they do a full readout and have a fair comparison (4k vs 4k). I find the XT3 has good enough low light for me, so as long as it isn't much worse than that I'd be happy. I wonder what mount they would go with for S35. I'd love to see MFT a la that JVC camera, or if they surprised everyone and threw in with the L mount lot. 3k would be nice, but my gut says more like 4k. Fingers crossed though!
  9. My prediction is 8k S35 in a slightly larger body than the E2.
  10. While we are talking about audio, it would be cool to have a video recorder that can function as a usb audio interface, thereby recording sync sound from an external mixer without introducing any loss or levels issues.
  11. Yeah, exactly. It will take a few years, but we'll get there, and Atomos doesn't want to be caught without a business model when that time comes. Just a few years ago, we used Atomos devices just to bypass 24 Mbps IPB compression. Cameras didn't output 10 bit even through HDMI. I think this time next year, and certainly by the year after that, we'll have a number of decent options that shoot ProRes and/or RAW, like we already have with the P4K, or the Z Cam E2. Just recently we got our first 10 bit full frame hybrid. Now, a couple of cameras have even abandoned SD in favor of faster and more reliable storage options, eliminating yet another barrier to ProRes and Raw. All these point to the external recorder business slowing--not dying completely, but slowing considerably over the next couple of years.
  12. I wonder if Atomos is seeing the end of mainstream external recorders in the coming years. For a while, Atomos had a niche market of being the best option for people on a budget to record higher quality footage. Back around 2014, I had discussions about external recorders, and my position was that they were a bad long term investment, since I thought it was only a matter of time before internal recording was higher quality. Now we've got several photo/video hybrids and budget video cameras that shoot 10 bit, and even a couple that do 422, essentially removing the benefit of an external recorder as far as quality goes. I know there are still benefits to external recorders and will be for the near future, but as internal codecs and formats continue to improve, I predict that fewer people will buy external recorders. I think it's smart for Atomos to both pursue higher quality with ProRes Raw, and to also expand to building dedicated monitors for the market that is satisfied with improved internal recording. As far as this product, it has some cool features. I'm sure it's a fantastic screen, and the Analysis tool looks amazing. I would prefer more physical buttons and less reliance on touchscreen, though.
  13. I've done a number of codec tests, some of which I posted on EOSHD. With H.264, you can easily find a situation where an IPB encoding performs as good or better than All-I at 5x the bitrate. Static shots, for example. As you move the camera more and more, the IPB advantage goes away. I do not buy this. In my tests, I have found that increasing bit depth can increase quality without increasing the bitrate. The more information the encoder has to work with, the more efficiently it can decide what to keep, what to throw out, and what to fudge. The result: you actually need less data to get a better image when using 10 bit. This is true whether the source is 8 or 10 bit. But to answer your real question: I think that, controlling for all other encoder settings, your knee jerk reaction is correct--but in your examples the difference of All-I vs IPB isn't being controlled, and will be the biggest factor depending on the amount of motion in the scene. Yeah, I get the impression that the publicity for the 10 bit 422 paid update made some people miss the fact that even without the update, the S1 shoots 10 bit 420 out of the box, internally.
  14. KnightsFan

    Bane voice

    I heard way back when the movie came out that Bane's voice was 100% dubbed. However, I would definitely record audio on set, even if you plan to dub it all. First of all, you can give your actor nice recordings to listen to when dubbing. Second, you will probably want to process the dubbed audio to actually sound like it come from the real space, and the original recordings will be an excellent reference tool. Third, in the event that your actor is not available to dub and you can't find someone else suitable, you have a backup plan. I might not go all out and be super picky about the sound. Like don't redo the perfect take because a car two miles away backfired during a line, but put some minimal effort into getting usable audio.
  15. A lot of skills will overlap between software, so even if you switch later it wont all be wasted time. I should mention that i was referring to standalone fusion. I would not recommebd using the fusion tab in resolve for serious composites. I have had a lot of crashes and terrible performance compared to standalone Fusion 9. I am sure it will improve, but at the moment i see it as more a beta feature than a solid tool.
  16. i find resolve and fusion to be better than premiere and after effects. While premiere does have some nicer editing features, i have spotty reliability and significantly worse performance than on resolve. Node based compositing is easier for complex comps, and overall i find fusion's design to be more consistent than after effects'. Fusion does have a timeline as well. Using fusion is less intuitive coming from traditional video editing software, but is very nice to use. However, blackmagic software is GPU heavy. Your mx150 might not be sufficient for resolve. Fortunately, resolve and fusion are both free so you can go try it risk free. But if you decide to use blackmagic software, you can upgrade your gpu for the cost of a year subscription to adobe.
  17. I have this condition where I can't leave well enough alone... That's a really good idea for down the line when I've got a mostly-finished product! It's still early days. I looked into a lot of those systems when you mentioned it before. I checked out Movie Slate Pro and a handful of other apps. My app is primarily for personal use/just a fun project that will do a LOT of work for me in post. My idea is quite different, with a little bit of overlap of course. Thanks for all the advice!
  18. Bluetooth transmitter and receiver. They are dirt cheap, and i've already got a couple receivers lying around. Range isnt a huge issue. The only reason i'm doing it wirelessly is so it doesnt throw a gimbal off balance. the app stores metadata, and i use tc to match that metadata with video and audio clips. As long as the latency is within a reasonable time (1 or 2 seconds) and the clock doesnt drift by a drastic margin (30% faster or slower), it works for my purpose. In my tests so far, the system works, except it has always been wired since i dont have a transmitter. Scene, angle, and take stored in user bits for now. Its an early prototype (i am the app developer). Eventually there will be more, so it can be linked to storyboards and camera info for a more complete system. There is a companion desktop application that sorts everything in post. I suppose technically there is no reason i am using ltc per se, except that its an existing standard that i can piggy back off of. It lets me test a lot of my custom code against existing ltc for debugging without wondering which end is failing. Which brings me back to the wireless problem. I hope to borrow real transmitters on my next shoot when i take this system for a test drive, but if that falls through, bluetooth could be a $20 solution. Edit: And of course bluetooth is built into phones, so eventually I will make a single-device solution. The only problem is that my (cheap) phone can only connect to one device at a time.
  19. Overall, it looks excellent. I'm impressed by the fact that someone had this idea, and made it into a competitive stabilizer for a competitive price. But I do have a few concerns. 1. there seems to be lag between panning with your arms, and the camera panning 2. The footage looks like it has less of a level horizon than I get with my glidecam (and I'm not an expert at a glidecam). Maybe the problem is that you can only hold it in front, meaning you have to sidestep for X axis movement (e.g. dog shot at 3:41). 3. 3D printed parts... that will have to change if they make a bigger version for heavier cameras
  20. Why not? Latency, unreliability, or what? So that was actually another question. Are the bnc connectors LTC compatible? As in, will the F4 sync to LTC generated from a smartphone app and then properly send it from bnc? I read that it would not work. The tc is being generated in a smartphone app along with other metadata. In fact, i am more interested inthe metadata than having perfect sync. So i can't generate tc in the f4 itself. Alternatively, it could be sent from the phone to both the f4 and camera, but that would still require a wireless link in roughly the same place.
  21. Someone should tell Picasso that his art does not accurately reflect real people. Art doesn't exist in a vacuum, it is always created and viewed in reference to previous work. Over time, film has deviated from being completely in reference to the real world, and has developed its own conventions, symbols, and nuances. New films are viewed in reference to other films, not the real world. In the same way, narrative storytelling, sound design, musical scores, and all other aspects of filmmaking are built on previous films, not just real life You seem to be implying the color should be accurate to real life unless the artist specifically wants it to be different. This implies that using color as accurate to expected film convention is the work of a "hack." I think this is false. Would you consider Max Richter's album Vivaldi Recomposed to be creative?
  22. @IronFilm Great into about the car rig. Speaking of bluetooth... Have you ever used bluetooth to transmit LTC to the camera? I'm trying to send LTC from the Sub Out of an F4 to the camera.
  23. Better how? You mentioned easier to balance, is it easier to keep it balanced when in use, and on which axes? How easy is it to pan, tilt, or do canted angles? It looks like a very interesting concept with a lot of potential. Yeah, its way too little payload for me, too.
×
×
  • Create New...