Jump to content

KnightsFan

Members
  • Posts

    1,292
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. Focusing breaks down into two parts. First, deciding what to focus on, and second, actually focusing on that object. The first task should be done by a person in most cases. There is no reason to do the second manually. The problem with the MF vs. AF decision is that we don't split those parts into distinct tools. Correct me if I'm wrong, but there is currently nothing out there that uses the camera's AF system, but allows for complete manual control of which part of the image the camera is using DPAF to focus on. You can use a touchscreen tap to focus in some cases, but that's unusable in most situations that you really need a focus puller for--for example steadicam shots. It's pretty easy to imagine a better system, where a focus puller makes all the artistic decisions, but the actual precision work of moving the lens elements is automatic.
  2. Prores raw from the z6 is one of four options. 1. It is a 4k crop (likely). 2. It is 4k full frame using line skipping or binning, probably producing bad artifacts 3. It is 6k full frame. 4. It is oversampled and really should not be called raw.
  3. I was talking about prores raw over hdmi. You can't oversample raw that way.
  4. KnightsFan

    Lenses

    I meant making a LUT in DisplayCal so that the calibration is used in Resolve. AFAIK, the only system-wide calibration that software like displaycal can do is adjust the VCGT, which is gamma and doesn't effect color. So if you calibrate your monitor, you have to follow the steps in that wiki page I linked to earlier. However, it does seem that on Mac, Resolve uses the ICC profile and thus you don't have to go through those steps. I'm not completely sure, and probably a lot of my confusion is from reading contradictory information without realizing that the OS's behaved differently. Thanks, I have since learned that on Mac, Resolve actually uses the system's ICC profile, and thus you wouldn't have to go through those extra steps. It's quite a big difference with the LUT and without, very noticeable on my system. The "1D LUT" he mentions is just a gamma adjustment, while the 3D LUT provides the same color adjustments as the ICC profile does for programs that use system color management.
  5. KnightsFan

    Lenses

    There are two sections in that wiki, one for using it with a Decklink monitor, and one for using it with a desktop monitor. Seems pretty clear in this conversation with DisplayCAL's creator that you need both the ICC profile and LUT (which I just found today, and seems to be a strong confirmation of what I was doing). https://hub.displaycal.net/forums/topic/resolve-lut-for-gui-viewer-profile-loader-both-needed/
  6. KnightsFan

    Lenses

    @Geoff CB As far as I can tell, Resolve on Windows doesn't use the ICC profile that DisplayCAL (or other calibration software) generates, so you need to create a 3D LUT to have the calibration applied to Resolve's viewers. See: https://hub.displaycal.net/wiki/3d-lut-creation-workflow-for-resolve/ however, some of the information that I found about it is very confusing and contradictory.
  7. KnightsFan

    Lenses

    @Volumetrik I use DisplayCal with a Spyder as well. Do you happen to use Resolve? Do you know the process to creating a color correction LUT from DisplayCal to use in Resolve? I am not particularly confident that I did it correctly.
  8. @DBounce You're right, they do seem a little more professional than some of the other similar concepts we've seen. A 3D printed body would have been bad advertising, even if it's a perfectly legitimate way to make prototypes.
  9. Distance doesn't matter. A camera needs to be able to operate as a unified, battery powered unit without power or signal cables required, at least for my uses.
  10. @Mokara you would need something like a nuc if you want it to have manageable ergonomics, ie tripod or handheld use without tethering cables.
  11. @Mokara That's certainly something I would do if I had excess money. It would be a fun project. With a NUC you might even be able to power it off the new 24V B mount battery standard that Arri is making.
  12. Seems like it. I'll watch this with interest, more as a concept than something I would buy since it may be rather expensive. We've seen a couple attempts like this in recent years. The Axiom project is the prominent one, featuring a modular concept and a global shutter. The Craft camera (which was never more than a couple 3D renders for marketing), also championing modular design and a global shutter. The Fran, which was a Windows computer with a global shutter sensor. The nice thing about a camera like this is that, with a powerful enough computer, there are really no limits to what can be done. I've mentioned it before, but with a setup like this you could for example use a Zoom F4 as a USB audio interface and record synced, multitrack audio directly in the video files. With a sufficiently open API you would have a lot of control over how files are recorded. They seem to be using ffmpeg, and basically said that only the hardware speed limits the codecs and formats that you use. Depending on how open their software is, you could potentially do a lot of file management assistant editor'-type jobs on set as you shoot as well. I really hope they expose the software as a proper OS that you can use as a normal desktop computer if you wanted to. My pipe dream here is for an all-in-one pre-to-post computer. Write a script, storyboard, shoot, and edit all on the same device, with software that keeps it organized from start to finish. How awesome would it be to be able to have a PIP or overlay between the live view from the camera, and the storyboards? Hit record, and the camera knows which shots you're doing based on the storyboard, records video and "external" audio directly into the same file. At the end of the day you plug it into an eGPU and you can edit with almost the full power of a decent consumer PC. I can think of so many ways to streamline my productions if there were an ergonomic way to attach a WIndows or Linux PC on my camera on set.
  13. To take a step back, what the MFT proponents here want is an APS-C sensor they can adapt or speed boost. MFT and Sony E were the only two real mirrorless mounts up until recently, and Sony E was out because of patents. I don't think many people genuinely want MFT over, say, RF, L, or Z mounts if the legalities worked out and the corresponding adapters and speed boosters existed, which they don't--yet. I would personally love to see more companies join in with the L mount alliance. A standard mount across different brands would benefit consumers. Another option is if JVC made it EF mount, but had an optional focal reducer fitted inside the mount, like what Luca does for Blackmagic cameras. There would be a couple disappointed people who really wanted to use FD or other short flange vintage glass, but overall that's a good solution, since EF is still widely popular. JVC already makes fixed lens cameras, so they have the capability/partners to make their own focal reducer. And of course a FF sensor with EF is an option as well, if it doesn't balloon the cost. But hey, if Z Cam can make a $5k FF cinema camera, maybe JVC can? Just brainstorming any viable solutions that don't include MFT. But none of them are likely to happen, so MFT really is the most likely mount that JVC would choose to allow speed boosting an APS-C sensor.
  14. I agree with @webrunner5, it's too dark and doesn't look good. Kind of like that Game of Thrones episode we had a topic about recently. I'm also watching on a color calibrated monitor. I even turned the lights off to give them an extra benefit. Subjectively, I think it was poorly done, the the fact that they had an extensive, professional team and used such amazing gear makes me sad. The thing about darkness is you always need a point of reference. Like that image @Anaconda_ posted from 12 years a slave. That lantern blows out, showing just how dark it really is in the environment, and you get that wonderful, insane lowlit glimmer on the eye from the lantern. You still see the characters standing out against the background, giving your eyes a focal point. What it does do is give you outlines that your imagination works with. The almost pitch black background closes in around the protagonist. The two faces show expressions, but your mind puts in the details. A lot of the frames from the Fuji video were just normal images, but dark. Instead of using darkness to occlude something that your mind fills in, it's just an ordinary shot, darker. Like the two shot at 1:22. It's not just dark, it's flat. No highlights, no eyes, no backlighting, really just a completely ordinary two shot where you can't quite see it comfortably. And again at 2:28 the guy with the bat. No contrast to pull the character out of the scene, or to give your eyes a place to go. There's no sense of the lighting drawing your eye to a person in a dark world, it's just an ordinary shot of a guy in a garage that's underexposed.
  15. That's true, but on the other hand, it would be trivial to hide hobbyists' responses from the response database as they peruse the results.
  16. @Kisaha I'd like to see a shorter body for sure. But 99% of my shots are on a tripod or glidecam, so the handheld-friendly grip and form factor of a C100/XC15 is not that important to me. If it has a handgrip, I'd like it to be removable. AFAIK the XC15 grip is fixed, and has the battery inside? I haven't actually used one. A big heavy grip to one side makes it that much more difficult to balance on a glidecam. I did like the ability to strip down the C100 somewhat, though it was still a little top heavy in my opinion. A lower center of gravity is easier for balance. And thought I haven't used a C100 much, and perhaps there is a way to customize the buttons, one thing I hated was that I had to press the ISO button and THEN use the wheel to change the value. I like DSLR style dedicated wheels for each function, so that you can change ISO or shutter speed in one press instead of two. I'm not big into anamorphic either, but it is nice to have different aspect ratios in general. It's always nice to frame and record in the native project format. I've been requested to frame for 4:3, and even 1:1 (for social media).
  17. Yeah, basically XT3 image quality and recording formats inside a GY-LS300 body would be ideal. Adding ProRes in addition to HEVC would make it very flexible and scalable.
  18. I was split 50/50. I found it very easy to guess which was which, especially on the faces, but the Fuji was a little overexposed for my taste. Exactly.
  19. I grew up shooting with film cameras as a young child, before my family ever bought a digital camera. I used old Kodak disposable cameras a lot, and occasionally an old Voigtlander Bessamatic. I still use the Voigtlander 50mm for special shots these days. I've recently shot a tiny bit on 16mm with a Bolex. I guess for me I always try to be sparing with shooting, both photos and video, so film isn't much different. I'd rather rehearse 5 times and shoot one take. Shooting on film never feels any different to me, to be honest.
  20. So if it is FF is it 6k or does it line skip? I assumed it must crop for 4k.
  21. Ah ok. I was thinking the S6 with the preorder discount for some reason. $2500 for the S6 vs $2700 for the Z6+ninja, though it seems the Z6 is $200 off so it is roughly the same, give or take. Certainly cheaper than either of z cam's full frame cameras.
  22. Do we know yet whether it is FF or a 4k crop?
  23. Not if you include the cost of the recorder. But they are pretty close, depending on which media you use or if you find used prices. I am heartbroken! But they said they are looking into an interchangeable mount and that its possible a 3rd party could make a MFT option so there is still hope! When i saw the price drop i was so excited, but if you can't speed boost to FF it makes more financial sense to me to get the original E2 and speed boost that.
  24. @kye very important! All i mean is if you use software that is labelled as incomplete, you have no right to complain about bugs, either report then to the devs or wait for the actual release. (Of course discussing bugs and crashes with the community is valuable as well, i just get annoyed with people smugly saying the resolve beta isn't ready for production use... Of course it isn't)
  25. Have you reported your problems to blackmagic? It defeats the purpose of a beta if you don't report the issues. It's not finished software, and you shouldn't use a beta for anything other than lookinh for issues, and certainly not for anything important.
×
×
  • Create New...