Jump to content

KnightsFan

Members
  • Posts

    1,329
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. I agree! If you want truly open, the Octopus cinema camera is promising. The Axiom project was promising, but doesn't seem to be going anywhere fast--though I wouldn't be surprised if the Octopus camera is benefiting from the R&D that Axiom did and made open source. Z Cam cameras are really quite open. There aren't hacks per se, but they don't need any imo. In addition to just having a lot of options out of the box, they have an open API for controlling the camera and really are going out of their way to make their products compatible with third party accessories. Both of these are strictly cinema cameras. It would be really nice if some manufacturers of hybrids took the same approach. The closest we ever got to an EOS app store was the Samsung Galaxy NX, which is close to just a simple android device shaped like a DSLR. I haven't used one, but it honestly looks like what I'd expect the future of hybrids to be: massive back screen, good ergonomics, and designed for connectivity.
  2. I've been planning to create a chest mount for a small monitor (actually, phone) that rests just under eye level. I think having it permanently on my eye would be eye-straining and make it hard to walk around which is why I'm planning a chest mount. The nice thing about the E2 is that you can use a cellphone for wireless monitoring out of the box, so there's no extra gadgets required. Wireless latency is noticeable, but no worse than the HDMI latency on half the cameras I've used. So it might even find use when I use a glidecam.
  3. Yeah, definitely no the camera for great in-camera audio control. I always record externally so it's not an issue for me. You can get an adapter cable for direct(ish) XLR, but you'd have to figure out how to cable manage that--I don't know if I'd want heavy XLR connectors hanging off that little Lemo port. Or you could get a small mixer of some sort, which is really more a bandaid than anything else. I'd love to see some actual numbers on the rolling shutter on the S6 and F6. If I get time I'll do some measurements on the E2 and see if they match C5D's numbers.
  4. I have to agree with @Kisaha, that looks terrible. It doesn't even look like high dynamic range at all--it's an overcast day, a decent Rec 709 camera would capture that range.
  5. You can, but what I'm saying is every sensor currently being used in any camera we discuss here uses linear ADCs that double bits for every doubling of light. Theoretical ADC's aside, for every real camera on the market, including the S1H, dynamic range measured at full resolution will never be substantially greater than the bit depth of the ADC. Well it's exactly because of what @Lux Shots is saying: bit depth has more to do with gradations than dynamic range (unless you're talking about 1:1 linear ADC's). 10 bits is plenty to store all the usable data from any sensor, when encoded logarithmically. There's really no point to storing a 12 bit image if you sensibly map values into 10 bit.
  6. Maybe? I think members here just aren't that interested in the E2. Though it's an extraordinarily awesome Facebook group, especially a few years ago when it was very small. It's recently grown to the point of having a lot of repeat questions.
  7. You're right about what bits are and what dynamic range is, but virtually every modern sensor has linear ADC's, so bits equate to dynamic range stops because of the linear relationship. I don't know of any sensors that don't behave this way and suspect that none exist. (That's strictly dealing with the sensor ADC, not any encoding afterwards, which is hardly ever strictly linear.)
  8. KnightsFan

    USB 3.2 ?

    It's a well known fact that DaVinci Resolve uses a maximum of 639kB! The USB standards are getting very confusing. I have faith that eventually we'll get to a one-connector-for-everything utopia, but in the last few years USB has gotten so much more convoluted. I wish they would at least require clear markings about which cables and ports supply what. The USB C cable that comes with the Oculus Quest only supports USB 2.0, so you need to buy an identical-looking third party cable to use Oculus Link at USB 3.0 speeds--which is now called USB 3.2 Gen 1 apparently. Just a mess.
  9. The new firmware (v0.95) enables ProRes RAW over HDMI to Atomos recorders on all E2 models. This includes the $800 E2c. This means that the E2 series now has not one, but two different Raw formats. I'm sort of surprised the Z Cam is still barely talked about here. It checks a lot of boxes that people complain about with the P4K. The entire series now has H.264, H.265, ProRes, ZRAW, and ProRes RAW recording capabilities, plus incredible frame rate options at full sensor width on the E2, multiple crop options and aspect ratios, low rolling shutter, long battery life, and incredible build quality. I recently got one, though probably won't have any projects to use it on for a few months.
  10. Great post! I think design, UI, and ergonomics are an under-reported element to our work. As every NX1 will say, it is such an enjoyable camera to use. Not just the chunky grip and sensible button placement, but the straightforward, simple menus deserve a shoutout. As a general use photo camera nothing comes close. Though I haven't used them, I think the Z6/Z7 are noteworthy for actually including a locator pin hole on the bottom--so simple, but so helpful. On the other side, the Z Cam E2 has some unique design choices that stand out. Lack of a grip makes gimbal balancing easier, and the presence of two 1/4-20 threads makes it super stable. Unlike larger pro cameras, it's tiny and much more easily managed for a single person crew. Then there's the phenomenal Z Cam app: wired/wireless monitoring and control from an iPhone. I noticed that! Definitely a nice touch.
  11. The Fuji XT3 is marginally smaller and lighter, but I do believe that you will find smaller lenses for MFT than X mount in general--maybe someone can correct me on that though. A gopro is nice also because they are designed to be mounted on stuff like bikes. If you use a DSLM, make sure it can't rotate at all on the single 1/4-20 thread--vibrations from a bike could easily knock it loose.
  12. Z cam E2 + all the E2 flagships
  13. On some shoots, that is almost precisely what happens. I've worked on projects where they shoot a scene on expired 16mm because of its feel rather than accuracy, or where we chose a profile that gave us the look we wanted while being completely inaccurate. And yes, I've also been on shoots where they strive for 100% accuracy. Many times those projects sucked, because the people in charge were more concerned with technicalities and test charts than they were with evaluating what worked for their vision. I don't mean to say that those kinds of projects will always be bad. I'm sure you have your workflow and that it works for you--as do a lot of people. That's just been my experience on micro budget projects. There is a huge difference between understanding the color that you are capturing, and always striving for real-world accuracy in 100% of situations. Just as any animator or CG artist who uses the same colors and tools but has no real world reference anyway.
  14. Well this is not true. Many movies with great cinematography, including virtually everything shot on film, have their style chosen from the moment the film starts rolling, and that often determines their choice of film stock or LUT. If you are shooting anything other than generic footage, you must know your end result and how the scene will end up in order to light it. Looks aren't chosen in post. I'm sure some productions this is the approach, but it's certainly not universal, not in cinema. What kind of productions are you saying use the approach of always stating with "accurate color" and grade from there?
  15. This looks very cool! I've been eyeing a Deity Connect system for my next project, which still looks like a better value for my uses than this new system. But this is really got some great features.
  16. Image quality, I bet it will be good enough for Netflix, but I suspect that approval will not happen. If there is a recording time limit that would hurt its chances. Also it might not write all the metadata that Netflix requires--I haven't heard anything about timecode support for example. That might be something Canon wants to reserve for the C line.
  17. The 14 stops that C5D measured on the Alexa was at 2k resolution, and presumably the C300 III number of 12.8 would be closer to ~13.5 if downscaled to 2k. So it seems very close in terms of SNR.
  18. When I started making movies with my friends in middle school and all we had was a camcorder with no external mic at all, we would re-record our lines with the actual camera, just holding it close to our faces usually with the lens cap on. We'd have hours of "footage" just for ADR, lol.
  19. What about with HEVC? I'm curious to know whether HEVC is decoding on your GPU or CPU. Edit: Though if HEVC runs flawlessly whether or not GPU decoding is checked, then it's likely just decoding on your CPU.
  20. @Trankilstef are you on mac or windows? what is your gpu and cpu usage like when playing back? Also is your timeline 4k or 2k?
  21. @gt3rs I opened the HEVC test file in Resolve 16.2 studio on a laptop with an i7 9750H and a 2070. With Intel QuickSync turned on, the video file showed up as offline media and could not be played. With QuickSync turned off, I could play the file. I placed it on a 25 fps DCI 4K timeline and played it back. It stuttered at 10 fps, using 100% of my CPU and 1% of my GPU, likely using software decoding. How does that compare with your results, @Trankilstef? I heard you got smooth playback somehow. Do you know if it was using hardware decoding, and if so, on your cpu or gpu?
  22. My experience with Amazon prime video direct has not been great. It honestly feels more like a beta at this point, and to be fair, it has improved its interface in the past few months. It says they review titles for publication in 2-4 days, but it's more like 4+ weeks, and their system for telling you about issues is frustrating and poorly implemented. In the end you will end up with like 5 cents per hour watched if you release for amazon prime users to watch for free. So you can do the math to figure out how much content and how many views you need to make a profit.
  23. I have resolve studio, and access to a 1080 and a 2070. If you send me a file and instructions on what to look for I can run tests.
  24. You're the one who has a political slogan as their profile pic (sorry, couldnt resist, no hard feelings!) If we want to talk cameras, there are a bunch of projects recently posted in the screening room that have fewer than 5 replies, we could all go give our feedback.
  25. Nice! I think the first shot was great. The mysterious references to past events made me want to find out what the mystery was. I think that after that first shot, I would have liked to get a little more story and for it to reveal a little more about who died, what they were doing that made them "stupid," etc. I agree with the comments above, it was a little too obvious the character was wearing a mic, especially when the rope was rubbing against the jacket. And then one yell was badly distorted. One option for better sound is to use a boom mic and strategically mix that with the lav, for example using the boom exclusively for those effects. If you don't have a boom mic but do have the time, then you can even just go back and re-record some of those sounds with the mic placed somewhere to better capture those effects. But great job for a first narrative short with a limited production time/resources! I'd love to see whatever you make next.
×
×
  • Create New...