Jump to content

KnightsFan

Members
  • Posts

    1,292
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. You could be right. I'm still hoping for 10 bit HEVC--even if that spec sheet had a typo and it's only 420 internally.
  2. @Oliver Daniel yeah, i saw that at PDN earlier this morning. That s1 price is competitive, considering it is the first full frame 4k60p, and has that sweet pixel shift mode. As has been mentioned, lenses seem scarce at the moment but i'm sure that will change. It will be interesting to see what format 10 bit 422 is in, and the price of the upgrade. Surely they wont have internal prores? That would be insane. My guess is HEVC still.
  3. No reason why Nikon shouldn't be first, but when Atomos first announced Raw over HDMI, I predicted that Panasonic would be the first to take advantage of it. My second guess would have been that Blackmagic make a Raw video camera that takes good enough photos to be technically called a hybrid.
  4. I agree with pretty much everything you said, but i especially agree with this. I have no doubt fairlight and fusion will continue to improve. A fully integrated, single post production application would be absolutely heavenly for quick turn around and light work. It will be interesting to see if they can make it flexible and stable enough over the next few versions. That sound software you screenshotted looks like the kind of intuitive UI that i think thay fairlight lacks. One glance and i understand how it works. But yeah it would get cluttered very quickly with large projects.
  5. Yeah, but the one built into Resolve lacks a lot of features and it's really slow. It hangs and freezes doing simple comps, which run fine in the standalone version. At least that's my experience so far. There are more features in standalone Fusion than Resolve builtin, and even more in the paid version, some of which are looking really tempting--specifically the tracking features. Also I did some 4096x4096 comps in Fusion recently, and since the free version maxes out at UHD I actually had to export as four 2K images and then stitch them in ffmpeg. It's an easy workaround but it's kind of annoying. Huh, interesting discussion So far I've found fewer bugs in Resolve than Premiere, but I do avoid the "bleeding edge" features like Fairlight and Fusion. Back when Fairlight first was integrated, I had one issue crop up in the rendered audio file, but that was a few versions ago. Hopefully that's fixed. Still, not being able to reliably hear what you're working on makes it annoying. True. But it's all so big and blocky. Like if I pop open my effects tab it takes up 1/4 of my screen, so I have to close it back down just to see what I'm doing. I'm constantly finding myself adjusting the windows. I believe that you still can't freely undock or rearrange panels except like the color scopes only--which makes it really hard to take advantage of dual screens. Re: the rest of your post... I agree, and one reason I like Reaper so much is that it's scalable. If I want to record a sound, I open Reaper and hit record. If I want to EQ a single clip, it's a couple of clicks. If I need to mix for a surreal narrative project in both 2.0 and 5.1, with crazy effects, and dialog and ambient being piped through a vocoder and all kinds of crazy stuff, it scales really well. In Fairlight you've got to dive through menus to set up sends, the UI honestly feels very uncreative and inefficient. It just doesn't want to do things that it wasn't consciously designed to do by the developers.
  6. It says the thread doesnt exist... Yeah, I use resolve as my primary editor and of course for color correction. I got the full version of that, too, mainly for noise reduction. Certainly worth the price just for editing and color. At this point im strongly considering getting a fusion license when we hit post on my next project. Blackmagic really has phenomenal software. However, I tried using fairlight a few times, and really disliked it. I regularly have issues with the audio cutting out while editing, or popping, or suddenly getting really quiet for a few seconds. So at this point i dont trust fairlight for real use. The only time ive ever used it outside of testing was to add some compression to an audio track for a rough cut render. Just that once. I suppose like many DAWs, fairlight probably gets a lot of mileage out of plugins like the RX pack. I think that reaper has a better approach in that regard, though, where there is no "builtin" EQ or dynamics, its all plugins. Not only does it come with a phenomenal library, its just as easy to use a 3rd party EQ as it is to use reaper's EQ. Fairlight seems to have built in stuff just sitting there taking up screen space even if you dont need it. Instead of memorizing lots of little functions, once you understand the broad design philosophy in reaper, you can figure out the rest intuitively, which was the opposite of my experience with fairlight.
  7. To be fair though you are supposed to pay for reaper after the trial period. It doesnt lock you out, but i discourage taking advantagr of that. i am happy to pay $60 if only to show my support for non intrusive software--and it also happens to be a killer program that I use daily. @kye tongue in cheek aside, reaper is signifcantly better than fairlight in my experience. It has a cleaner interface that is much friendlier to limited screensize and dual monitor setups, in my opinion. Also resolve still doesnt support 44.1 khz exports i think? Usually when i do a music video, the audio file is 44.1 and not only does resolve/fairlight force a conversion, it is a bad sounding conversion.
  8. Have you tried plugging the camera into the monitor? Do you get an image?
  9. Totally agree. I'd rather see the E2 as a polished product before they announce a GS version. But hey, maybe the industrial market will snatch them up? True. But... last weekend I went back to some of my REALLY old projects. (The .wmv files no longer play audio in VLC, so I wanted to re-export in a modern codec. Success, btw) They were shot on an old photo camera. 640 x 480, 15 fps. But it was a CCD with intraframe mjpeg in 422! The motion was day and night compared to modern cameras. Really makes me want a GS camera.
  10. In one of their videos, a Blackmagic representative said they want other camera manufacturers use it in their products, but I'd have to look for that video. I posted it in one of the BM Raw topics here a few months ago. However I think it extremely unlikely that Z Cam will put it in the E2. I think they'd have mentioned it on their FB group if that was at all an option.
  11. Yup, sounds like simple compression artifacts to me. Downscaling might make it subjectively more pleasant to look at, but won't recover any details that have been mushed. The only real solution would be to use a higher bitrate or better codec to begin with. Raw of course solves every problem, but is usually overkill. High bitrate intraframe codecs are ideal for fast pans, but even using a higher bitrate interframe codec would improve the image. However, a better acquisition format doesn't fully solve the problem. If you distribute on a streaming service, then it will be compressed into mush, whether you shot in Raw or not. The delivery format is often the limiting factor these days.
  12. Downscaling is essentially adding a blur to your image. It might help reduce compression artifacts--if that is indeed what you are seeing, that was just my first guess without having seen the image itself. You might get better mileage out of a directional blur rather than downscaling.
  13. Does the monitor have hdmi input? Is the camera running the latest firmware, or magic lantern? If both of those are a yes, it will probably work. The only issue would be if somehow the monitor doesnt support any of the frame rates or resolutions the camera outputs. We would need to look at your specific model to know for sure.
  14. When using aconstant bitrate interframe codec, any movement in the frame will have quality degradation. The more motion, the more quality loss. Shooting raw will eliminate such issues. An interframe codec would also eliminate motion artifacts as well.
  15. Thats a fair assessment for whether it is worth the money to you. In fact, i passed on the e2 because i will be using an xt3 on my next project. But thats a similar argument to saying the a7r3 isnt worth more than the a73 because you only need 42 megapixels for specialty purposes. And as @androidlad points out, there are other benefits. If you want prores on an xt3 you are looking at another $500 for an external recorder anyway. I completely see your point though, and agree that if you dont need the extra features, or if you want those beautiful 26mp photos, the xt3 is a better camera for you. The truth is, everything is now a product in development. Most major software companies ate switching to rolling updates and away from major versioning. Tesla rolls out software updates for their cars. Ten years ago it would be unthinkable to have to update your car! Its just how the world works now. The good news is Z cam listens to their buyers and is willing to shape their product to fit evolving needs.
  16. Thats true. Still seems a reasonable price. How much do you have to pay for a reliable platform that shoots 4k 120 at 10 bit? How much for a camera that supports all that in a synced arrays of cameras? Im not sure its just for tinkerers and beta testers. LG used e2's for their massive OLED Falls display, for example. Again, maybe those arent the features you need. But it does not seem overpriced at all to me.
  17. No more than you would need for most other cameras. It uses cheap batteries, and you can use an iphone as a wired or wireless monitor. Most other cameras need like a teradek to do that. A side handle is pretty cheap and you can even use it left handed. Most people considering an e2 would get an external monitor and cage for any camera they got which would puy them at the exact same accessories. Name a cheaper camera--with or without accessories--with an interchangeable lens and 4k 120 at 10 bit. Name one that also has wireless monitoring for cheaper. You could spend $1k on accessories and still it will be cheapest in its class. The gh5s cost more brand new than this does with a side handle and cheap monitor. Maybe none of those features speak to your needs. Thats fair if it is not the best value for you. But the price overall seems good for what you get.
  18. I think its still not in the official firmware yet. You have to manually install it still. But yeah its official and all. They are developing their own flavor of raw last I heard.
  19. For sure! Even when shooting raw it is worthwhile to visualize the shot as it will be graded, even if it is just metadata. I guess i was mainly talking in the framework of hess' argument.
  20. Any setting below 1250, and the data recorded is the same. Changing ISO from 100-1000 only changes the curve that determines where "middle grey" is mapped within the range from noise floor to saturation. In other words, from ISO 100-1000... iris, shutter speed, NDs, and lighting are the only ways to change the data that is recorded. Adjusting ISO within that gain range doesn't actually adjust "exposure" anyway. (Of course I'm assuming we're talking about Raw or Log on the P4K.) That's one way to do it. I think Hess's video is most useful for exposing with a light meter, because "high ISO for bright" and "low ISO for dark" is 100% relative to properly exposing for middle grey. If you are at ISO 400 and your grey card is one stop under, Hess's method implies adding TWO stops of light and then changing to ISO 200. Which is essentially the same as adding your two stops of light, staying at ISO 400, and then dropping down a stop in post. Your method chooses an ISO based on the ratio of over exposure latitude vs. under exposure latitude, and then you adjust iris/shutter/ND/lights for exposure, because those are the only things that actually affect the data recorded (apart from the 400 vs. 3200 gain switch)
  21. This is a great video. Very clear, concise, and informative, as are the few other videos by Hess that I've watched. I think it would be much better if, instead of giving a single ISO value, the camera just displayed that DR bar graph that simply shows where middle grey is measured, and the number of stops over and under that are retained. It's a shame we're stuck using ISO ratings to describe exposure, even after technology has moved beyond film stocks.
  22. ...and Nikon goes from "buy Nikon lenses--you can adapt them to any camera!" to "buy Nikon cameras--you can use any lenses!"
  23. That's what I'm saying! Either Andrew's info is wrong, or I am misreading it, or it is completely nuts.
  24. That's possible, but Andrew's post implies that he at least thinks it will be full frame: Unless I'm really reading this wrong?
  25. You mean the z5 wont be full frame, or that the video mode will be cropped?
×
×
  • Create New...