Jump to content

KnightsFan

Members
  • Posts

    1,331
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. Are the reviews from people who have used both and think they handle the same? How much experience do these people have? The Flycam could be perfect, I don't have firsthand experience. But just because they have similar number of stars doesn't mean anything for quality. If you could get your hands on both and see for yourself that would be ideal, I'd love to hear back if you do because it's always nice to see budget brands increasing their quality.
  2. Image Signal Processor, basically just the processing of the image that the camera does. By applying noise reduction, an ISP lowers the noise floor and thus increases dynamic range, usually at the expense of resolution. but yeah your original statement is correct: There is no inherent reason why raw would improve dynamic range on the same camera. If their log format already captures the entire dynamic range of the ADC, then Raw would likely bring no dynamic range improvements. We'd have to test in the real world to find out (like in the video above, which is in German unfortunately).
  3. I always recommend a mechanical gimbal (Steadicam style) over an electronic one if you have the time and space to use it. It takes some time to learn how to use it, but you have a lot more control in the long run. In addition, the motion produced is much more fluid and natural. I find that most electronic gimbals have sterile movement, especially on pans. The only shots I like better with electronic gimbals are static shots where you really want it as still as possible and can't use a tripod. Another benefit of the mechanical gimbal is that it's a dead simple, durable hunk of metal. You can tear it down into small pieces for storage, you don't have to worry about firmware updates, battery life, getting it wet, or making sure it doesn't flop around and break itself when powered off. In my opinion, the gimbal is much more important than the vest/arm. You can get by without a vest for most shots, it's just heavy and tiring, even with a tiny camera. If you are using a wide lens, slight up and down motion is a lot harder to notice than even minor changes in angle, and inertia keeps bobbing to a minimum anyway. Of course, having a vest/arm is better than not having one, but personally I'd invest more in the gimbal. I have a Glidecam HD 4000. The only other mechanical gimbal I've used extensively is the Glidecam 4000 Pro (lower end model). The difference is night and day. The HD is easy to balance, and just "works" whereas the Pro was NEVER stable no matter what I did. The general consensus on the internet is that cheaper gimbals (mostly from India) such as Flycam are not as good as Glidecams--I don't have personal experience, but I'd get hands on experience comparing it to a Glidecam HD before buying. Especially since I got my HD4000 used for $150, it can be very affordable for high quality.
  4. No one is sure of anything, any info about actual camera specs is pure speculation for now. Sony might not have made final decisions for all we know.
  5. You won't get scopes, but you can get 24" HD TV or computer screens for a couple dollars at a thrift store. I think if you want semi accurate image analysis tools you'll have a hard time finding something for cheap.
  6. Ah man, should have gone with my gut! I was thinking GH5 the entire time, right up until that final boat scene. In my experience, Panasonic tends to have colder tones on everything except skin, which ends up being more orange, which is especially apparent in the scene around 6:40, or the shot at 2:13. I told a director once that Panasonic white balance was always too warm and too cold at the same time lol. But those boat shots look really good, so I thought it must be something like an XT3. I don't have any experience with any MFT lenses. All I could tell really was that they were very modern, based on the lack of CA and overall sharpness. For the gear used, was it just what you had available, or was there any special reason you paired this kit with this story?
  7. No problem! I enjoyed watching it. And for the gear... The thickness of the color and richness of the shadow tones was very good. The blue jacket at 9:55 for example popped out. I thought the last scene on the boat looked a lot like some more stylized film stocks. I thought the boat shots stood out as the best in terms of color. If I had to guess I'd say XT3 with Sigma Art lenses. But whatever you used it was consistent and did the job well.
  8. Not necessarily. It sounds like a scam. The thing is, Amazon would refund a scammed customer, so the customer wouldn't take the hit other than the hassle. Maybe this guy ran the scam once before, and the scammed person posted about it but didn't report it. They didn't lose any money so they wouldn't be out for blood. I found a great deal on Amazon for a monitor. The seller had perfect feedback on a hundred previous sales. It was a scam, and I got my money back, but reading the reviews showed that it was carefully crafted: the guy sold a hundred small items to build good feedback, and then sold a few hundred expensive color grading monitors and skipped town with tens of thousands. They had years of good reputation before pulling a lucrative con. I wouldn't bite. It sounds like a scam.
  9. That was really well done! I liked the unassuming and consistent pace, and the overall production value was quite good. I appreciate how the conflict is introduced in the very first scene, but is implied and veiled. The movie hides the conflict the way Paul is burying his own emotions. I do think, though, that for the runtime, there is too much buildup and not enough payoff. Part of that is that the moment of confrontation between the two men is the weakest part. There is no leadup to Paul falling, it's just a shot of a foot slipping out of nowhere, and then there is no visceral "punch" when Paul hits the ground. And then suddenly Emilie's there. I think that's the part that kind of harms the payoff for me, there's no connection between Paul and Emilie and then suddenly there is, with no story element to explain the resolution (unless I missed something, which is possible).
  10. LutCalc generates LUTs based on the difference between the color spaces you select in the drop downs. You specify an "in" and an "out" and it'll generate a LUT that would transform from one to the other. All you need is to know what gamma and gamut you are coming from (in your case, F-Log gamma and F-Log Gamut) and what you want to put it into. There are several Arri options, I'm not sure which one you want. You would enter this information on the left where it has Rec (in) and Out dropdowns. If you want a straight mathematical conversion, you can uncheck customization. You also probably want to leave the camera as generic and stop correction at 0. Then on the right you can set up the LUT file's options. You'd probably want a 3D lut, 65x65x65, in .cube format. I would use the largest range, and leave it unclipped. I'm not an expert, I've just used it once or twice in the past, so just play around with it. There are some how-to-guides and info on the main page as well that might be interesting https://cameramanben.github.io/LUTCalc/
  11. You can use LUTcalc to make one https://cameramanben.github.io/LUTCalc/LUTCalc/index.html If you use Resolve, you can use a CST node, which should have slightly better results than a LUT. They supposedly added F-Log support in version 16.1 (I'm still on 16.0 so I can't confirm firsthand).
  12. I thought I'd share this crop factor comparison tool I made, which now includes anamorphic options after reading some discussions in the anamorphic topic. It's a simple simulation made with Unity and plays in the browser. http://gobuildstuff.com/CropFactorApp/ Let me know any ways I can update it to make it more useful. I know the UI isn't great--I slapped the original app together a few years ago for fun--but if you mess around with the controls it should make sense eventually.
  13. That was nice! It is a bit annoying that over half the run time is end credits, though.
  14. Here we go again 10 bit, external timecode, external power, and low rolling shutter are the features I'm (still) looking for, let's see how it goes this time around.
  15. KnightsFan

    Lenses

    Sounds like a nice weekend project then. Here's the program so far: http://gobuildstuff.com/CropFactorApp/. It takes a really long time to load right now, I'll try to improve that as well.
  16. KnightsFan

    Lenses

    I made a little 3D simulation once that showed a comparison between different focal lengths on different size sensors, maybe I should look into adding anamorphic and/or more specific camera sensor sizes? Would anyone be interested?
  17. This is definitely as far from real tv as we could make it. On the technical side, it's 24p with 1.89:1 ratio. Story wise it kinda has a bit of everything.
  18. It's all done, just need to upload. I think its 10 for season 2. Episode lengths range from about 5 min to 20 min.
  19. I've done many projects that were edited in Premiere and finished in Resolve. What I do is use an XML to move the edit back and forth. If you've already started editing, then you need to export an XML from Resolve, and hand that over with all the footage to your friend, who will import that XML into Premiere. He'll probably need to do some relinking, but usually once you find one missing file it'll automatically find the rest if you keep the folder structure intact. I find that keeping the same folder structure is the most important thing for a smooth collaboration. If you haven't started the edit, then your friend can just start the edit in Premiere. Once your friend finishes the edit, he will export an XML, which you will import into Resolve. You'll likely have to fiddle with export and import settings on both ends, so I recommend trying the workflow before committing to it. Keep in mind that most effects will not transfer over. Simple fades usually work, but things like warp stabilizer or color correction won't. I have read that Blackmagic has made a free plugin so that Braw can be read in Premiere. If this is true, then you won't need to do any transcoding. That would make things easier. I would color grade after the edit is done. Let him edit the SOOC footage, and then color it once it gets back to you. If you want to start coloring straight away, one trick is to use Remote Grades in Resolve. If you haven't used them before, basically it makes a color grade go for a file instead of a timeline clip. You can start coloring clips right away, and then those grades will link up with the files automatically once the XML is ready and imported. Since you mention a band, keep in mind that most music is 44.1 khz, while Resolve operates strictly at 48 khz and tends to produce unpleasant artifacts when resampling. When I edit music videos in Resolve, I do the sound separately and then mux audio and video using ffmpeg so that I can keep the native sample rate for the music track.
  20. Season 1 is 4 episodes and will be out on Amazon on Dec 20. Season 2 is considerably longer (and much better quality, we actually knew what we were doing by then!) and will be out soon after new years.
  21. Does it show up in Disk Utility at all? If so, does it give any information? I ran into issues like this a LONG time ago when I unplugged a drive from a Windows computer without ejecting it, and then Mac wouldn't show it automatically. I think in that case I could still see it in Disk Utility and it gave some information about the drive. Also, have you tried a different cable? I have a Micro HDMI cable that works everywhere, except it suddenly stopped working with my NX1 about a year ago. Do you have other USB devices plugged into your Mac that would be forcing it to only allocate USB 2.0 resources to the T5? I would expect it to still work, but it's another easy thing to try.
  22. What exactly happens when you plug the T5 into a Mac or PC? Does it show up in the device manager (or whatever the Mac equivalent of that is)? Any noises or alerts? What is the drive formatted in--exFat, HFS, etc?
  23. I've been saying for some time (including earlier in this topic) that the roadblock to mainstream VR is bulky equipment. Facebook just announced controller-free hand tracking for the Quest, coming this week. They're talking about the resolution of the recorded 360 video. Mapping an image around for a 360x180 panoramic stream really benefits from 8k since you're only seeing a small portion of it at any time. What platform do you target in VR? My day job involves developing for the Quest, so we have pretty strict hardware limitations. I imagine developing for a traditional headset hooked up to a gaming PC gives you more room for better assets and rendering.
  24. I haven't seen any measurements of the DR of the 5D3 raw, but that seems about right based on my comparisons to the BM 2.5k. I tried that iso mode a bit, but i didn't like it and never used it on a project, i cant remember exactly why.
  25. Magic Lantern shoots native 14 bit DNG image sequences and puts each sequence into a .MLV container. If you mount the container as a drive, you literally copy/paste DNG frames out of the container--you don't actually have to convert anything. For 12 and 10 bit, Magic Lantern truncates the words which is why there is a DR penalty. (Or at least that was the case last time I used it, which was a few years ago). I can dig up some of my old footage if you can't find any samples online.
×
×
  • Create New...