kye Posted July 30, 2020 Author Share Posted July 30, 2020 @Towd The difference between 24p and 23.976 is one frame every 41.6 seconds. If you have your NLE set to "nearest frame" then that means it won't jump until 20.6 seconds into a clip, which is probably fine for me and my edits. However, if you have your timeline set to some kind of frame interpolation, where you're doing slow-motion shots that aren't a direct ratio (speed-ramps or using the algorithms to create new frames in-between the source ones) then your 23.976 clip is going to trigger those algorithms and the first frame will be the first frame of the clip, but every frame from then on will be algorithmically generated as being between two frames in the source clip. The way around that is to set the timeline/project to "nearest" but then engage the algorithm for specific shots, but that's a bit of a PITA. I'm fine to change to NTSC mode, and I guess that's what I should do. Funny how you convinced me to shoot 24p and now I'm switching away from it again 🙂 I don't care about PAL or NTSC or whatever - I haven't been able to receive TV or radio in my house for over a decade, all content is data via the internet. Whenever people ask me about something on TV I normally reply "do they still have that?" just to be cheeky - to me it's all just data. These days the frame rate of a video has about as much connection to the frequency of the AC coming out of my power sockets as the bit-depth of a JPG image is connected with the phases of the moon. I guess that if you're involved in the industry then it's probably still a big deal though, and maybe there's a bunch of hardware processing digital TV signals that are all hard-coded or spec'd for a certain frame rate. I genuinely have no idea what refresh rates my TV runs though. We have a smart 4K TV with Netflix and Amazon apps but no clue what the settings are - when we got it I went through the menus to turn off the image auto-magic enhancement destruction features. Before that we had a dumb FHD TV with a Roku media box with Netflix and other apps, and before that we used the Xbox with those apps. I always set the TV to native resolution but I never paid much attention to the refresh rates though. Considering that TVs are just computers now and they're all made in the same factories I wonder if the content is all region unspecific and maybe the apps are all written to handle whatever frame rates the content is in? TBH it makes as much sense as talking about NTSC mode it is! Thanks for your help. back to the original question though.... Does this mean that the motion cadence question is now moot? or just moot for the GH5? I have a BMMCC which can shoot uncompressed RAW, and can do a test with that if it will help. Quote Link to comment Share on other sites More sharing options...
Towd Posted July 31, 2020 Share Posted July 31, 2020 Yeah, my point was just that while technically 23.976 is not 24.0 fps... for 99.9% of projects its all the same thing. And only the technicians such as DP and Editor care. I can't speak for every production in NTSC land but unless you are specifically targeting a DCP delivery, I don't know of anyone who shoots or edits at 24.0. It is interesting what you bring up regarding Youtube and other streaming platforms being mostly frame rate agnostic and just delivering whatever frame rate. I know I've watched PAL stuff on Youtube at 25fps. However, I'll still argue that NTSC 30fps and PAL 25fps are legacy formats from the broadcast days and not something a modern project should be using. 22 hours ago, kye said: Does this mean that the motion cadence question is now moot? or just moot for the GH5? I have a BMMCC which can shoot uncompressed RAW, and can do a test with that if it will help. Totally up to you. I still don't believe that "motion cadence" in the sense that frames are recorded at slightly variable timings which can be perceived or measured in any meaningful way is real. I do believe in jitter/strobing effects in high contrast scenes, rolling shutter, and users setting bad shutter angles which in turn lead to "motion cadence" issues. I think playback devices can contribute, and can also see how the dynamic range of a camera could affect perceived jitter due to clipping or contrast issues. That said, I'm still very interested to see if you can find the motion cadence unicorn. If you do run any kind of measurable test using an oscilloscope, pendulum, or consistently measurable device, I'm happy to run it through SynthEyes if it helps verify results. kye 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.