Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 12/25/2023 in all areas

  1. In the real world (as always) things are more complicated. Getting things right in camera is normally considered good advice because it's assumed that the results will be higher quality than not getting it right in camera and then adjusting afterwards. It can also be good advice from the perspective that depending on the situation it can take significant time / effort to process in post, and there's a risk that the desired results can't be obtained, and by then a re-shoot might be very difficult. On the other hand, some situations will be made better by getting in wrong in camera, but in some way that provides an advantage. ETTR is getting it wrong in camera, and can improve the image once adjusted in post. There are other situations where this might be the case, depending on the situation. Your shot might have benefited from being exposed normally and then be pulled down in post, assuming nothing was clipped. Your shot might also have benefitted from being shot at a normal WB, and then having the red and green channels pulled down to make the image blue (ETTR but only of those channels). Good old fashioned "movie magic" involves trickery from time to time, sometimes by a huge margin (e.g. shooting day-for-night) and sometimes getting it wrong in camera can be advantageous to aid in the illusions.
    2 points
  2. 1 point
  3. Jedi Master

    24p is outdated

    Depends on whether the sink device (the smart TV in this case) supports QMS. The Netflix app does , but most streaming devices either don’t or not completely.
    1 point
  4. kye

    24p is outdated

    I was wondering if the conversation would get to discussing this. I was curious some time ago and did some testing and some math. In testing I can see the difference between 24p and 30p easily, on both a 60p display or a display that is set to the native frame rate. The difference is obvious and the look of 30p is quite distasteful to me, regardless of the display frame rate / refresh rate. 24p on a 60p display does indeed introduce jitter in the timing of the frames (where the frames displayed are "nearest" and not synthesised from multiple frames in the source material). When you go to higher frame rates the jitter becomes less, with 120p being an even multiple of 24p, so the jitter of 24p will be eliminated or drastically reduced with higher display frame rates. In the math I did, I was surprised to see that capture frame rates are remarkably preserved even if put through different frame-rate timelines / displays etc. Assuming I didn't screw up the logic, here's what you see when watching 24p source material on 30p display. Timing is all over the place, but for whatever reason both 24p on a 24p display as well as the below are still preferable to 30p for me. What becomes interesting is when we shoot 30p, put it on a 24p timeline, and then display it on a 30p display: Apart from a doubled-up frame every so often (because there are only 24 frames per second to choose from), the 30p is completely resurrected! I have wondered if Netflix etc apps on smart TVs actually change the frame rate based on the source material or if they just run the TV at some fps and pick the nearest frame to display. I have been meaning to test my TV with my phone (recording the screen with 240fps slow motion and then reviewing the footage and counting the frames is pretty straight-forwards). TLDR; 24p is far superior to 30p/60p regardless of display refresh rate (for me anyway) When displays move to faster refresh rates the jitter from 24p sources will be reduced / eliminated Frame rate conversions can involve interesting time-aliasing effects where the time-resolution of some frame rates can pass through almost completely in-tact
    1 point
  5. Astonishment by the sheer sensation of a blue hour magical moment, dreamyness, a story possibly to unfold or having end. You got that right in(front) the camera. 🙂 @PPNS
    1 point
×
×
  • Create New...