Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 03/16/2025 in all areas

  1. The notion of "color grading" is largely a commercial-construct designed to create the job of "colorist" and to sell computer hardware and software. Movies made after the 1980s look consistently worse and worse, because the popular concept of "fixing it in post" has led a generation of film-makers to disregard the importance of proper lighting, story, acting, and set design. Something went very wrong in cinema after the mid-90s. Both socially and artistically. This degradation of quality in film-making coincided with three shifts in film-aesthetics: 1. The move from celluloid to digital. 2. The move from capturing a look based in 'reality', to color-graded footage. 3. An odd obsession with increased resolution. With increased-resolutions, the decay of cinema became even more profound: When an actor's face is shot in close-up at 8k, we are seeing a level of surface-detail to the human-face that we would NEVER see in reality. So, what is the 8k+ film-maker actually capturing? Cinema is predicated on our 'suspension of disbelief'. To intentionally shoot a film that cannot be believed, because it does not represent 'reality' in a way that we could possibly see, is anti-cinema. The Rise of Anti Cinema Through both malice, and incompetence, cinema has decayed. Before it can be saved, we must acknowledge the extent of this sickness, and then take steps to remedy it. We need to rely less on software, and more on our eyes, on set. We need to embrace imperfection, and return to capturing a plausible reality. We were better off when analog color-timing was the only post-production option for "grading" footage. Cinema can be fantastical, magical, or extraordinary, but it should never be unbelievable. Let's return to honest, practical effects; proper lighting; and artistry in set-design. It's time to stop color-grading. 25 Years of Madness Since the launch of the Sony F900, over 25 years ago, camera companies have been promising a digital replacement for analog 35mm film. For 25 years, they have been completely unable to deliver the 35mm analog look. Instead, film-makers have been expected to mess-around in computer software chasing an aesthetic that can rarely be achieved, and that the camera companies should have been providing as a default output. Why (given the equivalent lighting, set and actors) can no commercially-available digital video camera shoot footage straight-out-of-camera that properly emulates the Kodak 5247 and Kodak 5254 color-negative stocks? These stocks practically defined cinema as we knew it, but they do not exist as digital equivalents. We got scammed Why must young film-makers wade about in a swamp of technical-nonsense, graphics cards, manuals, color-grading, and hardware chasing the look that an off-the-shelf roll of 35mm stills-camera film would have delivered instantly, for five-dollars, in the 1960s, 70s, and 80s? Why can't these stocks be delivered straight off-camera? The camera industry has pushed responsibility for great video-capture onto the "colorist". The colorist is a symptom of decay in the camera and film industry; necessary only because of the technical failings of camera manufacturers, and their inability to simply deliver the replacement for Kodak stocks they promised over 25 years ago. The colorist is also a symptom of the decay in the excellence of artists on set. The Broken Promise of the Camera Industry We were promised film in a digital format. But, instead, the camera-industry redefined "film" as a sub-par version of itself. Then all the failings of this new medium were commercialized in a host of hardware and software to "repair" the damage done. Why is it so difficult for the digital-camera industry to care about creating an accurate version of the very medium that it claimed to be replacing? The digital "Cinema" cameras of today have almost nothing to do with cinema as we knew it. This is nothing short of fraud.
    3 points
  2. Agree with what everyone has said regarding there being many poor looking film movies that we forgot about, many great looking digital movies, and everything in between. I believe a better thesis would be, "movies looked better before smartphones were invented" A big reason that mainstream movies look bland is because they are no longer designed for a giant screen in a dark theater, nor even on a big flatscreen in your living room. They are increasingly consumed on 6" screens in broad daylight (as well as theaters and living rooms). Now to go on a sight tangent, the same can be said of writing. Often when I talk to friends, they'll say, "oh yeah, I saw that movie. It was on netflix in the background while I cleaned my house" To some degree, it's not that writers are worse, it's that modern writing is designed to be consumed at 50% attention with chunks missing. The percent of audience that watches every second at full attention is simply getting smaller. I don't believe that shift has anything to do with filmmaking technology.
    3 points
  3. I get that people love to dissect every little thing, but as someone working with some of the biggest media companies, I can tell you that a lot of these so called issues don’t even come up in real world professional work. The S1R II delivers solid image quality, stabilization, and dynamic range. More than enough to create high level content. I’m not saying don’t analyze your tools, but at some point, you have to ask, is this actually making you a better filmmaker, or just keeping you stuck in analysis paralysis? If your priority is making great content, you’ll be just fine.
    3 points
  4. While I do like the look of film and enjoy movies from the 90’s, you’re laying it on pretty thick here and a little over dramatic. There have been a lot of beautiful, cinematic films coming out over the years since that weren’t shot on film and I wholeheartedly disagree on your “hot take” on colorists. You have an opinion and that’s great but none of this is fact. Actually it’s rather naive and the Alexa 35 in the right hands might be indistinguishable from film at this point. 1. At the end of the day film is both an art form and a product. If the audience doesn’t mind or can’t tell the difference the cheaper and more efficient format will win out over quality/nostalgia 90% of the time. 2. It might be important to remember that we all have a perspective based on what we were raised on and what we enjoyed growing up. What looks “good” to us can taint our objectivity and it’s important to separate out nostalgia and familiarity from things like color, latitude and grain. 3. For someone raised on the internet and video games at 60p your idea of film being superior is laughable. The new generation is determining things now. Vertical format content delivered in bite sized portions all day long. That’s the norm. Steaming services competing over existing IP that has a built in audience and a race to the bottom.
    3 points
  5. PannySVHS

    Forum ideas

    @eatstoomuchjam@Clark Nikolai Let's get it rolling.:)
    1 point
  6. You can always use optimizations and make timeline with NR and lots of color grading play in real time. For Example - Top Menu ->Timeline -> Timeline playback resolution -> Half or even Quarter. In most cases Half or 1080p from 4K would be enough.
    1 point
  7. S5ii had improvements over orig S5, but lower level of details with oversharpening was annoying, even with 6k VLog and also with a LUT burnt in. It was especially noticeable on extracted stills when viewed from a big screen. If you check Connor McCaskill’s latest S1Rii Youtube clip, he has a download link to his footage. Had a play on resolve, and damn, S1Rii IQ looks definitely better and different than what came from S5ii sensor. Grades pretty effortlessly too. The only downside with Z6iii NRaw along the big file sizes is the NR. Don’t know if the full M4 Max chip is enough to play the timeline in real time with NR, and it is expensive. Have a PC with 5800x CPU and RTX 3080 GPU too and editing is more stuttery with it than with M3 Pro Macbook. The latest Nvidia 5xxx GPUs have 4:2:2 support now, but they are not cheap either, and I prefer the ease of using macOS. A new camera usually means you need to get a new computer for editing too, but with S1Rii that does not seem to be the case. Even M1 Pro would be fine
    1 point
  8. Exactly. The viewer does not give a damn how it was made. A film or documentary must move feelings or make people think. If they do, who cares about the technical details? Only the insiders.
    1 point
  9. I disagree, I really think the video engine is worse, of course much more when using the standard profiles. In 4K the level of details is low but yet these details are over-sharpened. In 5,9K/6K, there are more details but the bad sharpening is annoying. Of course most clients don't see the difference but they can't really see the difference between 4k or 1080P either ... So if use high resolution video modes I want the best quality for cropping, for extract stills etc. If not, even an old GH2 is largely enough to create beautiful images. From my point of view, lowering image quality is never a great thing, I just hope the S1RII will be closer to the S1 and S5 than the S5II.
    1 point
  10. This. Yeah, we can look back at older films and remark about how nice they look, but I can name you just as many (and, frankly, probably more) that look like absolute dog shit visually. Ironically, some of those are my favorite films! But most of them weren't made to intentionally look that way, they just did because of budget limitations. There is a lot of really bad looking films out there though. In fact I'd say most films from the film era aren't any more remarkable looking than what is filmed today digitally. There certainly are exceptions, which is why I do agree to a certain extent that it's unfortunate that most everything has moved to digital, but I can't say that every film I watch today would look substantially better if it'd been shot on film, especially lower budget ones. It's really easy to look back with rose tinted glasses and say "everything looked better back when it was shot on film." I think the bigger issue with the move to digital is how disposable images have become in general. We all shoot thousands of pictures on our phones every year but most we never look at again after taking them. In fact, most of the time we put little thought into taking them. Or at least I am guilty of that. They just sit on our phones, taking up digital space waiting for the day when maybe we remember that we documented this moment or that moment. Whereas with film, or even video tape, aside from the camera itself, you were limited by how many pictures were left on the roll, how many rolls you could afford to buy, and then the cost of developing them. You also didn't get that immediate feedback of looking at a photo you just took to see how it turned out, instead you had to wait until it was developed. I remember going to sports events as a kid with my camera and only having two rolls of film. That was roughly what, 50-60 pictures total? I had to choose carefully what pictures I took less I run out of film and miss something extraordinary. I couldn't just waste pictures! Now though I'll take 60 pictures in the span of 5 minutes with my phone! Taking photographs or moving images was a much more thoughtful experience in the film days. Today that doesn't really exist, because content is so disposable. Even if you are fortunate enough to create something that breaks through, something else rapidly comes along to take the viewers attention away. With the rise of TikTok it has gotten even worse than it was during YouTube's peak. 15 seconds and then it's on to the next thing! Still, that isn't to say it all is bad. But it's not all good either.
    1 point
  11. I don't understand whether this is a provocative post or a serious one. Now, I am also somewhat ignorant of the topic but some of the statements seem 'exaggerated' to me to be good. In general, digital has greatly democratised cinema and its art. But every art has its era and its crafts and this constant 'it was better before' is largely pissing me off. It describes good old times that never existed. There were horse carriages and the farrier lobby, then came the railway and goodbye farriers. The truth is that today any artist can produce a film with cinematic quality at home and at a negligible cost. Unless you regret the old Super 8 home movies. Anyway, back to the data. It's hard to talk about resolution as we understand it today in the digital world, but I have my doubts that a good film will not solve 8K, quite the contrary. Here a document for nerds: http://www.tmax100.com/photo/pdf/film.pdf But here there's a video that explains it more simply. Regarding the immediacy of film.... are we talking cinema or home-made Super 8s? Because if you mention Clint Eastwood and Kodak Vision (Vision is cited also in the YT video above) then it's much more complex with intermediate and print stocks: https://en.m.wikipedia.org/wiki/Film_stock Ah and color grading existed on film stock too. It was called color timing: https://en.m.wikipedia.org/wiki/Color_grading Finally, am I saying that digital is better than film stock? NO, they are two different medium.
    1 point
  12. I very much prefer the image of the OG S5 over the S5II X, but people really make a mountain out of a molehill when it comes to the "worse" image. It's really not that bad. In the year I've had it not one client or viewer has complained about the image; in fact it has been the exact opposite! I feel like cameras have plateaued so now people over analyze and overstate every little thing. But virtually all of this stuff doesn't matter to the audience that we are creating these images for. Anyone with any discernible talent will be able to take the S1RII and create compelling images with it. That bride is going to love the pictures you take, the corporate client is going to be ecstatic with the talking head interviews you shoot, the MMA school is going to be psyched with the promotional video you film, etc. As long as it's in focus, the colors are okay, and it's framed well, these folks aren't going to really care if it's a little noisier than the R5II or if the rolling shutter is slightly worse than the A7RV. I don't know how it is where you all live, but there are literally people making money using cheap Canon Rebel DSLRs and kit lenses in my area. I see friends post their wedding pictures, their kid's senior portraits, baby pictures, and all of that stuff on Facebook all of the time. Most of the time these photographers aren't even good at what they do, put people I know still go crazy over them and post these photos they paid for proudly on social media! These photographers still get paid work, not just because they are cheap (that certainly helps!) but because the average person's standards aren't all that high. That's not to say that we should lower our standards, just that we should remember the big picture (no pun intended) and stop worrying about the small things that aren't going to matter to 99.9% of our clients/audience.
    1 point
  13. Might be a bit of survivor bias here. The older movies that were shot on film might seem to be of a nicer IQ standard, but those are the ones that are still acknowledged. As an dude that went to the local 1$ 'grindhouse' theater rather regularly as a kid, I assure you that the quality of the image for the forgettable films were often nothing remarkable. However, I will say that the darker, deeper, contrasty look that was in fashion among better cinematographers back then is something I miss. Less is more. Too much detail in a scene can be a detriment at times. All that dynamic range often is not needed. Spielberg's West Side Story looked remarkable and like shit simultaneously, imo.
    1 point
  14. This a good piece about how it was used.
    1 point
×
×
  • Create New...