Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 07/01/2024 in all areas

  1. Yeah, depends on what you really want to accomplish. I'm a doc guy so my first thought is: capture it or it didn't happen. Without story, well, what are you looking at? At least that's my tact. If rez or skin tones are less than optimum, I'll cope with it later, but at least I got it.
    4 points
  2. Shooting with the S5ii a few days ago with the 20-60mm I was noticing some weirdness from the IBIS in terms of distortion and wondering whether its interacting (badly) with the lens correction for barrel distortion. These are two consecutive frames and there is a tiny horizontal change which the IBIS is correcting but its causing quite a distortion on the edges. Its subtle but its very noticeable on the building on the right. However, when you do a difference action between the two frames you can really see how it is spreading the distortion from the centre (where its barely anything) outwards.
    2 points
  3. Check it with Exiftool, you'll see that the file has a tag called "Panasonic Semi-Pro Metadata Xml" (at least that's what I have on the S5IIx), which is actually an XML and it has an element called CaptureGamma, in which you'll see "V-Log" when the clip has been shot in V-Log.
    2 points
  4. MrSMW

    Lumix and...Sony?

    And speaking of having the gear, wedding a couple of weeks ago, band turns up and sets up. I ask them if I was correct in observing they did not bring any lighting. At all. And it's a pretty dark room with near zero lighting and what there is is hardly 'disco' lighting. They confirmed I was indeed right and with these "overseas gigs we don't have enough room in the van for lighting". OK. You told the couple this? Well, no... Well luckily for me, I have 2 bigger plugin lights on tall stands and 2 smaller lights with 'disco light' settings and so for the duration of me being there, ie, the first set, there was lighting. After their first set, I cleared up and cleared out. What they did after that, I have no idea, but it probably wasn't good...
    2 points
  5. I went on vacation and didn’t bring my GH6. I used my iPhone 15 for photos and video instead. Even if I would have brought the GH6 I would not have been prepared to shoot my friend’s kit boarding lessons from a lawn chair under a palm tree on the beach with a 15x zoom that I didn’t think to bring. Could I have maybe captured it better with the right lens on the GH6 - sure! Did he care - nope he like it so much he asked me to airdrop the videos to his iPhone as we were driving to the airport. For me getting the shot is as important as getting the perfect shot. And in the near future just getting the shot will lead good enough results with the help of AI. With less demand it is inevitable the lenses prices will fall until of course when there is a time when no one makes lenses anymore and ten years go by and the prices of lenses will start to go up and up…
    2 points
  6. Wasn’t really supposed to be an issue and also… skaters.
    2 points
  7. I think the color science that happens in camera is really important. I'm also a believer that the image nearly always needs to be "developed" in post, but that development is largely power or constricted by the color science out of camera. That being said, I think all the major camera companies have good color science these days. Any nuances between Canon, Sony, Lumix, or Nikon can be easily massaged out in post if the starting image is good.
    1 point
  8. It's a big "it depends", based on lots of factors: Some scenes are more difficult technically to capture than others Some scenes are more difficult aesthetically to reproduce than others Some differences can be compensated for in post easily (e.g. small WB differences, skin tone hue rotations, etc) Some differences can't be compensated for (e.g. skin tone smoothing, quantisation issues like 8-bit log codec, lots of non-linear processing) The stronger the grade you're going to put on it the less it matters The more skilled you are in post the less it matters The more powerful the tools you use in post the less it matters The better the cameras colour profiles are the less it matters The less picky your audience is the less it matters The less saturated the final image the less it matters Etc The problem with discussing it is that on the open internet, the only two opinions anyone seems to understand is "it is the only thing that matters" or "it doesn't matter at all", and those who dare to look in the middle ground can't tell which of the strange things in the test images belong to which camp - easy or difficult or impossible to fix.
    1 point
  9. I use MediaInfo. It will do exactly what you need. It's been super useful for me identifying random video clips from other videographers at my day job.
    1 point
  10. He also said the IBIS in Open Gate and 4k have the same performance, but on his GH7/G9II video he only tested a 8mm lens but even my A7IV has a not so bad IBIS at 16mm. Owning the GH6 and G9II, the difference in IBIS performance is obvious while walking with most lenses.
    1 point
  11. As a fan of colour grading, I'm the first one to promote the idea that the footage SOOC is like a film negative - it's yet to be developed in post. However, while there are some things you can adjust in post to improve the cameras colour science, like WB or hue shifts (which people get triggered about all the time), you can't (without AI) make the footage higher quality. The richness of 5D ML 14-bit RAW can't be created out of the 8-bit 709 images from my GX85 (believe me - I've been trying for years!). BUT BUT BUT BUT BUT.... without lighting this is a BS pointless discussion. (and without anything interesting to point the camera at, lighting is polishing a turd) .....((and without a story to tell, glorious compositions won't even keep me awake)).....
    1 point
  12. A few Oly/OMDS lenses have OIS and support Sync-IS (their version of Dual-IS) i.e. the 12-100mm F4 and some of the long and expensive telephoto lenses.
    1 point
  13. I just watched the OM-1 vs G9ii IBIS only comparison and although the OM-1 looks better stabilised than the G9ii the majority of the time, there are places where it's the other way around, so goodness knows what the different settings were. However, I've experienced the difference between IBIS only and Dual IS on both my GH5 and GX85, and the different is night and day - potentially enough to make the G9ii the clear winner. If you're doubtful, think about the difference between having no stabilisation at all and adding OIS - that's the kind of difference it is. The G9ii might also have the extra mode the GH5 has where it removes all motion, which is enormously more stabilising than the normal IBIS mode, so that would be my preference in this situation. In terms of him doing the test and not telling us because he doesn't want to embarrass anyone, well, you might be interested to know there's lots of things about aliens that the government hasn't been telling us either...
    1 point
  14. skin tones and color are correlated inherently. difference races have different skin colors, need different ire to exposure right, the sensor and the color array filter also need to be optimized for these. very complicated. in the future, I will not be surprised that some cameras are designed for white people, some for black people, some for asian. this is huge market. I think in the film era, that Kodak film stock portra is optimized for white people, fuji provia is optimized for asian people. maybe there are film stocks optimized for black people, etc.
    1 point
  15. you are right, he did not compare om-1 to g9ii extensively. I suspect that he did the test privately, but he did not want to publish the results to hurt penny sells. om-1 can couple with some lenses with ois, like 12-100mm f4, 300mm f4.
    1 point
  16. Every time I see a blind camera test, I am looking at the colours. It's not something I decided to do, or have to focus on, it's what I naturally find myself looking at. Especially skin-tones. All the blind tests I've done, where I've done them blind, taken notes, and then scored and ranked the cameras, before looking at the results to see what was what, I correctly chose things in descending order of price. There was one test where I put the Alexa lower than cheaper cameras, but it had a strange green tint to it that the other cameras didn't have, so I suspect something was wrong with the test. The colour and texture of skin tones is about 40% of the grade I think. Drastically under-rated, and barely discussed with any depth.
    1 point
  17. I did the same, but in the opposite. I took a GH6 on holiday. Why? I care about the actual files I create. I don't want to look back on them and say "I wish I had taken my camera". I don't see that feeling going away anytime soon either. Sure, phones are getting better and in a pinch, why not? But when I can, I'll take the camera for its quality, better shooting experience and its authenticity (no AI crap going on that I have NO control over), not to mention I think I look like a douchebag when I use my phone.
    1 point
  18. I took a look at my footage of 5d3ml using 5.7k anamorphic or 3.5k crop mode pixel by pixel 14 bit color lossless ml raw. I cannot help smiling. the color is the best, the resolution is good enough even to the current standard. only f3 with the Zeiss lens can reach to similar level in terms of color. color, color, color.
    1 point
  19. "Grabbin a camera, going out and having fun with your friends" (at 43min 22sec). Great watch. Reminds me a lot of my video tape days, makes me wonder about my 15 years long break from video back then, makes me wonder about so many things. In 2015 I got back to video with hugely improved tech. I´ve had my waves of momentum since then. I still got my old svhs camera. It shall be a reminder of old dreams and of new dreams to go for. best and cheers
    1 point
  20. PannySVHS

    Nikon buys Red?

    True. I would like Panasonic to resurrect the brand name of Varicam and call their flagship DSLM something like Varicam VH1, with internal electronic vari ND, SDI, Dual Gain sensor, a cinema camera the size of a S1H. It could also be used to take photographs.:) Signal processing as pure and raw as of the og S1H. The DOP of the S1H Cannes film used a Sony Venice on his current Cannes 2024 entry. S1H was Bcam. So a traditionalist to the craft, but an open minded one to the tech and art of cinematography.🙂
    1 point
  21. majoraxis

    Is ProRes RAW really RAW?

    I had an URSA Mini Pro 4.6k that shot Cinema DNG and BRAW (a firmware upgrade discontinued Cinema DNG). The BRAW was slightly less resolved with a slight color shift compare to the Cinema DNG. Once color corrected I did not notice which shots were which on YouTube. From the online comparison between BRAW and Pro Res RAW it seems that Pro Res RAW has a finer noise structure, which makes the Pro Res RAW slightly better looking IMHO. BRAW at higher resolutions like with the URSA 12k shooting 8k resolution, I have no complaints unless I don't provide enough light, then I see the slight limitations of BRAW as compared to Pro Res RAW.
    1 point
  22. Thpriest

    Lumix and...Sony?

    Shit! That’s a real bummer on job! That’s when clients are glad they have hired a pro, there’s alway plan b or c and more kit.
    1 point
  23. Canon G1X which was their largest sensor compact ever at the time... Micro Four Thirds size sensor Example shot is from RAW, no AI... Canon G12 typical small sensor (1/1.7"), RAW no edits Same RAW from the G12 but with the Adobe Camera Raw AI optic corrections activated: Can you tell that apart from the G1X any more? The difference is now virtually unnoticeable.
    1 point
  24. It didn't slow down much, have you seen what generative AI is capable of when you feed it a source image? The progress of that is huge in just 2 years. The processing power of Generative AI is in the cloud, which makes it an app. So it is a very short leap to embed this in the default iPhone camera app or the default Samsung camera. The question is financial... Who pays for the cloud processing and how much.... When do smartphone GPUs reach the advanced threshold required to do this processing on the device rather than it needing a subscription? 2018-2023 is only 5 years of debate, and it saw 1" sensors become mainstream on flagship smartphones, often 100MP+ and a readout architecture fast enough for 6K RAW video, with no crop. They don't need to go to APS-C or even MFT size sensors in smartphones, because Generative AI is here. They don't need triple gain sensors when they have a full sensor readout at 240fps The current multi-shot HDR algorithms work very well. The dynamic range of a years old Huawei is close to a Fuji GFX 100 RAW file. There's been a lot of improvements to the optics too. Sony have a folded variable periscope zoom which is tiny and thin. Apple have a mass-market folded optic, a 135mm equiv. prime lens in an iPhone. Which would have been unthinkable 10 years ago. This is an incredible lens, capable of stunning results. The advancement in cinematic video is also very large in the latest models. The ability to simulate any full frame lens is already possible with software, let alone generative AI. I disagree. If you feed a 1" sensor Xiaomi 12S Ultra image from a few years ago into Adobe Camera Raw and apply the AI optical corrections you basically have the look of full frame and can't tell it apart. The dynamic range is there, the resolution is excellent, the main lens is fast and capable. The telephoto also. But you can shoot raw DNG. This overcomes anything and everything you don't like about the smartphone image processor for taste reasons. The DNGs benefit like the HEIF files do from the multi-sampling, quad bayer, multi-frame HDR and contain a fantastic amount of image data. Even on a relatively modestly priced Pixel 6 from 4 years ago. The more you shoot RAW the more you realise that the hardware is really fucking good, and that the processing built into the low level hardware and sensor output can be as natural as a mirrorless camera. Then once you have the colours to taste, you can then apply AI optics on top in ACR
    1 point
  25. There was this debate that smartphones will get better than ILCs, by 2020-22 or so. Suddenly smartphone sensor and Computational Photography/ AI development slowed down almost plateauing completely. In some ways the trade embargo on Huawei and Samsung and Sony mysteriously abandoning Triple Gain Sensors that they had presented papers on. Also, there were some talks on improving optics on smartphones much beyond the plastic lens optics currently used by the majority of the market. Somehow all of it was abandoned, and smartphones never became equal to or much closer to ILCs. White for dynamic range, colour accuracy, low light and a few other parameters they can be great, in real pixel to pixel detail, and exposure latitude, they aren't comparable. Until they improve Computational Photography/AI with the intention of replicating details on pixel levels comparable with ILCs, this seems to be something a bit in the future. While Topaz Labs and many others do it, including increasing resolution and even increasing bit depth, I wonder if it's actually as good as something shot on higher resolution (above the 12MP on smartphones) ILCs (24MP and above) with good quality glass. Smartphone photos, for now, only look good on small smartphone screens. On larger displays/ monitors they look strange (semi water colour pixalated).
    1 point
  26. haha That clinical eye of kye! ; ) Well, I think the question is more towards what we can identify as the usable/acceptable terms or not, isn't it? :- )
    1 point
  27. MrSMW

    Lumix and...Sony?

    Carnage on Shoot 01 Second Rode WG2 was DOA At least I knew about this in advance… Within the space of 10 mins, damaged my video workhorse Sigma 28-70 and totally destroyed my stills Tamron 28-75, plus A7 baseplate. Managed to soldier on with the Sigma for video but could have switched to a 20-60 but it’s going to need to go into repair ASAP. Just had to order a new Tamron though and then see if this one can be fixed as the lens came clean off the mount. Had to switch early to the 10-40 which is a bit wider than I’d have liked but hey ho, nothing like a baptism of fire. The damage was all caused by the Cotton Carrier system and first thing when I get back will be switching back to my previous twin set up of harness plus Spider Holster. The harness system means you cannot drop a camera. The Spider Holster takes the weight off your shoulders/back and on to your hips and stops the cameras swinging around as they tend to with a harness. Lessons learned 🤑 Otherwise GREAT set up.
    0 points
×
×
  • Create New...