Axel
Members-
Posts
1,900 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by Axel
-
@jonpais regarding Wolfcrow. He doesn’t seem to have fully understood HDR. I like the phrase ‘old wine in a new bottle’ though. Because you can judge wine by it’s vol% of alcohol (~ resolution) OR by more complex parameters. If one day we’ll record the same scene with a contemporary DSLR and with a camera fully covering all rec2020 requirements, we will agree to new wine in new bottles. @Vesku A deeper black actually doesn‘t help to distinguish more shadow nuances, for that you‘d need brighter dark greys and MUCH brighter highlights. On the recording side, you have to avoid true black anyway because right above a hypothetical ‘no signal’ stretches a zone of BAD signals, which you usually don’t record (ETTR). @Kisaha Yes, it’s not affordable for me. Mapping any rec709 or 8-bit LOG video to 2020 shows the limits in the waveform already. And even though I have no HDR monitor yet, I do see how little dynamic then remains for the midtones. The closest are some old raw recordings from my Pocket (12-bit), but also all artifacts (moire, color fringing) are accentuated.
-
That’s the correct path for XAVC. Either this is a total mystery or you accidentally shot something in AVCHD, you might check this folder also.
-
How do you access the card on the computer? Do you manually navigate down the folder tree, and if so, where do you expect to find them?
-
That’s reasonable. There are more reports about strange behaviour than usual. The main reason to upgrade would be H.265 support. I will wait for the next update with bugfixes and then check on fcp.co if they are fixed indeed (because my main concern is video with FCP). No one is forced to hurry. As for special software, it’s best to check on ML forum.
-
An "impossible" shot. Often meant to impress the audience. There is a danger in that, insofar as you (as one of my film school teachers eloquently put it) feel the presence of the author. That's inavoidable, it can be a bad thing, but not always. If you want the audience to experience the experience of an OOBE, your modus operandi needs to be established first. If you wanted to just make them understand, oh, this is meant to be an OOBE, a sequence of crude tricks will do (showing the person flying to the sky could be seen as crude, no matter how realistic, surrealistic or hyperrealistic it's done. Unless it's a dream). But for immersion? In 2017? Did you see Enter The Void? The hero smokes DMT and has visions. Everything is his POV. Then he is shot dead. He sees his dead body, and from then on the camera can (and does) move freely. The film isn't easy to watch for many, but for those who can, it's very immersive. Sounds reasonable. The cut through the roof is difficult/dangerous. For something to feel continous, it often needs to be interrupted. You didn't say what kind of OOBE you want to show. People who were once clinically dead report strict POV. They can see themselves lying on the hospital bed, they see mundane details of the room, a pretty sober affair, even though they float in mid-air, which obviously is being spontaneously accepted (or imagined?) as the normal way a disembodied mind/soul behaves. Elisabeth Kübler-Ross has interviewed hundreds of them. They came back, so of course they were not really dead. Then there is the whole complex of OOBE through drugs, meditation, lucid dreams asf. Carlos Castanedas The Teachings Of Don Juan also implies POV. Everybody occasionally dreams that he/she can fly. Does this count as OOBE? I don't know. What I do know though is that these dreams are never strictly POV, no dream is. My own experience is that you think, this must be a dream, I can't fly! As if to prove you wrong, the dream shows you above he fields and houses. You don't want the dream to stop, because it's such a good feeling (you can imagine what Freud said about hose dreams). The immersion then would be triggered by empathy more than by FX. How the actor radiates ecstasy or so.
-
@markr041 I personally thank you for pioneering. I can't monitor HDR right now, but it's good to know about the intricacies beforehand. You took the effort and tested it, that's awesome! @maxotics Very interesting read. I absorp every word and read your posts several times in order not to miss something. @jonpais You always find the right words and explain very well. What I like most about your postings are your excellent videos that show your good taste and commitment to beauty. I am glad I started this thread. Invaluable information, great forum. Thumbs up for all.
-
I'd postpone any investment in the editing/monitoring hardware until it gets a) cheaper and b) good enough. Even the 27" Dell HDR (~$2000) covers 2020 only by ~70% and is deemed "HDR ready" by most of the reviewers. Imagine you plan for an ambitious short film to be shot next spring, do you need to buy an Ursa mini (I wouldn't : too big and cumbersome). Or at least a GH5? I am really worried. How BAD will rec_709 videos look in comparison?
-
A year passed, and I came by a small town's consumer electronics' shop, stuffed with usually overpriced TV sets. The three biggest screens in the window front have the HDR-label, the prices reduced from 3000€ to 1800€. HLG is another aspect not widely discussed a year ago. It can be/should be 10-bit, but obviously not necessarily (see Sony A7riii announcement). Allegedly, broadcasters would use this as HDR standard because of manageable bandwidths. HDR10 (meaning 10-bit, a billion colors) is limited to 1000 nits. Netflix adopted this as well as iTunes store. DolbyVision demands 12-bit (68 billions of colors), it had experimental support from many companies. Supports playback on screens with 600 - 10000 nits, but right now is unlikely to win the race, except for cinema distribution. Youtube currently supports PQ (=HDR10) and HLG in rec_2020 (P3, a slightly wider color space than rec_709, currently standard for DCPs, is not supported). Getting this right seems to be quite a hassle. Apple announced HDR for their upcoming FCP 10.4 update. Since FCP already supported 2020, this probably means they will offer standardized sharing options. And although only a minor percentage of video enthusiasts/prosumers use FCP, this could be another nail to the SDR coffin. Today, I would add another possible answer to the poll: ⎕ I'm already worried ...
-
Slowly but surely HDR is noticed by the content consumers. I was in a shopping mall two weeks ago and they had a gigantic HDR-TV, surrounded by SDR-sets with the same video playing: partly nightsky timelapses (the skies on the other screens appeared grey, the stars faded), partly landscapes with sun (looking like flat LOG clips on the SDR-TVs). The audience very clearly saw the difference. As Yedlin had *proven* in his resolution-myths-debunked video, they couldn't have told the difference between HD and UHD. But there is no standard agreed upon, as I understood. How will an 8-bit rec_709 video, no matter the resolution, look on a 3000 nits screen? Will today's solutions ("HDR ready") be considered foul compromises very soon (like the early HDV cameras that were little else than "SD plus" if you see the images with today's eyes)? Or will the said foul compromises be accepted as just enough to make a visible difference, become standard for a long time and allow - for instance - 8-bit 420 to survive another decade? I'd really like to know.
-
Yes, jonpais. I really don't know how yet, but processing for and delivering as a widely accepted HDR (i.e. for Youtube uploads), whatever that takes, would change a lot. And the change is in the air, like Galadriel said. Surely I would buy any book or online training that can help me understand.
-
Nevertheless, we hardcore FCP X fanboys have been told that the little colorboard was enough for six years !!! We finally believed it ourselves, Stockholm syndrome. If color swatches and HSL qualifiers/curves were all (besides the rather exotic VR-stuff), it'd be a lame upgrade this year. The majority had these things with CFP or Chromatic already. This really is pure speculation, but another feature was parenthetically mentioned, which was "HDR". There is no question that Apple is interested in pushing this technology and making it easier to produce. My guess is that the crude colorboard indeed isn't enough for HDR anymore. 8k RAW would then be the buzzwords to impress audiences at the show, but HDR - much more about quality than pixel amounts - was the actual news.
-
Quote from Mac Rumors: But I guess it'd still be stupid not to use ProRes (even on December's $5000 iMacPro) in case HEVC was your acquisition codec. It's more interesting to directly export to it from the NLE for distribution. And unless you don't edit on iMovie, you'd have to wait another week for the new FCP X version:
-
Question mark, no image.
-
Yes, I see that exactly that way. I don't say it's bad storytelling or awkward or pointless, on the contrary. But if you try and count the minutes (or screenplay pages) this "film in film" takes and make it fit to any of the narrational structures that modern script gurus teach ... There is a similar, er, secondary storyline in the third season of Six Feet Under. I thought by myself, when will this bullshit stop? When is this going to make sense again? And only later, much later, did I realize, wow, this is what's happening to everybody all the time! Is it because you lose track, become a stranger to yourself? Or has there never been a track, a goal, a personal destiny in the first place? One of both answers is wrong, a lie, sometimes a life-long lie. And both questions are equally disturbing. On the three act dogma: apparently, a story doesn't need an end. Happily ever after seems to be a formal way to end a story, but isn't it really anything else but who cares?
-
Some thoughts about feature lengths in general. As I see it, the 2-hour, 3-5 act drama of classic cinema is dead (perhaps some day someone will successfully resurrect it). Short films and ultra short films have proven to be more inventive in ways how to tell stories, influenced of course by all the information and emotion that needs to be packed into a short TV ad. See this as an example. You need longer to describe that in words than the clip runs. And words are thin. Then, there are the epic series. Inspired by classic TV mini series. These need to be long. The suspense is driven by character development and the promise that there will be interesting conflicts between the characters. There almost can't be too many storylines. The usual plotpoint receipts (of the kind that Syd Field once "found") don't apply. As an example, watch Gofather I-III all in one evening. You will notice, that the whole section where Corleone visits Sicily, falls in love asf. actually leads to nowhere, if you isolate the part of he trilogy. But without it, the whole tragedy would have had much less impact. Let your characters be strong and interesting, and make the audience care for their fate.
-
We are all comparing apples with oranges. As PS and AAE subscriber, I can test a new Premiere version for 7 days, and I did. Last year I made the double mistake to import 4k XAVC using cmd + i, and the performance was ridiculously poor. This year, I hastily tested just three things, and all seemed to work flawlessly: > I imported a 20 minute timeline from FCP using XtoCC (a little tool that modifies the XML so that as many things get translated as possible, it's also frequently updated). Played back well. > I tried the proxy workflow from a camera card using ProRes Proxy as proxy codec and activating proxy in preferences. A breeze. > I imported CDNG files from the Pocket using the Media Browser and right-clicking import. Appeared instantly as regular clips. Don't know though if they are actually read as RAW, if Lumetri interprets them correctly. I admit that these are no exhausting tests, but the first impressions (two afternoons spent with Premiere) somewhat corrected my view. OPINION: 1. 30% of all complaints about performance and stability have to do with the configuration of my system, and if the software in question is optimized for it. 2. 30% of all complaints about performance and stability have to do with the user not RTFM. 3. 30% of all complaints about performance and stability have to do with using original, highly compressed UHD media for editing. 4. 10% of all complaints about performance and stability have to do with the software being crap. For point 1, FCP seems to be the winner from the start, but Resolve has done some serious optimization for MacOS. Point 3 is valid for all three competitors. Adobe always bragged about being able to work with the native media. As a matter of fact, UHD needs four times the horsepower than HD, and imho it's simply silly to quadruple your system specs just to be able to keep up. These codecs are not meant to be edited, and proxy historically is a pro solution to edit. Furthermore, as it seems, Premiere has the best proxy workflow now (transcoding runs in the background with AME, it's reasonably fast, and you can start editing in the original codec without being forced to toggle like in FCP or having to wait until the foreground process is done like in Resolve). FCP's strength is optimization through optimized media (ProRes) which not only runs unnoticed in the background but is also the fastest of them all. But just for the sake of proving point 3 to be universal, try to delete all generated media (or how it's called precisely) in the midst of a long UHD project, and you will see how much remains of the often-praised superior performance and stability. Now, for anybody seriously considering to jump ship, you better evaluate by checking these factors.
-
I suppose everybody here knows Gregory Crewdson (google images), who's photographs are really staged like major Hollywood productions.
-
More romantic, captures a mood. Edo shots look more staged, if you know what I mean. A matter of attitude towards the motif. Impressionistic vs. expressionistic.
-
Dramatic, expressionistic lighting, absolute depth of field, almost black & white. Our physiologic night vision (scotopic vision) shares some of these characteristics, but not all. Although our irises are wide open, we won't see things out of focus, because the rod cells are not concentrated in the fovea but instead spread over the whole retina (like here). But we would see light sources in full color, it's like "Lum Vs Sat" in Resolve. Nice, I prefer this over he modern "Night for Day" - approach that modern sensors and their lowlight-capabilities provide. Thanks for sharing.
-
I like everything on B. The tomato on E doesn't look like a tomato at all, blueish cast. I generally don't like too saturated colors, although I saw Speed Racer three times.
-
Standard? No post? Impressive. Must give standard another try. What turns out to be not so neutral are red and green. Though it looks pleasing at first glance, you might notice (in your beautiful videos as well) that areas with these colors look too uniform. Means that the naked eye will see less saturation and more luma detail. You can test this by placing a plant and red, pink and orange objects next to your monitor, record this with correct WB and then compare the image on the monitor directly with he colors before your eyes. You will also see this if you don't have a perfectly calibrated screen. Unfortunately, this also means skin tones are simplified. That said, there definitely was an improvement if I compare my A6500 shots to old A7s shots.
-
People miss one thing here: the dawning of HDR. They say the display is capable of HDR (whatever that means). HDR metadata can be embedded in HEVC (check). New Apple TV advertizes "4k HDR now! (check). 2017 iMacs deliver 800 nits (check). High Sierra will use HEVC as the new standard compressor (check). As I understood, right now there is no real standard for HDR. Apple may have made the decision.
-
Thanks for sharing. I've only just read the headlines/titles, and I'm already inspired. Think I saw that on Netflix. Will check it. Your exposé has interesting implications. Dreams, uncontrollable impulses, privacy vs. society. Looks high-budget though.
-
This will be very profitable for BM. I don't know if it was very wise to cripple the performance of the free version (no Quicksync supported there), but I guess many will start to see the advantages. 300 bucks (once and for all, no subscription, no dongle) is a price point that won't scare off many. If they continue to improve everything and add features with the same energy they showed in the last years, it will mean hard times for he competitors.
-
Yeah, but framing a talking head should not take as long as stumbling through the wilderness and slipping over glaciers whilst trying to find the right height and angle to shoot a landscape. I wouldn't sell my old-fashioned tripod.