Jump to content

HockeyFan12

Members
  • Posts

    887
  • Joined

  • Last visited

Everything posted by HockeyFan12

  1. I'm not sure what you mean. Wasn't there just a DIY lighting thread? I didn't mean to brag about anything–attaching LED strips to velcro is pretty simple and anyone could do it who was inclined to. Same with taping together 4 2x2 Westcott lights into a 4x4. I'm just glad there are alternatives that don't require DIY. If I was bragging about anything, it's just that a few DIY ideas I had were showing up in commercial products. But mostly because I'm glad to learn they'll be available for rental!
  2. I've never used FCPX so I couldn't speak to that. It's been my experience working in the offline edit (generally in Premiere; I assume the transcodes were done in Resolve by the DIT, though) that ProRes 422 Proxy footage is unusable for anything except preview. Fwiw, it's always been Alexa footage, sometimes with a rec709 LUT applied, sometimes in log. Perhaps the Alexa, because it's fairly noisy and flat, is particularly unsuited for that codec, but generally I find it to have excellent image quality. I can't speak as well to LT, but I don't remember it looking particularly good, either. (I forget sometimes if the proxies were 422 Proxy or 422 LT since I think it varies by post house.) This is more anecdotal because I think Atomos' early recorders may have had a poor ProRes implementation, but generally I found the image quality from external recorders as compared with AVCHD out of a camera to be: 422 HQ > AVCHD > 422 > 422 LT That's with the caveat that while AVCHD edged out 422 overall, when there was a lot of camera motion or a lot of moving foliage, AVCHD was significantly more susceptible to macro blocking. So for many people 422 would have been better than AVCHD depending on what they shoot. Below that, LT was clearly worse. I was very surprised by this result since 422 is considered good enough for broadcast (some network shows used to and still might shoot 422 instead of 422 HQ to save space) and AVCHD isn't. To be fair, the difference was very small, whereas 422 HQ was a lot better than either. I don't know if that's due to Atomos having a poor implementation of the codec–I suspect it is–but generally I find the thinner flavors of ProRes to be quite poor and Proxy to be unusable for anything but... proxies. Also, you can't tell much by bitrate alone. ProRes is a DCT (discrete cosine transform) codec and less efficient than ALL-I h264, which is a wavelet codec, and that's only touching on the very basics of both codecs. ProRes is built for speed more than it's built for image quality. Again, just my experience. Of course it is all up to the client, I agree with you. A while back I worked on a few shows for cable (tier one cable, but still lower budget shows) that seem to work with thinner ProRes or DNXHD variants than most people on this board would consider acceptable, and the raw footage had substantial macro blocking, whereas prime time network tv seems to mostly be 422 HQ but again standard 422 also seems to get use, or used to. (Most people on this board have wildly higher technical standards than prime time network tv and indie film, closer to Netflix or major studios, which is ironic since for a while streaming had the lowest quality delivery codecs and it still might.) Still, I would put Proxy and LT both below even the tier one cable threshold and to my eye they are far worse than AVCHD, but I think AVCHD is pretty good. I don't know whether or not the BBC accepts LT, but imo they should not. I don't know if their standards are based purely on bitrate or also on subjective impressions, since 50mbps MPEG2 is much better than LT as it's interframe and I believe that's the lowest they'll accept (they won't accept AVCHD).
  3. Lots of cool looking stuff there! I was trying to build my own tiny flexible velcro LED lights, but it seems there are similar better options here, and also considering buying four 2X2s from Westcott when they were on sale for $500 apiece to make an 800w 4X4 to rent out. Looks like Litegear already has an 8X8 version, even better! That's the book light of the future and possibly bright enough for day exteriors, too. It claims 1600w and while I doubt it has the punch (beam angle) of a 960w MacTech (which gaffers tell me is about equivalent a 6k HMI through diffusion), the extra wattage might make up for it. I would love to rent that for the right project. I'm sure the price is extreme, but the smaller lite mats are excellent, so that could be useful even for day exteriors... maybe... That's a shame that lightning is gone, that's one of the main reasons I bought the light but I was probably going for more of a strobe effect anyway. To be fair, at that power output (guessing 10w equivalent about) you could just get a bog standard high CRI E26 bulb and flicker it with a switch or button on a dimming device. The brightness would not be that different whereas "real" lighting units are brighter than 6k HMI so it would never be useful for that except maybe in dimly lit interiors on an A7S or something. The real issue is the attack and decay envelopes being too hard with LEDs, though some seem to be worse than others... I'm experimenting with that now actually. For all I know the "strobe" effect on the viola is just as harsh as standard lights, unless you can carefully tune in the attack and decay (the sine wave effect looks good, but maybe too soft). Though I hope I'm surprised about the brightness. And I assume the more of those you buy, the better. A fire gag constructed out of three or four of them could be brighter and more varied. I actually just found two more at $120 used so I picked those up. Could be useful for fire gags club light gags etc. and there's not much else in this price range that is.
  4. Heh... it's even more complicated than that. Canon cameras trigger rec/stop on Atomos devices, which is what I'm using, but don't trigger rec/stop on Sound Device devices... I think? So I'd need to see if the Atomos device is recording timecode the same as the Canon device and then if it's outputting it, but rec/stop separately on the MixPre3. I might just have to buy one and return it if I can't get it working.
  5. In my experience the image is much worse than h264 variants, including AVCHD and Canon dSLR codecs, but the performance is excellent. It's a proxy codec, meant for offline edits. I wouldn't use it for anything else. It's not as efficient as h264 and the image quality is much worse, it's only built for speed for offline edits. Still, it's usually good enough to judge if footage is in focus or something so if you shoot with a built in LUT and nail exposure and don't expect to grade or otherwise manipulate the footage you could get away with it for certain content.
  6. Hi, I noticed that the MixPre3 syncs timecode with the C100 via HDMI (which is amazing!) but what I want to know is if I were to run an external recorder in line with it, camera to external recorder to MixPre3, if that would still work. Thanks!
  7. I don't agree entirely on the semantics, but I don't understand all the details there anyway, to be fair. On the overall message, I agree.
  8. I'm not so sure I agree with that, either. I agree that 12 bit vs 14 bit raster video should be irrelevant.... both are more than good enough if the range of values in the recorded is compressed into that space. Even for HDR it's way more than you need. And I also agree that if 12 bit raw is cutting out two bits in the shadows rather than compressing the range of values into a smaller space, you're losing information permanently in the shadows, not just losing precision (which matters much less). Of course, the 5D has such noisy shadows (maybe 11.5 stops DR total) that it's possible 12 bit is still effectively identical to 14 bit since those last two bits are entirely noise... but 10 bit would almost definitely imply losing actual shadow detail, probably cutting your dynamic range hard at 10 stops. (Unless the scene is overexposed or ETTR with low scene dynamic range.) The rest I disagree with. With a bayer array, each photo site only represents one either R, G, or B value anyway because it only has one color filter on it. With the 5D, it's 14 bit grayscale that's recorded for each pixel. And that 14 bit grayscale value isn't interpolated directly into three 8 bit values for R, G, and B; it can't be because there's literally only one color represented there, either R, G, or B depending on the color filter. When that value is transformed into an RGB value in the final image, it's through interpolating the nearby pixels which have different color filters, and while I'm not sure what the exact algorithm is, it's definitely drawing on multiple pixels, each with 14 bit precision in the case of the 5D. So the loss of color detail there doesn't have to do with bit depth but rather bayer interpolation being imprecise. You're losing resolution through bayer interpolation, but not bit depth. (Which is one reason single chip sensors don't really have "4:4:4" color....)
  9. Interesting... I take back what I wrote before. Though 12 bit might still be okay. The 5D Mark III never seemed to have more than 12 stops of DR to me, noisy shadows etc.
  10. Couldn't help myself, picked up a used one for $118. Can you program in animations to simulate flame or a flickering fluorescent or something? I've been looking for a budget magic gadget since years ago. Can you program color loops? Of course the trick with magic gadget flame gags is ganging up a bunch of different lights to create some real chaos, usually including a constant base source dimmed way down to a low color temp like embers, so one of these will never replace but merely supplement that. Aputure's approach seems to understand the need for being able to program in animations; recording a flame and then recreating it on a larger scale is sooooo cool. But even a poor man's version of that would be wonderful, especially for $118. Lighting with projectors as hard sources and RGB(W) displays as soft sources (look up Sony's Crystal LED technology) is sort of the holy grail of control. This is sort of like a super low res Crystal LED. And until then, Digital Sputniks aren't bad. They are too expensive, though. Hoping the app offers some decent control not just for color but also for animation. What are your experiences? Edit: just watched this video: ! So cool As for Deakins, I think that stuff is pretty common. I see a lot of batten strips/covered wagons and when I talked with the DP of the Sopranos he explained that most of their sets are lit by massive grids of 60w (or maybe 100w) incandescent lights on dimmers, at the top corner of each set. The approach is simple, always keep the scene backlit by those and then accent as needed. But you need to be very very good to dial in the details and of course the cost of constructing custom lights is often greater than just buying something fast and cheap and only justifiable or feasible on a big set in the first place...
  11. That's true. ProRes would definitely be my codec of choice on any camera, unless I was shooting long events. I used ML raw for a while. Nice footage, but it was a lot more work than I wanted to do for long events and I had reliability problems. I'm honestly surprised Canon added raw to the C200. Probably because the C300 Mk II sold poorly. I guess my point is just that Canon's not going to even try to compete with this because they're aiming for totally different markets. I don't think they're failing, just failing in the enthusiast market. So if it works for you, don't wait up on Canon. BM has really very nice color, too.
  12. I'm actually with you on that. I only own dirt cheap gear, because I'm an enthusiast like most people here, and have limited disposable income. Would I pay more? Not unless I could amortize it in six months of paid work. If I wanted something better for a personal project, I'd just rent, and I probably wouldn't even bother. The entry level is absolutely good enough for anything except Netflix and BBC, but you do have to pick your poison (bad specs for a Canon or a learning curve for BM). I just have different taste in poison. But I don't see it as a rip off with established brands. I see it as a high premium for CaNikon for user experience over specs, and up to the informed consumer to make his or her own choices. Granted, probably only 0.1% of this board agrees with me, and I only really disagree with you in terms of Canon's business savvy, which I think is pretty strong. The rest is subjective. Meanwhile, I see the iPhone as the most disruptive product line and real threat to CaNikon.
  13. Canon is known for consumer-friendly and super reliable, conservatively-spec'd products. That reliability is why they charge a premium. Maintaining that premium requires that most of their cameras are easy to use and extremely reliable. That's their brand. ML raw is not consumer-friendly or super reliable. (Yes, I've used it.) And it requires CF cards that are a bit faster than are widely available for reliable recording, at least it did when the Mark III was first released. It's not something Canon wants associated with its brand, crazy as that seems. Meanwhile, Black Magic is known for cutting-edge innovation at all costs, and good price/performance with some bugs. The bugs are the cost of their crazy low prices. And other than their Linux Resolve machines, they have a poor history when it comes to ease-of-use and reliability (think issues with magenta shaded corners, FPN, poorly shimmed mounts, bugs in beta Resolve releases, short battery life, poor ergonomics, etc.). I don't think it's a matter of one company offering more or one company offering less so much as it's a matter of different segments in a market. For most people on this site, preferring what Canon offers is basically unthinkable. For most wedding videographers, however, a C-series camera might be just the ticket, and every professional photographer or entry-level photographer I meet still shoots Canon or Nikon, often with older lenses or kit lenses. Whereas the A7S or A7R or GH5 is targeted toward a more tech-friendly niche audience. I see a lot of them, but never among seasoned pros (maybe as a b camera) and never among total newbies. That middle ground (enthusiasts) is where the Black Magic cameras fit in especially well. It's a different market, and I think the market most people on this site find themselves in. If you're in that market, Canon won't offer what you want at the price you want. Not ever. The pro market is higher margin and the low end market has more unit sales. (I think their market position is actually pretty good, but that's just my opinion I am terrible at business, unfortunately for me.) Meanwhile, BM won't offer what I want at any price. I'm fine with 1080p, 24fps, etc. but I'm super lazy and don't want a short battery life on any camera that I can't hire a crew to swap batteries for, etc. and a poorly shimmed mount or Resolve issue or SD card dropping frames would ruin my day completely. That might sound crazy, but I just don't care about 4k, in fact I would rather not have it if given the option because it slows things down or requires an extra transcode for 1080p delivery. (HFR and internal ProRes is cool, I admit!) So I agree with you, but at the same time I don't think either of us should wait up for something that's never going to happen, nor do I think it's a matter of crippling something so much as marketing something based on specs/final output for emerging brands or based on brand/user experience for established ones. It's very similar to the Red/Alexa debate on the high end, which will never be settled. (But I would take Arri ten times out of ten.) Just buy what you want and don't proselytize to others. Just my 2 cents.... This camera looks sick btw. Like super awesome, even if it's not for me. Lastly, a truly groundbreaking product can shake up brands at every level. But for an example of that, look to the iPhone, which is the ultimate in user experience over specs. I use my iPhone camera 100X more than my other cameras.
  14. If I had the money I'd upgrade to this. Early Atomos stuff was awful, but each generation seems a bit better. Shame I'm too broke. :/
  15. I also just like the look of F35 and Alexa footage. The lack of skew (none on F35, vanishingly little on Alexa and film) and large camera body just looks better to me. But I realize I shouldn't complain.
  16. I see a LOT of jello and skew in YouTube videos and drone videos, so I take issue with the 5% idea in the first place. But I agree it's not at the top of everyone's wish list, so I'm not really trying to argue there. For me personally, the biggest problems are with strobe effects, flashing lights, etc. There are some work arounds, but they mostly involve using mechanical shutters on the lights or using only tungsten lights (which have a much more forgiving attack and decay envelope), but that's a problem because they're the wrong white balance, aren't very efficient, and they shift their color temperature in a strange way as they dim. When the frame is split down the middle it gets really ugly. For horror movies or music videos this can be a really big problem. The other issue that's kind of niche is match moving. I can't get good 3D solves when there's skew because the scene sort of breaks apart irregularly when you pan and tilt. It's subtle but a 3D solve can reveal it sometimes. On wide angle lenses I've found this to be less of a problem. The issue I have that's not so niche (imo) is with longer lenses and whip pans or just long lens footage in general. Or shooting wheels, propellers, guitar strings, etc. At 200mm I have a lot of problems. With anamorphic lenses I seem to get more skew, too. I'm not arguing that global shutter is first on everyone's wish list, just answering your question as best I can.
  17. I wouldn't know since I must be in the other 5%. It's a huge problem for me.
  18. +1 for fixing rolling shutter.
  19. I don't spend much time on set. I see some Panasonic monitors for directors. I think DPs rely on the Alexa viewfinder, which is pretty good, or SmallHD maybe. I don't know about high end sets. I suspect iPads are getting more common for directors and producers, but obviously not for DPs. In post a Flanders or a calibrated Plasma seems sufficient for anything except HDR. I don't pay attention that carefully to brands I can't afford. :/
  20. Codec is WAY more important, but the whole bit thing is kind of a mess. The human eye is estimated to see about 10 million colors and most people can't flawlessly pass color tests online, even though most decent 8 bit or 6 bit FRC monitors can display well over ten million colors: 8 bit color is 16.7 million colors, more than 10 million. And remember sRGB/rec709 is a tiny colorspace compared with what the human eye can see anyway, meaning 16.7 million fit into a smaller space should be plenty overkill. But also remember that digital gamuts are triangle shaped and the human eye's gamut is a blob, so fitting the whole thing into the blob requires overshooting tremendously on the chromasticities, resulting in many of those colors in digital gamuts being imaginary colors.... so the whole "8 bit is bad" thing needs a lot of caveats in the first place... I haven't tried 10 bit raw from the 5d, but I suspect in certain circumstances (100 ISO just above the noise floor) 10 bit will have visibly higher contrast noise than 14 bit after grading, though only if it's the exact same frame and you A/B it will the difference be apparent. That's my guess. Something VERY subtle but not truly invisible, though possibly effectively invisible. It's possible there could be banding, too, but the 5D III sensor is quite noisy. The science behind it is so complicated I gave up trying to understand. The more I learned the more I realized I didn't understand anything at all. First you're dealing with the thickness of the bayer filter array and how that dictates how wide the gamut is, then you're dealing with noise floor and quantization error and how that works as dithering but there's also read noise that can have patterns, which don't dither properly, then you're dealing with linear raw data being transformed with a certain algorithm to a given display or grading gamma, as well as translating to a given gamut (rec709, rec2020, etc.) and how wide that gamut is relative to the human eye and how much of the color there is imaginary color, and then what bit depth you need to fit that transformed data (less than you started with, but it depends on a lot of variables how much less), and then you introduce more dithering from noise or more banding from noise reduction, then compression artifacts working as noise reduction and to increase banding via macro blocking, then there's sharpening and other processing, then... then it goes on and on to the display and to the eye and of course that's only for a still image. Macroblocking and banding aren't always visible in motion, even if they are in a still, depending on the temporal banding and if the codec is intraframe or inter-frame. It's possible everyone who's proselytizing about this understands it far better than I do (I don't understand it well at all, I admit). But I frequently read gross misunderstandings of bit depth and color space online, so I sort of doubt that's the case that every armchair engineer is also a real one. (That said, there are some real engineers online, I just don't understand everything they write since I'm not among them.) I know just enough about this to know I don't know anything about this. From the armchair engineers, we do have some useful heuristics (overexposed flat log gamma at 8 bits heavily compressed will probably look bad; raw will probably look good), but even those aren't hard and fast rules, not even close to it. All you can do beyond that is your own tests. Even inexpensive monitors these days can display close to 100% NTSC. They should be good enough for most of us until HDR catches on, and when it does bit depth will matter a lot more.
  21. Seems like a cool idea. I'd focus on staying small at first, and maybe even having it be informal. There are lots of generic local festivals, but few that are that interesting. Something small, a starting place to find collaborators, could be cool. If you're going to accept everyone, maybe put a tight limit on how long each short can be. I agree with that. I've heard similar things about Channel 101 being a more insular and self-serving community than it once was. It's sort of turned into what it was a reaction against. But I wouldn't let that sour you on submitting to other festivals entirely! If you like Channel 101 stuff, make a Channel 101 show. The difficulty with being really really creative is that your ideas exist BECAUSE they're unusual and innovative. And so if there are non-creative criteria for entrance somewhere, and the more established the venue the more established the criteria generally, the least creative stuff is valued at the low end or entry level (it ticks the boxes) and the most creative stuff at the high end (it innovates). So you won't be at the level of high end stuff, but you're too creative for the low end stuff, and it's going to be unduly challenging and you wonder what's wrong with you. Well, the question is also what's wrong with the world. Really creative people often never get past the entry level. This is a real problem with companies, the visionary CEO eventually gets replaced with very conservative thinkers. You need to learn to think like your audience, and meet them half way. If this is a problem for you, or you've faced rejection, what to do about it is up to you. Making a festival on your terms is a good idea–it's where Slamdance and Channel 101 were born, even if they later sort of turned into what they began as being defined against. But if you see stuff you like coming out of Slamdance and Channel 101, maybe meet them halfway, and once you break into those communities, be even more and more creative. Once you get in, then you can push the boundaries more and more! Either way, look for other films you like, and find collaborators. Don't be myopic. I'd start small either way! Five-minute run times aren't a bad thing. The challenge is to pack all the creativity into it! Or hone your idea down to the best, smallest version of itself. Your next film won't be your best unless you let it be your last. So keep creating! And don't look (too far) back.
  22. I've yet to see 4k that, per pixel, is as sharp as 1080p downscaled from a 4k source. But stills I've seen from TOTL dSLRs certinly show that potential... the 8K Reds might provide meaningfully more resolution, I haven't seen raw footage from one. Early Red footage was SUPER soft. Now we're seeing 4k that's meaningfully sharper to the eye than 2k/1080p... On what display is the question. Even on IMAX, 2k Alexa looks good to me. And the C100MkII is just as sharp (worse in other respects, of course). Both are way sharper than 35mm film (as projected, but even look at blu ray stills from film and see how surprisingly soft they are). But on a 5k iMac, I notice a bigger difference since I sit so close to it. Even there, the difference isn't huge between 4k and 5k, though. It is with UI elements, not so much with video. And the difference between 4k cameras will be even way less significant. For me, 2k is enough for video. For most, I think 4k will be. For now... I think the only substantive shift (beyond already significant meaningful aesthetic differences between cameras and lenses) will be when HDR takes off. And HDR imo is going to evolve in a different direction entirely, even more naturalistic, maybe HFR, etc. maybe even integrating with VR. The C300 Mk II and Alexa (and in practice the F65 and new Reds and Venice probably) all meet that standard of 15+ stops 10 bit rec2020 etc., But HDR is changing fast so I wouldn't even sweat it unless HDR delivery is important to you. Of course, I don't know if by "reasonably affordable" you mean an A7S or an F55. I think there's already a rather big difference there, though the two can be seamlessly intercut if you're careful to mitigate the A7S' weaknesses.
  23. Interesting. I've done similar tests with noisier cameras (C300, Alexa, F35) and not seen any banding because of the heavy dithering from the noise in the source footage. I suppose it goes to show it's worth doing your own tests! Certainly it seems to make a big difference here.
  24. I categorically disagree. I've done extensive tests with everything from dSLRs to high end cinema cameras. With high end cameras shooting near-uncompressed codecs (raw codecs, ProRes 444, etc.) I couldn't incite any banding in 8 bit transcodes, even very flat log transcodes, without first doing noise reduction (or working with footage that had already undergone noise reduction or compression in camera). In most cases, the difference between 8 bit and 10 bit was completely invisible until I zoomed in to 800% and there's a tiny increase in noise in the 8 bit transcodes. I'm not saying my dSLR doesn't have banding in video, though. It has plenty. But it's from poor in-camera processing and a low bitrate, not low bit depth. In my mind, bit depth has to do with tonality and banding is more an issue of bitrate. I challenge you to take 10 bit Alexa or F35 footage, even in log, transcode it to an uncompressed 8 bit codec, apply a LUT or a grade, and find any banding (without destroying the image completely in other ways to intentionally muck it up). The noise in the image will dither things out so as to completely obviate any banding. The only difference you'll see between the camera original and the 8 bit transcode is a hit to tonality (higher contrast noise, almost like a subtle sharpening). Even a one bit image can avoid banding with proper dithering: https://en.wikipedia.org/wiki/Color_depth (Fwiw, the "8 bit" image there is 256 shades of color, not 256 shades of gray per color channel. When we discuss "8 bit" we mean per channel, or 24 bit total, 16.7 million colors. And the dithering in those examples isn't perfect, but you can see how it reduces banding even at extreme low bit depths. The "8 bit" image there, which has next to no banding visible, would be 2.7 bits per channel.) Maybe my eyes are going or I need to upgrade my monitor (the reference I was using was a Flanders Scientific panel, but it was an 8 bit tv panel, not a 10 bit video panel, so perhaps I was blind to something), but what you're writing completely contradicts all my experience nonetheless. But I am open to the possibility of my eyes not being very good. I can't tell the difference between an 8 bit and 10 bit grading panel. But I recommend running your own tests on Alexa footage to see if you have the same experience I have. I'm contradicting my above post a bit, but I have seen banding with the C100. Granted, extremely little, but it's there at times. As good as that camera is, it's too clean in the highlights, which are heavily compressed in log, and the AVCHD compression is very heavy and prone to fall apart completely in challenging circumstances, if surprisingly good in easy circumstances. It's a 100X better than an A7S, though. Well, in that regard at least. Worse in many others. Some day I'll have to get my hands on one and try shooting flat walls overexposed with an external recorder... not sure if there will be banding or not.
  25. I believe “thin” is a term taken from film development. A thin negative is underexposed, and can’t be processed or manipulated without introducing photochemical artifacts (grain, color isses, etc.). So a thin digital file is any that's hard to work with in post without introducing artifacts (banding, grain, color problems, etc.). I believe "thin" generally refers to exposure problems and/or problems with the file itself (low bitrate, low bit depth, etc.). The confusing thing is that some digital cameras look good overexposed (ETTR), particularly raw cameras like the Red, whereas other cameras with heavily compressed log codecs will be “thin” in the highlights. So overexposing will introduce the most banding and the worst colors there. Sony cameras also seem to need the right white balance, whereas with Red you can change it with little penalty other than noise. But then the Sonys have way better low light. The best solution is to know your camera well and is expose properly and white balance properly. (I set my white balance to 5600K and then forget it most of the time, but it depends on the camera.) You might also consider an external recorder if you're going crazy in post. I've seen very good A7S footage from external recorders, but shot by DPs way more technical than me! Some crazy stuff with a Q7+ and a custom LUT that pulled SLOG2 two stops from 3200 ISO to 800ISO. Looked really nice. There are good debanding tools in Resolve, I believe. I’ve never had a problem with banding in footage, so I’ve never used them. In After Effects you can use the scatter plug in on gradients (not ideal), or Sapphire deband (expensive, but should be excellent). But I think Resolve has a debanding tool that's good? Fwiw, I disagree with an earlier post claiming that 10 bit acquisition is unheard of on big productions. I’m mostly a hobbyist but the bigger stuff I work on (not as a director or DP) is always either 10 bit ProRes or RAW acquisition, I wold say more than 99% of the time.
×
×
  • Create New...