HockeyFan12
Members-
Posts
887 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by HockeyFan12
-
Did you work on the online or are you referring to the show as viewable on Netflix? Of course the Netflix version has tons of banding. Fincher aims for a very clean image and then it’s massively compressed for Netflix. And generally web content is not compressed with the same care that's taken with Blu Rays, that are processed in high end proprietary Linux machines using special software that can be used to mitigate banding. (Which, yes, is an issue with Blu Rays. I'm not saying bit depth is totally irrelevant, particularly in shadows and highlights.) If you were working on the online, however, that's a very interesting point. Denoising can help with compression efficiency, but a cleaner image at 8 bit from a high quality codec can exhibit more banding than a noisier image at the same bitrate of the same codec, and by far. I believe the Prometheus Blu Ray, for instance, had added grain. House of Cards was shot in the HDR (dual exposures, then merged) on Reds, I believe. Not sure it was all HDR. So it was something like 28 bit raw in effect.
-
Different people see the world differently. The clip you posted shows 8 bit and 10 bit gradients looking nearly identical to me, but for others perhaps the slight difference is very pronounced, based on how important 8 bit vs 10 bit is to some. For high end HDR (15+ stops) the color gamut and dynamic range are MUCH wider and there I truly do believe 10 bit color makes a significant difference with tonality. But for me it doesn't matter at all with the cameras I use. And I think everyone has to try out this out for themselves!
-
I'm gonna check this link out! Never seen this, but that's exactly what I'd expect the difference to be. Subtle difference when increases bit depth, huge difference when increasing bit rate.
-
There's no simple answer here. You really need to do your own tests. Imo, the difference between 10 bit and 8 bit is on its own nearly invisible in every circumstance, even with graded log footage. But it's only part of the story. In theory, if your footage isn't heavily compressed and the camera's sensor is sufficiently noisy (noise is dithering), even if the footage is shot log, increasing the bit depth will only improve tonality and reduce the appearance of noise. It won't have an effect on whether there's banding or not. And on a standard 8 bit display, this difference will be basically invisible. (HDR is another story, you need more tonality for the wider gamut and flatter recording gamma.) But add in in-camera noise reduction and macro blocking (and Premiere having a broken codec for years when interpreting dSLR footage, if I remember correctly) and you get the potential for tons of banding, some of which 10 bit color will reduce (generally in the sky and into more bands rather than a smooth gradient, but more is better than fewer), some of which 10 bit color won't obviate at all (macro blocking). The FS5 and F5 have some banding issues in SLOG2 in the highlights even in 10 bit in certain codecs, in my experience. Certain cameras shooting in 8 bit, however (in the right codec), won't exhibit any banding or even suffer from significant tonality problems. (The C300 will rarely exhibit banding with an external ProRes recorder, except perhaps at high ISO with noise reduction on. But the built in codec can exhibit banding for sure. That sensor is very noisy. Fwiw, the noisiest sensor I've see might be the Alexa, or maybe it just has less in-camera NR. Can't get banding from that camera no matter what. I found a lot of banding with Canon Technicolor on dSLRs, though. None ever with any raw camera except my Sigma point and shoot.) The idea of "upscaling" bit depth is specious. I think that thread might have been posted as a troll or a joke. The examples posted in the thread basically just show noise reduction reducing macro blocking (not very well, I might add), which is sort of besides the point, but yes, you can use noise reduction to reduce macro blocking, which I guess is the look people have blamed on codec bit depth, but it really has to do with codec bitrate. Imo, 10 bit vs 8 bit is hugely overrated and essentially a moot point. But it's one of many many factors that are responsible for thin footage from certain cameras. Rent and run your own tests. And yes, the A7 line in SLOG 2 can exhibit serious banding. You can clean it up in post, though (with other tools than noise reduction, generally), or shoot to mitigate it. But it is absolutely a problem with those cameras, and whether it's a problem for you or you're fine dealing with it in post when it appears, is up to you. So it's two separate questions. Is the difference between 10 bit and 8 bit big? Generally, no. Is banding a problem on certain 8 bit cameras? It sure is. Rent, test, double check, then buy. But, imo, bit depth is far less important than bitrate when it comes to avoiding banding in the image.
-
How big do you print? I worked with an extremely successful high end photographer who would make wall-sized prints from 12MPs. The shots were stunning, but IMO they didn't quite hold up when you stand close. If you print bigger than 11X17 I might reconsider, but it depends on your personal standards so no one is going to be able to answer this question for you.
-
8bit → 10bit video with temporal noise filtering, stunning results
HockeyFan12 replied to cantsin's topic in Cameras
This is way above my head, but doesn't this have more to do with denoising being able to reduce macroblocking than it does with bit depth? The banding and color problems in all these shots seem to be from color profiles and compression artifacts, not bit depth. None of these image problems are primarily correlated with an 8 bit codec. They're far more to do with macroblocking. Both of the shots of the guy by the stairs, for instance, look awful and riddled with compression artifacts. Admittedly, the sunset does look a lot better, improved by what's a pretty clever trick. Generally, I feel like there's some "10 bit magic" that I don't see. My experience has always been that the strength of the codec and the color space and gamma assigned it is far more important than bit depth. The F5, for instance, still had a lot of banding in 10 bit XAVC or whatever the codec is, because the bit depth is too low and gamma too flat. (This has been improved upon since in future updates.) Denoising is definitely powerful! I've used it before to remove compression artifacts in similar situations. I just don't understand what bit depth has to do with it. It's macroblocking that's a far far bigger issue in all these examples. Unless my eyes deceive me... -
I haven't found that to be the case. From what I've seen, most film schools (at least any with a technical focus) are still requiring at least one class shooting 16mm. AFI requires lots of shooting on film. Whereas most schools with a focus on directing assume you'll be hiring your own experienced DP from a program like AFI. The D850 looks great, though.
-
Exposure is a creative choice. Occasionally you'll need to rack aperture, but 99% of the time you're lighting to a given stop.
-
Dynamic Range of ML RAW vs h.264 / h.265 Cameras?
HockeyFan12 replied to Mark Romero 2's topic in Cameras
$100k of gear. Just my opinion, but I find the images in those videos not so great as super high end stuff goes (though they're clearly from a very talented photographer and are still certainly way better than I could do). But I still agree. You can get by with a dSLR and tilt/shift lenses. An Arca Tech camera and MFDB is overkill for most. Just saying that at the very high end, lighting (strobes mostly, some dedos) matters as much as the camera. Although the videos you posted show that extensive post work helps as well... though I'd argue it's harder to get that level of control with video. -
Dynamic Range of ML RAW vs h.264 / h.265 Cameras?
HockeyFan12 replied to Mark Romero 2's topic in Cameras
Unless you have chroma clipping or are inept at setting white balance or dealing with a camera with poor white balance inherently (looking at Sonys, here) mixed lighting is difficult to work with in general. The Alexa handles it best, even in ProRes, so it's not necessarily about RAW. (Fwiw I agree about dynamic range, the 5D Mark III has less than the Sonys, but better tonality.) The high end $100k stills camera guys I know use strobes and heavily light their real estate work. For video it's not so easy but I think (I could be totally off base here) the ultimate solution to get great quality is to bring color correct fixtures and gels with you to swap out with what's there. Lots of LEDs or kinoflo bulbs to swap, maybe. Really really cheap to buy, but not always possible of course. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
http://www.digital-intermediate.co.uk/film/pdf/Cineon.pdf https://pro.sony.com/bbsccms/assets/files/mkt/cinema/solutions/slog_manual.pdf http://www.panavision.com/sites/default/files/docs/documentLibrary/Panalog Explained.pdf -
Yep, definitely confusing the two. I am an idiot about practical things lol. And yeah, that's totally fair. If you prefer a more traditional form factor, the F3 seems to be the winner. And it's the winner on price.
-
Canon claims the processing is identical. I've shot them side-by-side and could not see a difference, other than codec. However some people online will argue that Canon is using a simplified processor on the C100 and lying about it. The C100 Mk II has slightly cleaner images and noticeably different color from both. I find the ergonomics and menus on the F3 to be bad, but for the price used it is an amazing value, especially if you can get it with support gear included. @IronFilm isn't the only one who prefers F3 ergonomics, but I certainly don't. It's too heavy for me, and I find the image subjectively worse than the C300 (worse skin tones) but technically it is a bit better than the C300 in terms of tonality and a half stop more highlight detail, and it has 60p and 10 bit 444 color. The C300 is a little sharper, though. In most ways I prefer the F3 to the F5. Better image, imo. If you like the form factor and need genlock, it's a great option. I wouldn't choose it over the Canon options, but I can't argue with anyone who would. If you're a pro hiring out dual system sound, genlock pays for itself fast, and the C100 lacks it. Short answer: if there's a difference in image, you'll never notice it. But there are other differences worth considering.
-
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
Agreed, I don't even know how we got on the subject of HDR. :/ Maybe we should just move off that, since it's apparently a contentious topic... and I think hopefully when HDR hits it big it will obviate the need for log gammas since we can see fifteen stops of information on screen at once without having to flatten it out. @Maxotics as best I can recall, log gammas were introduced for film scans to save file space. 10 bit log could represent the data from a 16 bit linear scan. I believe Cineon was the most common format, designed by Kodak, and–apocryphally–I hear some of that same magic lives on in the Alexa, even though Log C doesn't look at all like a film scan to my eye. There are still formats out there that are wide DR in linear colorspaces, Open EXR, for instance. Maybe DPX? Though I usually get DPX in log these days. Panalog was the first log video format I've heard of. I haven't worked with it. The first log format I worked with was was possibly from Red (it was poorly implemented; their current IPP2 stuff is much better). The first good log format I worked with was S LOG, on an F3. I was so impressed. Or maybe the Alexa. I forget. I do remember SLOG on the F3 benefited tremendously from 10 bit capture. Later I got to work with in post (but not shoot) some SLOG footage from the F35, and that camera produces a great image to my eye, although it's pretty controversial on some forums, and a pain to use I'm told. The image reminds me of a really beefed up C100 a bit. Here are some white papers for Cineon, SLOG, and Panalog: I couldn't make it through them. Math isn't my strength; I've really struggled with it. It is interesting that Cineon is apparently 10 bit in a 12 bit wrapper. Not sure why... but it seems to indicate that 10 bit log is sufficient to capture the vast majority of film's dynamic range. Though Kaminski would note that the DI burns the highlights and crushes the shadows a bit... which he liked. I really can't speak to 10 bit on the GH5. I'd be interested to hear a good colorist's take on the subject. I do think "8 bit" has a bad name among people who don't run their own tests, though. Like you, I have separate cameras for video and for stills... and most just use my iPhone for both. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
I don't get to use fancy cameras that often! I own an 8 bit camera and am happy with it for personal use. I'd rather focus more on filmmaking... something I seem to have neglected lately. I don't know if the GH5 is really better in 10 bit, or if it has "true" 10 bit color. Haven't used it! Online tests seem to be inconclusive. I'd guess it makes a difference, but not a huge one. Then again, the Alexa has 15+ stops of dynamic range, and the GH5 doesn't, so it shouldn't matter as much with the GH5. I'm also not sure why SLOG 2 looks so bad from the A7S. I remember it didn't look great from the F5, either, which is a 10 bit camera (I think). F55 RAW looks better to me, but still not as good Alexa ProRes by any means... until it's expertly graded, at least. So each step up does improve the image, more in terms of grading potential than initial look imo. Still, I believe the arguments about 10 bit vs 8 bit on the C200, for instance, are overrated. I suspect 8 bit is enough for Canon Log 3. So it's only if you need those extra two stops of barely there dynamic range that the 10 bit codec would be better (letting you to use Canon Log 2). But that's not a camera I'm likely to buy. It isn't for me to say what others' needs are. If they need 10 bit, they need 10 bit. Not my concern since I don't. I still agree with @jonpais about HDR. In theory (I think), any "normal" or "flat" look, no matter how stretched it is, is just a flat rec709 look, and only has colors within the rec709 gamut. Where log differs is that it takes a container designed for rec709 but lets you capture colors outside that gamut, and then a LUT brings those colors back to a given colorspace. RAW, likewise, is only limited by the thickness of the BFA (which I believe differs between the C100/C300 and C500, fwiw), so you can take a RAW image and bring it into rec709 or bring it into any other given color space if the color is there in the first place. While I bet you could take a rec709 image and map it into HDR, that would be "faking it." Even a GH5 is (presumably) capturing a significantly wider dynamic range and color gamut than rec709 in v log, and shooting log or RAW will let you access those colors. But yes, tonality will suffer. How much? I'm not sure. I'm also not that interested in producing HDR content, at least at the moment, so for me color and tonality matter more. I suspect it requires a really good source to get high end HDR that has great tonality, but I wouldn't be surprised if HDR from the GH5 still looks good. I completely agree with @Damphousse that highlight roll off matters more than dynamic range to the average viewer. That's why I never got behind the A7S. The chroma clipping is severe. With a C300 or F3, my image might clip, but I can make it clip in a way that's aesthetically pleasing. With the A7S (and the early F5, it's better now) colors clipped wrong and it looked like video. So for me, the camera with the lower dynamic range is the one with the better image and grading potential, subjectively. The Alexa gets both right, you'll want to grade to burn it out. Furthermore, I have reservations about HDR. My first taste of it was Dolby's 10,000 nit demo with high end acquisition, so I'm really spoiled on it. I also can't afford a new tv, so there's that. More than anything, HDR and 4k feel like a simulation of reality, whereas film feels like a more physical organic medium. You go to a movie theater and watch film and the image is 24fps in a dark environment and your eyes are in mesopic vision and the film has a sort of built in tone mapping from the halation around bright spots and diffusion filters or anamorphic lenses, both of which which I love, bring it out even more with wild flares. There's not so much resolution, but the color is beautiful, and the contrast in the color allows the image to look much richer and higher dynamic range or higher contrast than it is. I like theatrical lighting with film, vivid and colorful. It feels like a dream a bit more, but I prefer stylized (not over-stylized) filmmaking. Whereas HDR, though stunning, seems to lend itself better to very naturalistic photography. Granted it makes anything look way better, but I see it being most useful in VR with like 8k per eye 120fps reality simulation. And I sort of think that's its own medium. Generally I think photography has tended a bit too much toward naturalism lately. So for my own purposes, I'm just happy doing weird stuff to 8 bit footage. But HDR is stunning. HDR nature documentaries are going to be breathtaking. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
Same! Maybe we don't disagree that strongly after all. There are issues with log profiles, I'll admit. One of the great things about Red is it looks great rated at 200 ISO (one could argue it doesn't look so good rated much faster) and pulled down in post. Whereas ETTR with log can thin out your signal. But I still think the better-implemented log gammas (Canon Log, Log C, and SLOG 1 was quite good on the F35 and F3) are the best thing we've got going at the moment. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
2k (not HD) 444 ProRes is about 38MB/sec; ArriRAW (2.8k Bayer for 2k delivery) is 168MB/sec. Yes, it's only about a 77% reduction in file size, which is significant on tv shows but perhaps not on the largest feature films. I suppose "tiny fraction" is an exaggeration. But ArriRAW has its own gamma mapping to a 12 bit container from a dual 14 bit ADC that then converts to a 16 bit signal in the camera. So, if you were starting with the true RAW signal, which is either 28 bit or 16 bit depending on how you look at it, the reduction in file size would be dramatically more. In the case of ArriRAW, the RAW data itself has its gamma stretched (similar to, but different, from Log) to save space. So perhaps ArriRAW is not the best example because it compresses the gamma, too, and a 77% reduction in file size isn't that big for your needs (it is for mine). I'm not sure what I "don't get." My own experience shooting 10bit SLOG 2 on the F5 indicated that the codec wasn't well-implemented for that flat a gamma, and I ended up not liking that camera when it was first released. (Overexposed by a stop, it's not so bad, and it's better now.) I think what you miss is that most serious shooters are running these tests for themselves. Problems like sensor banding in the C300 Mk II reducing the stated 15 stops of dynamic range and SLOG 2 on the A7S being too "thin" and Red's green and red chromaticities being placed too close are well-documented at the ASC and the ACES boards. Furthermore, the Alexa is pretty darned good even at 422, which you posit is too thin for log. (And Log C is very flat as gammas go.) Many tv shows shoot 1080p 422 (not even HQ) for the savings in file size. They still shoot log, the images still have good tonality, if slightly less flexibility than 444 ProRes or ArriRAW affords. Just because a few log profiles aren't all they're cracked up to be doesn't mean log profiles are inherently bad or wasteful. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
Can you send me a link to the video you mention where you discuss bit depth per stop in various formats/gammas? I want to make sure I watch the right one. It is an interesting topic and worth exploring. There are, no doubt, trade offs with log gammas screwing with tonality. But by distributing data differently (I believe most camera sensors have 14 bit ADCs in RAW, but that data is not stored efficiently) you can maintain good tonality in a smaller package. Which is the whole point of log capture. No one says it's better than RAW capture, but in the case of the Alexa, for instance, 10 bit Log C 444 is maybe 99.9% as good–and a tiny fraction of the size. Furthermore, dynamic range is not the question so much as tonality is. With adequate dithering (or in the case of most cameras, noisy sensors doing the job for you) you can obviate banding for any given dynamic range at an arbitrarily low bit depth. (At a certain point it'll just be dithered black and white pixels–but no banding!) The color and tonality, however, will suffer terribly. I shoot a bit with a Sigma DP2 and I do notice a lot of poor tonality on that camera relative to the gold standard of 4x5 slide film, despite both having poor dynamic range, and even in RAW. I believe that has a pretty low bit ADC. While I admire your reasoning and rigor, I agree with @jonpais for the most part. I agree that a ten bit image, properly sourced, will hold up better in the grade than an 8 bit one, but will look the same to the eye ungraded. While I know (secondhand) of some minor "cover ups" by camera manufacturers, none are too nefarious and consistently it's stuff you can figure out for yourself by running camera tests, and things people online identified anyway, and which were eventually rectified to some extent. Most camera manufacturers are surprisingly transparent if you can talk to their engineers, and there are white papers out there: However, this over my head. Where I disagree with Jon is his statement that a given log profile from any camera is adequate for HDR content production. In theory, since HDR standards are poorly defined, this might be true. But it doesn't mean it's giving you the full experience. My only exposure to HDR (other than displays at Best Buy, and trying HDR Video on an iPhone X) has been a Dolby 10,000 nit demonstration and a few subsequent conversations with Dolby engineers. The specs I was given for HDR capture by them were 10 bit log capture or RAW capture, rec2020 or greater color space, and 15 stops of dynamic range or greater. Of course, there are many HDR standards, and Dolby was giving the specs for top of the line HDR. But still, this was the shorthand for what Dolby thought was acceptable, and it's not something any consumer camera offers. They are, however, bullish on consumer HDR in the future. Fwiw, the 10,000 nit display is mind blowingly good. Just because Sony seems to be careless at implementing log profiles (which is weird, since the F35 is excellent and F3 is good, too) doesn't mean log profiles are universally useless. The idea is to compress the sensor data into the most efficient package while sacrificing as little tonality as possible. The problem arises when you compress things too far, either in terms of too low a bit depth or (much worse) too much compression. I do see this with A7S footage. And I think it's the reason Canon won't allow Canon Log 2 gammas in its intermediate 8 bit codec on the C200. I realize you wouldn't consider the Varicam LT and Alexa consumer-level, but the images from their 10 bit log profiles are great, with rich tonality and color that does not fall apart in the grade. Furthermore, I suspect the C200's RAW capture would actually fulfill even Dolby's requirements for high end HDR, and $6000 is not that expensive considering. Out of curiosity, do you use Canon Log on the C100? It's quite good, not true log, but well-suited for an 8 bit wrapper. -
What was the first professional camera to shoot LOG gamma?
HockeyFan12 replied to maxotics's topic in Cameras
I agree with you in theory. With dSLRs, at least, I always found ProLost flat to look much better than Technicolor. And I still find SLOG 2, for instance, too flat for 8 bit codecs (I also find it ugly on the F5 while shooting 10 bit, to be fair, though the Kodak emulation LUT on the F5 is quite nice). But my experience beyond that doesn't shore up with your hypothesis. Since you seem to know much more about this than me, I have a few questions before I respond in more detail: How is each f stop distributed in RAW, in rec709, and in log, respectively, in terms of how many bits are afforded to each f stop? I've seen linear output from dSLRs and to my eye it looked like 90% of the information was in the highlights, with the shadows being almost black. I believe, straight from the camera, each f stop has twice the information of the next darkest f stop in RAW. Whereas the JPEG (or rec709 video equivalent) conversion looks normal for the screen. I'm not sure how many bits each stop has of data on average in that case. True log looks super flat, and I'm assuming each stop is given an equal amount of data? Where did you get the figure of 5 stops of true dynamic range for the eye? Maybe our eyes work differently, but I can see into the shadows even with a bright light source or clouds nearby, and a figure of 15 stops, or more, seems much more likely to me. My pupils aren't constantly oscillating, either. Even if the five stop figure is true scientifically, it isn't experientially. -
It's possible it's bad. They've seen it and we haven't. Still, it speaks to a very unhealthy climate for theatrical releases.
-
On the C300 and C100 at least I find Canon Log looks better than Wide DR. I'd expect the opposite, better tonality from Wide DR since it's an equal amount of dynamic range spread over a wider gamma (goes deeper into the blacks). Haven't found this to be the case. Canon Log consistently gives me the best image. Not sure why. But try it yourself and see! What were the settings in custom? You can choose your own gamma there pretty sure... so if something there works better for you, use it. XC10 looks so cool! I know there are naysayers but I thought people ended up really liking it.
-
They all contribute to contrast and perceived resolution. I'm being metaphorical, though. To my eyes, the difference between 4k and 240p is about as dramatic as the difference between a 10,000 nit display and my late 2016 rMBP.
-
To my eye, the contrast ratio and brightness on my late 2016 aren't even close to HDR-ready, but if Apple labels the display as HDR-certified, it's HDR certified. But it's deceptive if they claim it is imo. Unlike every previous gamma ever, HDR values are absolute, not relative. It's not just contrast ratio (which is also very poor on the MBP relative to HDR displays) but absolute brightness that gets you an HDR certification. (You need a wide gamut display too, of course, and maybe it does fulfill that requirement.) When an HDR display can't reach a certain level, then it's tone mapped, but all brightness levels below that point are set in absolutes. So you need a REALLY bright display. HDR starts being meaningful/noticeable around 1000 nits brightness, which I believe is the lowest level where a device can be certified, and starts to look really good around 4,000 nits. But 10,000 nits is where you get "full spec HDR." I would be shocked if the MacBook Pro is much brighter than 400 nits at the absolute brightest and the contrast ratio is very poor anyway compared with an OLED. Apple might say differently, but I don't trust them on this stuff. Vimeo and Netflix have "HDR" content for the iPhone X. While the iPhone X's display is dramatically better than the MacBook Pro's, it still reaches 600 nits at best. I think the Samsung Galaxy 8 can hit 1000 nits. That might be the only phone that can show off true HDR (granted, only at full brightness, and only at the lowest possible spec, the equivalent of 480p being labelled as HDTV). The Pixel 2 is worse than either at 500 nits. If Apple claims the MacBook Pro is HDR-compatible, they should sued for false advertising. It's not even close to being close. If 10,000 nits is 4k, it's barely 240p.
-
Depends on how they're distributed and quantization noise. I do remember someone (Scorsese and Prieto?) compared 2k 12bit and downscaled 4k 10bit raw on the C500 and preferred the former ever so slightly, but they found the difference to be extremely subtle even when projected at the world's best facilities. I doubt I'd notice it.
-
Broadly, it depends by whom and what you mean by "censor." (In this case it's just a warning before the video, so I'm not even sure it's censorship.) Generally, I'm for private self-censorship and rating boards but not for government-mandated censorship. One reason England has the BBFC (which is allowed to censor and/or ban movies as part of a government mandate) and we have the MPAA (which is simply a private rating board) is because the American film industry took it upon themselves to "censor" (or just self-censor among their exhibitors) films produced by its members and exhibited in its theaters. This tamed the moral hysteria of the era and avoided government censorship (which would be, imo, anti-first amendment) like you see in the UK. So as much as people (often fairly, imo) complain about MPAA ratings, I think the institution is a good one and the institution of private censorship is a good one, too. YouTube, FaceBook, Twitter, etc. are not public institutions. They have every right to ban and block whomever they want, and those people then have the right to look elsewhere to express their opinions. I'm not for government censorship of hate speech, though. I support The Daily Stormer's right to exist every bit as much as I support the web hosts (private companies) who refuse to host them. In sum, yes, I support YouTube putting a warning before hate speech. It's their freedom of speech as a private enterprise that allows them to put that warning there, after all. So at the private level yes and at the public level no.