Jump to content

KnightsFan

Members
  • Posts

    1,292
  • Joined

  • Last visited

Everything posted by KnightsFan

  1. I agree, but hey, what can we do. As long as everyone knows what everyone else is talking about, that's really the best we can hope for at this point.
  2. This is especially true if you want a definition of RAW that works for non-bayer sensors. If your definition of RAW includes "pre-debayering" then you've got to find exceptions for 3 chip designs, Foveon sensors, or greyscale sensors. Compression is a form of processing, so even by @Mattias Burling's words, "compressed raw" is an oxymoron. But in fairness, I often see people use RAW to describe lossless sensor data, whereas raw (non-capitalized) is often a less strict definition meaning minimal compression to a bayered image, thus including ProRes Raw, Braw, Redcode, and RawLite. So as long as we remember the difference I grudgingly accept that convention.
  3. Considering both DNxHD and ProRes are lossy codecs, that last bit sounds like false advertising. But it will be veey little loss, really not a big deal at all. It sounds like a really neat feature.
  4. KnightsFan

    HLG explained

    @mirekti HDR just means that your display is capable of showing brighter and darker parts of the same image at the same time. It doesnt mean every video made for an HDR screen needs to have 16 stops of DR, it just means that the display standard and technology is not the limiting factor for what the artist/creator wants to show.
  5. That doesnt actually explain why the factor is 1 though. That just explains why its linear.
  6. I'm not 100% sure about this, but it's my current understanding. The reason you are incorrect is because doubling light intensity doesn't necessarily mean doubling the bit value. In other words, the linear factor between scene intensity and bit value does not have to be 1. For example: If each additional bit means a quadrupling of intensity instead of doubling, it is still a linear relationship, and 12 bits can hold 24 stops. As @tupp was saying, there is a difference between dynamic range as a measure of the signal measured in dB, and dynamic range as a measure of the light captured from the scene measured in stops. They are not the same measure. A 12 bit CGI image has signal DR just like a 12 bit camera file does, but the CGI image has no scene-referred, real world dynamic range value. It seems that all modern camera sensors respond linearly to light, roughly at a factor of 1 comparing real-world light to bits. I do not know exactly why this is the case, but it does not seem to be the only conceivable way to do it. Again, I am not 100% sure about this, so if this is incorrect, I'd love an explanation!
  7. @kye Unity is a lot more programmer friendly than Unreal, certainly a lot easier to make content primarily from a text editor than it is in Unreal. Unless you need really low level hardware control, Unity is the way to go for hacking around and making non-traditional apps.
  8. ProRes has very specific data rates. If two programs make files of differing sizes, one of them is wrong, either by using the wrong ProRes flavor (LT, HQ, etc) or it's not really creating a proper ProRes file.
  9. Yes. ProRes is very similar to DNxHR in terms of quality and file size. Both are significantly larger than H265 files of similar quality, but are easier on the CPU, for smoother playback on most systems.
  10. I did some VR development a few years ago when I had access to a university Oculus. It was a ton of fun. Even simple things like mapping a 360 video so you can freely look around is amazing, let alone playing VR games with the handset and everything. I guess what I love in games is where the game never forces you to use a certain item to defeat the monster, thereby encouraging creativity to overcome tasks. You could get that specific item, or you could find a way to bypass the monster altogether--but then that same monster may come up later in the game. That's where cinematic techniques like color come in. The developer may use color to psychologically influence a player to make a decision, which makes it much more rewarding to find a different way to accomplish the task. For a great use of color in a game, think of Mirror's Edge, where objects you interact with are bright red and yellow against a mostly white world. It's makes it much easier to identify things you can climb without stopping and breaking your momentum. Films can use color in a similar way, to draw attenion to certain objects, but the fact that attention is drawn to an object actually changes how a game is played and, in some cases, the actual plot of the game, whereas a movie still exists on a linear timeline no matter where you look.
  11. DNxHR is a much less efficient codec, so you will see a significant file size increase. HQX in 4k should be around 720 mb/s, and 444 is around 1416 mb/s. I am not very familiar with DNxHD to be honest. I think you can include an alpha channel in any version, so that would add another 33% on any of those rates. So you can easily get a 10x increase in file size over the XT3, depending on what data rate you shot in.
  12. It's truly a great time for camera tech. Every month we get something new and awesome! The RAW footage from a given camera will almost certainly be better for keying, compared to 8 bit 420 from the same camera.
  13. They are not tied to each other, exactly. I said "correlate" and "necessarily" because in real world, manufacturers usually add more bits to higher DR images to avoid artifacts. So they usually do correlate in the real world, but only because of convention and not because of some intrinsic property. True, I implicitly encompassed both of those factors into saying an encoding of the same image. I should have been more specific. Yes, the ADC bit depth does limit the DR assuming its linear, but the encoded image might not retain all that range in the final image.
  14. The problem is there are many ways to measure DR. If you read "the Sony a7III has 14 stops of DR" and "the Arri Alexa has 14 stops of DR" both may be correct, but are utterly meaningless statements unless you also know how they were measured. Many years ago, Cinema5D pegged the a7sII at like 14 stops. However, they later standardized their measurement to use SNR = 2, which gave the result of a7sII at 12. But whichever way you measure, it's ~2 stops less than the Alexa. Many members here will tell you that Cinema5D is untrustworthy, so take that as you will. I have yet to find another site that even pretends to do scientific, standardized tests of video DR. Cinema5D puts the XT2 and XT3 at just over 11, so that confirms your finding. And again if you change your methods, maybe it will come out at 13, or 8, or 17--but in every case it should be a stop less than the a7sII when measured the same way. Bit depth doesn't necessarily correlate exactly to dynamic range. You can make a 2 bit camera that has 20 stops of DR: anything below a certain value is a 0, anything 20 stops brighter is a 3, and then stick values for 1 and 2 somewhere in between. It would look terrible, obviously, because higher bit depth reduces banding and other artifacts. There is pretty much no scenario in which an 8 bit encoding has an advantage over 10 bit encoding of the same image.
  15. You can use ffmpeg to convert to DNxHD or DNxHR. I have never done it myself, but the answer on this page seems very thorough https://askubuntu.com/questions/907398/how-to-convert-a-video-with-ffmpeg-into-the-dnxhd-dnxhr-format. Many windows applications can decode ProRes, they just cant encode it. What do you mean by DVR?
  16. That is true, but I would have to actually make a calibration profile for the TV. I started to do so last week, but after DisplayCAL took 5 hours to analyze the TV, I found I'd made a mistake, and I haven't had a need for accurate color that was worth 5 more hours. The real problem with color calibration on consumer devices is that for whatever reason, there is no consistent way to calibrate the entire computer display without an expensive LUT box. Graphics cards should just have a LUT box built into them, instead of the mish-mash of ICM profiles, VCGTs, and application-specific LUTs that we currently have. It's a ridiculous headache even with color managed software, let alone browsers.
  17. Guess I've got a one of a kind XT3 then!
  18. It charges while it's on from a 5v.
  19. I have never tried. It's my understanding that ProRes is always in apple's QuickTime MOV container, as it's an apple codec. If an MKV works in your NLE, I think that it's perfectly fine for personal use. However, I would not use it when you need to send footage to someone else unless you are 100% sure it is compatible with their software.
  20. The XT3 can charge it's battery off a regular 5v usb power bank. I don't think it requires 9v.
  21. @MeanRevert Yes, you can use the command line I gave in the original post to convert from H.265 to ProRes 422. ffmpeg -i Reference.mov -c:v prores_ks -profile:v 2 ProRes.mov "Reference.mov" is the original file, and "ProRes.mov" is the new file you will be creating. Change the 2 after profile:v to a 3 if you want ProRes 422 HQ (or 1 for LT, or 0 for Proxy). If you are transcoding an entire project, you will want some way to batch the files. If you have an HEVC hardware decoder (many modern GPUs and CPUs have hardware decoders now), you can use cuvid to speed up the decoding process. Keep in mind that ProRes 422 will probably lose some quality, though really unnoticeable in ordinary viewing. To me, 422 HQ was visually lossless compared to H.265, even under scrutiny.
  22. Sort of... but in my case it's mainly a mis-calibrated monitor. I'm using a fairly nice 4k TV as my monitor at the moment, whereas usually I use a nice (but very old) monitor, which is what my computer is calibrated for. So I'm 100% sure the colors on this TV aren't accurate, and I've got a calibrated LUT for previewing in Resolve so really it's anyone's guess what I'm seeing at this point.
  23. Its certainly interesting comparing the grades across different screens as well.
  24. @MeanRevert we discussed some issues over in the nx1 forum with hevc and resolve. Resolve free technically does not offer hevc decoding on Windows, so the fact that it worked at all was a fluke. As we saw with the nx1, some flavors of hevc worked, but had weird color fringing. So i would say that the only reliable way to natively edit hevc in resolve is to buy the studio version, unfortunately.
  25. Idk lol, I initially decided to make it super rich, and then made a bunch of little adjustments with no clear reason other than "this looks cool." I think I increased brightness on the yellows to make those yellow bits on the red poles stand out, I think I increased saturation on the blues to make his shirt jump out. As usual for my grades, I decreased saturation in highlights to make blown out areas a little less "angry." In hindsight maybe I'd use a similar grade for an Agatha Christie-esque murder mystery. Mildly stylized with a loose connection to reality, but sadistically intriguing in how it teases out people's dark secrets.
×
×
  • Create New...