Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 03/01/2014 in all areas

  1. I would think there's a lot of crossover between favorite films and influential films but I can say in my case there would be specific shots or sequences from each that I would say are my touchstones for specific types of photography or shots, color and composition. These would represent personal taste so folks should refrain from projecting their's onto others or questioning why their own personal taste isn't represented by other's choices. It's not a contest. You don't win anything coming up with the most predictable AFI selection. Deviations from the norm are a lot more interesting and revealing than some list that could be created with a statistical algorithm.
    2 points
  2. Guest

    Slider - "must have", or meh...?

    The Manfrotto MVM500A has replaced the 561BHDV-1. I've used both. They're exactly the same apart from the head. The head on the 500A is supposedly better but it's larger than the 561. I have a 561 and the head is fine. If you can find one it's a bit more compact (not much). However I would really recommend buying the two parts separately - the monopod and the head (Manfrotto sells the large video monopod without a head). Just buy the standard 500 head to go with it, because if you buy them together you get a head that has no panning axis, so it's useless if you want to put it on a slider/tripod/etc. If you buy the parts separately you can lock the panning axis when it's on the video monopod but you have the option of panning on other things if you need it (the Monopod pans using the ball at the bottom near the feet). Of course if you already have a fluid head, you could just buy the monopod part and try that first.
    1 point
  3. Julian's images: saving the 4K example at quality 6 creates DCT macroblock artifacts that don't show up in the 444 example at quality 10. All the images posted are 420: that's JPG. To compare the 1080p 444 example to the 4K 420 example: bicubic scale up the 1080p image to match exactly the same image region as the 4K image (examples posted are different regions and scale). The 1080p image will be slightly softer but should have less noise and artifacts. Combining both images as layers in a image editor then computing the difference (and scaling the brightness/gamma) up so the changes are clearly visible will help show exactly what has happened numerically; helpful if the differences aren't very obvious on visual inspection. We agree that 420 4K scaled to 1080p 444 will look better than 1080p captured at 420 (need to shoot a scene with camera on tripod and compare A/B to really see benefits clearly). 444 has full color sampling per pixel vs 420 having 1/4 the color sampling (1/2 vertical and 1/2 horizontal). My point is that we're not really getting any significant color element bit depth improvement which allows significant post-grading latitude as provided by a native 10-bit capture (at best there's ~8.5-9-bits of information encoded after this process: will be hard to see much difference when viewed normally (vs. via analysis)). Another thing to keep in mind is that all > 8-bit (24-bit), e.g. 10-bit (30-bit) images, need a 10-bit graphics card and monitor to view. Very few folks have 10-bit systems (I have a 10-bit graphics card in one of my machines, but am using 8-bit displays). >8-bit systems images need to be dithered and/or tone mapped to 8-bit to take advantage of the >8-bit information. Everything currently viewable on the internet is 8-bit (24-bit) and almost all 420 (JPG and H.264). re: H.264 being less that 8-bits- it's a effectively a lot less than 8-bits not only from initial DCT quantization and compression (for the macroblocks), but also from the motion vector estimation, motion compression, and macro block reconstruction (which includes fixing the macroblock edges on higher quality decoders).
    1 point
  4. I saw it in at least one persons list already ;) I haven't completely watched it yet, but saw some of the low light scenes. There's always this talk about the famous f/0.7 lens used for Barry Lyndon. Yeah, of course that's special, but come on, it was shot at asa 100 pushed to 200! (at least, that's what I can find about it) So that would be the same as shooting nowadays digital at iso 800 f/1.4... whoohoo! That's peanuts for a $500 dslr and a cheap manual focus lens. Of course, this doesn't make the film bad and it is more of a testimony to what possibilities we have within reach now. But I kinda fail to be impressed with the technical side of it. Also they really used a shitload of candles and special ones that burn a lot faster and brighter than the usual ones. /rant ;)
    1 point
  5. The difference between 8-bit and 16-bit might be apparent when comparing two images of the same resolution (on monitors of the same resolution and bit depth to match each image). Such a difference would become obvious if the scene contained a gradation subtle enough to cause banding in the 8-bit image but not in the 16-bit image. However, in such a scenario, if you could continually increase the resolution of the camera sensor and monitor of the 8-bit system, you would find that the banding would dissappear at some point in the 8-bit image. By increasing the resolution of the 8-bit system, you are also increasing its color depth -- yet its bit depth always remains at 8-bit. One can easily observe a similar phenomenon. Find a digital image that exhibits a slight banding when you are directly in front of it, then move away from the image. The banding will disappear at some point. By moving away from the image, you are increasing the resolution, making the pixels smaller in your field of view. However, the bit depth is always the same, regardless of your viewing distance. Such a test wouldn't be conclusive unless each monitor matches the resolution and bit depth of the image it displays. Most image makers are not aware of the fact that bit depth and color depth are two different properties. In digital imaging, bit depth is a major factor of color depth, but resolution is an equally major factor of color depth (in both digital and analog imaging). Therefore, one can sacrifice resolution while increasing bit depth, yet the color depth remains the same (or is decreased). In other words, swapping resolution for more bit depth does not result in an increase in color depth.
    1 point
  6. I haven't followed the topic, but I was curious if I could simulate the theory. On Fred Miranda I found this topic that explains how Photoshop saves jpgs in 4:2:0 or 4:4:4 I used a full resolution JPG from the GH2 and took the following steps: Cropped the image to 3840x2160 and saved as JPG quality 6. The result is a 4K 4:2:0 still. Then I resized this image to 1920x1080 and saved it as quality 10 to make a 4:4:4 image. I also resized the 4K 4:2:0 still to 1920x1080 and saved it at quality 6 to make a 1080p 4:2:0 image. 200% crops of the above images in same order: 4K 4:2:0 1080p 4:4:4 1080p 4:2:0 Not sure if this test is correct, but to me it looks like you can gain color resolution from downsampling the 4K 4:2:0 file to 1080p. Correct me if I'm wrong!
    1 point
  7. I saw the explanation video too, it sounded more like an infomercial for canon because I don't know how anyone with eyeballs could advocate the c500 over the alexa for a theatrical release with a straight face. Esp considering the footage shown in the tests looked like a wedding video. I love Shane's posts though, full of insights.
    1 point
×
×
  • Create New...