jonpais Posted January 16, 2014 Share Posted January 16, 2014 I want to join the 4K party, possibly with the Sony, perhaps the announced GH4 but.... Am I going to have to invest $12,000 to play? All I've got is the latest 15" Retina with 8GB RAM. It's enough for the tiny bit of editing and minor color corrections I'm doing now, but... I've read about making proxies and stuff, but most of the sites I've consulted talk about RAID and so forth, stuff I'm not really familiar with... I'm also prepared to purchase one of the inexpensive 4K monitors that are coming out, though I realize they only have a 30Hz refresh rate... Am I being overly optimistic? It's hard to believe that Sony would release a $2,000 consumer camcorder knowing that your average Joe doesn't have over 10 G's lying around to invest in a build... Just like now, I would just be making videos for online delivery, not theatrical presentation or anything Quote Link to comment Share on other sites More sharing options...
bzpop Posted January 16, 2014 Share Posted January 16, 2014 It's hard to believe that Sony would release a $2,000 consumer camcorder knowing that your average Joe doesn't have over 10 G's lying around to invest in a build... Just like now, I would just be making videos for online delivery, not theatrical presentation or anything believe it, it'll be here in couple months :) i've been shooting 4K since 2010, but never deliver anything in higher than 1080 :) and to me 4k (at least today) not as much about resolution as about detail and dr; i didn't have a chance to get a closer look at XAVC, but red R3Ds are extremely light , in fact way lighter than AVCHD for example, and i can edit 4K raw on my slightly modified Samsung i7 laptop. i can't wait to get that lil 4k camera! jonpais 1 Quote Link to comment Share on other sites More sharing options...
odie Posted January 16, 2014 Share Posted January 16, 2014 And you'd be in the company of a lot of respected DPs that have made the transition to digital. Arri has proven since the introduction of the Alexa that, when it comes to digital, how you fill those rows and columns is ultimately more important than how many rows and columns you're filling. Higher and higher resolution and more and more DR and wider and wider color gamut are all goals the engineers busy working away at Sony or RED understand. They're practical problems with practical solutions, overcome by throwing more and faster engineering at them. These engineers aren't going to create better cameras though. Last night I finally got around to watching Paul Thomas Anderson's The Master and immediately thought about where we were going here. It's almost poetic that this film was released the same year as the first installment of The Hobbit, a shitty looking 4K movie shot on RED. I'm not anti-RED and there are plenty of good looking RED films, this isn't one of them and this film is a good example of what's missing from RED (and Sony's) path. Reading about their workflow, on The Master, it reinforces my feelings that 4K and beyond, with digital cameras, is really a problem for filmmaking, and it's not just about resolution. Nobody has ever complained that 65mm made actors or sets or props or make up look bad. The Master was a mixed 65mm (mostly) and 35mm show, graded photochemically with both print and DCP distribution. They did four separate release finishes for this film, with true photochemical for 70mm and 35mm prints. It's crazy. But for their DI they scanned the 5-perf 65mm at 8K and the 4-perf 35mm at 6K for an eventual 4K DCP finish. More than once I've read, based on DP and colorist commentary, that anamorphic 35mm with a digital 4K finish is considered minimally what you need if you care about preserving most of what's there in the neg, where 2K is an abomination, but 6K for spherical when you're not shooting 8-perf seems excessive by conventional wisdom. Everything I've read over the years about how obsessive P.T. Anderson is over the photography of his films leads me to believe they weren't assigning film this level of resolution unless it warranted it. Every aspect of his acquisition and release methodology was thoroughly tested. My own experience on dozens and dozens of films working with scanned 35mm imagery concurs. Making 4K actually look good with digital cameras isn't something that seems part of the conversation at these electronics shows and soft lenses is the current bandaid but I think it's just that. They're not confronting head on a fundamental flaw in how these sensors are recording reality so discretely and unattractively. Resolution is to blame but film proves this to be false. For the 35mm portions of The Master the emphasis was on the sharpest lenses available, so that the footage cut well with the 65mm. Meanwhile DP after DP are on record choosing reportedly soft lenses, like Cooke S4, for high resolution digital acquisition, otherwise finding the look is harsh and unappealing. Moreover, PT Anderson requires his films be shot on the slowest stocks available, ensuring the smallest possible grain. DP Mihai Malaimare brought his personal 85mm Zeiss Jena for Panavization, the sharpest 85mm he's ever used or seen, and Panavision found a matching set of expanded focal lengths already in their possession. All this emphasis on resolution didn't make the film less beautiful. On the contrary, I can't think of the last time I saw close-ups in a motion picture that inspired this much awe. It was just astounding. And the color. Seeing a film, shot on film, with a photochemical grade that looks this amazing also points out just how over-graded a majority of films are, regardless of the origination. There's a lot of "just because you can doesn't mean you should" sort of observations one could make. All the criticisms levied against The Hobbit and DP after DP working around and against the "edge" of Ultra-HD (some of which, yes, is preserved even in 2K or 1080P reductions), this was never the case with film, shooting with what we know to be higher resolving power than 4K. I think we need to be shifting the conversation away from just the simple number of rows and columns and to how they're being filled. Video engineers did finally get around to giving us the ability to shoot video with film-like gamma curves, finally appreciating that we don't like our movies looking like the nightly news. Now they need to appreciate that 4K+ for movies needs to be different than 4K+ for covering Formula-1 or football. agreed…alexa over red…kodak 35mm overall.. Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 16, 2014 Share Posted January 16, 2014 I've read about making proxies and stuff, but most of the sites I've consulted talk about RAID and so forth, stuff I'm not really familiar with. The workflow to do editing involves an intermediate step, sure, but I did plenty of proxy editing in the mid-90's (imagine a computer not powerful enough to handle 29.97 640x480 video) and proxy edits are not that big of a deal. I'd rather do proxy cuts than spend 10K on computer hardware. If you do proxy editing you would't need a RAID, really. In FCP, for example, you'd simply ingest your media in a proxy low res format and do your entire edit. When it's finished you'd load in your high res clips and then reconnect the files for a 4K video. Anyway, point is, it might seem like a big deal to accomplish if you haven't done it before, but once you've pulled it off, you'd realize it's not too difficult. I actually did a documentary edit this way two years ago because loading up hundreds of hours of 1080 422 prores was too much data...so we just cut it low-res. Sean Cunningham and jonpais 2 Quote Link to comment Share on other sites More sharing options...
Sean Cunningham Posted January 16, 2014 Share Posted January 16, 2014 Even 1080P, most folks don't need and shouldn't be editing raw or scrounging around for specs that let them spend gobs of money to edit with the raw files. They need to really learn how to edit and separate their editing from conform+finish. You can be totally productive with terribly basic hardware when you take a disciplined approach and understand the process. andy lee and fuzzynormal 2 Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 16, 2014 Share Posted January 16, 2014 agreed…alexa over red…kodak 35mm overall.. This is going off topic, but from the online discussions I've seen, what some folks don't seem to fathom about the whole resolution debate, when talking in the context of a film like The Hobbit, is the characteristics of perceived resolution when the frame rate increases. The faster the frame rate, the more that particular series of still pictures are viewed as more RESOLVED and lifelike...even if the pixel resolution is IDENTICAL and shot from the SAME camera. Everyone seems to focus their discuss pixel resolution, when it (I think) is about the aesthetic issue of slower frame rates. This is an overlooked cinematic effect that shouldn't be ignored on-line, but too often is. (Professionals get it though) Look, here's a straw man for ya: "Oh, I like films shot on the Alexa!" people argue, "It's so much better than everything else. So organic and "pure." Well, yeah. It kinda is, but are you enamored with the cinematic look of the frame rate or the actual image resolution? I suggest it's both. If you shot 60p on an Alexa I guarantee you a film purist would take one look at the result and be horrified at the "videoness" of the image. Shoot the same exact scene at 24p on the Alexa, play it back, and the film purist would instantly feel more comfortable. Aside from all that, just watching a film shot in 48p, then played back and displayed at 24p will certainly alter the perceived cinematic aesthetic of the film. It will present different visual characteristic. At 24p you'd be watching every other frame of a 48p shoot and that's all it takes. Watch the footage at it's initially shot 48p and it starts to look more "video/electronic" (and thus less cinematic) to the human eye. You got a camera and monitor that does both 25p and 50p? Shoot 50p and put the clip on a 50p timeline and then on a 25p time line. Watch the difference. Or, shoot a horse race at 50p then shoot another at 25p and go look at the perceived change of the image. You'll see in a hurry it's not an issue about resolution that's altering your idea of what it means for an image to be "cinematic." ...and I'm not even getting into motion blur and shutter speeds, which also greatly alter the perception of moving pictures. Long rant short: It's not just about the resolution. Aussie Ash and Paul Ning 2 Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 16, 2014 Share Posted January 16, 2014 You can be totally productive with terribly basic hardware when you take a disciplined approach and understand the process. What this guy said. Read it. Believe it. Comprehend it. Out of all the variables needed to make good motion pictures, gear is such a small part of the equation. I wish there was less fretting about equipment. With all this modern technology it's all good enough to allow incredible creations...so create. Or, collect, I guess. I know I'm guilty of just wanting some new "thing" just because it's cool. I have a weakness for collecting gadgets. But the fact is we all have better imaging power in our hands than most film makers did in the 60's and 70's. The bigger question is, what am I going to do with it? andy lee and Sean Cunningham 2 Quote Link to comment Share on other sites More sharing options...
ntblowz Posted January 16, 2014 Share Posted January 16, 2014 Asus 28" 4K reported to have native 4K 60Hz, unlike the Dell offering which only have 4K 30hz, they both gonna cost for $800 and less I think I used to have MBPR 15, but gradually overtime they get overheat, my friend's power adapter even melted! (my one clocked over 100c when rendering, you can probably boil water or cook some egg on it lol) Now I bought my self a new desktop for editing and rendering, it is at least 70% faster than MBPR 15 when u export the video out, and at only 1/3rd of the price and runs much cooler too, with a lot of HDD space to waste. (and option to add PCI-E HDD/SSD in the future too for ultimate speed), with 32GB of ram it should handle up to 6K no problem too I hope Btw if you happened to see 4K monitor you can instantly tell the difference between 4K and FHD/WQHD monitors, the difference in clarity and detail is quite apparent, and FHD looks soft on 4K compare to 4K native (on MBPR the difference is not that great, since 15" is really small) I want to join the 4K party, possibly with the Sony, perhaps the announced GH4 but.... Am I going to have to invest $12,000 to play? All I've got is the latest 15" Retina with 8GB RAM. It's enough for the tiny bit of editing and minor color corrections I'm doing now, but... I've read about making proxies and stuff, but most of the sites I've consulted talk about RAID and so forth, stuff I'm not really familiar with... I'm also prepared to purchase one of the inexpensive 4K monitors that are coming out, though I realize they only have a 30Hz refresh rate... Am I being overly optimistic? It's hard to believe that Sony would release a $2,000 consumer camcorder knowing that your average Joe doesn't have over 10 G's lying around to invest in a build... Just like now, I would just be making videos for online delivery, not theatrical presentation or anything jonpais 1 Quote Link to comment Share on other sites More sharing options...
Tone13 Posted January 16, 2014 Share Posted January 16, 2014 Not sure how a 4K recording has more DR than 1080p? The BMCC has 13 stops of DR and the 4K version has 12. Different sensors but the 4K version has LESS dynamic range. What I'm saying is that just because something wears the 4K badge, does not automatically mean it is better. 4k for the masses is marketing hype driven to get your money wether you need it or not. Quote Link to comment Share on other sites More sharing options...
SPG Posted January 17, 2014 Share Posted January 17, 2014 A funny thought occurred to me while reading this…EOSHD.com no longer has any love for either EOS or HD. Perhaps it's time to rename the site to GH4K.com? :) But more to the original topic, we'll all be shooting 4k eventually. Maybe not this year or next, but it's inevitable that the CE companies will complete the push to 4k. They need to sell more TVs and cameras and the HD ones out there now will eventually wear out and need replacing. Just as there aren't many 4:3 SD TV sets left, so too will HD sets be replaced. I suppose the only real question is when? My personal guess is that we'll be fine shooting and delivering HD for at least another three years or more. Quote Link to comment Share on other sites More sharing options...
Sean Cunningham Posted January 17, 2014 Share Posted January 17, 2014 This is going off topic, but from the online discussions I've seen, what some folks don't seem to fathom about the whole resolution debate, when talking in the context of a film like The Hobbit, is the characteristics of perceived resolution when the frame rate increases. That's certainly a big reason for why The Hobbit is the biggest offender here. The faster frame rate, even when down sampled to 24P, retains the crispness of the HFR and therefore high shutter footage. But other 4K digital acquisition, at 24P, will still have objectionable aesthetics if you compared side-by-side with film of similar scanned resolution. I'm having a parallel discussion on the subject with some industry colleagues and apparently some vendors actually have been working on the problem and identifying one of the biggest reasons why two images of equal resolution, one film origination and the other digital will almost invariably result in the analog picture being found more pleasing: The HVS* is exquisitely sensitive to discontinuities of all sorts, up to the third derivative (for you math geeks). It enhances them, and one's attention goes immediately to them. By evolutionary design. It's vital to staying alive and finding prey in the bush.But in art and entertainment, we want to direct the attention of the viewer, so such artifacts are highly undesirable. (* HVS = human visual system) We're designed to detect patterns and edges and the fixed grid of the sensor is the perfect structure to draw our attention. It is also a phenomenon that has both spatial and temporal factors (like the HFR of The Hobbit). Film resolves higher that digital of supposedly similar resolution when you include how we perceive the analog image over time versus the fixed and never changing grid. This is something that Aaton (see the pattern here, Arri and now Aaton) is or was working on with the Delta Penelope: Another revolutionary design is in the sensor, which is the first in the industry to be mounted to a moving assembly. By offsetting the physical position of the sensor by half a pixel with each frame, the spatial resolution is increased over time. This is akin to film image capture, where the random structure of grain and silver halide crystals in each frame creates greater image resolution in the changes that happen on the image surface level between frames. In other words, while an individual film frame may appear relatively low in resolution and high in grain, the random structure of the actual imaging material on a strip of filmstock means that information is captured in different spots and grain deposited in different positions from frame to frame. When these frames are shown successively one after another, the cumulative effect is an overall increase of resolution and a decrease in visible grain structure. This is increased temporal resolution via increased spatial resolution. ...this is my first exposure to this project but reportedly they're having problems with the Dalsa sensor. edit: oh, I guess the company is no more. Hopefully something like this will be along the lines of where Arri goes with new cameras. mtheory 1 Quote Link to comment Share on other sites More sharing options...
see ya Posted January 17, 2014 Share Posted January 17, 2014 Not sure how a 4K recording has more DR than 1080p? The BMCC has 13 stops of DR and the 4K version has 12. Different sensors but the 4K version has LESS dynamic range. What I'm saying is that just because something wears the 4K badge, does not automatically mean it is better. 4k for the masses is marketing hype driven to get your money wether you need it or not. Whatever, but 4k brings with it a new specification, BT2020. http://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2020-0-201208-I!!PDF-E.pdf In that specification it explains how the wider DR and wider gamut will be transported in the codec. A tweaked rec709 transfer curve (at the low end) and different color primaries wider than rec709 in a 10bit codec, maybe some camera manufacturers will offer 8bit LOG still to try to carry that gamut and DR. 4k is not just about resolution. Although anyone can choose what they take from the new 4k standard. http://www.avsforum.com/t/1496765/smpte-2013-uhd-symposium http://www.ultrahdtv.net/what-is-ultra-hdtv/ That's all assuming the camera sensor is capable of providing wider DR and as long as the display technology supports BT2020 to show it. Bearing in mind LCD technology with these crap computer monitors struggle with a contrast ratio and elevated black level to even display a decent rec709 image (5 stop curve) without a UHD spec'd display those benefits won't be seen apart from the resolution, again as long as the camera provides a decent image. We already read about how camera raw offers wider DR over a typical rec709 or sRGB curve compressed video output and the BS talk of highlight recovery, aka tonemapping in this case to get that wider DR to display on crap or rec709 limited monitors. The new BT2020 transfer curve, it is suggested along with the 10bit coding should allow that DR through, even in a compressed codec. I guess first release of cameras and displays will offer 4k resolution, 1080p mode and rec709 back compatibility, then later releases the BT2020 UHD specification and wider gamut in some mash up of specifications to pick through. Although we've already had xvcolor (xvYCC) extended gamut via rec709 primaries and 'Deep Color' higher saturated colors and neither taken off, just marketing hype. Quote Link to comment Share on other sites More sharing options...
Axel Posted January 17, 2014 Share Posted January 17, 2014 @BurnetRhoades Interesting. I didn't know. It all comes back to niceties of aesthetic perception, doesn't it? We are becoming modern alchemists, watching paint dry, whereas around us life continues, unbeknownst to us. :blink: Makes sense. You take a single frame from a 35mm release print and see how much it can be enlarged. At 15 x 10 inches (rather sooner) it gets grainy, hard to imagine that it's fit for a 50 feet screen. Next big thing: Sensor offset. Oh really? You're still recording with a fixed sensor? You can't be serious. I top your so-called 4k resolution with my R-O (random offset) 1080 sensor easily. Didn't you study Why I am going R-O and why you should too? Try sell it at Ebay. That's all assuming the camera sensor is capable of providing wider DR and as long as the display technology supports BT2020 to show it. Bearing in mind LCD technology with these crap computer monitors struggle with a contrast ratio and elevated black level to even display a decent rec709 image (5 stop curve) without a UHD spec'd display those benefits won't be seen apart from the resolution, again as long as the camera provides a decent image. Yeah, and imagine the first time you power on your new 14-bit, 4-8k monitor in three or four years and watch your 10-bit, raw or 4k video on it, you now deem perfectly graded. Reminds me of Mephistopheles: I am the Spirit that denies! And rightly too; for all that doth begin Should rightly to destruction run; 'Twere better than that nothing were begun. EDIT: Irony of course. Quote Link to comment Share on other sites More sharing options...
Guest Posted January 17, 2014 Share Posted January 17, 2014 * Quote Link to comment Share on other sites More sharing options...
Tone13 Posted January 17, 2014 Share Posted January 17, 2014 Whatever, but 4k brings with it a new specification, BT2020. http://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2020-0-201208-I!!PDF-E.pdf In that specification it explains how the wider DR and wider gamut will be transported in the codec. A tweaked rec709 transfer curve (at the low end) and different color primaries wider than rec709 in a 10bit codec, maybe some camera manufacturers will offer 8bit LOG still to try to carry that gamut and DR. 4k is not just about resolution. Although anyone can choose what they take from the new 4k standard.http://www.avsforum.com/t/1496765/smpte-2013-uhd-symposiumhttp://www.ultrahdtv.net/what-is-ultra-hdtv/ That's all assuming the camera sensor is capable of providing wider DR and as long as the display technology supports BT2020 to show it. Bearing in mind LCD technology with these crap computer monitors struggle with a contrast ratio and elevated black level to even display a decent rec709 image (5 stop curve) without a UHD spec'd display those benefits won't be seen apart from the resolution, again as long as the camera provides a decent image. We already read about how camera raw offers wider DR over a typical rec709 or sRGB curve compressed video output and the BS talk of highlight recovery, aka tonemapping in this case to get that wider DR to display on crap or rec709 limited monitors. The new BT2020 transfer curve, it is suggested along with the 10bit coding should allow that DR through, even in a compressed codec. I guess first release of cameras and displays will offer 4k resolution, 1080p mode and rec709 back compatibility, then later releases the BT2020 UHD specification and wider gamut in some mash up of specifications to pick through. Although we've already had xvcolor (xvYCC) extended gamut via rec709 primaries and 'Deep Color' higher saturated colors and neither taken off, just marketing hype. While there is no doubt that the 4K codec brings many advantages, it's just plain wrong to say 4K is better when many factors (mainly the sensor) contribute to the image properties. So will the Sony 4K handycam have better DR over Alexa because it is 4k and Alexa is not? I can see where your coming from but stating that 4k has better DR just because it's 4k is plain wrong. Now while the 4K codec is impressive on paper, the same qualities could have been given to a 1080p codec at much lower data rates with similar results (other than resolution obviously). Why would they now create a killer 1080p codec when they are trying to force 4K down our throats? Business men in suits have given the massed 4K, 99% if consumers and professionals alike have not requested it. sir_danish 1 Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 17, 2014 Share Posted January 17, 2014 stating that 4k has better DR just because it's 4k is plain wrong. Indeed. That's kind of a strange assumption. Anyone that's been using a variety of digital cameras should be more than well aware of the IQ between different sensors. Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 17, 2014 Share Posted January 17, 2014 Business men in suits have given the masses 4K. Even if 99% of consumers and professionals alike have not requested it. While true, I welcome this particular technology push as a marketing tool. I also believe they're doing this for the simple reason that making sensors with more pixels is a ton easier than making a sensor with better dynamic range. Simply: It's cheaper and gives them some marketing clout. Like the "megapixel" wars, the numbers will matter less as it all equalizes across competition. When that happens IQ will have its day. Especially as pro and enthusiast photographers start using video as a means of non-stop stills shooting. Those customers are demanding and IQ matters. They're not going to settle for 4K if the IQ is inferior. It's all moving forward and I'm cool with that. Quote Link to comment Share on other sites More sharing options...
see ya Posted January 17, 2014 Share Posted January 17, 2014 While there is no doubt that the 4K codec brings many advantages, it's just plain wrong to say 4K is better when many factors (mainly the sensor) contribute to the image properties. So will the Sony 4K handycam have better DR over Alexa because it is 4k and Alexa is not? I can see where your coming from but stating that 4k has better DR just because it's 4k is plain wrong. Where did I say 4k has better DR? I didn't. You asked this: Not sure how a 4K recording has more DR than 1080p? I answered, BT2020 specification as long as the camera can provide it. I'm not interested in all the 4k BS debate, it's business as usual. Same old same old. New formats, standards, hype and BS. Choose the camera that suits the individiual and f--k what anyone else thinks. Quote Link to comment Share on other sites More sharing options...
see ya Posted January 17, 2014 Share Posted January 17, 2014 Indeed. That's kind of a strange assumption. Anyone that's been using a variety of digital cameras should be more than well aware of the IQ between different sensors. Is that aimed at me? I guess not because I never made that strange assumption. ^ Tone13 1 Quote Link to comment Share on other sites More sharing options...
fuzzynormal Posted January 17, 2014 Share Posted January 17, 2014 Is that aimed at me? I guess not because I never made that strange assumption. Correct guess. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.