Jump to content

jcs

Members
  • Posts

    1,839
  • Joined

  • Last visited

Reputation Activity

  1. Like
    jcs got a reaction from valery akos in Capturing the best A7s skin tones   
    I started creating a tutorial for good skin tones on the A7S- higher priority projects have taken my time. There a many ways to get good skin tones on the A7S- creating custom profiles can help, though stock PP6 can work very well as is. Slog2 gamma with Pro or Cinema also works well. It is possible to get good skin tones with the A7S using Resolve and no LUTs at all: everything from scratch (scopes are indeed helpful). Casey posted some test images shot in Slog2 on dvxuser. Here is the thread: http://www.dvxuser.com/V6/showthread.php?334239-a7s-skin-tones-slog2-vs-pp-off-(shogun-4k) (IIRC, account login required to view images). Here are the stock Slog2+Sgamut images:

    Here are my grades (no LUTs- just the basic tools in Resolve):

    More blue to separate the foreground actor from the background, also changes the mood:

    Casey's PP Off shot:

    I prefer graded Slog2+Sgamut vs. PP Off. I use mostly tweaked Slog2 + Pro color and CINE1/CINE2/CINE4 + Pro or Cinema (tweaked).
    Casey also created some useful Stock PP7 (Slog2+SGamut) LUTS: http://www.dvxuser.com/V6/showthread.php?334831-a7s-slog2-3d-luts (based on the F65/55 LUTs).
  2. Like
    jcs got a reaction from Geoff CB in Visioncolor Impulz Luts   
    Before I got more serious into filmmaking, I created a custom tool to allow mapping any color to any other color. It used a 3D LUT, with trilinear or tricubic interpolation. The UI was in 3D, and 3D glasses were worn to edit the 3D cube lattice (editing was in stereo 3D). The UI provided 2D rendered slices through the 3D cube to help visualize the transform being created. The tool wasn't a retail product, but rather a tool used to figure out a solution to a specific problem. It was clear that while a 3D LUT is very powerful, the distortions created in the mapping can lead to 'color collapse', meaning many colors get mapped to the same value (banding, solarization, poor skin tones), and because the final values must be mapped back to a [0,0,0] -> [1,1,1] space, clipping or other techniques must be used, which can further create unwanted artifacts.
    3D LUTs work best when the input is exposed in a way the LUT table 'expects' (per the design). Changing exposure before the LUT can radically change the output (or if shot too low/high). I purchased Film Convert and Impulz Ultimate, and while both are useful tools, I don't use them very often. I might use them more if they supported my (by far) favorite film stock: Eastman Kodak 100T 5248/7248.
    A 3D LUT cube can be converted to a 2D bitmap (and back to a 3D LUT): I used this method for an iOS app which needed fast real-time 3D LUTs.
    Here's a 3D LUT creator that works similarly to the custom tool I created (but with a 2D UI and 2D bitmap display): http://3dlutcreator.com/
    When I watch movies on Netflix, when a scene has amazing color, I stop the movie and look it up on shotonwhat.com. I did this when watching Braveheart recently, on this scene:
    Shot with iPhone 5S on Sony XBR5 HDTV:

    The blue/magenta halo is not visible on the TV- iPhone 5S artifact. Screen shot on MacBook Pro in Safari (Netflix makes screenshots a challenge- most come out black; stopping on a good sharp frame is tricky as well. Here's a close sharp frame). Note the reduced brightness and contrast: HDTV image looks much better in real life!:

    Reading the Kodak paper on 5248 film: http://motion.kodak.com/motion/uploadedFiles/H-1-5248t.pdf , something interesting is apparent: sharpness varies with RGB (B is sharpest, followed by G, then R). Blurring the G and R color channels may help recreate the 5248 look (not possible with a 3D LUT alone). I stopped Men in Black II on a similar shot, with Will Smith in front and blue sky in back. Something magical about blues, skintones (pinkish) and 5248 film. The Last Samurai, The Fifth Element, Armageddon, Fight Club, American Beauty, Star Trek First Contact, The Shawshank Redemption, Baraka, and many more favorites- all shot on 5248.
  3. Like
    jcs got a reaction from Ed_David in Visioncolor Impulz Luts   
    Before I got more serious into filmmaking, I created a custom tool to allow mapping any color to any other color. It used a 3D LUT, with trilinear or tricubic interpolation. The UI was in 3D, and 3D glasses were worn to edit the 3D cube lattice (editing was in stereo 3D). The UI provided 2D rendered slices through the 3D cube to help visualize the transform being created. The tool wasn't a retail product, but rather a tool used to figure out a solution to a specific problem. It was clear that while a 3D LUT is very powerful, the distortions created in the mapping can lead to 'color collapse', meaning many colors get mapped to the same value (banding, solarization, poor skin tones), and because the final values must be mapped back to a [0,0,0] -> [1,1,1] space, clipping or other techniques must be used, which can further create unwanted artifacts.
    3D LUTs work best when the input is exposed in a way the LUT table 'expects' (per the design). Changing exposure before the LUT can radically change the output (or if shot too low/high). I purchased Film Convert and Impulz Ultimate, and while both are useful tools, I don't use them very often. I might use them more if they supported my (by far) favorite film stock: Eastman Kodak 100T 5248/7248.
    A 3D LUT cube can be converted to a 2D bitmap (and back to a 3D LUT): I used this method for an iOS app which needed fast real-time 3D LUTs.
    Here's a 3D LUT creator that works similarly to the custom tool I created (but with a 2D UI and 2D bitmap display): http://3dlutcreator.com/
    When I watch movies on Netflix, when a scene has amazing color, I stop the movie and look it up on shotonwhat.com. I did this when watching Braveheart recently, on this scene:
    Shot with iPhone 5S on Sony XBR5 HDTV:

    The blue/magenta halo is not visible on the TV- iPhone 5S artifact. Screen shot on MacBook Pro in Safari (Netflix makes screenshots a challenge- most come out black; stopping on a good sharp frame is tricky as well. Here's a close sharp frame). Note the reduced brightness and contrast: HDTV image looks much better in real life!:

    Reading the Kodak paper on 5248 film: http://motion.kodak.com/motion/uploadedFiles/H-1-5248t.pdf , something interesting is apparent: sharpness varies with RGB (B is sharpest, followed by G, then R). Blurring the G and R color channels may help recreate the 5248 look (not possible with a 3D LUT alone). I stopped Men in Black II on a similar shot, with Will Smith in front and blue sky in back. Something magical about blues, skintones (pinkish) and 5248 film. The Last Samurai, The Fifth Element, Armageddon, Fight Club, American Beauty, Star Trek First Contact, The Shawshank Redemption, Baraka, and many more favorites- all shot on 5248.
  4. Like
    jcs got a reaction from Christina Ava in Canon 5D Mark IV "will be 1080p" with Canon LOG   
    If the 5D4 is truly full resolution 1080p (as good as the A7S), and has dual pixels with solid autofocus on EF lenses, comes close to the A7S in low light / sensitivity & similar or better DR, similar or better slomo (60p full resolution at 1080p), a crop mode similar to A7S (and perhaps more sizes), audio preamps as good or better than A7S, then it will be a great replacement for the current A7S (video) + 5D3 (stills) package. 10-bit 422 H.264 would be a bonus to go along with log (why not if they're using 4K to delineate product lines?). 
  5. Like
    jcs reacted to johnnymossville in Panasonic GH4 firmware update brings 24p Anamorphic, V-LOG coming in later update / watch footage   
    ​Dunno, I just thought technicolor when I saw it.  Actually, looking at real technicolor I prefer that.  Like Marilyn Monroe here. 
     
     

  6. Like
    jcs got a reaction from Beritar in Samsung NX1 dynamic range comparison   
    The GH4 can do a bit better with DR by using iDynamic to bring up the shadows, then underexposing to protect the highlights: http://***URL removed***/previews/panasonic-dmc-gh4-sony-alpha-7s/14
     
  7. Like
    jcs reacted to FilmMan in Canon C300 Mark II - $15,999 4K camera   
    ​Good one.  Or maybe the skin tones need more blue with a touch of red.  Cheers.
  8. Like
    jcs got a reaction from FilmMan in Canon C300 Mark II - $15,999 4K camera   
    What camera was used to shoot this? Watching on an iPhone 5S (Retina- generally excellent color), the face skintones look too green. Don't tell me he's an alien and the color is accurate. Everyone knows the green aliens top out around 4'.
  9. Like
    jcs got a reaction from zetty in Audio on GH4 and NX1   
    mercer- yes, thanks, Delta was our first narrative short. We had no complaints for audio/sound (other elements- yes; a learning experience for us! ). Glad you got the humor shots! (I shot everything- no stock footage). The short was supposed to be campy and fun; special effects a homage to old-school video games / Star Wars, etc. We'll improve the story (and everything else) on the next one
    The simplest solution would be a shotgun mounted on camera with a high-quality isolated mount (and far enough away from the lens to mask stabilizer noise, etc., when present), with a decent preamp (either in the mic system (Rode etc.)), or as another piece of hardware (JuicedLink sounds cleanest for the money, SD's MixPre-D sounds much better and has superior limiters: worth over double the cost IMO. The hacked iRig Pre is perhaps the best low cost, ultra small solution: http://www.dslrfilmnoob.com/2012/11/25/irig-pre-hack-cheap-xlr-phantom-power-preamp-dslr/ ). You could also rig a boom to a backpack, etc., to get the mic closer to the subject (really need to get mic pointed down towards the ground to take advantage of noise rejection).
    Regarding dual sound, IMO it's only worth it for larger productions, with a dedicated sound guy, and only when using Sound Devices or similar quality gear (I have a Zoom H4n and Tascam DR100mkII: not high enough quality vs. internal DSLR with a preamp to warrant extra effort of separate sound. DR680 and newer are good and SD 702 and up are preferred. SD gear has amazing preamps, not only clean, but a very full, natural 'Hollywood' sound. The SD limiters are also very, very good: the extra cost for SD gear is worth it: http://www.bhphotovideo.com/c/product/429566-REG/Sound_Devices_702_702_High_Resolution_2_Channel.html . How many 5.0 reviews do you see on ANY piece of gear? Pretty amazing).
    Steve M.- going wired like that can certainly sound better than wireless (until getting to the Zaxcom or Lectrosonics digital wireless level), though the Sennheiser (and similar priced gear) is more than good enough for indie work (even material planned for streaming paid delivery). Another solution that can be 'free' is to use old cell phones as lav recorders (even your current cellphone). Rode makes a decent lav for the iPhone & Android: http://www.amazon.com/Rode-smartLav-Lavalier-Microphone-Smartphones/dp/B00EO4A7L0 . It is technically possible to have an app be remotely controlled to start recording, meaning a bunch of iPhones/Androids could join an adhoc wifi network and be triggered to record remotely, then send a compressed AAC copy over wifi back to the controller, where mixed audio could be monitored live (if this already exists- very cool)). In post the locally recorded uncompressed WAV files can be used for editing (along with timecode and/or sidecar metadata to make syncing easier).
    Not monitoring lav recordings live is indeed very risky- many times the mic/cable rub and must be readjusted due to talent movement (cable loop taping and careful placement help, but there are still issues that come up during recording).
     
  10. Like
    jcs got a reaction from SleepyWill in Audio on GH4 and NX1   
    The trick to getting professional audio (low noise, excellent quality) is to use a preamp into the DSLR. In the field, the Sennheiser G3 wireless lav system works great, especially for run & gun and guerrilla shooting when you don't (or can't )have a boom operator. I use two G3 systems for two channels and a simple Y-adapter into the Sony A7S and it works great. In the studio / green screen I use a Sound Devices USB Pre2 (also a stand alone preamp, same hardware topology as the 744T) and an Audio Technica BP4029 mid-side stereo shotgun. For ADR in post I use a Shure SM7 mic into a Focusrite 2i2 (amazing quality preamps for the price. Not as good as Sound Devices, but it's a more stable solution as a computer input device (SD made some odd choices for the USBPre2 hardware (as in it never turns off, even when the computer sleeps + some driver issues)).
    Example audio: http://brightland.com/w/delta/ : jump to 6:30 to hear VO (SM7 + Scarlet) and BP4029 + Sound Devices (into a GH4) for prelude to fight scene. Earlier scenes used the G3's into the A7S. All audio sweetening done from within Premiere Pro (used to use Pro Tools, including their crazy expensive hardware) and small amounts of Audition (waveform editing). For shorts/indies/corporate/for-fun, this level of hardware is plenty good!
  11. Like
    jcs reacted to mercer in Audio on GH4 and NX1   
    jcs, I assume that was a link to your work. It sounded good. Interesting green screen shots too. your Hollywood stock footage and Hooray for Hollywood interlude was funny. The narration sounded very nice. 
  12. Like
    jcs got a reaction from arellaTV in Magic Lantern run Linux 3.19 on Canon DSLRs - download source code   
    This
    ​A completely custom GUI to run the camera- with complete control over everything: not running any of Canon's firmware would be possible. Instead of hacking on top of existing firmware/OS, now it's possible to completely take over: potentially higher performance, new functions/features. Loading codecs- not likely however if there's any way to access the custom hardware to perform debayering at a higher quality, then sending to the custom hardware for H.264 encoding at a higher bitrate, it could be possible to have RAW-like resolution with H.264. If possible to access the debayer hardware as well as individual elements of H.264 hardware, such as the DCT, it could be possible to create something compatible with ProRes (not likely the ARM processor can do everything by itself).
  13. Like
    jcs got a reaction from neosushi in EditReady now supports the Samsung NX1 / H.265   
    ​They appear to be using FFMPEG elements for file reading (LGPL libraries vs. FFMPEG itself) and appear to be using AVFoundation (etc.) which is native OSX if using Apple's ProRes for final output. At which point AVFoundation supports H.265 decoding (does it already?), FFMPEG (and libraries) wouldn't be needed at all to transcode H.265 into ProRes. AVFoundation is super easy to code for (especially compared to DirectShow on Windows). If AVFoundation already supports H.265, a custom tool for H.265 to ProRes would be fairly quick & easy to create (a few hours of coding- GUI, etc. would take more time). MainConcept also has decent tools/libraries (possibly much faster than the free/GPL/LGPL stuff), though the licensing cost would up product cost.
  14. Like
    jcs reacted to Julian in New Fast Transcode App with Benchmarking Features   
    Tried some NX1 files (80mbps 4K) with the same settings on my i7 3770 (4 cores 8 threads), 16GB RAM.
    I'm getting about 6.0 MB/s and 16 FPS when converting 4 files at once (I can't set Max Jobs higher than 4).
    It makes sense your 12 core machine is faster, but how come you get dubble the fps but the data rate is not that much different? Does that depend on the source files?
    When using RockyMountain Movie Converter (4K ProRes LT) it says it's working with a bitrate of around 275000 kbps. This translates to around 35MB/s. The 4 files (46 seconds in total) take about 90 seconds to convert.
    With Photon the same job is done in 68 seconds. The resulting files from Photon are a lot smaller though (600MB vs 1,5 GB for RMMC) with those settings.
    Also just noticed that the files from Photon have less contrast - RMMC seems to clip the blacks and whites where Photon still shows detail. Interesting, because the RMMC file looks the same as the original file when played back, but this might be another playback issue with contrast/blacklevels...
  15. Like
    jcs reacted to Geoff CB in New Fast Transcode App with Benchmarking Features   
    Awesome, will be passing this around. Put up a paypal button for those that wish to thank you
     
  16. Like
    jcs reacted to ntblowz in New Fast Transcode App with Benchmarking Features   
    Thanks for sharing!
  17. Like
    jcs reacted to sqm in New Fast Transcode App with Benchmarking Features   
    thanks jcs
    is it windows only?
     
  18. Like
    jcs reacted to nougat in New Fast Transcode App with Benchmarking Features   
    Thanks for creating this app and sharing.  Can't wait to try it out and report back.
  19. Like
    jcs reacted to The Chris in New Fast Transcode App with Benchmarking Features   
    I need to track down a copy of windows, I'd like to try this out.  As always JCS, thanks for sharing your creations with the community.
  20. Like
    jcs reacted to Geoff CB in New Fast Transcode App with Benchmarking Features   
    Loving the colors i'm getting from ProRes Mode 2.. Great work on this.
  21. Like
    jcs got a reaction from arellaTV in New Fast Transcode App with Benchmarking Features   
    I created a fast transcoding app for in-house use and thought it might be useful to others: you can get it here. I added benchmarking features so you can easily see performance differences between codecs and settings: ETA to complete, MB/s, and FPS. With example NX1 files, it can do ~46fps converting 4K H.265 to 1080p ProRes LT (12-Core 2010 Mac Pro running Win7). Sony A7S Mp4 files can be converted to ProRes LT at 192 FPS.  It can rewrap A7S and FS700 files for use with Resolve 11 and audio. Also works well for high-quality GH4 4K to 1080p (Lanczos scaler). 422 10-bit H.264 (XAVC) in both IPB & ALL-I as well as H.265 output also provided for experimental use.
  22. Like
    jcs got a reaction from Xiong in Top Gear - Clarkson contract won't be renewed by BBC. Should there be one rule for talent, one rule for "the rest"?   
    Andrew- as a successful blog owner you have the option to help or to hurt the world with your voice. Clarkson is an alcoholic, an addict. His behavior is irrational and he can't be reasoned with while he's drunk. Even sober, addicts tend to not behave rationally. This doesn't mean they should be isolated, in fact the best way to help a person suffering from addiction is to immerse them in compassionate fellowship. Do you know any addicts? Have you seen any turn their life around, and help others to heal? If not, perhaps attend an Alcoholics Anonymous meeting (or similar) to witness addicts helping each other heal through fellowship (addicts are always addicts and never 'cured'; always mindful to avoid falling into old patterns). Since moving to LA in 2006 I was surprised how pervasive drug, alcohol, and sex addiction is in the entertainment industry. People doing drugs and drinking on set isn't healthy and is unfortunately very common here (not allowed on my sets- what one does on their own time is their own business). Instead of turning a blind eye to addicts on productions, we should provide daily reminders that there is free fellowship available to help people deal with life (such as AA). Even better, entertainment companies should provide in-house help to encourage people to live healthy, drug-free lives through fellowship. The BBC did the right thing in letting Clarkson go. Clarkson needs change in his life to give him a chance to deal with his addictions, ideally through positive connections to other people through fellowship.
    Discussion addiction on a filmmaking blog is totally appropriate. If the film industry can help heal its players, then it can help create messages and positive influence to help millions of people suffering in our world.
  23. Like
    jcs got a reaction from SleepyWill in New Fast Transcode App with Benchmarking Features   
    I created a fast transcoding app for in-house use and thought it might be useful to others: you can get it here. I added benchmarking features so you can easily see performance differences between codecs and settings: ETA to complete, MB/s, and FPS. With example NX1 files, it can do ~46fps converting 4K H.265 to 1080p ProRes LT (12-Core 2010 Mac Pro running Win7). Sony A7S Mp4 files can be converted to ProRes LT at 192 FPS.  It can rewrap A7S and FS700 files for use with Resolve 11 and audio. Also works well for high-quality GH4 4K to 1080p (Lanczos scaler). 422 10-bit H.264 (XAVC) in both IPB & ALL-I as well as H.265 output also provided for experimental use.
  24. Like
    jcs reacted to Ed_David in Red Weapon Top of Food Chain???   
    ​I have heard that one so many times.
    It was the Red Epic.
    then the red dragon.
    then the blackmagic cinema camera.
    once a camera is announced, then it takes about 12 months for the firmware to get the camera up to to speed - so it's about a 2 year process.
    so why not just own an older camera that has all its problems figured out.  and then get that brand new camera one year later, used, for cheaper, with its problems figured out.  Not just be a beta tester for something that isn't perfect.
     
  25. Like
    jcs got a reaction from Ed_David in Red Weapon Top of Food Chain???   
    Part of ​ARRI's secret sauce is their camera ergonomics and menu system: very easy to use and widely praised. Compare the Red menu system: great for technophiles, but with so much complexity, it's not easy to use. I've seen very experienced Red operators hunt around for menu settings. These seconds and minutes add up during production while everyone waits- time is money and it also gets on the cast & crews nerves waiting for something that really shouldn't require waiting. Capturing a scene without highlight or shadow clipping with maximum flexibility in post is the most efficient workflow: ARRI is currently the best. ARRI's ProRes files easily compete with Red's raw files and are much faster & easier to work with in post. ARRI now provides 50Mbps 422 MPEG2 for broadcast work! Smaller files are faster to work with and cheaper in the long run. Sony's XAVC (H.264 422 10-bit) available from the FS7 and up is very efficient and useful. H.264 with 444 12-bit would be another useful option. H.265 provides twice the efficiency of H.264 and can also support 444 and 10+ bits. Even without using a GPU, current H.265 decoders easily run much faster than real-time on current computers. The trend is clear, even at the high end, compressed codecs are replacing raw (which is Red's case is lightly wavelet compressed Bayer data).
    In terms of image quality vs. resolution, for Skyfall's 4K they could have shot Red 5K and scaled down, instead they shot on Alexa and scaled up from ~3K. ARRI has years of experience with film cameras and digital film scanners: their cameras produce the most film-like images possible. That said, the F65 is less film like and more... something else- addictive color for the eyes, perhaps different in the way Technicolor 3 strip was compared to Kodak film:

    The F65 is of course different from Technicolor 3 strip; the point is the color rendering makes you stare at the beauty of it. ARRI, Canon, Red, Sony, Panasonic (even the NX1!) can produce this kind of color, however Oblivion & Lucy (even After Earth's F65 shots) look different than other cameras (perhaps not better if wanting a film look, but the color is amazingly addictive).
×
×
  • Create New...