Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 07/01/2016 in all areas

  1. Apologies in advance if this is widely known. Personally, I've never found a really good explanation of why front anamorphs produce oval bokeh and rear anamorphs don't, despite reading my fair share of patents, technical papers, internet gossip and the like. Feeling that my own understanding needed some firming up I finally set up some paraxial models and went through the math in gory detail. It all boils down to how front and rear converters alter (or don't alter) the f/#, and basic DOF type circle of confusion calculations. It has nothing to do with higher order aberrations, or the shape of the front lens, or various mechanical aspects of the lens. Briefly: 1) A front anamorph is just a special case of a front afocal attachment, and as a result it preserves the f/# of the lens its attached to. With an anamorphic front lens the focal length is shorter in the powered axis than in the non-powered axis. For example, consider a 2:1 anamorph attached to a 100mm f/2 spherical lens. In this case the net focal length is 50mm in the powered axis and 100mm in the non-powered axis, but in both cases the aperture remains f/2. If you venture into the weeds to do circle of confusion calculations for a given object-space defocus you discover that a de-focused point source evaluated at the image plane is an ellipse with an aspect ratio of 4:1. However, you only need to de-squeeze the image by 2x to correct the in-focus geometry, so you are left with de-focused ellipses with an aspect ratio of 2:1. 2) A rear anamorph is just a special case of a rear-mounted teleconverter, and as a result it *does not* preserve the f/# of the lens its attached to. In particular, in the powered axis the aperture becomes slower. For example, consider an 50mm f/2 spherical lens with a 2x rear anamorph. Here the net focal length is 100mm in the powered axis, but the aperture has dropped to f/4, and is still 50mm f/2 in the non-powered axis. When you do the circle of confusion calculations with object-space defocus you find the on-sensor defocused image to be an ellipse with a 2:1 aspect ratio. When you desqueeze by 2x this defocus ellipse becomes a perfect circle. Bottom Line: Rear anamorphs have circular bokeh because they *don't* preserve the f/# of the spherical lens in both axes, while front anamorphs have elliptical bokeh because they *do* preserve the f/# of the spherical lens in both axes.
    6 points
  2. Ok, a review unit dropped in my mail today. But first let me say this: I know diddely squat about gimbals. I really don't pay any attention to them. In my work world a gimbal shot is like a drone shot or under water, in other words, something special that you use for a purpose and then you rent or borrow the gear + crew that are good at it. In my hobby and youtube world Im far to in love with proper camera shakes that I dont want to get rid of them. Of my +200 youtube clips 0 are done with a gimbal and maybe 2 clips in total have stabilization in post. Moving on. With that little disclaimer out of the way, I will know share my experience with this thing and be as honest as I can. And also do the best I can to give it an honest chance and learn how to use it. Even though I, at this time, dont feel any need to have one of these. Would be even more fun then if I turn out to really like using it. And YOU can contribute with questions that you want me to find out and that can be used in a review, since I dont really know what people want to know. First Impression It comes in a normal box, all the specs and such is written in english. So is the manual. Rather surprising is the nice and sturdy "pellicase" that all the stuff is well secured and foamed in. I have roll of T-Max in a stand develop so I might not be able to do more than look at it today. Also my BMPCC isn't here today and I think my GM1 is to light. The general feel is that its well made. It feels sturdy, all the joints are smooth, mostly metal and the plastic also feels nice. Here is some info: Weight without battery: 950g Payload Minimum: 350g Maximum: 1200g Battery run-time: 6-12h 360 degrees on Tilt, Roll and Pan. It can apparently feel every 0.02 degree change interval (what ever that is). Claim to be the first handheld gimbal with CCI (camera control interface) to control focus and shutter. Tool-less. Can be controlled via smartphone. First with three 32-bit MCUs running parallel at 4k hertz, "The number is far beyond reach of any other gimbal" (sounds pretty cool I guess, a bit nerdy but cool ) Lastly some other long paragraph about degrees ad milliseconds... sounds swell. Thats it, need to rinse my film. Will update as soon as I have it running and maybe a test shot.
    2 points
  3. I am quite curious about this one, specs seem great for a hybrid. Who knows maybe it turns out great for low budget video, just hope that it isn't cropped in 4K mode and have decent DR. Unfortunately it seems that the grip is required to shoot 30min of 4K. http://www.fujirumors.com/fujifilm-x-t2-full-specs-images-leaked/ – 24.3MP CMOS sensor – Image processing engine X Processor Pro – Video 4K30fps. Full HD60fps – Video of the bit rate is 100Mbps – The maximum speed of the shutter is 1/8000 seconds. 1/32000 sec electronic shutter – Synchro speed of the flash is 1/250 seconds – AF 325 points can be selected intelligent hybrid phase difference AF – The body is in a magnesium alloy, improve the durability of the dial. Dust and water specification – EVF is 2.36 million dots. Refresh rate 100fps – 3-way tilting 3 inches 1.62 million dot LCD monitor – Dual SD card slot. UHS-II support – AF-C custom settings – ISO range is ISO200-12800 (RAW) – Wi-Fi built-in. Remote shooting correspondence. Instax Share printer support – 16 kinds of film simulation modes, including the ACROS – That can customize the settings screen “My Menu” – Normal / boost mode switching – 14bit RAW lossless compression. Camera RAW development – Exposure compensation +/- 5EV – 256-segment metering (multi-spot average, spot, center-weighted) – Interval Shooting – 13 kinds of creative filter – The battery is NP-W126S. The number of possible shots 350 sheets – 3.5mm microphone jack. 2.5mm remote terminal. USB3.0 terminal. Micro HDMI terminal
    2 points
  4. I'm always trying to get the Super 8 look, and this thread motivated me to try again. This is my bmpcc treated to look like Super 8. I want to post footage, but for some reason, a lot of my gorilla grain files cause red flash frames when composited over the footage. Fine grain works, but anything heavier has this problem. I'm pretty satisfied with this look. It's the closest I've gotten I think.
    2 points
  5. I bought an old Panasonic SDX900, it's the dvx100's big brother. I thought maybe SD had something interesting about it - lacking resolution and dynamic range. I tested it quickly against my Sony F65 going to 8mm or 16mm, and it seems to have its own weird unique feel that's kind of interesting. Anyway let me know what you think - post workflow I can post as well how I kind of "destroyed" the image too.
    1 point
  6. Finally, It was long and painful but I made it after 2 years of work. Here is my Miami video. Enjoy Unlike cities such as New York, Paris or Dubai, no one ever did a real hyperlapse video of Miami so I had to fix this. I started this project 2 years ago with hyperlapse experimentation. My first attempts were very bad and most sequence went to the trash bin. It took me a while to capture the hyperlapse sequence correctly (must be very very accurate) and then do the post-stabiliation frame by frame. Hyperlapse is very time consuming. In average, one second of video takes 1 or 2 hours of work and I don't even count the failed attempts (either I fucked up, light was wrong or something/someone messed with my sequence on site). Techniques: The video is 75% hyperlapse, 5% timelapse and 20% drone and aerial (rented plane and helicopter). I had a massive volume of video and photo (RAW of course) and I only used the tip of the iceberg. I failed many times during my learning process but I now master the most complex type of sequence such as HDR Hyperlapse of Holy Grail Hyperlapse. Gears and software: Here is what I used: ► Hardware: - DJI Phantom 3 Pro with Mars Lite & Mayday Board - TBS Discovery Pro with custom GoPro 4 lens - Polar Pro Filters - Dynamic Perception Stage One & Stage R with NMX controller - Panasonic GH4, Lumix 12-35 f/2.8, Canon FD 50mm f/1.4 and custom made gimbal (Alexmos) - Canon 6D, Samyang 14mm f/2.8, Sigma 24 & 35mm f/1.4 Art, Canon EF 50 f/1.4, Canon EF 24-105mm f/4L, - Cessna 172 & Bell-206 ► Software: - Adobe Premiere & After Effects - Adobe Lightroom - LRTimelapse 4 - SNS-HDR Pro How many pictures ? Honestly I have no idea of the number of pictures taken and I find this point completely irrelevant. The number of shots and TB of hard drive is not a good metric and does not give any indication regarding the quality of the final video. What I can tell is that I finally achieved a good ratio of shooting attempts / keeper sequence. When I started hyperlapse I had to discard 80% of my sequence, now I can keep 70-80% of my clips. I also have a lot of unused sequence that I shot but didn't make it to the final video. I prefer to squeeze the best out of my project and not fall into the trap of "clips stacking" for the sake of it. This video is already long with its 4 minutes duration. Aerial Shots & Safety I used a DJI Phantom 3 Pro for the aerial shots along with a TBS Discovery Pro fitted with a GoPro 4 black. I modded the lens in order to get a longer focal. Some shots were taken from airplane (C-172) and I also rented a Helicopter (Bell-206). In terms of safety, the drone shots were line of sight only and below 400 feet following the AMA & FAA guidlines. 90% of the drone flight occurred over the water even if it's not visible because of the framing. For the remaining 10%, I flew over parks and empty area with the Phantom. I follow a precise scouting of the place before each flight and perform a thorough checklist. And because shit happens, I also installed a Mars Lite parachute with a North UAV Mayday board (special thanks to Kyle) on the Phantom in order to prevent any damage/injury is something goes wrong. Last, I had to notify the airport manager in some places before my flights and I stayed away from Class B airspace of Miami (I couldn't get clearance despite my request). Special Thanks and Credits: ► Intro & Opening tittle by michaelcparadise.com/ ► Music: Daft Punk - Derezzed (The Glitch Mob Remix) theglitchmob.com/ facebook.com/theglitchmobmusic instagram.com/theglitchmob/ twitter.com/theglitchmob soundcloud.com/theglitchmob ► Special thanks: It would be hard to detail the list of all the people involved in this video but here are the main one who inspired or helped me to make this project: Artem Pryadko / Zweizwei, Aaron Priest, Dimid, Gunther creator of LRTimelapse, Jeff Colhoun, Jay Burlage & Dynamic Perception, Marco from Timelapse Network, Team BlackSheep, Aufmschlau, b-zOOmi, Keith Loutit, Dustin Farrell, Rob Whitworth, Dominic Boudreault, Michelle, Guille, Mariana, etc. You can also follow me at: facebook.com/Oliver-KMIA-1622032868057530 instagram.com/oliverkmia/ twitter.com/OliverKMIA Some behind the scene photo
    1 point
  7. http://www.dxomark.com/Cameras/Canon/EOS-1D-X-Mark-II That's a pretty good sensor improvement... I know some Sony/Nikon guys will have a little chuckle, but regardless, 13.5 stops is a beautiful range to work with and Canon tend to have a nice DR range across the ISOs. Let's hope Canon release a 1DC ii that can record the full tonal range, or that the 5D iv has a similar sensor and gets a little ML lovin! Imagine that sensor with raw video.
    1 point
  8. Given the a6300's rolling shutter, it'll be interesting to see what Fuji can do. Love Fuji's lens lineup, the XT2 may move me away from Sony.
    1 point
  9. Sounds interesting. Personally I couldn't care less about AF. So the question for me is all about the DR at this point.
    1 point
  10. Yeah man, I was really loving the the dailies that I was seeing. Can't wait to get into grading them. I wish the mark ii was cheaper but having a NX1 for slow mo I think I could be fine with a C100.
    1 point
  11. You really need to stabilize this camera. Tightly control its movement, or limit it to slow, deliberate moves. Perhaps ironically, I think it's a terrible run and gun camera, despite its small size, for those reasons. I'd put it on sticks, even. I'd focus on framing and composition to help tell the story, like "Ida."
    1 point
  12. Generally, zoom lens has disadvantage, IMO. I used 70-210 with Elmoscope II on FF camera Need at least 100mm to get rid of vignettes. And at 210mm end, the IQ is poorer too. Usually, fixed focal prime lens filter thread size is smaller then zoom lens with the same aperture. Or other words, with the same size filter thread, prime lens has larger aperture to use.
    1 point
  13. http://www.fcp.co/final-cut-pro/articles/1830-hollywood-veteran-lance-bachelder-explains-why-he-has-chosen-to-use-final-cut-pro-x-on-his-latest-feature-film-saved-by-grace Read under the images of inspire one...it seems that he has used a Nx1 for b-roll...Red Dragon, Inspire one and....NX1..:)
    1 point
  14. It gets interesting at the 1:22 mark. They show multiple times in the video people playing with zoom rings while using the gimbal. I hope that's not by accident. That is impossible to do on my Pilotfly H1+. I hope someone gets to review it because it seems like the stabilization market is already pretty well established. Below $600 on ebay, not a bad price but it's not gonna be a game changer either. 2 18650 Li-ion batteries like the Came-TV Single. That handle looks obnoxiously long, but then again, maybe not a bad thing. It's not like I carry the Pilotfly in my pocket. I own a Zhiyun Z1 Smooth C, which is a sub $200 gimbal that works well for my Galaxy Note 4/5 and stabilizes smartphones just as well as my Pilotfly H1+ stabilizes my cameras. Edit: Oh just a warning before you buy though. I'm guessing it's not compatible with SimpleBCG which is powerful for fine tuning. The Zhiyun software for the smartphone is very very basic. Just does firmware upgrades and calibration.
    1 point
  15. color is certainly great on Fujis
    1 point
  16. It's a business, they all do this... Apple, Ford... everyone. Bodies are not investments in hardware, they are investments in your talent. If you want a healthy return on that investment you need to produce paying work.
    1 point
  17. 1 point
  18. I'd say a used C300 or a c100 mk II with external recorder. C300 with autofocus upgrade would be best but I think either one would be fine. There are both work horse cameras that will never let you down. Plus you can't beat that canon look.
    1 point
  19. Just posted this on another thread: Just spotted a deal for my US chums..... C300 mk II is now only $10,640 , cheaper even than B&H ($11,999) http://cvp.com/index.php?t=product/canon_c300_mark_ii_4k_video_camera
    1 point
  20. For anyone looking to buy a softbox for the LS1, this one is amazing. It collapses into an 8" circle (similar to the big round reflectors). Takes about 30 seconds to unzip, open, snap the magnetic frame, and velcro to the light. http://www.bhphotovideo.com/bnh/controller/home?O=&sku=1258825&gclid=CNnOwb_90c0CFQ-raQodPRgLwA&is=REG&ap=y&c3api=1876%2C52934714882%2C&A=details&Q=
    1 point
  21. Erik Naso found that when he switched to faster memory cards, many of the heating issues were improved.
    1 point
  22. Really I could care less about the numbers. My 1DX Mkii blows my Sony A7Rii out of the water... period. In every respect it's in another league. Anyone that doubts this just needs to do a back to back test of both cameras. And if you haven't done a side by side comparison, don't bother chiming in, you simply cannot appreciate the difference until you see them both head to head.
    1 point
  23. I've done something similar. Had an old Sony Handycam Hi8 camera from my childhood lying around at the back of my shelf, and one day decided to see if it still works. Much to my surprise, it did, and even the batteries were functional. Those things were built like tanks, I abused the hell out of it in my teen years shooting backyard movies, and over a decade later it works like a charm. So I fitted it with a cheap 35mm adapter, ran a few tests and then went out to shoot a small art short. Since the 35mm filter was mirrorless, I was practically shooting blind - the loop displayed a mirrored, upside down image, which really messes with your brain when you're trying to operate. Nevertheless, I'm happy with what we got. We dubbed it our punk movie, because it has that look of cheap, grainy 16mm film from old underground stuff.
    1 point
  24. Other manufacturers have had higher DR than Canon for many years already, but it seems like Canon is catching up. This seems like good news for anyone using Canon cameras - as soon as they add this to other cameras down the line. The sensor for 1DX II has around 2/3-1 EV steps better DR at each ISO, compared to 5dmk3. Click Measurements and Dynamic Range: http://www.dxomark.com/Cameras/Compare/Side-by-side/Canon--EOS-1D-X-Mark-II-versus-Canon-EOS-5D-Mark-III___1071_795
    1 point
  25. 1 point
  26. Ricardo is always posting such gorgeous framegrabs in this thread, so I wanted to give his settings a try for myself on my latest project... I'm really liking the way these are looking. (Graded with DeLuts and filmconvert)
    1 point
  27. Fuji would be the last company to buy Digital Bolex because the sensor inside is a Kodak CCD - likewise as with the Ikonoskop. From back when Kodak were frantically selling off patents and IP to keep them afloat before they mismanaged themselves into oblivion. When I worked for them, I actually witnessed a Kodak manager tell me that digital would never replace consumer and professional film cameras...this was while the demolition crew were surveying the factory floor of the film processing facility.
    1 point
  28. jcs

    Resolve 12.5 Is Out

    @cantsin as a real-time game developer and video/image developer, if I can decode HD video, perform complex GPU effects and multilayer compositing, along with multiple DSP audio effects, and save the result to HD H.264, all in realtime, any desktop app can easily do all those things and more, especially with a GTX 980ti with 6GB of RAM. How do I know? Because in the above example, I can do all those things on an iPhone! Pretty much everything can be done on the GPU now- 10 or more layers of 4K are easily possible. Note that as nodes are added in Resolve, it doesn't slow down much, as node effects run on GPUs. The bottleneck appears to be video caching and IO, along with audio sync and timing issues (why does everyone have trouble with this? (FCPX does pretty well here)). In summary, there's no technical limitation to performance for any of these desktop apps. The limitations are from antiquated software design which cannot fully utilize the amazing CPU and GPU power available today. These companies should hire game developers to rewrite their graphics engines. Pretty surprising that FCPX beats PP CC and Resolve, which both use GPUs, by a long way for basic real-time editing.
    1 point
×
×
  • Create New...