Dave Maze Posted September 24, 2017 Share Posted September 24, 2017 With iOS 11 Apple opened up the dual camera depth data API to developers. I’ve been using the built in portrait mode on my iPhone 7 Plus for a while now and love it. However, today I downloaded a new app that takes full advantage of the depth sensing technology and it’s blowing my mind!! The app is called Anamorphic and it was $2.99 and it may be the best app I’ve ever purchased. The app allows you to take an image, and then adjust where you want the blur to happen in 3D space. You can then feather the blur in the same way you would feather a Luma Key, but it feathers it in Z space. It then applies the most realistic and gorgeous anamorphic bokeh and aberration. Export it out and put VSCO Cam on it and it looks like you took an image with a film camera with a T1.4 vintage Cooke anamorphic lens on it!!! Just testing this app out today has made me giddy and it really feels like magic when you use it. I’m so excited to experiment more with this and even more excited about getting an iPhone X with the faster telephoto lens and OIS! This excites me about the future of cinematography. It’s going to be computational! Imagine shooting on a $100,000 cinema lens that has been profiled that you can simulate in post. Lytro is working on it on the high end, but the fact that my tiny cheap iPhone can do it blows my mind Below you’ll see the normal image that was taken and then the “Anamorphic” with VSCO one. Download the app here: https://itunes.apple.com/us/app/anamorphic/id1247287369?mt=8 I would love for this thread to become a place where we all can share our shots taken with computational photography methods if the admins allow it. If you have an iPhone 7 Plus or 8 now, get the anamorphic app and start shooting!! jcs and Inazuma 2 Quote Link to comment Share on other sites More sharing options...
AaronChicago Posted September 24, 2017 Share Posted September 24, 2017 That's pretty cool. I didn't realize third parties were utilizing the depth map. Quote Link to comment Share on other sites More sharing options...
tellure Posted September 24, 2017 Share Posted September 24, 2017 Pretty impressive. The edges still look a bit weird compared to real DOF (e.g. the slight halo around that lady's forehead), but it's impressive how far this tech has come in a short time. Also amazing to think about adding DOF as an aesthetic choice in post. If they can do it to stills now then doing it to video isn't that far off. Quote Link to comment Share on other sites More sharing options...
Administrators Andrew Reid Posted September 24, 2017 Administrators Share Posted September 24, 2017 A glimpse into the future. This will really influence the look of lenses a lot in 10 years. Smartphones were the first! The adjusted stills are a big leap forwards in your examples. Quote Link to comment Share on other sites More sharing options...
jcs Posted September 25, 2017 Share Posted September 25, 2017 Computational cameras come up every now and then: We can compute depth data from multiple cameras, from depth sensors (iPhone X), or both. Computing depth from multiple cameras is computationally expensive (though probably not a big deal for modern phone GPUs and just 2 camears), and the iPhone X (IR hardware depth sensor, from same company who built the Kinect for XBox) doing real-time background subtraction / 'segmentation' / background replacement without a green screen in real-time is pretty cool. IR depth sensors have had trouble in sunlight in the past, curious to see how much they have improved with the iPhone X IR depth sensor. Once you have clean, high-quality depth data, a small sensor camera can then be used to simulate pretty much whatever you want in software, and with modern GPUs, many effects will be possible in real-time, including with video! When the depth data is made available for NLEs (someday in the future for sure), we'll be able to set focus and lens simulations in post. Quote Link to comment Share on other sites More sharing options...
tellure Posted September 25, 2017 Share Posted September 25, 2017 It's cool to think about all the possibilities this post-process DOF/focus future offers for video. Like doing a rack focus shot entirely in post. Or simulating DOF that goes beyond what physical lenses offer, like a 16mm f/1.0 on full-frame. Quote Link to comment Share on other sites More sharing options...
JurijTurnsek Posted September 25, 2017 Share Posted September 25, 2017 So, ideally we could have a small sensor with huge DOF and depth perception to make it variable, but then you still have the issue of low light and different focal lengths. Some phone manufacturer should implement two periscope zoom camera modules and implement depth perception so that a smooth zoom would be possible.I wonder why the light L16 didn't just cram 4 of these in their design. Quote Link to comment Share on other sites More sharing options...
ntblowz Posted September 26, 2017 Share Posted September 26, 2017 Just bought the app, it is really cool! Dave Maze 1 Quote Link to comment Share on other sites More sharing options...
Dave Maze Posted October 10, 2017 Author Share Posted October 10, 2017 Just did an extensive review/ tutorial using the Anamorphic app! Had fun making this Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.