Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 09/25/2013 in all areas

  1. I was going to ask about a wider and a longer addition to the FF lineup.  It's very cool that these are in the works.   These seem like the perfect answer to folks trying to do anamorphic with full-frame sensors and beating their heads against a wall.
    2 points
  2. I'd love to hear some feedback on the newest development from DSO..    A streak filter for TRUMP, to work in partnership with oval apertures to get as close as possible to a convincing anamorphic look, without the complicated cylindrical elements.     http://vimeo.com/75175593     Hans_Punk and myself have been beavering away in the development of this, and know there are a few weak points, but our intention is to create something usable, while maintaining the sharpness and ease of use by keeping things spherical and resorting to the dreaded crop:)   Priority is to nail the TRUMP module, but ultimately this technique will see integration into the FF58 and upcoming FF35mm and FF90mm     Feel free to be nasty!    
    1 point
  3. You are almost certainly panning too fast. It's really surprising just how slow you need to do it. If you can't pan slowly enough it's better to do a whip pan as the brain accepts that better than the jitters of a fast pan.
    1 point
  4. Make sure that the image stabilization is turned OFF on your lens.   But I suspect more that it's a side effect of shooting at 24 frames per second. If an object takes less than about seven seconds to pass from one side of the screen to the other, then you are likely to experience motion judder when viewing on a 60hz monitor. It is one of the limitations of that frame rate.   If you really need smooth motion, you may find that 30 and 60 frames (especially 60 frames) per second are better for your purposes.
    1 point
  5. What's being projected is (mostly) CG, it's just not done through post production techniques.  The system knowing the position and orientation of the robotic arm at all times relative to the camera POV (like how the realtime line of scrimmage and yards-to-go lines are created for live sporting event broadcast) makes the orientation of projection surfaces easier to derive compared to having to track them in post.    You could, additionally, sync a second motion controlled camera to photograph a completely different live scene, instead of using pre-rendered or on-the-fly CGI, to project onto the surfaces.  After 5th Element we built a system within Houdini to interface to the Digital Domain motion-control stage that allowed animators to design physically accurate moves for motion control miniatures rather than the screen-based methodology that was standard to the industry that often introduced "skating" and other kinds of motion or performance artifacts that have plagued flying miniatures for decades (though miniatures were phased out shortly afterwards so it was a neat proof of concept that never really got used).  This adds a projection component to that sort of realtime telemetry and view-dependent spacial awareness.   I didn't see specific info on the website or VIMEO but I'm wondering if the projector is on-axis with the composite camera via beam splitter or if they're pre-distorting the projection so that the projector can be fixed, up out of frame and not attached to the composite camera.     It's neat looking, however it's done.
    1 point
  6.   Yes, please!  Where can I sign up for the FF35mm and FF90mm?
    1 point
  7. HurtinMinorKey

    Camera/Equipment Bag?

    I have a LowPro Flipside 500, which is huge, and I love it, but I'm not sure It's big enough for all of your gear. It fits a BMCC with it's Switronix battery pack, a 5D3 and 6 lenses.   Sounds like a lot of gear, you might consider...
    1 point
  8. Probably indeed a problem with the hack.       So it looks the camera was perhaps put off after a longer clip at a high data rate, and the index was not finished.   After a long recording, make a very short recording (just 1 or 2 seconds).   Probably QT will not open the file. I once had such a clip (with FCP 7 then). Premiere also didn't recognize the clip. But Toast did. Open Toast in the media conversion window (or so, I have a german version). Drag & drop your clip into it, export as QT, ProRes (gear icon lets you set export codecs). My problem was, I could only export the audio as an extra clip (make it Aiff) and had to synchronize both in the NLE.    If this doesn't work, either upgrade to 10.8 (AVCHD is played back by QT without conversion, the upgrade is safe) - but I bet it won't work either, because still OSX doesn't like .mts without or with wrong metadata.   Or take your card to a windows computer, where you export both clips with highest quality settings of H.264.
    1 point
  9. i am restarting my anamorphic project anyone with a blackmagic camera based in london that wants to meet and give me some feedback that would be great. it would be handy if you had played around with scope lens before. please ping me a message if interested cheers.
    1 point
  10. https://vimeo.com/71812380     We are very thankfull for John Brawley's first ProRes Files of the Black Magic Pocket Camera. We have decided to use Final Cut X for grading because we want to know how to get some useful pictures in a short time without using a color grading software like DaVinci. Lenses used: SLR Magic 35mm F1,4 Olympus 14-35mm F2,0 Olympus 7-14mm more info mindcutfilms.com/blog  
    1 point
×
×
  • Create New...