Jump to content

Axel

Members
  • Posts

    1,900
  • Joined

  • Last visited

Everything posted by Axel

  1. Axel

    GH3 short film

    The story is not so interesting, but the photography and the editing are really very, very good. Thumbs up!
  2. The model for the remote follow focus for a steadicam would be the intuitfocus: http://vimeo.com/9827192   There had been attempts to DIY sth. like this. I followed this video: http://www.youtube.com/watch?v=D3-P1rKCEEg (but didn't finish it, I just can confirm the low prices for the parts)   Parts are cheap (~20 bucks all together). I like the whole 'rig' design in this video. Perfect for an EOS with Magic Lantern.   You must know: These cheap servo testers   > allow a rotation angle of 90°. So you can only follow part of your lenses' focal range. I.e. the Nokton 25mm rotates 270° to go from 0,17 m to ∞. But: Only about 60° cover 1m to ∞.   > will translate the position of the controller knob exactly to the motor, but there is no ease-in, ease-out, like with the zoom lever on a modern camcorder, so slow focus transitions are not possible.   There are digitally controlled servers available, but they are expensive. Ask at your modeling shop.
  3. By increasing contrast you will get a more 'punchy' image, as many tutorials put it, but if 'overcast' also means slightly hazy, foggy, misty (choose the right word, I often don't feel the subtle differences) you would take away the mood of that day with it's diffused light. Also, low saturation could be the pastel-colors caused by light absorption. A too punchy image would take away the depth (contrast and saturation fade with greater distances). A master colorist will know the right dose here.   On the other hand, an image that's balanced for contrast will look sharper. It can be further sharpened by introducing (or enhancing) color-contrast: If the lights are warm, the shadows should have the complementary color cast (3-way CC-tools). Usually Cyan. You can use this only, if there is no clearly predominating light like candle light on X-mas. At a sunset, the shadow color must be blue, because the sunlight is orange (color temperature low, about 2000°K) and the sky is deep blue (over 20.000 °K).    To increase the saturation of 'Blue' equals lifting the values of the blue channel of RGB. This should be a curve, because the shadows in the RGB model mean low (between ~10-30 percent) or no (0 percent) saturation.   The tools are RGB-curves (experiment with them) and the 3-way-CC.   There is a temptation to do too much. Experienced colorists add subtle changes in reproducible and reversible steps. They allow their eyes pauses, they work with the unerring scopes, asf.         The sky probably is evenly bright grey, a combination of hue, saturation and luminance with little tolerances that elsewhere in the image only occurs in a few contemporarily visible pixels in the waves (since the water reflects the sky). You could sample the HSL-values of the waves with the pipette tool of the secondary color correction (I can't tell you where in Premiere, cause when in CS5.5, I use Color Finesse or Colorista) and exclude those very few of the sky. You could fill the holes by feathering the matte or mask (depending on whether you want to change the selection itself - aka 'inside' - or the inverted selection - aka 'outside'). Instead of 'tinting' the waves blue, you could enhance their natural saturation. In a second step, you could select the sky and very carefully color it. This will only work if you exposed correctly and the sky doesn't clip. With one RGB value cut off during recording, every change you apply in post will look weird.
  4. I was trying to say that the depth of field needs to cover every motif on your steady path, and the smaller the aperture, the deeper the field. You probably can't use your highspeed lenses wider than f4. You should check in advance if you are happy with that look.
  5. Of your lenses, one would only use the SLR 12mm. Because it has no AF, you need to find the aperture/focus combinations that practically make it a fix focus lens (you know that you can't focus during a shot?). Use this online-DoF-calculator. Before you buy a bigger steadicam for the heavier lens, make a test at the calculated apertures. I didn't calculate myself, but I'd say you need f4 or so.    A DoF calculator is perhaps the one 'filmmaking app' (available for OiS and Android) that you actually work with on set. 
  6.   The difference of 2,35 to 2,39 is about the width of these framing guides:
  7. No ambition. These three geniuses make me know my place ;-)
  8. There was the 1/72 shutter they recorded with (afaik) as a compromise between 1/48 and 1/96,  too high to render sufficient motion blur without 48 fps in 3D. Was it BD? 
  9. @Mobillaire   Surely Kubrick as well as Scott would have postponed their projects and tried to come up with something better, had they only seen this ;-)
  10. For the GH3, Blanches viewfinder solution is okay, but imho not for the GH2, because it's display is not good enough to find focus. You'd just collect more useless gadgets (like your toy mattebox, which even doesn't seem to take filters). Yesterday I read a review in a german cinematographers magazine of a bunch of the 'best' DSLR rigs (Chrosziel DV Studio, 2200 €, Vocas DSLR pro, 3500 €, Redrock eyespy deluxe, 1400 €, Kinomatik Movietube, 4000 €), and the verdict was, they all had to be rebuilt to work ergonomically with the particular DSLR and for the camera operator. Only the Movietube had the counterweight right. Even the pro admitted that he would rather buy parts and build an own rig according to his needs. I promised to post a DIY big comfortable eyepiece, and I will. I've been testing materials and improving my prototypes, and in a few days I will present a photo tutorial with detailed instructions.  
  11. Axel

    Mobile Filmmaking

    There are no excuses. It's not the camera. They made a scope action movie with a Nokia. http://m.youtube.com/watch?v=PVISpc7EDEk
  12.   That was fun, I didn't know.   It reminds me, seemingly somewhat far-fetched, of this open letter to James Cameron. His Avatar did win the VFX Oscar. The film paraphrases the injustice 'white man' did to the 'native americans'. It shows and condemns the greed and low values of our culture. And yet, the very individuals working hard to make this film the first rate visual experience (and box office success) it was, were treated like the poor seamstresses in the beginning of the industrial revolution. And Cameron remained silent. A shame.
  13. Of course all the real Kubrick fans know this story, which first was published a year earlier in Playboy with the subtitle 'Imagine a mindfuck in four dimensions'.   It is more funny than Michael Herrs short biography (also included in The Stanley Kubrick Archives, minus the chapter on Eyes Wide Shut ), which also appeared in excerpts in Vanity Fair 1999, here. Good read.
  14. What VFX slaves tell each other in a bar may be gossip. There may be films in which VFX supervisors instruct the camera operators and gaffers to finish their compositional elements to this or that specifications, reducing them to uncreative technicians. As I tried to say earlier, these VFX people then follow the advise of the production designer, whose work had been approved by the director, and so, whereas they indeed did the finishing of the image as seen by the audience, their contributions were reproductional as well. We simply can't tell how much the DOPs work in Pi shaped the final look. Imho if the real scenes blend perfectly in the CGI work, that speaks for successful teamwork and is an achievement worth acknowledging.
  15. A Thunderbolt Raid for an iMac is of course not cheap, but perhaps the most sophisticated solution. As described above, for original media, perhaps the much bigger optimized media and a good portion of render files, you end up with a lot of redundant data. With roughly 1 TB footage, I would figure 10 TB to be sufficient. 
  16. I am very impressed. Would be perfectly happy to be able to show something as good as that one day. 
  17. Axel

    Grey imports

      'That photo of the flag looks like photoshop'. 'It is, Qasim. I have no Union Jack. But you know what, you're right. I apply one, two filters, like so ...' 'Now it looks crumpled.' 'But more real.' 'As if dragged out of a cow's arse.' 'Nobody would fake that. So it looks real.'
  18.   In 2046, if cinema is to survive, 'cinematography' will define something completely different. The word, as it is used by the americans, isn't too correct anyway. Literally it means 'drawing or recording movements'. Yes, the french pioneers who made the first 'film' (material: paper), just advanced an older technique called chronophotography, in which the word photo dominates, meaning light. But early on, it was clear that film as a new technical and cultural phenomenon was successful not because it was lit in any defining way, but because it allowed to manipulate time. Take this one from 1898: http://www.youtube.com/watch?v=8oFnOAnL8Ss   'Cinematography' is synonymous with 'filmmaking'. Including the cheapest tricks and the costliest.   In In The Mood For Love, there are other major contributors to the look and feel than just Doyle: the director, the composer, the actors, and, quite humble in the background, the production designer and costume designer and editor and associate producer, all 4 in one person.   http://www.youtube.com/watch?v=ypY9OaKCfRU   Wouldn't you agree, that however masterfully and tastefully Doyle lit and photographed this film (in a way the bible for the DSLR-aesthetic-connoisseurs), he doesn't deserve to be called it's cinematographer, in the original meaning of this word. He is a cinema-photographer, skilled and with a cognizant style. But that's it.   Would William Chang, who painted the drafts, collected the props, searched the locations, defined the colors of the backgrounds and costumes (a.s.f.) feel cheated, if anyone said it was Doyle who created the look of the film?   No, because they worked together. It must have been a creative decision of this small team.    If in Pi Ang Lee thought that he only needed an experienced blue screen camera operator (in german, DOP is simply credited as 'Kamera'), he could have hired a mere technician. But he didn't. Nor did he hire the 'artist' Doyle. He hired a talented team player, who had already proven that he cared for perfect 'plates' to make the VFX artist's job easier and prevent the CGI from looking like CGI (as he says in the Arri-interview).   Who was entitled to take the award for 'cinematography' for Pi ? No one? Was there no cinematography (in the modern sense of 'cinema-photographer' as well as in the old sense 'making images come alive')?   I'm sure, in 2046 we don't wonder any more. 
  19. I suppose both of you edit with Premiere.    With AVCHD (but not i.e. with 5D clips, which are Quicktime-wrapped already) you have a copy in FCP X without choice. Make sure the copies are on an independent hard drive. One hard drive can (and will) fail, but two at the same time?   Render files (for preview purposes) are dispensable, you just need events (your footage, the original recordings) and projects (your work, hours, days, weeks). With the first living on two separate drives, you just need a backup of the second. The way to make an archive of your project (FCP X knows no 'safe as', only auto-safe) is to duplicate the project with a new name. You are then asked where you want to save this duplicated project. This could be another hard drive, or two. It could be a 128 MB USB stick (or two). You could send it as an E-mail attachment to yourself. Just to see how big project files actually are, I took 10 of mine (under 10 minutes each, reason above, project in FCP X is more or less the same as a sequence in Premiere) and hit 'info': 7,7 MB, and I use 'compounding' a lot.
  20. Scott is not obsessed with 'redundancy', in the very opposite he is trying hard to reduce the hard drive space his ever growing projects will eat. He even frequently asks if it is 'safe' to identify unwanted clips (fka 'footage') and delete them physically from the SD card and/or hard drive.   He didn't yet buy the idea, that FCP X - unlike other NLEs - only at the first stage needs all original media - which with AVCHD and a current understanding of what a 'big' media file is (quote: Shooting it on a FS700 so the media files are quite big) doesn't present a problem. The hard drive to backup his original SD cards (so that he can format them for further use) could be USB 2, the cheapest and slowest drives around, with which you can store the 15 hours of footage Scott plans to shoot (source: Technical specs of FS-700, 28 Mbps highest bitrate for HD, spread over 960 GByte, calculated with VideoSpecs Bitrate Pro, maybe Scott mixed up bits and bytes).   The drive may be that slow, because it is not what FCP X works with anyway. The smart way to import original files (directly from an SD card via card reader, from the connected camera, from a copy of the SD card on a hard drive or the recommended 'camera archive' of FCP X, in either case they remain AVCHD as 'x.mts') is to make a selection of what he actually intends to use. These imported clips are copied to FCP Xs event folder, and they come in different flavors, to be chosen in FCP Xs import preferences:   1. 'original media'. The term doesn't refer to the AVCHD on the card/backup volume. It refers to the codec, mpeg4. FCP X wraps the files during import 'in no time'* (*somewhat slower from a USB 2 connection, but that's the extent of it). After that, they appear as H.264 copies with the extension '.mov'. The copied clips have the same size as the AVCHD, but if Scott already imported, as he wrote earlier, only about thirty percent of the whole footage, these clips will take 320 GB. After that, the external HD can be ejected safely and remain in the drawer for the eventuality that the drive he assigned for the event (by first clicking on the volume's icon in the event browser prior to clicking 'new event', with the latter creating a 'log bin' for the clips about to be imported) goes bust. The speed of this event drive is crucial, internal SATA for MacPros, Thunderbolt for modern iMacs or Books, any older Macs at least Firewire. What happens if the event drive actually breaks? Well, if he stored the projects (timelines, the project files with diminutive file sizes) on a different volume (they must be assigned as well), he will have to reconnect the 'offline' files to the backup originals.   2. 'optimized media', same, but with about four times the file sizes, intermediates.   3. 'proxy media'. Only slightly smaller than AVCHD, but fastest to edit. Always need to be reconnected to the original media before onlining!   EDIT: The wise way to organize media in FCP X is to think of projects as sequences of about 10 minutes duration. Each should have a corresponding event. Though one can use clips from all events, things can look confusing on bigger features. What one should worry about is the work space's appearance, the real estate on your display's surface, not so much the storage. My two cents.   Safety and redundancy could be guaranteed by making a Timemachine actualization at any coffee break (can be a USB 2 drive as well).
  21. Perfect.   In a couple of days I will post the *DIY* soft eyepiece for the GH2. In another thread of course. 
  22. I don't know if the one rod is a problem, you must know. I have a lot of tools and I don't throw away metal parts, so usually have everything I need in the boxes in my cellar. If I miss something, I don't go to the department store but to the ___ (no dictionary at hand, guy who recycles metal waste). If you are no handyman by nature, you can find parts there that already fit more or less. You just need to screw them together. Then all you need is an accu driver (is that correct?). I assure you: This is no high-tech. The dimensions of the parts don't need to be exact, and neither the weight of the metal rings, bars, blocks or whatever that work as counterweights. Elegance costs more time and effort, you can also buy additional rods and connectors, but you will not find a perfect rig from the industry under 1000 bucks. I vote for DIY when it comes to the shoulder rig.
  23. It really is very easy. It takes fifteen minutes. The link you provided still has no counter weight, and I think it's too expensive for such a plastic. Besides that, you won't find a shoulder pad for one rod, which is a weak point of your rig anyway. Do what you like. Did you actually watch the Zacuto film? Now remember: This is the GH3. It doesn't have this very good EVF like the GH2. The position of the left hand here would be much more natural and relaxed with the GH2. How can a follow focus help you here? Also, I doubt if your rig is built well enough to provide the said sturdyness. Much more expansive follow focuses add problems, I bet a lot of shakes you wrote about are caused by the bad quality of the construction. Make it simple.
  24. If, for example, you have such a cheap rig: that otherwise has some usable features like the very important option to change the height (the viewfinder has to be ab-so-lu-tely exactly in front of your right eye), you can resell follow focus and useless mattebox and add these items:    If you buy everything new, it costs you under 10 bucks, even if you have the aluminum bar in black (tip: before putting it in the vice, cover it with tape, otherwise you will have ugly scratches).
  25.   It's called 3-point-stabilization, and it's valid for classic SLRs (with grip on the side) as well as classic Super 8 cameras (with pistol grip).   1. Right hand takes the grip  2. Right eye is pressed to eyepiece (hurts with the small and flat eyepiece of the GH2, I admit) 3. Bottom of camera body (in case of a pistol grip the lens itself) rests on left palm underneath. Left thumb and forefinger operate focus.       In the Star Trek series they often lament about the "structural integrity" of their spaceship. This is the first thing your shoulder rig must be: As tightly screwed together (with camera and everything) as if milled out of one solid block of metal ...   And move your feet? Well, you can practice to walk like for 'the ministry of silly walks'. You can absorb quite a lot of instability by keeping your gluteus maximus under tension. No joke.   But generally neither a pistol grip nor a shoulder rig are meant to walk with them.   Use a glidecam or s.th. like that for stabilized handheld shots. If they are to look like dolly shots, you have to exercise A LOT.
×
×
  • Create New...