Jump to content

dhessel

Members
  • Posts

    393
  • Joined

  • Last visited

Everything posted by dhessel

  1. I have done it a lot with a dual focus Kowa so it is possible. It's not something you want to jump into, I would suggest having 1 lens and the adapter that you use at first. Practice a lot getting comfortable with the process and learn how the lenses focus relate to each other and good at estimating focus distance. Eventually you get the point where you can quickly estimate your focus distance and know exactly where you need to set focus on your taking lens and adapter, some times this is not obvious, and then make some quick micro adjustments to the focus on the taking lens. You will also have to modify your shooting style to maintain the same distance from subject at all times while shooting and then stop and do a quick refocus for a new distance. It is also worth stating that the more stopped down you are the more forgiving the focusing gets. If you are stopped down enough on some lenses you can focus the adapter to the approximate distance and adjust focus with the taking lens. Sometimes you can even do some rack focus this way. Of course if you are on a FF camera it is less forgiving.
  2. They have a variable power diopter in front of normal anamorphic elements that are fixed set at infinity focus. The variable diopter consists of a Plano-concave lens all the way towards the front then a Plano-convex lens then the anamorphics. The focus ring moves the the two Plano lenses closer and further apart varying there power. When the two Plano lenses are touching they have a 0 power when at their max they have a .5 power in the case of the iscos. The design and concept is simple but finding lenses large enough to work on anamorphic adapters is not easy. I have tested and seen this first hand on an isco and with some cheap demostration lenses and was able to make it work with a Kowa. It worked in concept but the quality was not so good. Also in theory the lenses I used should have had a 0 power when together but they didn't so it lost infinity focus.
  3. Not that I have ever seen, it is safe on lens coating too. Ultimately its what ever you are comfortable with. If the grease isn't to hard you may just be able to wipe it off. I had to use the fluid since the grease was hard and the lens was completely stuck like it had been glued together. When I regreased the Helios-40 I didn't have to use any fluid.
  4. I believe with the close up mod there is a risk that you unscrew to far and the front will fall off. That is exactly what you want, unscrew the front part completely to gain access to the threads. I use lighter fluid for cleaning the old grease. It breaks it down and evaporates without leaving any residue and is a common way of cleaning lenses. I have successfully removed hardened tank grease from the Helios-40, seriously that thing is a monster:), and fungus spots from multiple lenses with it. Once all the old grease has been removed from the threads re-lube it with white lithium grease or something else specific for lenses if you want. I have always had great luck with just white lithium. That's it really, after my cleaning the 54 was very smooth just like new. If you find you have to disassemble the lens to get to the threads, do so slowly and carefully. The lens has 4 glass components, starting from the front there is a plano-concave then a plano-convex then the 2 anamorphic elements. The first 2 glass elements can be safely removed and put back on. There is no reason to remove anything other than the first glass element for this procedure and it should remain in its housing - they should be removed together since you just want access to the threads. The way these lenses work it is hard to mess up infinity focus like on traditional lenses. What ever you do though, do not alter anamorphic elements at all. They are very precisely aligned and a tiny offset will result in a blurry image. Good luck.
  5. Rechargeable batters can and do explode and it can happen with just about anything. Do a google image search for exploded battery. I found lots of results, didn't see any from a blackmagic camera.
  6. Sounds like the drive may have been formatted for mac. If that is the case the only solution is to transfer on a mac or purchase a driver that allows hfs to be read on a windows machine. One is called macdrive it costs around $50. You could also try this for free, it claims you can copy the files off the drive just not write to it. Never tried it so I have no idea if it really works or not. http://www.pcadvisor.co.uk/how-to/windows/3369574/how-read-mac-os-hfs-drives-in-windows-for-free/
  7. These things are old and the grease gets hard over time, I had a 54 that was stuck on infinity due to this. The 54 and 36 are very similar design, I serviced the 54 my self and it was very easy to do - not that I am recommending that you do it yourself since they are expensive. The grease probably just needs to be cleaned of a re-applied. From what I have heard about Bernie I find that user's statement kind of hard to believe.
  8. That says front element not front filter size. Since it ships with components that allow it to attach to lenses from 49-62 mm I am sure it would be compatible with any lens with a filter size in that range as long as the are not too wide angle that would vignette.
  9. Actually from all the other posting and the video it self, it looks like the internal batter exploded.
  10. Those are filtering types for adding motion blur to moving elements in your compositions, It has nothing to do with video footage its only if you want to add some animated text / motion graphics and wanted motion blur on it. Gaussian would be the best and box the worst of the ones listed.
  11. Sounds like a faulty card, the Komputerbay cards are cheap and unreliable. The 5DmkIII should get around 100 MB/s on a 1000x card. Depending on what you are doing you may want to go with a higher quality card and save the hassle of tracking down a good Komputerbay, I have seen posts of people trying 3 or 4 before finding one and several reports of the cards just dying during a shoot, footage along with it.
  12. I never use this application but to address the last part of this which is the difference between video levels and full range. The eye has a fixed range of colors that it can see, digital devices and digital files have a fixed range of colors they can reproduce. This color range is referred to as gamut. The standard video color gamut is much smaller than what the human eye can see meaning that there is a whole range of colors that we can see that are cannot be displayed on some digital devices. We don't really notice this much because the video gamut has been carefully designed to include the most commonly seen colors so the ones that are left out are extremes in one way or another. The full range option will give you a larger gamut, hence a larger amount of possible colors, yet no where near what the eye is capable of but and improvement over video aka sRGB or REC709 colorspace. Why would you want to use full range, some cameras and newer devices have a larger gamut. I know that some of the 4k TV's have been advertising that they have larger gamut's. So working in full range would allow you to preserve that throughout the edit. Almost all current HD TV's and computer monitors are sRGB or REC709 with a video color gamut so you will want to use video range unless you have a specific reason not to. What format to choose? If you are working with 8bit footage and not adding any effects, just doing a straight edit then 8 bit is fine and you can enjoy the speedup to your workflow. Otherwise probably should just use 32bit to ensure the highest quality results. Welcome to the wonderful world of digital color, there is a lot to learn but the information is out there. I remember asking myself many years ago why are computer primaries red, green and blue when in traditional media the primaries are red, yellow and blue?
  13. Yes. I am pretty sure neither one of those are on the official list of supported SSD's. Apparently SSD's can have fairly variable sequential write speeds that can drop too low causing the BMCC to drop frames. Here is a list of SSD's that should work without issue. Also be sure to format the SSD after every time you off load the footage, delete clips isn't enough as it fragments the drive and causes a drop in write speed. The BMCC need's a high stable sequential write speed or it drops frames. http://www.blackmagicdesign.com/support/detail/faqs?sid=27541&pid=27542&os=linux
  14. I have tested mine at 58mm and 85mm and the results were pretty much the same at 1.41 stretch factor - at least with my lenses and setup - that would be a 70.9% squeeze. I have found several other reports online that confirm this as well. Unfortunately that gives you an akward 2.5 aspect ratio that I have been cropping down to 2.4 or 2.35, taking advantage of the extra resolution for slight reframing. Sometimes you can get away with using 1.5 but I noticed that 1.5 didn't look right when shooting people which is why I tested to begin with. I find that 1.41 looks way better in those situations and I can't bring my self to use 1.5 anymore since I know it is wrong.
  15. This adapter is a dual focus lens so they both have to be focused together to achieve a sharp image. If you are confident that you are focusing them properly and it is still not sharp chances are the front and rear anamorphic elements are slightly rotated out of alignment. This is fairly common with the age and shipping around of these lenses. The alignment is very sensitive even the slightest offset will result in a blurry image. You may need to have it serviced, I have been able to realign the optics in the past so you may be able to do yourself, but do so at your own risk. I am not familiar with this lens so I have no idea how difficult it would be, however most of these lenses are very simple in design.
  16. I have one, tested it and found it to be a little over 1.41.
  17. No the 5D3 is the best for raw as it can record higher resolution continuously than any other camera. It so suffers from rewer problems like moire than the others.
  18. Probably but I am not sure what it is but, here is one that will tell you how much you need to scale an image down to get the desired number of extra bits. If you can solve for bits you will have your equation, my math is a bit rusty in that area. scale = 1/(sqrt(2^bits)) where the scale is a value from 0 to 1, and bits is the number of extra bits you want. example: 0.5 = 1/(sqrt(2^2)) is UHD to HD. 4k to 480P would be about 12 bits. I am considering 480P to be 864 pixels wide with square pixels, even though that is not the case. 0.25 = 1/(sqrt(2^4)) 3840 x 0.25 = 960. Since bits have to be in integer numbers not every resolution change is going to result in a extra bit being added
  19. For a 2x squeeze, If you were to squash your footage not stretch it then maybe you could pick up 1 extra bit, but probably not 2 like the 4k to hd scaling. Since 4k is HD * 2 you are basically squashing it in x and y you then can use 4 pixels to make 1 pixel or 4 data points per pixel. Those 4 data points are equal to 2 bits. With anamorphic you will end up with 2 data points per pixel since you are making 2 pixels 1 giving you 1 bit extra so I would think that a 9 bit image would be possible.
  20. Just to be clear some luts can cause a loss of data. Some luts have flattened curves near the black and white ranges that case cause a loss of color information that is not recoverable. One example for colors on a 0 to 1 scale, black to white, a lut may expand the colors in the mid values pushing a value of 0.25 to 0 and any value from 0 to 0.25 also to zero. Regardless of your working environment being floating point 32bit or not there will be no way to recover the crushed blacks in your footage. Now this is an extreme example but it is not that uncommon to have luts with flattened curves that can do this on either the black or white end of the spectrum. To make this effect even more clear imagine a "black and white" lut that takes any color and makes it grayscale, once you have applied that lut there will be no way to get your colors back. Also is you were to push some colors to super white and apply a lut you will most likely loose them and they will no longer be recoverable, again regardless of working environment. That is why it is recommended to use luts at the very end of a color grade.
  21. To answer one of your questions, the point of using a film emulation lut is due to the fact that color reproduction on film is different than digital. One is a chemical process and the other is electrical after all. As a result shoot a scene with film while have different color tones than digital, While you could grade footage your self to get a more film like color response the emulation luts can do that for you. They are just a starting point however. A proper grade should always follow. Color correction and luts go hand and hand as little or as much of both to get the look your after. As for when you use a lut, generally it is the last step or node since they can crush your black or clip your highlights.
  22. If you are shooting in regular mode then ML always crops center. The resulting recording will be the width that you specify centered on the sensor matching the aspect ratio as close as possible by adjusting the height again centered. If you were to shoot with an aspect ration of 1 the resulting image size would be 1920 x 1280 since ML always matches width even if it cannot match the height. If you were to shoot 960 x 640 you would be using a 18 mm x 12mm portion of the sensor centered on the sensor. This effect is easy to see in the live view when change resolutions. In crop mode that may not be true, I know there was a time when it was cropping to the left but that may have changed. I personally never use crop mode.
  23. With ML and card spanning turned on with a 5DIII people are getting about 110 MB/s so that is about the limit that camera can do continuously anyway. ML doesn't limit the maximum resolution in any way, those limits are a result of limits of the data stream that they hack into. In non-crop mode the maximum resolution is the max resolution of the camera / 3 5760 x 3840 / 3 = 1920 x 1280. That is because the camera takes the raw sensor data and only reads out every third pixel and then sends that to compression. ML can grab that data after it has been down sampled and dump it to the card. They have no direct access to the sensor data so they are limited by what they can access which I believe is actually a raw data stream meant for the Live View. In crop mode the data stream is much larger (3584 x 1440) and is every pixel instead of every third so that is the max resolution in that mode. Also the resolutions have limitations as well can cannot be just any random value. The width in pixels should be evenly divisible by 16 and the number if bits per horizontal line must be divisible by 16. If you do the math you will see that most of the possible resolutions are already included, I recently added one my self at 1792 wide. So shooting roughly 4/3 aspect the highest non-crop mode res you can do is 1728 x 1280. In crop mode you could do 1920 x 1440.
  24. It also depends on what format you are recording in. If you are using compressed 8bit then you will probably want to expose for your subject since there is little room for post correction. If you happen to be shooting raw then you may be able expose so that highs are about to clip and bring up the darks in post. The downside will be excessive noise if there is too much range between the too.
×
×
  • Create New...