Jump to content

Mokara

Banned
  • Posts

    744
  • Joined

  • Last visited

Posts posted by Mokara

  1. 9 hours ago, Nikkor said:

    With The Same Technology applied to large and to small Sensors, guess which one will have larger full well capacity and less rolling shutter.

    100MP is total overkill for fullframe, when will you See diffraction? F5.6? Will they build in a focus stacking program for landscapes? Moving the ibis back and forth, using phase detection pixels to make a depthmap? I know they are not.

    Even if you encounter diffraction limitations, higher pixel counts mean that you can actually reach that limitation with bumping into approximation errors that you would encounter if your pixel resolution was at it. As a general rule of thumb, in order to capture a true image you want your sensor resolution to be significantly more than your lens resolution so you don't waste any of the lense performance. Plus, Beyer filters means that even when you are at the diffraction limit you will lose information, so you have to be significantly below it to capture reasonably accurate true color at the maximum resolution possible with the lens.

    High pixel count sensors are useful for this reason, even if the optics does not resolve down to that level. At a guess I would say that you need about 9-16 pixels (3X or 4X matrices) on the sensor to properly recover all of the information in 1 pixel of lens resolution.

    3 hours ago, Towd said:

    Color aware binning that doesn't throw out data.  Sounds cool.  Would love to read the white paper on how they bin the data without throwing anything away.

    In the meantime my point stands that a lower megapixel sensor with the same capabilities would be more useful.  

    It is fairly simple. You use the 3x3 Beyer array to generate a single debeyered pixel, which is what is output. If that is done on the sensor itself it means that your camera would not have to debeyer the image separately, and you would essentially be getting a true color image from the camera at full resolution. The 4K output would have the image quality of a 12K camera, but without the computational overhead a conventional 12K camera would normally have. Since computing power is the bottleneck on all video cameras, a sensor that does this automatically itself would free up enormous resources in the rest of the camera that can then be used in other more useful ways. Such as monitoring more AF/exposure points for example, which in turn would allow for tighter more accurate tracking and things of that sort.

  2. Ya, Like I said, I have to wait until I get home to check. I don't use the recorder all that often, it is more of a toy to experiment with for me, but I seem to remember deleting individual files the other day however.

  3. 16 hours ago, DeMarcus Davis said:

    Of ALL the photogs I share the field with, I have only seen TWO (firsthand) use mirrorless cameras for NFL in-game shooting. One as I previously mentioned) used a Sony ( and sinced switched to a Canon system out of frustration) and another used what looked like a battery gripped Fujifilm X-T2. Mirrorless MAY become more mainstream in outdoor sporting events, but that is a more wait-and-see than an eventuality. 

    Simply put, pro DSLR camera auto focus tracking is superior to ANYTHING a mirrorless body has put out at this point. It's just a hard fact. I REALLY LIKE my Nikon Z6. It's GREAT for portraiture, landscape, travel photography and is mobile and light (with it's native lenses.) However, I cannot endorse or promote ANY mirrorless for NFL/fast action sports.

    That will change. A lot of what you see now are systems people currently have. No reason for them to throw that stuff away just because. There is an inertia to change, especially in professional spheres where familiarity with current equipment and systems far outweighs other considerations. For example, old main frames continued to be used in corporate environments long after superior alternatives were available. It was simply too disruptive to upgrade, consequently they carried on using the old obsolete technology, often for very long periods of time. So what you see on the side of a sports field is irrelevant to the performance argument.

    There is also an element of Pavlovian response involved, where the people who use DSLRs for this stuff generally base their opinion of mirrorless performance on much older systems rather than the latest state of the art gear. There choices become a self fulfilling prophecy based on subjective rather than objective considerations as a result.

     

    15 hours ago, webrunner5 said:

    He won't even review a camera unless it has 4.2.2 in it. Yeah he does more serious work so that is required for it. He is pretty honest and doesn't feel bad about knocking missing features.There really is no reason not to have all that stuff in it in this day and age. It is mostly software based. It is not like it is a 600 dollar camera. And it is their video based model for heaven sake's.

    It is also a consumer camera, not a professional camera. It should be evaluated as such. 

  4. On 1/30/2019 at 7:27 AM, Django said:

     

    And furthermore who really wants to use an external recorder on a mirrorless? Not that many people imo. Kind of defeats the whole compact portable purpose.

    .

    Flexibility? So you can if you want to for a particular purpose, but use internal for other applications as needed?

    On 2/1/2019 at 5:12 AM, ivanku said:

    I agree that the weird IBIS artifacts almost disappear at 50mm, but it’s not distortion correction that’s causing this. I tested a couple of manual focus wide angle lenses with corrections turned off, and saw the same wobbles. 

    Those wobbles are a combination of lens distortion as the focal point moves and rolling shutter. Your camera can't correct for this. You will have them in most stabilized systems in one form or another no matter what system is used unless you have exceptional optics

    On 3/2/2019 at 1:06 PM, DeMarcus Davis said:

    I have a buddy who shoots sidelines at Oakland Raiders games and he switched from his Sony A9 system to Canon. He complained a lot about autofocus tracking especially while panning in autofocus continuous high. This is the weakness of mirrorless cameras and why I would only use my Nikon Z6 for pre and post game shots. In game football action, NOTHING beats a DSLR and optical viewfinder especially on the D500 and D3s up through to the D5 (OH GOD how I love the D500). Then again, these cameras are nothing but tools. Right tool for the right job. There is no magic Goldilocks camera that will rule them all.

    In regards to hybrid video shooting stabilization, the best thing to do to eliminate lens or IBIS video wobble is to buy a gimbal and turn off IBS/VR. Sure, it makes any mirrorless system more of a clunky beast, but if you want buttery smooth video with a stills camera (mirrorless or DSLR), you need to place it on a gimbal. NO camera manufacturer is going to cannibalize their pro video cam lineup by introducing a cheaper and compact consumer/prosumer stills camera that can eclipse an FS5/7, Ursa Mini Pro 4.6k, EVA1, or Canon's C100-300 Cinema line up. And Nikon is too conservative of a company to leap into that territory of dedicated video cameras.

    These mirrorless cameras are meant to be nothing but a bridge between standard DSLRs and dedicated video cams. 

    That is not correct. DSLR tracking is inherently limited by hardware while MILC tracking is limited only by computing power. It is inevitable that MILCs will overtake DSLRs in this respect. The latest iterations of Sony firmware are apparently already there or are pretty damned close.

    We are now at the trip point when photography moves rapidly to MILC systems. My prediction a few years ago was that this would happen around 2019 based on my perception of how technology was developing, and it somewhat satisfying that it is actually playing out that way :) . Especially going back to forums like Canon Rumors and the folk there who poo pooed that idea. Turns out they were not as astute as they thought :)

  5. 23 hours ago, deezid said:

    Seems like the iGPU is rendering which is really slow in comparison to the 1060.

    Playback does not involve rendering. It is just decoding, the data is not being transformed.

    Usually when you encounter issues like this it is a case of the software doing the decoding rather than hardware. Since software decoders vary considerably in efficiency, some will give smooth output while others will stutter. In this case since the issue seems 10 bit centric, my guess is that various software is treating those files very differently as a result, which is causing the performance issues. 

  6. On 3/14/2019 at 12:14 AM, thephoenix said:

    i don't think anybody told me to shoot at 23.976, i think it is my stupid artistic brain that took over my logical brain.

    at the moment i only shoot personal stuff so i basically do whatever i want, and allows me to do the mistakes now (which is better than doing them on a job) ?

    i think i will stick to 24 for my personnal work as i like the smoothness of it, don't like the crispy footage of 60 and kinda like the blur in the movement.

    not sure there is much difference between 24 and 25. 30 i never tried.

    do you know if youtube / vimeo keep the original frame rate or if they "convert" the footage in a different one to comply with their standards ?

    It is not an artistic brain, it is a Pavlovian response to the fact that movies have traditionally been shot at 24p while ignoring the fact that nobody outside of a cinema (or a BluRay/DVD disc on a TV) views anything at 24p. You see stuff you admire and try to emulate it without understanding WHY things are shot like that.

    There is a difference between 24p and 25p. Because of frame mismatch there will be a stutter twice a second (unless you are viewing in a cinema or from a BluRay/DVD disc).

  7. 2 hours ago, thephoenix said:

    thanks.

    i shot 23.976 because in my mind it is the "standard" for movies even if my project is not going to be a movie...

    i am shooting for youtube and vimeo, not sure what their standard is, probably 30fps ?

    to get slow motion in my 23.976 timeline what should i shoot ? i thought 60fsp (59.94) was ok if slown down by 40%.

    sure thing i that i already have some rushes in 23.976 so i am kinda stuck with it on this project.

     

    The people who tell you to shoot at 24p because that is what movies are shot at don't know what they are talking about. Shoot at the framerate you expect it to be displayed at. 24p works for movies because movie projectors display 24 fps. But if you try to display that on a device operating at a different frame rate then expect problems unless you do sophisticated interpolation (and if you are going to do that, you might as well shoot a the proper frame rate to begin with - far fewer problems later on).

    If you are not shooting footage for cinema distribution, shoot at 25/30 - 50/60 fps (depending on where you live) if it is for commercial TV distribution, or 30 - 60 fps if for anything else. Flat panel TV sets (but not CRT panels) automatically adjust their frame rate to match the input (cell phones probably do this as well), but computer monitors and on-line video hosts do not - they run at 60 fps (or whatever you have manually selected the frame rate to be), anything else will have frames duplicated or removed which can result in serious visual artifacts. Broadcast TV sources use a fixed frame rate and any content shot with a different frame rate has to be re-rendered using sophisticated interpolation to generate something that is watchable.

    As a general rule of thumb, shoot at 30/60 fps and you should be ok for most viewing devices if you are not shooting commercially. For broadcast TV shoot at the PAL or NTSC standard, depending on where you live.

  8. Graphics cards on laptops are not the same as the ones in desktops in terms of capability. That is likely your problem. A laptop is NOT a replacement for a desktop when video editing is concerned.

    If a hardware decoder is being used it won't show up in tools since the work is not being done in the processor cores. That is why it doesn't look like they are being fully utilized (when they probably are, at least in so far as the hardware decoder is concerned). 

  9. 10 hours ago, mercer said:

    Yep, it seems like it’s an activity camera more than an action camera. At the right price, they will sell a boatload of these. I may even buy one.

    It is a Canon. The colors are lovely. Of course you will!! ;)

    9 hours ago, Mattias Burling said:

    The intended customer of this camera doesn't even know what those numbers mean. They probably couldn't care less.

    I think the intended customers are children.

  10. On 2/2/2019 at 6:36 PM, Hans Punk said:

    I think Apple are way too late to the party...like an arrogant drunk waving his wallet around.

    Apple won’t surpass Netflix and Amazon because they’ve insisted on being too proprietary in their thinking and have no innovation sparkle under its current dictatorship. It is essential these days for consumers to have choice of platforms to view streamed content. People expect everything to be available on a phone, laptop,desktop,TV and totally agnostic to which manufacture of peripheral you are viewing it on. Apple TV is an example of the last clinging attempt of Apple to lock and control mass-market consumers into an Apple streaming ecosystem. Apple could have been pioneers at the streaming market - if they had just relinquished their proprietary hardware mindset and invested earlier in original quality content creation, thinking about the user experience first, then work backwards - I guess  like Mr Jobs probably would have done it.

     

    That is absolutely NOT how Mr. Jobs would have done it. He was all about control and squeezing competition out, that means proprietary everything. That was his thing.

    Steve Jobs believed that he knew what the consumer wanted better than what they knew they wanted (he said so himself). You got what he gave you, he packaged it as "hip" and got opinion leaders to buy in to an elitist image knowing that the sheep would follow. That, together with absolute control over the ecosystem, was his business plan. It was NOT a consumer friendly business plan, consumers were gullible chumps there to be exploited by sophisticated marketing.

    On 2/5/2019 at 8:22 PM, Video Hummus said:

    Yes, this 100%. Sometimes a lot of the Netflix Originals just look and feel like they are trying too hard. When some of the better movies and films are simply about story and less with all the crazy explosions and fancy (and expensive) special effects and stuff. If the story is good, viewers don't care!

    You mean like Russian Doll?

  11. 14 hours ago, androidlad said:

     

     

    16 hours ago, DanielVranic said:

    400MBps UHD on both

    Heat was not a factor. The camera (XT3) directly next to it ran for 15 mins longer (filled the card, non-heat)

    Have not swapped them in a while, so i do not recall if there has been a change. I will test this afternoon.

     

    95 MB/s is the read rate. Burst write rates are usually a lot less and sustained writes even lower.

    Keep in mind that your effective write speed is going to be determined by the lowest write speed, actual write rates fluctuate during a recording, sometimes wildly. What you see in a burst spike is not necessarily what you are going to get if you are recording for a period of time. It will be the minimum write speed, not the maximum.

    Also, write times can vary considerably depending on the capacity of the card.

  12. On 3/4/2019 at 4:44 AM, BTM_Pix said:

    That is very kind of you to say so.

    Lets reserve judgement on it until it's proven not to kill your phone, camera or nearby pets.

    You can now follow us on Twitter @cda_tek and of course everyone re-tweeting the balls off it would be very welcome ;) 

     

    Is that a possibility? What will happen to cat videos if the app wipes them out? How will new cameras be tested properly in the absence of cats?

  13. On 3/6/2019 at 5:56 AM, BTM_Pix said:

    I would hazard a guess that it might be to do with the method to construct continuous video from it rather than specifically the file format for individual frames, which would make me suspect it was the same company that were suing Atomos until very recently before a licensing arrangement was done.

    Whether the same license arrangement was not offered due to BM being a rival camera manufacturer or BM decided to turn it down would be a question if it was who I suspect it was.

    CinemaDNG is an open format, but BM was using a modified extension of it. It was likely those modifications which ran foul of other people's IP, not cinemaDNG itself.

  14. 16 minutes ago, Mark Romero 2 said:

    ...and scary...

    Ok, so the dynamic range is the height distance between the ground floor and the top floor, and the bit depth of the ADC is the number of steps the staircase has that goes from the ground floor to the top floor???

    Right. Your dynamic range could be a foot or it could be a mile. ADC bits are the "ticks" on those scales when those distances are converted into a digital code. A bit in itself has no inherent size. A 12 bit tick on your foot ruler does not represent the same quantity as a 12 bit tick on your mile ruler.

  15. 15 minutes ago, androidlad said:

     

    Sensors work in linear, even if the analog components (photosites) are capable of 10000000 stops, a single 12bit ADC would only be able to produce max 12 stops in digital signal.

    Sony's IMX490 is capable of 120dB DR, that's achieved by three 10bit ADC working simultaneously.

    https://***URL removed***/articles/4653441881/bit-depth-is-about-dynamic-range-not-the-number-of-colors-you-get-to-capture

    Have a read and educate yourselves gentlemen.

    A ADC converts a analog signal into digital code. If your sensor can measure 1000 stops of dynamic range (as in, be able to accurately measure a low non zero quantity and a high quantity, the difference between which is your dynamic range), your ADC will convert that analog response into bits covering that range. An individual bit does NOT correspond to a stop of DR range. It could be the equivalent of 10 stops or it could be 0.1 stops. It is completely arbitrary. You could use a 12 bit ADC for your 11 stop DR sensor, or you could use a 256 bit ADC for it, or you could use a 4 bit ADC. The only thing that would change is the size of the steps between the highest and lowest analog value your sensor can detect. 

    If your sensor had a 1000 stop DR, your 12 bit ADC would generate a digital version of that analog signal with 4096 possible values. The size of those 4096 steps will vary depending on what the dynamic range of your sensor actually is. If it 4 stops, those 4096 steps will correspond to small increments of the response, if it has 1000 stops, those 4096 steps will correspond to large increments of the response.

     

  16. 1 hour ago, androidlad said:

    Do you know what ADC is? ADC bitdepth directly determines how much DR can be extracted from the sensor. Sony sensors use single 12bit ADC for video. The new BMD 4.6K G2 uses dual 11bit ADC, ARRI uses dual 14bit ADC.

    You clearly don't know what ADC is. Your 12 bit ADC can cover 1000 stops of DR if necessary. There would just be big steps instead of little steps.

  17. 4 hours ago, Maccam said:

    In the market for new camera. I want more power video features and the Z6 with Prores RAW coming down the pipeline makes it look like good option. I originally started in photo and over the years I handed my money to Nikon for their glass the FTZ mount it should be a relatively smooth transition, along with the potential to mount just about anything on the Z mount.  One thing I did notice when testing the Z6 at my local camera store I discovered something I had overlooked, and that is the Nikon set a limit of only 3 axis IBIS with f mount lenses. I found this to be the case even my Nikon 70-200 2.8G VR2. I tried it both ways VR in lens on and off checked the the various in the camera for IBIS. The results were not good. In fact just hand holding it I thought the VR would be smoother on a Nikon DSLR Body. Last few years my main video camera has been the A7sii and with a dummy metabones e to f mount I get full 5 axis IBIS. My first thought is there should be no engineering reason for Nikon to not allow full 5 axis with anything mounted on the camera. I know they want to sell their new Z mount lenses but this is kind of a uncool way to go about this. Maybe I did something wrong or if you have any idea please tell. I would very much like to be proved wrong.

    Also did anyone notice how the new Z mount Nikon 24-70 2.8 is $100 more then the F mount and that one has VR in the lens and it less. I would think the 24-70 Z would be few hundred cheaper with no VR in lens. 

    The Sony is doing 5 axis stabilization in the body, it doesn't care much what lens you have attached. The stabilization on the Nikon however happens through interactions with the lens, so if your lens is not fully compatible with the body it is not surprising that it does not offer full levels of stabilization with older lenses.

  18. 3 hours ago, Castorp said:

    Wow somebody wrote a silly comment about Canon and Apple users being for art snobs and suddenly the Canon people are losing it? hahaha If anything many artists don't care too much about what they're using and just go ahead and make the work. Cameras are being used over years and years and insecurity isn't  projected into what camera is being used and the need to have "the best". The Canon RP looks to be a great camera making great work. Canon's are great. I've never liked the colour or lenses myself but sure wish I did. Sigh, condemned to being a loser I guess *shrugs* As is poor loser-NASA and countless other institutions that rely on, for example, Nikon. 

    They say the truth hurts.

  19. 18 hours ago, DBounce said:

    That would be because arts types are also equipment snobs in general. Most of them probably would not be caught dead with anything other than an Apple product (where applicable) for example. The type of person who shoots documentaries with awards in mind (in other words, as "art" rather than as content) is probably going to use the "right" sort of equipment. It is a self fulfilling prophecy of sorts. For them, if they don't use a Canon they are not a real artist, so they use a Canon. If they used something else, their fellow artists would consider whatever they made as "not looking right" and hence not "art", so they would not win any awards (or at least, not stand much chance of winning). That is why all of those docs were shot on Canon equipment.

    On the other hand people who shoot documentaries primarily as content are going to be more concerned with equipment specs, so their choices will be different.

  20. On 11/15/2018 at 3:38 AM, webrunner5 said:

    For Pro work I am not too sure I would go with a cable that is that thin. Just think how thin the wires in it have to be. There is 19, yeah 19 wires in a HDMI cable. If they didn't have to bend it much sure. But if you break down, set up a rig often not too sure about that.

    The wires are probably the same thickness irrespective of how thick the cable is. Thicker cables are less flexible however, so less local stress is placed on the wires inside.

    On 11/15/2018 at 3:45 AM, Shirozina said:

    That's true but the thick cable of the  Atomos also puts additional strain on the connectors - you can't win!

    The important part is to stop the wires from flexing too much and failing from metal fatigue, so a thicker relatively inflexible cable is what you want. Bending is the enemy of wires, you want a cable that minimizes that.

×
×
  • Create New...