Jump to content

kye

Members
  • Posts

    7,817
  • Joined

  • Last visited

Everything posted by kye

  1. kye

    Davinci Resolve 16

    I'm just winding you up I've found and reported a few bugs over on the BM forums too. So far though, not a lot of issues from my end. I have had a few WTF moments, but then thought about it and realised that it's probably just a design decision where it works the way it is for some situations, just not the particular way I happen to be using it.
  2. If you consider the BMPCC4K a cinema camera (which you should) then you'll understand how a cinema camera can put pressure on DSLR/MILC cameras. You only have to look at what impact the BMPCC4K had on people shooting with other cinema cameras (eg, lower-end REDs) to know that the more cinema cameras that come out challenging the specs-vs-price equation like the ZCam, Kinefinity, BM, etc the more pressure it puts on the big cinema names like RED and ARRI, but also those manufacturers who span the two worlds like Sony, Canon, etc. It's all connected. Sometimes loosely connected, but connected just the same.
  3. kye

    Davinci Resolve 16

    I'm editing the video for my 5 week honeymoon trip in Italy - does that count as important? ??? Seriously though, I've had less crashes on 16 than I used to get on v12.5 per hour. I'm not even kidding. I am doing backups each time I get a new beta though.
  4. kye

    Davinci Resolve 16

    Yeah, and an incremental update where you only have to download the patch, rather than the full standalone version would be nice. I have tonnes of internet, but making people download the same thing again and again is just annoying and wasteful.
  5. Interesting stuff. Seems like it might be the time of the 6K camera, which for many will be a nice compromise between the file sizes and cost of 8K but still gaining the ability to crop, stabilise, or downscale in post for a 4K delivery.
  6. kye

    Sports videography

    My 2x teleconverter for PK mount arrived today. It was a bargain, especially considering that when I opened up the original instructions that came with it, I discovered that it begins with: "The TELEPLUS is the finest quality lens extender on the market" Score!! Even @BTM_Pix can't argue with that! I took some test images through my Sun 70-210mm f3.8, both with and without the TC, and initial impressions suggest that it does a pretty good job of zooming into the horrible CA that comes from the zoom lens! It's a wonder why the person was selling it, maybe it was the whacking great fingerprint on one side of the glass element... ??? I'll be sure to take some test shots over the weekend to properly evaluate it, but it will be great to have some extra reach, considering I'll now have the FF equivalent of a 280-840mm f16 lens.
  7. kye

    Davinci Resolve 16

    Peter should be able to sort you out, I hope anyway. It's worth trying the next beta and seeing if it resolves it, har har har.
  8. kye

    Lenses

    Nice collection, some ones in there I'm not too familiar with. If you feel inclined I'm sure lots of people would appreciate seeing a video with some footage from each of them so people can make a bit of a comparison. Extra useful if they're in the same lighting conditions. If you're still in Melbourne then I'm sure an overcast or rainy day would have pretty consistent lighting, maybe you can get shots of Melbournites celebrating the climate of their home city by madly dashing through a downpour trying to avoid puddles I've got two Helios 58mm f2 lenses and the Mir 37mm f2.8 and found them to be quite different in character. The Helios is pin sharp in the centre and falls away significantly, even when on MFT where you're only looking at the centre of the lens, whereas the Mir had a different character. They also all change character from wide open to stopped down a couple of stops, so there's lots of variables going on. I was particularly excited to get the Mir because it is supposed to be apochromatic, and to its credit that did give it a different rendering, but I ended up going a different direction and have chosen the Konica Hexanon 40mm f1.8, which is not only faster but is also considered one of the sharpest lenses ever made (if not the sharpest). I have no idea if that's true but it's not that typical vintage look! I remember that video of the busker from when you posted it previously, it's nice work and the IQ is very nice. I think many many people would be happy with the combination of the resolution and beefy codecs of the P4K through vintage glass that takes that modern sheen off it.
  9. In a sense I'd agree, but the bigger picture is a bit more complicated. Over time computers have been getting smaller and cheaper. Early computers were custom built, cost millions of dollars, and took up entire floors of buildings. They were run on valves (tubes for you US folk), needed huge amount of power, created amazing heat, and were subject to all kinds of issues like valves burning out, dust making faulty connections and insects getting in (where the phrase "a bug in the code" comes from). Then with the microprocessor revolution desktops appeared and eventually became cheap enough for domestic use. There were early portable computers around that required mains power but had integrated CRT monitors and keyboards, which were the first step towards portable computers. Laptops became feasible when battery technology and LCD displays reached sufficient maturity and low enough cost. Smartphones appeared when touchscreens and miniaturisation technologies matured. You can look at it like this, people do, but there are some wrinkles in here. For example, laptops are basically mobile phones with a huge screen, huge batteries, and keyboard. In many cases the hardware involved is the same, and phones can easily be made to run the same operating systems. Those with laptops who (like me) use them on the go but then 'dock' them to a larger monitor, storage, and potentially eGPU hardware etc, are essentially bridging the gap between laptops and desktop computers. The only limitation is the comms standards (like USB 3.1 or Thunderbolt 3) and software support. Smartphones have this potential as well - plug in your phone at home and run Resolve on it for example.. We currently turn our phones into laptops quite commonly: So, in the current marketplace laptops are sort-of desktops, and phones are sort-of laptops, what is the future of the desktop? Is it that desktops will be sort-of super-computers? If so, why shouldn't they be designed to be far more powerful than laptops, and priced accordingly? In a world where things are sort-of the next size up, it sort-of makes sense
  10. kye

    Lenses

    You're welcome!! I'm assuming they're all Russian lenses? I think I can spot the Helios 44 and Mir 37 in there, but the rest aren't obvious to me Confused and paranoid man with camera runs around at the seaside and accidentally becomes the saviour... talk about trotting out the old cliches!!
  11. kye

    Davinci Resolve 16

    Did you figure it out? I have no idea, and if Jim couldn't help then it might be a strange bug of some sort, which means the normal troubleshooting steps, like copying every clip to a new timeline, or a new project, etc.. Alternatively, you could set the in and out points in the Deliver page to a single frame and then export the video using a codec that is an image sequence and get a single image that way. It's a bit of a PITA though!
  12. kye

    Davinci Resolve 16

    @heart0less Have you checked in the Deliver page, there are some options (maybe under Advanced?) about using optimised or proxy media during the export, maybe these are accidentally enabled?
  13. Great stuff, keep us informed. PS, I like your writing style, direct with lots of info in few words. Keep it up!
  14. My finger grip for the X3000 arrived yesterday, it is a very interesting and well designed accessory. The unfortunate aspect of it is that it completely obscures the rear door, prohibiting the use of an external microphone. I've examined it with the intent to drill a hole in order to get a 3.5mm mic jack through the rig and into the camera, but it doesn't look like that is too feasible. I will do some testing with the built-in mics and see what I think. My uses for this camera are mostly without dialogue so ambient (non-directional) audio is probably fine. I think this rig might be one of the most interesting from a "nano-rig" perspective, and it might also be why there are hardly any third-party rigs for this thing - it covers the tiny rig category and if you are fine with larger rigs then the 1/4-20 mount provides you with all the options you would need.
  15. There's an economic theory that basically says that 10% of your customers can afford to pay 10x the price for an item if it gives them something they can't get elsewhere. That means (for example) that if Apple are selling 18M computers per annum (link) then there are roughly 1.8M people who would buy a computer that cost 10x what the Mac cost. If a family car costs $25k think of how many people buy $250k or even $2.5M cars. And if R&D costs a the same figure to make a super-charged version then you will still recoup it by selling the same overall $ amount of them per year (10% of 10X is 1). You guys might be surprised they're pricing a computer so high, but I'm surprised that they don't make computers that cost $40k and $400k.
  16. Absolutely. I also think that people don't know they want something until it is invented or becomes feasible. For example, when 1080p was released no-one turned around and said "I want a camera that can record four times the resolution of FullHD, I want it to be the size of a match-box, have a screen on the back and also one on the front so I can film myself and see if I'm in frame, and I want it to do magic so that it looks like a professional steadicam operator is holding it, and I absolutely need all these features and will accept nothing less because I'm going to be using it to make my own TV show starring me and I'll get paid by doing advertisements for brands that reach out to me, not me to them". Now it exists and people are criticising it because the bitrate isn't high enough.
  17. Another data point, slightly OT here, is that I spent 5 weeks in Italy last year and almost every camera I saw from tourists and holidaymakers was a Nikon super-zoom fixed lens model of some kind. I saw a few Nikon DSLRs and a few Canon DSLRs, and of course a zillion smartphones. What I really took from that was that there are pockets of users and they can be quite insulated from each other.
  18. kye

    Sports videography

    That makes total sense, and I can see that it would result in lots of shots where people are running towards you, providing the best opportunities for shots. I'd have to weigh up the fact I normally sit with the wife and watch the game together while I shoot, plus the fact it would make my son all the more aware of me filming him play, against the better footage. Life was never meant to be easy I guess!
  19. @joema thanks for sharing your methods and workflows In a sense I'm thankful that I don't work with the footage quantities that doco folks work in - a shooting ratio of 600:1 is pretty killer! Plus, the quality bar I have for my projects is exponentially less so there is that Part of the challenge I see is for NLE designers to provide methods for organisation and marking-up raw footage that is of sufficient flexibility and scalability to meet the challenge of the project. The smaller and simpler the project the less comprehensive the tools need to be for doing it all within the NLE. I also see there being a kind of scalability factor where below a certain size the built-in functions of the NLE will work, and above a certain size where doing things outside the NLE and even across multiple external tools is an acceptable overhead for the size of the project and the capability and capacity of the team of editors. However I wonder if there is a middle ground where you have too much complexity for the NLEs integrated tools but the editing team can't afford the overhead of external tools and the additional admin overheads this creates. This gap may not exist in FCPX or PP (I've heard that they're mature and well-featured editing packages) but I suspect that this might be where Resolve might be a bit lacking. I'm not in the territory of this gap, I'm just kind of thinking out loud here. My challenge (and my original call for comments) was that my challenge outstripped my previous technique of just doing a series of passes over the footage until I had an end product, so had to dip into some of the clip / timeline management tools available. In future I'm contemplating doing highlight reels that cut across many years of footage and may end up including footage that didn't make it to a final edit of a previous project, so in a sense I'm kind of looking to a possible future project where my many hundreds of hours of footage are the source media. Also, understanding a larger challenge normally helps when you're facing a smaller one, so there's both a longer and shorter-term benefit of this discussion for me.
  20. 1000 nit screens would be great, but if not possible / practical then a decent EVF can also work almost as well.
  21. I voted for Internal H264/H265 but I did get a bit torn between that and internal RAW. My rationale is that I want the best quality I can get, but with a limited bitrate for practical purposes like ease of editing and stage bitrates. I use GH5 with UHS-I cards, so I'm limited to the 150Mbps h264 4K mode or the 200Mbps h265 6K codec, and whilst I'd love to shoot the 6K mode all the time, saying it's a PITA for editing is possibly the understatement of the century, so for the moment I've gone back to 4K mode because the h264 is so much nicer. If the manufacturers started offering internal RAW then I'd be very very interested in codecs like BRAW 12:1 (46MBps) or maybe 8:1 (68MBps) for short projects or hero shots. But every Prores except Proxy (22.4MBps) and, at a stretch, LT (51MBps) are beyond practicality for me. I also voted for 6K resolution. For me, I'm currently publishing at "1080 quality" which means that I'll probably export to YT in UHD but I don't sweat it if there's a shot that is only 1080 quality because either I've cropped into a 4K shot, or it was a 4k50 150Mbps LongGOP shot conformed to 25p and possible slightly cropped and/or stabilised in post. Of course, I'd love for every shot to look like it was shot natively in the export resolution in full uncompressed RAW, but that's not my expectation. In this sense, taking in consideration my ~500Mbps bitrate limit, it's a toss-up between modes like: a very high bitrate 1080 HFR mode which would have less resolution to crop into and stabilise, but also less compression, more chroma subsampling, and higher bit depth a 4K "middle ground" a 6K oversampling approach which will have the most resolution, but the most compression and least data-per-pixel as-shot, but when downscaled to 4K may give a better result, especially considering that stabilisation is only as good as its tracking accuracy, which is a function of resolution. When 1080 and 720 came out I did some tests comparing what resolution gives the nicest output for a given completed file size. When you compress a video file to 1080, 720, and SD where all have the same output file size, the file with the highest resolution was easily the best quality. This quality test assumed the same codec, and all files were quite compressed, so the test may not be directly applicable to a compressed vs RAW comparison, however it suggests that a comparison might be worthy. Given this logic, the h264/h265 6K or 8K file may have advantages over a lower resolution but much higher bitrate file. I'd be open to both options, especially when you consider that a 4K sensor doesn't have full colour subsampling, whereas an 8K sensor can do 4K at full subsampling. This then leads into the question about if you should downsample from 8K to 4K in-camera before compression or downsample in post after compression. I suspect there are advantages to each.
  22. This design doesn't seem to be weather sealed, but theoretically if you have the vents pointed down then it would be difficult for water to get in there. Kind of like how houses with roof tiles keep water out most of the time. You can see on the diagram that the water that hits the outside of the tile would have to flow up the tile, then between them through that zigzag pathway to get over the top lip of the tile and into your ceiling. This is why tiled roofs will often leak if there's a strong wind coming from one direction because it will push the water up the tile instead of allowing it to flow downwards. If you have a camera with a convoluted ventilation pathway then you could keep most of the water out, even if there was a small amount of splashing upwards, however if you then put the wet camera down without keeping it upright then all the water clinging to the bottom would find its way inside, which is why they probably don't claim any waterproofing spec on electronics like this. .... just so you know
  23. Promoting the competition.... I would much rather have heard it was yours
  24. Controller app for android.. @BTM_Pix is this yours?
  25. One of the ways I want to use my X3000 is for time lapses, things like sunrises and sunsets, but also busy locations with crowds, and things like family dinners while on holiday. Doing a bit of research around supports got me these ideas.. Manfrotto Pocket series I've got the larger model and use it even on my GH5: The smaller one looks intriguing for nano setups, but for the X3000 it's oriented in the wrong direction, so you wouldn't be able to angle it up or down that much without turning it 90 degrees to the camera, at which point I'm not sure how balanced it would be. There's also the Job Micro series However, I'm wondering if there's a DIY opportunity based upon something like the Platypod: or the pocket tripod: Not too sure which way I'll go, but I want something that I can angle the camera up or down for compositions, and probably a little bit sideways to level it out.
×
×
  • Create New...