-
Posts
7,835 -
Joined
-
Last visited
Content Type
Profiles
Forums
Articles
Everything posted by kye
-
Nice collection, some ones in there I'm not too familiar with. If you feel inclined I'm sure lots of people would appreciate seeing a video with some footage from each of them so people can make a bit of a comparison. Extra useful if they're in the same lighting conditions. If you're still in Melbourne then I'm sure an overcast or rainy day would have pretty consistent lighting, maybe you can get shots of Melbournites celebrating the climate of their home city by madly dashing through a downpour trying to avoid puddles I've got two Helios 58mm f2 lenses and the Mir 37mm f2.8 and found them to be quite different in character. The Helios is pin sharp in the centre and falls away significantly, even when on MFT where you're only looking at the centre of the lens, whereas the Mir had a different character. They also all change character from wide open to stopped down a couple of stops, so there's lots of variables going on. I was particularly excited to get the Mir because it is supposed to be apochromatic, and to its credit that did give it a different rendering, but I ended up going a different direction and have chosen the Konica Hexanon 40mm f1.8, which is not only faster but is also considered one of the sharpest lenses ever made (if not the sharpest). I have no idea if that's true but it's not that typical vintage look! I remember that video of the busker from when you posted it previously, it's nice work and the IQ is very nice. I think many many people would be happy with the combination of the resolution and beefy codecs of the P4K through vintage glass that takes that modern sheen off it.
-
The new Mac Pro + 6k monitor has landed - introducing the Cheese Graters
kye replied to Trek of Joy's topic in Cameras
In a sense I'd agree, but the bigger picture is a bit more complicated. Over time computers have been getting smaller and cheaper. Early computers were custom built, cost millions of dollars, and took up entire floors of buildings. They were run on valves (tubes for you US folk), needed huge amount of power, created amazing heat, and were subject to all kinds of issues like valves burning out, dust making faulty connections and insects getting in (where the phrase "a bug in the code" comes from). Then with the microprocessor revolution desktops appeared and eventually became cheap enough for domestic use. There were early portable computers around that required mains power but had integrated CRT monitors and keyboards, which were the first step towards portable computers. Laptops became feasible when battery technology and LCD displays reached sufficient maturity and low enough cost. Smartphones appeared when touchscreens and miniaturisation technologies matured. You can look at it like this, people do, but there are some wrinkles in here. For example, laptops are basically mobile phones with a huge screen, huge batteries, and keyboard. In many cases the hardware involved is the same, and phones can easily be made to run the same operating systems. Those with laptops who (like me) use them on the go but then 'dock' them to a larger monitor, storage, and potentially eGPU hardware etc, are essentially bridging the gap between laptops and desktop computers. The only limitation is the comms standards (like USB 3.1 or Thunderbolt 3) and software support. Smartphones have this potential as well - plug in your phone at home and run Resolve on it for example.. We currently turn our phones into laptops quite commonly: So, in the current marketplace laptops are sort-of desktops, and phones are sort-of laptops, what is the future of the desktop? Is it that desktops will be sort-of super-computers? If so, why shouldn't they be designed to be far more powerful than laptops, and priced accordingly? In a world where things are sort-of the next size up, it sort-of makes sense -
You're welcome!! I'm assuming they're all Russian lenses? I think I can spot the Helios 44 and Mir 37 in there, but the rest aren't obvious to me Confused and paranoid man with camera runs around at the seaside and accidentally becomes the saviour... talk about trotting out the old cliches!!
-
Did you figure it out? I have no idea, and if Jim couldn't help then it might be a strange bug of some sort, which means the normal troubleshooting steps, like copying every clip to a new timeline, or a new project, etc.. Alternatively, you could set the in and out points in the Deliver page to a single frame and then export the video using a codec that is an image sequence and get a single image that way. It's a bit of a PITA though!
-
@heart0less Have you checked in the Deliver page, there are some options (maybe under Advanced?) about using optimised or proxy media during the export, maybe these are accidentally enabled?
-
Great stuff, keep us informed. PS, I like your writing style, direct with lots of info in few words. Keep it up!
-
My finger grip for the X3000 arrived yesterday, it is a very interesting and well designed accessory. The unfortunate aspect of it is that it completely obscures the rear door, prohibiting the use of an external microphone. I've examined it with the intent to drill a hole in order to get a 3.5mm mic jack through the rig and into the camera, but it doesn't look like that is too feasible. I will do some testing with the built-in mics and see what I think. My uses for this camera are mostly without dialogue so ambient (non-directional) audio is probably fine. I think this rig might be one of the most interesting from a "nano-rig" perspective, and it might also be why there are hardly any third-party rigs for this thing - it covers the tiny rig category and if you are fine with larger rigs then the 1/4-20 mount provides you with all the options you would need.
-
The new Mac Pro + 6k monitor has landed - introducing the Cheese Graters
kye replied to Trek of Joy's topic in Cameras
There's an economic theory that basically says that 10% of your customers can afford to pay 10x the price for an item if it gives them something they can't get elsewhere. That means (for example) that if Apple are selling 18M computers per annum (link) then there are roughly 1.8M people who would buy a computer that cost 10x what the Mac cost. If a family car costs $25k think of how many people buy $250k or even $2.5M cars. And if R&D costs a the same figure to make a super-charged version then you will still recoup it by selling the same overall $ amount of them per year (10% of 10X is 1). You guys might be surprised they're pricing a computer so high, but I'm surprised that they don't make computers that cost $40k and $400k. -
Help The Camera Manufacturers Make Better Cameras: Part 2
kye replied to Andrew Reid's topic in Cameras
Absolutely. I also think that people don't know they want something until it is invented or becomes feasible. For example, when 1080p was released no-one turned around and said "I want a camera that can record four times the resolution of FullHD, I want it to be the size of a match-box, have a screen on the back and also one on the front so I can film myself and see if I'm in frame, and I want it to do magic so that it looks like a professional steadicam operator is holding it, and I absolutely need all these features and will accept nothing less because I'm going to be using it to make my own TV show starring me and I'll get paid by doing advertisements for brands that reach out to me, not me to them". Now it exists and people are criticising it because the bitrate isn't high enough. -
Help The Camera Manufacturers Make Better Cameras: Part 2
kye replied to Andrew Reid's topic in Cameras
Another data point, slightly OT here, is that I spent 5 weeks in Italy last year and almost every camera I saw from tourists and holidaymakers was a Nikon super-zoom fixed lens model of some kind. I saw a few Nikon DSLRs and a few Canon DSLRs, and of course a zillion smartphones. What I really took from that was that there are pockets of users and they can be quite insulated from each other. -
That makes total sense, and I can see that it would result in lots of shots where people are running towards you, providing the best opportunities for shots. I'd have to weigh up the fact I normally sit with the wife and watch the game together while I shoot, plus the fact it would make my son all the more aware of me filming him play, against the better footage. Life was never meant to be easy I guess!
-
@joema thanks for sharing your methods and workflows In a sense I'm thankful that I don't work with the footage quantities that doco folks work in - a shooting ratio of 600:1 is pretty killer! Plus, the quality bar I have for my projects is exponentially less so there is that Part of the challenge I see is for NLE designers to provide methods for organisation and marking-up raw footage that is of sufficient flexibility and scalability to meet the challenge of the project. The smaller and simpler the project the less comprehensive the tools need to be for doing it all within the NLE. I also see there being a kind of scalability factor where below a certain size the built-in functions of the NLE will work, and above a certain size where doing things outside the NLE and even across multiple external tools is an acceptable overhead for the size of the project and the capability and capacity of the team of editors. However I wonder if there is a middle ground where you have too much complexity for the NLEs integrated tools but the editing team can't afford the overhead of external tools and the additional admin overheads this creates. This gap may not exist in FCPX or PP (I've heard that they're mature and well-featured editing packages) but I suspect that this might be where Resolve might be a bit lacking. I'm not in the territory of this gap, I'm just kind of thinking out loud here. My challenge (and my original call for comments) was that my challenge outstripped my previous technique of just doing a series of passes over the footage until I had an end product, so had to dip into some of the clip / timeline management tools available. In future I'm contemplating doing highlight reels that cut across many years of footage and may end up including footage that didn't make it to a final edit of a previous project, so in a sense I'm kind of looking to a possible future project where my many hundreds of hours of footage are the source media. Also, understanding a larger challenge normally helps when you're facing a smaller one, so there's both a longer and shorter-term benefit of this discussion for me.
-
Help The Camera Manufacturers Make Better Cameras: Part 2
kye replied to Andrew Reid's topic in Cameras
1000 nit screens would be great, but if not possible / practical then a decent EVF can also work almost as well. -
Help The Camera Manufacturers Make Better Cameras: Part 2
kye replied to Andrew Reid's topic in Cameras
I voted for Internal H264/H265 but I did get a bit torn between that and internal RAW. My rationale is that I want the best quality I can get, but with a limited bitrate for practical purposes like ease of editing and stage bitrates. I use GH5 with UHS-I cards, so I'm limited to the 150Mbps h264 4K mode or the 200Mbps h265 6K codec, and whilst I'd love to shoot the 6K mode all the time, saying it's a PITA for editing is possibly the understatement of the century, so for the moment I've gone back to 4K mode because the h264 is so much nicer. If the manufacturers started offering internal RAW then I'd be very very interested in codecs like BRAW 12:1 (46MBps) or maybe 8:1 (68MBps) for short projects or hero shots. But every Prores except Proxy (22.4MBps) and, at a stretch, LT (51MBps) are beyond practicality for me. I also voted for 6K resolution. For me, I'm currently publishing at "1080 quality" which means that I'll probably export to YT in UHD but I don't sweat it if there's a shot that is only 1080 quality because either I've cropped into a 4K shot, or it was a 4k50 150Mbps LongGOP shot conformed to 25p and possible slightly cropped and/or stabilised in post. Of course, I'd love for every shot to look like it was shot natively in the export resolution in full uncompressed RAW, but that's not my expectation. In this sense, taking in consideration my ~500Mbps bitrate limit, it's a toss-up between modes like: a very high bitrate 1080 HFR mode which would have less resolution to crop into and stabilise, but also less compression, more chroma subsampling, and higher bit depth a 4K "middle ground" a 6K oversampling approach which will have the most resolution, but the most compression and least data-per-pixel as-shot, but when downscaled to 4K may give a better result, especially considering that stabilisation is only as good as its tracking accuracy, which is a function of resolution. When 1080 and 720 came out I did some tests comparing what resolution gives the nicest output for a given completed file size. When you compress a video file to 1080, 720, and SD where all have the same output file size, the file with the highest resolution was easily the best quality. This quality test assumed the same codec, and all files were quite compressed, so the test may not be directly applicable to a compressed vs RAW comparison, however it suggests that a comparison might be worthy. Given this logic, the h264/h265 6K or 8K file may have advantages over a lower resolution but much higher bitrate file. I'd be open to both options, especially when you consider that a 4K sensor doesn't have full colour subsampling, whereas an 8K sensor can do 4K at full subsampling. This then leads into the question about if you should downsample from 8K to 4K in-camera before compression or downsample in post after compression. I suspect there are advantages to each. -
This design doesn't seem to be weather sealed, but theoretically if you have the vents pointed down then it would be difficult for water to get in there. Kind of like how houses with roof tiles keep water out most of the time. You can see on the diagram that the water that hits the outside of the tile would have to flow up the tile, then between them through that zigzag pathway to get over the top lip of the tile and into your ceiling. This is why tiled roofs will often leak if there's a strong wind coming from one direction because it will push the water up the tile instead of allowing it to flow downwards. If you have a camera with a convoluted ventilation pathway then you could keep most of the water out, even if there was a small amount of splashing upwards, however if you then put the wet camera down without keeping it upright then all the water clinging to the bottom would find its way inside, which is why they probably don't claim any waterproofing spec on electronics like this. .... just so you know
-
Promoting the competition.... I would much rather have heard it was yours
-
Controller app for android.. @BTM_Pix is this yours?
-
One of the ways I want to use my X3000 is for time lapses, things like sunrises and sunsets, but also busy locations with crowds, and things like family dinners while on holiday. Doing a bit of research around supports got me these ideas.. Manfrotto Pocket series I've got the larger model and use it even on my GH5: The smaller one looks intriguing for nano setups, but for the X3000 it's oriented in the wrong direction, so you wouldn't be able to angle it up or down that much without turning it 90 degrees to the camera, at which point I'm not sure how balanced it would be. There's also the Job Micro series However, I'm wondering if there's a DIY opportunity based upon something like the Platypod: or the pocket tripod: Not too sure which way I'll go, but I want something that I can angle the camera up or down for compositions, and probably a little bit sideways to level it out.
-
Interesting poll. I chose IBIS because I shoot hand-held, size and weight because I have to carry it all day and shoot in places that consumers are welcome but pros aren't. In terms of the top options (as they are now), I shoot GH5 and so I'm probably quite spoiled, but I don't feel I need more DR, the codecs I have are just fine for my needs, I have great IBIS, I don't care about NDs, price is good already, and I use manual focus anyway as I like the aesthetic. Great low-light would have been a good option to add to the poll, I think people care about that to some extent.
-
I agree. I bought the 18-35/1.8 for my APSC camera on the promise that it is like having 28mm 2.8, 35mm 3.8, and 56mm 2.8 equivalent primes, all without having to change lenses, and it absolutely delivered in that sense. When combined with Magic Lantern and its 3x crop mode, it made it into a 28-168mm zoom, which would be the perfect walk-around lens. I only went a different way because of problems with ML and the fact the setup had no stabilisation and I shoot hand-held. This lens is the same principle, but with the IBIS in the GH5. Had this lens been around when I swapped to the GH5 from Canon I might have just bought it and never looked back. Although I wonder how heavy it will be.... the 18-35 1.8 is prohibitively heavy for a hand-held walk around lens.
-
Wow.. that extra 100mm sure adds a huge amount of length to the overall size! and strange considering that the longer one is a 3x zoom and the shorter is 5x, which you'd think would be the other way around. Optics doesn't make sense to me a lot of the time! Ha!
-
That's going to instantly be the 18-35mm 1.8 equivalent of the MFT world, you can see it already. I like the fact they went wider at the wide end, 14 or even 12mm isn't wide enough I think, which is why I have an 8mm prime in my kit, and is either the 2nd or 3rd lens in my bag, depending on the situation, because nothing beats it for those 'wow' wide shots.
-
Indeed it is! TBH I'm a quite disappointed. I found his review to be more from the perspective of a YouTuber rather than a cinematographer. For someone who reviews things like Kinefinity, EVA-1, FS5, C200, Fuji MK cinema lenses, cine primes, etc (and quite well and thoroughly I have normally thought) he basically just tested the marketing claims of the GoPro and Osmo, and gave us a Codec and IQ 101 mini-class. He accepted the premise of the products (as action cameras) instead of ignoring the premise and testing how well they would perform from the point of a cinematographer. What was missing was all the stuff that we argue about on here: No talk about ecosystems and how they do or don't fit with basically any other equipment (recording time limits, mounts, audio, use on tripods, etc) No discussion of post-production except in the context of stabilisation or to examine the DR Really not much acknowledgement that anyone would use these cameras as anything other than action cameras Things we should have learned (that I already know) include that the X3000 has tripod mounts not GoPro ones, and has a 3.5mm audio in jack that doesn't require an extra converter, how good the low-light was on any camera other than the GoPro and Osmo, what the 3D flippy camera footage looks like (was it in there at all?), what they might be good for other than stabilising the footage you take with them, etc. Beyond knowing that productions with real budgets use GoPros and other prosumer/consumer cameras, I know very little about why they use them, how they use them, how they hide it, and what things I should be considering. PB knows a lot more about shooting than I do, but he didn't bring anything new to the table on this one, unfortunately. Speaking of things we didn't learn..... Imagine is the key word here, because it was notably absent. In fact I would sum up his video as the antithesis of the title - I think it was just another action camera comparison, just like all the others that grabbed as many small form factor cameras as they could, made a thumbnail, introduced the video as a huge camera roundup, then mostly ignored the others and only compared the GoPro to the Osmo and forgot that Sony, Yi, SJcam, etc even exist.
-
Interesting ideas. I'm familiar with repetition and evolution in music, but how would you do this kind of thing visually? Just thinking out loud here, but at a shot-level it would be kind of hard to create in the edit, but you could have little patterns in sequences like a wide-medium-tight at the beginning of each scene perhaps, or a certain rhythm in the cuts perhaps. I heard that in film school they talk a lot about layers, so you have these recurring themes that don't directly influence the plot but kind of weave the footage together on an aesthetic sense perhaps, like colour schemes, or set design, or language, etc. I guess the answer is probably all of these things. In a sense I have developed my own visual language with a title sequence that is similar for all videos, and I tend to do straight cuts when I'm cutting within a location or event, and use dissolves when I'm transitioning from one location or event to another one. I have noticed that other travel films have used a quick fade-to-black for scene changes which gives a different feel. I'm still working out how best to get from one location to another, if it should be travel b-roll or not, etc, but it is a repetition of sorts. I mostly just edit down the events I film in the order they happened, and I know that docos are often highly chopped up. It made sense to me how you could have a three-act structure (or other structure) in a drama, and also in a doco, but I couldn't see a link in-between my sequential editing and these structures, but I think I'm starting to understand how I could subtly weave such things into my edits. It wouldn't be obvious, and I already have a kind of emotional arc by using the shape of the music I include, but I can see that you could push things further with subtle things like edit timings, grading changes to make the film ebb and flow a bit more, etc. These techniques might be the way to give my films an ending without deviating from the chronology of how things actually occurred, which I'm keen to stick to, at least for the moment. Definitely something to contemplate!
-
Ah, I thought you were saying to use it on the new FF Panny, and I'm thinking "ummm........" ???