Jump to content

BTM_Pix

Super Members
  • Posts

    5,798
  • Joined

  • Last visited

Everything posted by BTM_Pix

  1. I remember how pedantic the replies to your last trivia question got so I'm going to get an early bid in by saying Metabones.
  2. To be honest mate, he was more than happy to set his gang of rabid little shits off on you over the R5 overheating issue until he belatedly realised he had backed the wrong horse. That was grade A shithouse behaviour.
  3. If its for travelling and you've got an iPad then I'd be temped to use Sidecar to use it as an external monitor. It will fit the need for something smaller than a 14 or 15" external type, they are decent screens and of course it will be useful for other things when you aren't using it as an editing monitor. Saves you taking the BM box with you as well. If you prefer to monitor using the screen on the MacBook then you can also flip roles and use the iPad for the UI. Another bonus is that you can put LumaFusion on it to do some assembly editing on hoof on the plane, train, coach etc and then use its XML export function to carry on in Resolve when you get to somewhere more fixed to finish it off. As it can now edit directly from external drives, its a very viable workflow.
  4. Yes, if you are using electronic EF lenses then there are no additional requirements. If you are using manual focus lenses then of course you need to add Nucleus-N or M motors. As above, the current production have all sold out and there will be another run when the component cost/supply chain issues have settled down so there is no ETA at the moment. If you have contacted or do contact cdatek regarding future availability then you will automatically receive an email when they are available.
  5. Great, I'll be humming that all night now.
  6. Sigma have quietly-ish released quite a significant update for the FP-L https://www.sigma-imaging-uk.com/major-firmware-update-for-the-sigma-fp-l-ver-2-00/ The big headline is the inclusion of false colour and I'm curious to know whether this can go some way to address the, erm, "quirky" metering issues when shooting RAW. It also adds the linear/non-linear option for the focus ring on compatible lenses and smart new option to select which exposure settings you want to carry over when switching from Still to Cine. No word yet on whether that will adopt the new Canon industry standard metric for hybrid cameras of taking 8 seconds to switch between.... The big question, of course, is when this is coming to the regular FP as it seems unlikely Sigma would do a snide differentiation thing. Has anyone here actually got an FP-L ?
  7. Yes, funnily enough it was a toss up between that one and the one in the original post as regards an example of the destination. Although its the steps that I use to get there and whether I branch off somewhere else that remains to be seen. My ultimate end with this may not be using it myself creatively at all but more ending up having developed tools for others to do it. Incidentally, the posts I've made so far are summary recaps and are about two weeks behind real time so there has been some decent progress on that front already that will be coming up soon.
  8. I still think back to when we put the Contax Zeiss 35-70mm on your GFX100 and how fantastic it looked. In the spirit of that particular trip, I would've attempted to buy it off you even though it was actually mine.
  9. Step 3 - Start your engines.. So, a few days on and where are we up to ? I'm happy to say that we have moved forward but, obviously, its a very long journey so the extent of that movement is all relative. I'm happy to see though that @majoraxis has put together a good primer regarding the projectors for when (or if 😉 ) I get that far. First step forward was installing Unreal Engine and the first decision there was whether to use the current version 4 (4.27 to be precise) or the early access version of the upcoming version 5. Both versions are free so the choice for me really was to stick with the current v4.27 release version as its more of a known entity so the learning resources are more plentiful, which is important whilst I'm just paddling in the shallows. There is also less of a resource requirement in terms of the machine to run it on which, as I'm using a MacBook pro that is really long in the tooth, is a big factor. Going into this, I am well aware of the limitations of the 1.5gb built in Intel graphics of my MacBook so there is no point trying to explore what v5 brings to the table as judging by the hovercraft noises coming from it when I'm running v4.27 then I'm guessing running v5 would require a fire extinguisher to be at hand. You can have both versions installed on your machine though so if you have a suitable machine then go for it. Just to briefly touch on why v5 is a big deal, the primary aspects are two elements called Nanite and Lumen which offer huge advances in terms of detail and lighting control as discussed here : Obviously, v5 will be the ultimate destination but thats some way off yet, particularly considering that a wallet damaging computer upgrade will be required. As what I'm interested in doing first is applicable to both versions then it can wait anyway and I'll be working in v4.27. OK, so just circling back to the fundamentals of what the hell Unreal Engine is and using very broad brush strokes to discuss it... (apologies for the baby steps) It started as the engine used to produce a game called Unreal back in the late 90s (the clue was always in the name 😉 ) and was then licensed to other game developers over the subsequent years using various licensing models to its current status where it is free of charge in terms of royalties for products grossing less than $1m. Which means its definitely completely free for us ! You can read more about its history here https://en.wikipedia.org/wiki/Unreal_Engine Such is its longevity, ubiquity and support of all gaming platforms, its fairly likely that if you have played any games more graphically challenging than Pong in the past few decades that you have already experienced something created with Unreal Engine. If, like me, you have been playing more recent titles such as those that have had you running around Midgar lashing Materia at anything with a Shinra logo on it then you will have noticed how much more cinematic everything is. There have always been the pre-rendered cut scenes that have always looked cinematic of course but now the real time in game content is also taking on an increasingly cinematic look thanks to the simultaneous advancement in machines we are playing them on with the tools in Unreal Engine to simulate the aesthetic. Bringing this down to two very basic elements, Unreal Engine offers you the ability to build and light the set and then provides you with the camera with which to capture it. In real time. And of course, as this emulates the exact paradigm of real world production, the advancements in those two elements is what has sparked the interest in using it for virtual production. From my point of view, the creation of the set is secondary at this point as I don't have the skills or the time at the moment to be creating the assets but, fortunately, I don't have to as there is plenty of starter content that is freely available from the Marketplace for Unreal Engine that we can use to get going. For me, this is all about the virtual camera as this is what we will see the scene through and what we will be looking to match to a real camera so all the initial work I've been doing is using very simple sets and seeing what I can do in terms of operating the virtual camera. The virtual camera has all the same elements of a real camera in that you can not only move its position, change the lens focal length and aperture etc but also control its processing elements such as white balance, ISO and shutter speed. In the parlance of Unreal Engine, this virtual camera is referred to as a Cine Camera Actor and here is an example of how you see it within the editor. So as you can see in this example, we have the Cine Camera Actor pointing at the figure inside the set that has been created and in the bottom right you can see its generated viewport of the scene based on its current settings. The Cine Camera Actor has all the same elements of a real camera in that you can not only move its position, change the lens focal length and aperture etc but also control its processing elements such as white balance, ISO and shutter speed and all these changes that you make will be reflected in the generated viewport in real time. So, if we change the focal length to be wider then we will get the matching field of view, if we move the position of the camera we will see a different part of the scene, if we open the aperture we will get a shallower depth of field and so on, exactly as it would with a real camera. How we make those changes in real time as we would with that real camera will be covered in the next enthralling episode 😉
  10. BTM_Pix

    Lenses

    4K Internal 8it RAW There is zero movement (aside from camera shake) in those shots so anything motion would be pointless to be honest. I'll do something a bit more suitable with a few of the CZ lenses I've got and put it up in a few weeks.
  11. Yes, there are a number of possibilities to do it differently with integrations of different bits and pieces. The advantage of the Vive for most people is that it is an off the shelf solution. Some of us have shelves with different things on though 😉 Not with my ancient steam powered Samsung it wouldn't ! But yeah, this is what I was referring to about how much shit you need to cobble together to reach the start line whereas the new generations of smartphones have all of the components built in. Including the background removal so you don't have to use chroma key. Not sure about the depth of support for it in Unity. Or, more importantly, the ratio of people that would be using it for VP versus those using Unreal Engine which means less support materials and user experience to help the journey along. Fuck that. This entire projects is just a thinly veiled excuse to sneak a 4K ultra short throw laser projector into the house under the premise of essential R&D. Until then, it'll be one of those pop-up green screens that are more difficult to get back in their enclosure than a ship in a bottle !
  12. In their promo video at the cued up point here, they dedicate a whole section of it to the "hey, switch to stills....hey, lets do video...hey, back to stills". They make it appear to be seamless and pretty much instant when the reality is they should have shown this between each switch. So whether you need it or not isn't the point for me, more that there's no need for such blatantly misleading bollocks from Canon in their promo video. Mind you, its probably fixable in firmware as we know what they are like when it comes to timers don't we ?
  13. Yeah, I have seen some people looking to integrate them and similar (like the now sadly discontinued Intel Realsense cameras) and they do have Unreal Engine integration. At $499 for the new Zed 2i they work out a fair bit cheaper than a start from scratch Vive system. For now the conservative option is the Vive because so many people use it that its easier to find resources for it so I'll see where I'm at when the time comes as they will likely be easier for me to integrate. The lens control aspect is the target product development project for me to emerge with from doing this journey. Well, the first one anyway. The idea of having the enormous OLED walls is obviously a pipe dream but I guess the peak at this level for me I think would be to use one of the 4K Ultra Short Throw Laser projectors like the LG Cinebeam etc. Sat off the wall at 38cm they'll give a 2.2 metre screen size and go up to just under 4 metres at about 50cm. Price is roughly €2-2.5K for those type of projectors (plus screen) so I'd have to be very keen on this to get one but in an era when no one is batting much of an eyelid at €6K hybrid cameras then its not really that eye watering a sum to invest. Particularly as if it all turns to shit you can always have fantastic movie nights with it in even the smallest of spaces ! Until then, I'll have a mess about with my current little projector to experiment with. At this point, not to mention for many points afterwards, I'm more concerned with the functional aspect rather than any semblance of quality !
  14. Step 2 - Holy Kit OK, so we've established that there is no way we can contemplate the financial, physical and personnel resources available to The Mandalorian but we do need to look at just how much kit we need and how much it will cost to mimic the concept. From the front of our eyeball to the end of the background, we are going to need the following : Camera Lens Tracking system for both camera and lens Video capture device Computer Software Video output device Screen No problem with 1 and 2 obviously but things start to get more challenging from step 3 onwards. In terms of the tracking system, we need to be able to synchronise the position of our physical camera in the real world to the synthetic camera inside the virtual 3D set. There are numerous ways to do this but at our level we are looking at re-purposing the Vive Tracker units mounted on top of the camera. With the aid of a couple of base station units, this will transmit the physical position of the camera to the computer and whilst there may be some scope to look at different options, this is a proven solution that initially at least will be the one for me to look at getting. Price wise, it is €600 for a tracker and two base stations so I won't be getting the credit card out for these until I've got the initial setup working so in the initial stages it will be a locked off camera only ! For the lens control, there also needs to be synchronisation between the focus and zoom position of the physical lens on the physical camera with that of the virtual lens on the virtual camera. Again, the current norm is to use additional Vive trackers attached to follow focus wheels and translating the movements to focus and lens positions so that will require an additional two of those at €200 each. Of course, these are also definitely in the "to be added at some point" category as the camera tracker. The other option here are lens encoders which attach to the lens itself and transmit the physical position of the lens to the computer which then uses a calibration process to map these to distance or focal length but these are roughly €600 each so, well, not happening initially either. Moving on to video capture, that is something that can range from the €10 HDMI capture dongles up to BM's high end cards but in the first instance where we are just exploring the concepts then I'll use what I have around. The computer is the thing that is giving me a fit of the vapours. My current late 2015 MacBook Pro isn't going to cut it in terms of running this stuff full bore and I'm looking to get one of the new fancy dan M1 Max ones anyway (just not till the mid part of the year) but I'm not sure if the support is there yet for the software. Again, just to get it going, I am going to see if I can run it with what I have for proof of concept and then defer the decision until later regarding whether I'm going to take it further enough to hasten the upgrade. What I'm most worried about is the support to fully utilise the new MacBooks not being there and having to get a Windows machine. Eek ! At least the software part is sorted as obviously I'll be using Unreal Engine to do everything and with it being free (for this application at least) there is no way of going cheaper ! The video output and screen aspect are ones that, again, can be deferred until or if such time as I want to become more serious than just tinkering around. At the serious end, the output will need an output card such as one of BM's and a display device such as a projector so that live in camera compositing can be done. At the conceptual end, the output can just be the computer screen for monitoring and a green screen and save the compositing for later. To be honest, even if I can get my addled old brain to get things advanced to the stage that I want to use it then its unlikely that the budget would stretch to the full on in camera compositing so it will be green screen all the way ! OK, so the next step is to install Unreal Engine and see what I can get going with what I have to hand before putting any money into what might be a dead end/five minute wonder for me.
  15. Step 1 - The Pros And Not Pros So, to look at sort of what we want to end up with conceptually, this is the most obviously well discussed example of VP at the moment. Fundamentally, its about seamlessly blending real foreground people/objects synthetic backgrounds and its a new-ish take on the (very) old Hollywood staple of rear projection as used with varying degrees of success in the past. The difference now is that productions like The Mandalorian are using dynamically generated backgrounds created in real time to synchronise with the camera movement of the real camera that is filming the foreground content. Even just for a simple background replacement like in the driving examples from the old films, the new technique has huge advantages in terms of being able to sell the effect through changeable lighting and optical effect simulation on the background etc. Beyond that, though, the scope for the generation and re-configuring of complex background sets in real time is pretty amazing for creating series such as The Mandalorian. OK, so this is all well and good for a series that has a budget of $8.5m an episode, shot in an aircraft hangar sized building with more OLED screens than a Samsung production run and a small army of people working on it but what's in it for the rest of us ? Well, we are now at the point where it scales down and, whilst not exactly free, its becoming far more achievable at a lower budget level as we can see in this example. And it is that sort of level that I initially want to head towards and possibly spot some opportunities to add some more enhancements of my own in terms of camera integration. I'm pretty much going from a standing start with this stuff so there won't be anything exactly revolutionary in this journey as I'll largely be leaning pretty heavily on the findings of the many people who are way further down the track. In tomorrow's thrilling instalment, I'll be getting a rough kit list together and then having a long lie down after I've totted up the cost of it and have to re-calibrate my "at a budget level" definition.
  16. Creating emulations of different lens profiles could well become a thing.
  17. You and me both. Like most things these days, the amount of stuff and graft required to get systems set up with "real" cameras just to get the same starting point in terms of real time composting etc as some phone applications is quite mad ! The interesting thing is how it may impact the role/expense of the camera and lenses in such productions as, from a practical though simplistic point of view, the onus just becomes on achieving a sharp capture of the live element as all of the background elements where the character/mojo/hype kicks in with real cameras and lenses will be synthetically created. It offers the unwanted potential for some horribly synthetic "portrait mode" stuff like some smartphones of course. But with much more expensive cameras and lenses. And tons more computing power required !
  18. They say "A journey of a thousand miles begins with a single step" and that describes exactly where I am with Virtual Production. The reason for setting off on this journey ? Curiosity in the process is one aspect - and is something I'm hoping will sustain when the expected big boulders block the route - but I also think VP is going to be a process that trickles down to lower budget projects faster than people might expect so it is something I'm keen to get involved in. What aspects of VP am I looking to learn about and get involved with ? In the first instance, it will be about simple static background replacement compositing but will move on to more "synthetic set" 3D territory both for narrative but also multi camera for virtual studio applications. There will also be a side aspect of creating 3D assets for the sets but that won't be happening until some comfort level of how to actually use them is achieved ! And the ultimate objective ? I'm not expecting to be producing The Mandalorian in my spare bedroom but I am hoping to end up in a position where a very low-fi version of the techniques used in it can be seen to be viable. I'm also hopeful that I can get a two camera virtual broadcast studio setup working for my upcoming YouTube crossover channel Shockface which is a vlog about a gangster who sells illicit thumbnails to haplessly addicted algorithm junkies. I'm fully expecting this journey to make regular stops at "Bewilderment", "Fuck Up", "Disillusionment" and "Poverty" but if you'd like to be an observer then keep an eye on this thread as it unfolds. Or more likely hurtles off the rails. Oh and from my initial research about the current state of play, there may well be a product or two that I develop along the way to use with it too...
  19. I think this is a neat idea. Anything that encourages more viewfinder gazing than navel gazing is a win for me at the moment. I'm in.
  20. Kotaro became quite the celebrity after his sax playing exploits.
  21. He was absolutely desperate to get a grip of that protein shake wasn't he ?
×
×
  • Create New...