Gianluca Posted March 6, 2022 Share Posted March 6, 2022 Hello everyone ... I preferred to open a new topic rather than continue to abuse "my Journey to virtual production" With the carnival I was able to do some more tests with unreal and this time for I recorded my subject up against a yellow wall ... As you can see, in controlled working conditions, the results can be really good ... Obviously there is still a lot of room for improvement, for example I have to synchronize the two video tracks by recording a common audio track, I have to balance the gimbal better (I have a crane-m and with the mobile phone mounted I exceed the grams it can support, so it vibrates a lot) , but apart from that, if I had an actress who does something sensible apart from shooting all the time 🙂 I might at this point think I can shoot something decent ... What do you think? Opinions, advice? majoraxis, KnightsFan, webrunner5 and 2 others 5 Quote Link to comment Share on other sites More sharing options...
KnightsFan Posted March 7, 2022 Share Posted March 7, 2022 The matte is pretty good! Is it this repo you are using? You mentioned RVM in the other topic. https://github.com/PeterL1n/RobustVideoMatting Tracking of course needs some work. How are you currently tracking your camera? Is this all done in real time, or are you compositing after the fact? I assume that you are compositing later since you mention syncing tracks by audio. If I were you, I would ditch the crane if you're over the weight limit, just get some wide camera handles and make slow deliberate movements, and mount some proper tracking devices on top instead of a phone if that's what you're using now. Of course the downside to this approach compared to the projected background we're talking about in the other topic is, you can merge lighting easier with a projected background, and also with this approach you need to synchronize a LOT more settings between your virtual and real camera. With projected background you only need to worry about focus, with this approach you need to match exposure, focus, zoom, noise pattern, color response, and on and on. It's all work that can be done, but makes the whole process very tedious to me. majoraxis 1 Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 7, 2022 Author Share Posted March 7, 2022 Yes, it's robust video matting.. In this video tracking it's bad but it's my fault, fps in the 2 videos doesn't match... I'm using virtual plugin, an app for unreal engine to install on a smartphone... Projected background it's better, but you NEED a studio, a very large studio, with light etc etc.. Here I'm in my garden.... Now i want to use blender camera tracking for videos where exact match is needed.. majoraxis 1 Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 9, 2022 Author Share Posted March 9, 2022 Yesterday finally after an exhausting day arguing with unreal engine, I managed to get a video with a credible tracking, and the merit is all of a free application for smartphones called blendartrack. I have never been able to use the blender camera tracking, moreover if there are moving subjects you have to stay there to delete the points by hand where there is the overlap of the subject, then you have to align the planes etc etc .... With blendartrack all this is really a walk, from when I open blender to when I export for unreal it takes no more than 15 seconds, truly a liberation ... In this example the tracking in my opinion is not even at its maximum potential because in the rush with my son who wanted to do something else, I forgot to add markers, but it is already very good ... What surprised me then is that the focal length used by the mobile phone is also exported, and that the world of unreal (or at least the one used in this example) responds perfectly to the movements of the real world, without having to scale anything. The only big drawback at the moment is that the video taken with the mobile phone has a very low quality compared to what I can get with the a6300 and its dedicated optics .. I have to figure out how to track the file with my mobile phone and then use a file taken with the mirrorless. majoraxis 1 Quote Link to comment Share on other sites More sharing options...
majoraxis Posted March 10, 2022 Share Posted March 10, 2022 @Gianluca Wow - the tracking looks so realistic, especially when the camera shakes! I love the last part where the camera twists and it looks like it tracked it perfectly. This is really getting good. Looking forward to seeing how you make this work with your a6300, how do you synchronize the tracking data from the phone (assuming you put it on top of the camera) with the video file created in the camera? I imagine the sync would have to be spot to keep that same level believability... Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 10, 2022 Author Share Posted March 10, 2022 Yes, I was very surprised for the last part... I've done some other test with this sequence with other environment and the best part for me is that i can change the location importig other world and my subject will be aligned perfectly with the floor... This will greatly speed up my work when I will combine multiple scenes.. Unfortunately I have not found a way to make the file of the mobile phone perfectly match that of the mirrorless, probably the problem is due to the fact that the mobile phone films at variable framerate while the a6300 at fixed framerate ... Even comparing the two files in resolve, without doing anything, you can see that they have different times .. Maybe I'm doing something wrong but at the moment I have to settle for the mobile phone file, which is unfortunately really bad, also because the application records at most in 1080p ... Now I'm trying to upscale the video with python to see if it improves a bit, but even if it doesn't matter, I have a mind to make a little movie where both the locations and my son will have a cartoon effect, so that the definition is a secondary problem ... majoraxis 1 Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 15, 2022 Author Share Posted March 15, 2022 I add this video even if unfortunately I can not add anything in reality to the discussion, just to show the exceptional tracking that blendertrack manages to obtain with an embarrassing simplicity .., everything is then transferred as it is to unreal with the same settings of the camera and this is the result .... If there were no algorithm interpretation problems because the file is in low resolution (1080p from mobile) and bicycle etc and because the background is not homogeneous enough, you could really do anything with the minimum effort ... Unfortunately this type of tracking I can only use for a few and studied scenes ... I then tried to digitize my son with capture reality but even here I couldn't, I'll probably have to try again with other camera settings BTM_Pix and Xavier Plagaro Mussard 2 Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 16, 2022 Author Share Posted March 16, 2022 In this case it was more complicated to combine the two videos ... You have to perfectly orient the direction of travel otherwise it seems that you walk too fast or slowly ... Even with the aspect ratio it was not easy ... What do you think? webrunner5 and majoraxis 2 Quote Link to comment Share on other sites More sharing options...
majoraxis Posted March 17, 2022 Share Posted March 17, 2022 @Gianluca that video was really impressive. From looking at the edge of the python key, it seems that there is a transition to the background at the edge so at times when the background of the subject video is similarly dark or light in color/lighting to background replacement video it is most convincing. Maybe If you lit your subject with a light from one side and had the background behind/on the other side of the subject darker then substitute a similarly lit and oriented virtual back ground, I believe you would have something that is even more seamless. Lighting your subject in anticipation of the background lighting you plan to use with the python script should look even better if you can get the real and the virtual back ground lighting to match closely. For my application, if I could get a projected background that was in sync with the camera and lens "position" with the replacement background synced in post with the camera movement in relation to the projected background then using the python keying script, be able to transition from real and virtual with as little give away as possible. My goal is to seamlessly transition from a subject shot on a minimal projected set to an infinite virtual set, which is compatible with traditional lighting and shooting techniques. I'm looking forward to what you come up with next! Quote Link to comment Share on other sites More sharing options...
Gianluca Posted March 26, 2022 Author Share Posted March 26, 2022 I have made some other small but significant progress ... Finally I can set up the sequencer of unreal quite fast and predictable, before there was always something that had to be done that I did not do that made me waste hours just understanding why it behaves in this way etc etc .. Then I finally understood how to create foreground plans in such a way as to increase the realism of the scene ... This bad example of animation really helped me understand this, and as always with unreal, it was not easy at all, at least for me ...;) majoraxis and webrunner5 2 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.