Hi there, fellow EOSHD forum members. I wanted to talk a bit about Virtual Production (hope there are a few interested people). Setting up a VP studio can be quite overwhelming, especially when choosing a tracking system. There are several options to choose from, and we've made this quick demo to show how to use Antilatency with Unity. You can track cameras, objects and even body parts.
In this case, we have used a tracking area attached to the truss system, one tracker for the camera, and two trackers for each leg. All tracking data was transmitted to Unity via proprietary radio protocol, and was used for real-time rendering. We didn't use any post-processing or cleanup — only real-time data.
I would love to hear what you and Andrew think of it - this demo and Virtual Production in general. We’re also open to suggestions on what demos we should do next!