Mo-Sys joins HP and NVIDIA at SIGGRAPH 2018 to demonstrate the latest virtual production technology

At SIGGRAPH 2018, Mo-Sys will showcase its new StarTrackerVFX tool at Waskul Entertainment’s StudioXperience broadcast studio and technology space. The patented optical camera tracking system will be used to demonstrate how it brings virtual and real worlds together in real time, removing the complexities, time and budget constraints associated with virtual production. The StudioXperience space is sponsored by HP and NVIDIA.

Mo-Sys will demonstrate a real-time composite of interviews filmed on a green screen and rendered by the Unreal Engine, which reduces the need for costly fixes in post. The green screen footage and camera tracking data captured will then feed into The Foundry’s NUKE compositing suite, where VFX will be added in just twenty minutes – completing the full-VFX workflow on the fly.

In a separate demonstration, the StudioXperience stage will also showcase camera tracking from Mo-Sys’ industry-proven StarTrackerTV solution for TV studios and broadcasters, combined with real-time rendering from Zero Density. StarTrackerTV is currently used by over 100 broadcasters and facilities providers including BBC, Sky, ESPN, FOX, CNN, NHK, ZDF and NEP.

Simplifying the workflow

StarTrackerVFX simplifies an otherwise complex process of aligning a moving camera within the virtual environment. Its direct plug-in for the Unreal Engine enables users to film actors in front of a green screen and effortlessly immerse them in photo-realistic environments. The camera tracking data is recorded with a timecode stamp and can be exported as FBX camera data for post-production, re-rendering and visual effects.

How it works

StarTracker is an upward facing sensor that attaches to any broadcast or film camera. It points at the ceiling, rather than looking forward and uses retro-reflective stickers – also known as stars – as a reference. These are robust, inexpensive, easily installed and offer immunity to changes in light. A one-time 30-minute procedure auto-maps the sensors position and references it to the real world, after which no further star calibration or ‘homing’ is required.

As part of the StarTrackerVFX toolbox, the StarTracker ViewFinder also benefits directors, DOPs and VFX supervisors. It’s mounted to a handheld monitor and tracks every movement; acting as a window into the virtual world that allows them to frame the scene and create camera moves without the need to guess where objects are.

Mo-Sys founder and owner Michael Geissler commented: “StarTrackerVFX lets creatives tell the best version of their story. The end-to-end streamlined solution takes the technical worry away from using CG backgrounds with no compromise on quality. Its advanced technology lets users move the camera freely and understand what the scene will look like in the end.”

StarTrackerVFX from Mo-Sys will be demonstrated in partnership with Waskul Entertainment from August 12 to 16 at SIGGRAPH 2018. Waskul Entertainment’s StudioXperience broadcast studio and technology space is sponsored by HP and NVIDIA. 

CATEGORIES