I was asked to create a system that would allow an AR presentation to be shown on-location in a theatre, and live-streamed globally to branches, with high quality video output from a handheld camera and stationary cameras. Due to NDAs, none of the presentation content can be shown, but I can show both the camera rigs I put together.
At first we intended to use a hololens for filming, while the presenter would wear their own – both networked together to share the presentation’s location and animations. However we found the streaming video quality and latency wasn’t up to our standards. We tried to incorporate a go-pro to provide video and mix the feed on a laptop, but the latency issues were still present and variable enough we couldn’t sync the two feeds properly.
I rebuilt the rig using a Vive wand mounted to a camera, which only took light modification to the actual presentation code and content. After some tweaks to match the vive latency with our video pass-through system, we had it running smoothly.