Tuesday 12 February 2019

VR: A work of art

Content marketing for NewTek
The incoming 5G network promises near instantaneous, high fidelity, high bandwidth video communications to open up tremendous possibilities for interactive remote learning and new immersive experiences. One of the more adventurous demonstrations of its potential to date is currently on show at Russia’s State Hermitage Museum in St. Petersburg where delicate artworks are being restored remotely by robot.
The first of its kind 5G Trial Zone is intended as a future technology demonstrator that mixes live stream immersive VR video with an interactive haptic setup controlling a remote robot arm.
The design, code and build of the project was by AR, VR and AI platform provider Room One, which in turn approached Focal Point VR to provide a 360° video camera rig, live stitching and streaming solution for the installation.
The full immersion of the live 360° 4K video stream was to be transmitted to Oculus Rift head mount displays over a local high-speed network with the direct mirroring of separate haptic devices enabling remote presence for art masters and their students.
Focal Point has developed its own end-to-end VR streaming platform, called Ubiety, which supports everything from delivering an HD 360° stream to YouTube through to multi viewpoint 360° and stereo VR ultra high-resolution event coverage.
Such was the pinpoint precision required of the remote art restorer operating a robotic arm interacting with fragile artworks, an ultra-low latency solution was called for.
“The aim was to give the user the sensation of being involved and to give art restorers the sense of touch – something which is only possible with extreme low latency,” explains Paul James, Head of Production, Focal Point VR. “We did a lot of testing with various protocols but none had the stability or absolute minimal latency we were looking for. RTSP, for example, exhibits a problem with packet loss in which audio and video can get out of sync.
“The target was super high-quality video transmitted at 200Mb/s. Plus, it had to work constantly without skipping a frame for the entire 18-month duration of the project’s installation.
“The only possible solution was NDI®.”
NewTek’s NDI® (Network Device Interface) is a royalty free software standard that enables video compatible products to communicate, deliver and receive broadcast quality video in a high quality, low latency manner that is frame-accurate and suitable for switching in a live production environment.
Focal Point VR’s programming team says it found the NDI® SDK offered by NewTek was easy to implement with initial testing showing NDI to be stable and working well at ultra-high resolutions.
“Integrating NDI® into our 360° workflow was a very simple task. We wrote software to customise our VR player for NDI playback over Oculus Rift headsets and the whole NDI network just worked straight out of the box.”
Running over a GigE network, the camera lens to headset display latency was less than 200ms enabling real time interaction with the remotely controlled robot.
Increasing number of clients
Focal Point VR says it is being approached by an increasing number of clients looking for a low latency, high reliability 360° live streams to create varied projects from shared VR to 8K cylindrical 360° video experiences – among them, the Royal Holloway University and the University of York, while it recently partnered with handset developer Huawei to test further 5G applications.
“We’re certainly seeing more museums and universities keen on exploring VR as a very effective way of enhancing someone’s knowledge and experience,” says James. “NDI® is a core part of these solutions.”
Jonathan Newth, CEO at Focal Point VR, adds, “We are also talking to NewTek about how to integrate NDI® more completely into our multi camera VR workflow, replacing traditional fibre SDI with a more flexible network backbone. We are looking forward to continuing our work with the NewTek team.”

No comments:

Post a Comment