Monday, 18 March 2024

What Content Creators Need to Know for a Volumetric Video Future

NAB

article here

We are entering a world where video will no longer be captured and presented as a 2D experience on flat screens, but as spatial experiences enjoyed with virtual or extended reality head gear.

The building blocks are being put in place and creators of all kinds better get ready, says Lucas Wilson, founder and CEO of Supersphere.

“Immersive experiences matter because the user is more engaged. So the right question is not why has VR failed to take-off but in what direction is content going in? All the trend lines point to content being more immersive.”

At Supersphere, Wilson has helped transform the live performance space by creating hundreds of XR, AR, VR and MR experiences for music artists ranging from Paul McCartney to Billie Eilish and Post Malone. The visionary exec, who will be presenting “Virtual Production for Content Creators” at the 2024 NAB Show, has seen the future and says the next evolution of video and immersive experience is volumetric.

“What doesn’t exist right now — and maybe Apple will create it soon — is the YouTube of spatial experiences. We’re ready for that,” he says.

The ability for anyone to capture and share experiences in the 3D world is coming, he says.

“Meta or Apple or Google will come up with the first true spatial distribution platform. The YouTube of the spatial world. I think that’s where we will all want to live.”

Arguably commercial VR really only began in 2019, when Meta released Quest, so it shouldn’t be surprising that VR has not moved beyond being a niche industry. Yet Meta has sold 20 million of its headsets and the Vision Pro, albeit in limited run, sold out in hours.

Wilson views VR as part of a continuum of immersive experiences which has taken us in short order from analog to HD to UHD TV via stereo 3D TV. “Each tech advance is aimed at delivering a more immersive experience but while TV set engineering and content distribution has been around for 80+ years, virtual reality is only just getting started,” he says.

“Headsets are a temporary anomaly. I think most people in the industry agree with that. The real answer will be when we have VR glasses.”

Meta’s RayBans are one example. Another is being developed at Brilliant Labs. They are lighter, more comfortable, less obtrusive and, frankly, cooler.

“For a start, they won’t make people want to punch you if you’re wearing them in public,” says Wilson. “Headsets are always going to be a niche market because there are only a certain amount of people that will actually want to strap a device to their face, no matter how cool it is.”

He predicts that in a couple generations of Qualcomm chip development the electronics will be small enough to fit inside AR glasses.

“Once that happens, with VR headsets in eyeglass form, then I really believe that our fundamental world changes in terms of how we communicate,” Wilson foresees.

“Kids already live and breathe by sharing content and communicating via digital devices. It’s natural to them but they still share 2D images. In the next couple of generations [of consumer electronics] they’re going to start sharing Volumes, they’re going to start sharing spaces and environments that they can interact with each other in. Once that happens, then why would you ever share a 2D photo again?”

Supersphere is getting ahead of the curve by bringing to market a new content creation tool capable of manipulating video and virtual worlds in a native 3D space.

It is called ArkRunr, and it launches right after NAB Show in April, initially targeting virtual production.

Wilson believes there’s a huge market for VP-style content creation but without the cost and paraphernalia of conventional LED volumes, camera tracking systems, and VADs.

“Anybody who has worked in virtual production knows that it is complicated, expensive and time consuming to achieve a good outcome. Moreover, there are no tools that exist in the mid-budget to creator a range for that kind of work. So, we built our own.”

ArkRunr has in fact been used by Supersphere on lots of shows “with major artists,” so successfully, in fact, that Wilson decided to commercialize it.

Wilson calls it a Spatial Performance platform. The software ingests live video feeds (from a smartphone, for example) of an artist performing on a stage, or even their bedroom, and wraps it in a virtual environment complete with interactive lighting. The platform runs on Windows and requires the computing power of “an average gaming laptop.”

“Every musician, every creator streaming from their bedroom, their living room wants to up their game. This allows them to broadcast in custom XR, AR or VR scenarios with interactive lighting,” Wilson continues. “Another big market for us is corporate. You could imagine a virtual TEDx stage, a video presentation and dynamic lighting for a corporate keynote with high production value.”

With generative AI tools added to the mix, the ability to create digital content is going to be supercharged. Supersphere, for instance, has incorporated AI into ArkRunr to create lighting for  specific musical styles.

“We are training [our algorithms] on thousands of hours of real lighting shows according to musical genre.”

Supersphere’s ambition is to be the “Live Nation of the immersive world,” says Wilson, “because we are licensing virtual representation rights for spaces that exist today and those that no longer exist.”

He elaborates, “If you want to play in the Cavern Club with the Beatles in the 1960s or with the Bee Gees in Studio 54 then we can bring them back to life. If you want to imagine the Cavern Club in a cyber-tech future, you can.”

No comments:

Post a Comment