Wednesday 16 February 2022

The Building Blocks of the Metaverse

Streaming Media


article here 


Long before mainstream media latched onto the Metaverse it was already being built. It could have been called the Magicverse (Magic Leap’s What Is The Magicverse (And Why)? | Magic Leap), the Omniverse (Nvidia https://www.nvidia.com/en-gb/omniverse/) or the Cyberverse (Huawei Cyberverse: Huawei Unveils a New World). Others conceptualized it as Planet-scale AR Niantic, Inc. (nianticlabs.com), the AR Cloud and the Mirrorworld. Mark Zuckerberg characterized it as “the embodied internet” 

The ideas dovetail with Spatial computing, aka the 3D Web, or the merger of computer-generated 3D VR with augmented reality as envisioned by VFX pioneers like John Gaeta. Cultural historians and sci-fi buffs will point out that the term Metaverse was coined in 1992 by Neal Stephenson in Snow Crash before being given it’s Hollywood make-over in Ready, Player One

All of them describe essentially the same evolution of the internet from flat-screened, text and image web page to one in which the physical world is augmented by the digital and is experienced in (at least) three-dimensions.  

Ori Inbar, the producer of Augmented World Expo, for example, explains the AR Cloud as “a persistent 3D digital copy of the real world to enable sharing of AR experiences across multiple users and devices” that will ‘infuse meaning into every object in the real world.” 

That was in 2017 but recent descriptions haven’t deviated. 

Here’s Rodric David, founder/CEO of XR producer and facilities provider Thunder:  “The Metaverse is the total convergence of streaming, interactivity, and social media. Content, communication, and interaction are presented spatially as deep, evocative experiences that drive consumer behavior and ultimately brand value.” 

Craig Donato, CEO of online game building site Roblox declared: “What the internet is for information, the metaverse is going to do for social connections. I’m no longer bound by physical distance or all these constraints in terms of who I interact with or how I represent who I am. It’s insanely disruptive.” 

No-one puts a timeframe on its completion but proponents are positively evangelical about the profound impact the next-gen internet will have on pretty much everything. It will become the dominant global platform for creating and watching live content, says Rodric David, “with features including interactivity, real time transactional functionality, branded promotion and integration, game mechanics, integrated socialization, blockchain and NFT capability, and gamification tools.” 

This article is not going to debate whether much of the activity around the Metaverse today is merely an extension of big tech’s capitalist drive to own more of data, our money and our soul (there I said it). We do, though, need to acknowledge that there is a battle being fought right now for the framework of the internet’s second coming. 

Open or closed or both 

It is one summed up by Epic Games’ CEO Tim Sweeney who charges that the current walled gardens of a Facebook and Google need to be knocked down if the value of the metaverse (monetary, creative and social) is to be realised. 

“Now we’re in a closed platform wave, and Apple and Google are surfing that wave too,” Sweeney outlined to the Washington Post https://www.washingtonpost.com/video-games/2021/09/28/epic-fortnite-metaverse-facebook/. “As we get out of this, everybody is going to realize, ‘Okay we spent the last decade being taken advantage of.'” 

Leaving aside the extent to whether Epic’s own applications like Fortnite are also walled gardens, there is wide acknowledgement that for the metaverse potential to be fulfilled it requires cross-industry alignment “on a constellation of standards, guidelines and best practices to enable the consistent creation and distribution of scalable cross-platform 3D and XR content,” says Nvidia’s VP Developer Ecosystems, Neil Trevett. 

Standardization task force 

Sweeney thinks a similar collaboration is needed to how the Internet Engineering Task Force formed to develop and promote Internet standards in 1986. 

“You need an entire suite of standards, and the Web is based on several [like HTML],” he says. “The metaverse will require a lot of them. File formats for describing a 3D scene, networking protocols for describing how players are interacting in real time. Every multiplayer game has a networking protocol of some sort. They don’t all agree, but eventually they ought to be lined up and made to communicate.” 

Interchange standards and tools, protocols, formats, and services which enable persistent and ubiquitous virtual simulations are perhaps the most important aspect of the entire Metaverse framework. 

“Without them, there will be no Metaverse — only a more virtual and immersive version of today’s mobile internet and app stores,” says technology analyst Matthew Ball. “What’s more, this pale imitation will be far less lucrative, dynamic, and healthy. It will make it harder for new platforms to emerge — and, frankly, for the Metaverse to be built.” 

While most writing on the subject assumes the Metaverse will be singular, it will more accurately be a multiverse. Much like the internet today, with hundreds of millions of individual home-pages or applications as access points, so the primary entry point to the Metaverse will be via a browser-based URL and a personalized avatar. 

Not one, but billions 

“People will be able to use [those] to navigate a virtual metaverse environment with a mobile device using typical game engine mechanics,” says David. “Our endlessly customizable avatars will be virtual versions of ourselves that hold our keys, wallet, and identity.” 

There could be billions of such metaverses one for each of the planet’s population of digital IDs. But the intent is that they will all be able to synchronize and interact. Our avatars should be able to travel from one metaverse to another and from one hardware device (VR headset, AR mobile) with our movements, our creations, data and wallets tracked on the blockchain. Without being locked hindered by a walled garden. 

“[Blockchain is] an indisputably neutral, distributed way of expressing individual ownership … the most plausible path towards an ultimate long-term open framework where everyone’s in control of their own presence, free of gatekeeping,” says Sweeney. 

Getting to this point requires the ingenuity of privately funded and therefore proprietary endeavors – the closed metaverse – as well as the more open standards mass scalable approach of an open metaverse. There is a debate about the ethics of the closed versus the open approach which tends to divide along the lines of capitalist monopoly – closed and socialist (democratic) utopia. Inevitably, the reality is more nuanced. 

“A closed metaverse is only accessible through a download of a proprietary source IP which users are required to sign an end user license agreement to access,” says David. “Fortnite, Roblox, Call of Duty, Minecraft, and League of Legends are all examples of closed metaverses. Open metaverses are accessible by anyone who creates an avatar and enters it via a browser and a URL on a PC or mobile device. In the next few years, every brand in the world will need to invest in an open metaverse first strategy.” 

If the goal is integrate as much of the world into the Metaverse as possible, this means interconnecting the many devices and platforms around us from our car and home-security camera, to VR and AR headsets, projection cameras and screens, wearables, and more.  

Ball says, “Such development will require, or at least benefit from, the use of proprietary standards. All of this creates a burden on developers, and potentially a vicious circle, whereby no platform has enough users to develop for, and no platform has enough content to attract users. There’s a reason you can’t export an experience from Roblox to Minecraft or Fortnite, but just as you can’t easily import all your Instagram photos and likes into Twitter or TikTok or Snapchat.” 

The components of an open Metaverse  

The core of the open Metaverse requires an open, flexible and efficient way to describe a shared virtual world.  

“This will not be an extension to HTML or a javascript rendering library,” explains Michael Kass, Sr. Distinguished Engineer at Nvidia.  “And it will not be created by a standards committee. It will be an open source 3D scene description honed by years of use under challenging circumstances.”  

Nvidia is championing Pixar’s open source USD: Universal Scene Description. USD evolved from Pixar’s need to represent film-quality virtual assets, scenes and animation in a way conducive to effective teamwork among artists and interchange of tools.  

“Its features for teamwork are exactly what’s needed for the collaborative and social aspects of the metaverse,” argues Kass. “Its standardization for interchange is exactly what’s needed to stitch the open metaverse together.” 

Nvidia has used USD as a core part of its Omniverse – a business to business platform for companies across industries to build applications for the Metaverse. Nvidia says that its augmentation of USD with an ability to render applications “wherever the relevant rendering process lives” enables the extensibility of the Metaverse. 

“The web already has a variety of replication mechanisms - distributed databases, CDNs and caches of various kinds,” Kass explains. “But replicating a very complex 3D virtual world has its own special challenges. If the HTML for a web page changes, it’s possible to resend the whole modified HTML. For a virtual world described by hundreds of megabytes, this is simply not practical. So, any viable open Metaverse must have the ability to replicate by sending incremental updates that specify only what changes.” 

By building an effective replication system on top of USD, Nvidia believes it is able to synchronize the virtual experiences of multiple participants.  

Ball is a fan: “What’s key to Omniverse is that it can do this irrespective of the file formats and engine/simulation technologies being used. In other words, everything doesn’t have to be on Unity, or Unreal, or AutoCAD. And while Omniverse is, today, intended for design and testing, one can imagine Nvidia using this technology, plus its own industrial computing power, to operate much of the overall mirrorworld live.” 

Mass 3D asset creation 

Nvidia is also one of the backers of, an industry consortium (including Huawei, Epic Games, Google and Valve) focused on creating open API and royalty-free standards for graphics, compute and vision acceleration. Khronos governs specifications such as Vulkan, OpenXR, OpenGL ES, WebGL and glTF. 

According to the group, the WebGL API has become ubiquitous, allowing users to observe, manipulate and modify 3D objects without the installation of any browser plugins.  

It says VR and AR are also now supported in the browser by WebXR, and the creation and exchange of 3D models is enabled by the glTF 3D file format designed for efficient downloading and rendering. 

The digital twin of our world mirrored in the Metaverse is already being constructed with companies like Epic Games, Nvidia and Google the architects. The ability to scan the real world and translate it digitally is big business. Epic Games for example acquired Quixel in 2019, a company with a library of ‘MegaScans’ comprising of tens of billions of pixel-precise triangles. 

“The ability to map the real world is becoming a major source of IP,” says Ball. “This explains why companies such as Epic and Unity choose to buy and build up real world scans, rather than build from zero. In the coming years, it’s likely we’ll see intense competition in the category, with the likes of Nvidia, Autodesk, Facebook, Snap, and Niantic all choosing to build up their databases.” 

However, the creation of 3D assets still requires highly skilled technicians and artists, presenting a bottleneck for Metaverse growth. 

Kronos developers thinks this might be solved with the advent of mass market LiDAR (Light Detection And Ranging). New cell phones (such as iPhone 12) contain LiDAR, putting this technology in the average user’s pocket.  

Rumors abound that the iPhone 13 Pro could contain a second-generation LiDAR scanner, which combined with machine learning algorithms could turn the stills we take everyday into three dimensions almost overnight. 

“Many experts think 3D snapping is as inevitable as digital photography was in 2000,” reports Techradar. 

It’s not just still images either. LiDAR could hold the key to user-generated volumetric video. As pointed out by Apple Insider  patents published by Apple in 2020 refer to compressing LiDAR spatial information in video using an encoder, “which could allow its ARM chip to simulate video bokeh based on the LiDAR's depth info, while still shooting high-quality video.” 

3D media management platforms like Sketchfab  (sketchfab.com) and Poly.cam( https://poly.cam/) are based on interoperability standards such as glTF and already enable viewing and interactive manipulation of 3D models via a web browser. 

“LiDAR technology … now allows anybody with the latest iPhone to mass render the physical world, translate it into machine readable 3D models and convert them into tradable NFTs which could be uploaded into open virtual worlds very quickly populating them with avatars, wearables, furniture, and even whole buildings and streets,” says Jamie Burke, CEO and Founder, of London-based VC firm Outlier Ventures. 

Burke is leading a parallel attempt to lay the groundwork for the open metaverse. Outlier Ventures invests in cryptocurrency, blockchain and start-ups in the emerging ‘Web 3’ stack. 

Its manifesto  states: “The Internet is being fundamentally restructured from the bottom up by a convergence of decentralised technologies to form a new data economy. Where the last 20 years have been dominated by digital globalization of extractive and increasingly anti-social platforms and ‘the cloud’. The next 20 will be defined by a redistribution of value to networks and an unbundling of platform monopolies. From the sovereignty of the platform to the sovereignty of the individual user.” 

The fund wants to accelerate that mission by promoting what it calls The Open Metaverse OS, a shared and open operating system building on the success of decentralised protocols like NFTs. 

It explicitly ties digital currencies and the ‘on-chain transferability of assets through NFTs’ to the core of the emerging metaverse economy.  

Ball (something of a Metaverse sage) agrees: “Blockchain is considered the one major interchange technology that retains most of the values and benefits of an open standard, and also looks likely to thrive in the Metaverse.” 

An open Metaverse OS 

Among the framework technologies identified by Burke are LiDAR, Pixar’s USD and Khronos Group and Nvidia’s Omniverse. These he thinks can be better monetised in global and open markets than any one closed platform. 

“Increasingly the world of Web 3 and crypto is converging [and] with new environments like gaming and VR there is a generational shift away from Web 2 platforms,” says  Burke. “The Open Metaverse OS is best understood as an evolving collection of highly composable technologies that will increasingly, but selectively, be used to make aspects of a the Metaverse progressively more open.” 

Other blockchain platforms hoping for similar social outcomes. OWAKE is described as a ‘Real-time Moment Sharing System that enables communication between people and people, people and machines, and machines and machines.’ It’s developed by Kronosa Alliance (https://kronosa.org/) whose mission is to build ‘The Sustainable Human Society’ with a next generation internet for humans to use both the virtual and the real world as their residence and workplace.’ 

Another project, The Open Metaverse Interoperability Group (OMI) says it is focused on bridging virtual worlds by designing and promoting protocols for identity, social graphs, and inventory.  

Hardware counts in large amounts 

Metaverse or not, the sheer amount of data heading across networks demands more than a software response. Optimizing streaming bandwidth, latency and reliability are deemed essential.  

“If we want to interact in a large, real-time, shared, and persistent virtual environment, we will need to receive a superabundance of cloud-streamed data,” says Ball. “Cloud data streaming is also essential if we want to seamlessly jump between different virtual worlds.” 

Latency is a current bugbear in live sports of course but the general feeling is that by optimizing  technologies like LL-HLS and ABR the delay from capture to screen can reduce to a broadcast-like 5 seconds. That maybe fine for the NFL or EPL but nowhere near good enough for eSports, gambling or online multiplayer games let alone the instant social interaction on which the Metaverse is predicated. 

“Slight facial movements are incredibly important to human conversation — and we’re incredibly sensitive to slight mistakes and synchronization issues (hence the uncanny valley problem in CGI),” says Ball.  

It doesn’t matter how powerful your device is, he argues, if it can’t receive all the information it needs in a timely fashion. As a result, he says, the availability and development of computing power will constrain and define the Metaverse. 

Compute, 5G and edge 

The combined force of 5G and the build out of infrastructure at the edge (in datacenters or cell phones) is considered key. Aside from the regular telco developments in this area there are fresh innovations targeting the Metaverse. 

LA-based start-up Lionshare Media is launching the THIN/AIR Metaverse for premium entertainment and immersive media experiences. According to its website, this is a direct-to-consumer, cloud-native, 5G decentralized media distribution platform that ‘empowers creators with their own media channels called “Projects”. Projects, it explains, are spatial 3D Web apps which feature a ‘hyper-cube UI/UX design that leapfrogs the experience of OTT [MPEG-5] video, social media, and livestream sites’.  

But even if we improve the computing power of consumer devices, move more enterprise computing power closer to consumers, and build out more centralized infrastructure, we’re still likely to fall short. 

Ball’s idea is that a form of peer-to-peer networking will take place in which the available compute power of every local PC and device will be used to sate demand. Owners will be paid for the use of their CPU and GPU compute power. He thinks this possible because the transaction will be conducted by blockchain. 

“Every computer, no matter how small, would be designed to always be auctioning off any spare cycles. Billions of dynamically arrayed processors will power the deep compute cycles of even the largest industrial customers and provide the ultimate and infinite computing mesh that enables the Metaverse.” 

No comments:

Post a Comment