Wednesday, 20 November 2024

6G: media should plan for the internet-of-senses

IBC

Developers believe 6G will usher in photo-realistic holographic communication complemented by multisensory extensions - and experiments are already happening today.

article here

Real-time film scenes shot 280 miles apart, feeling a crunching on field tackle in your living room and live holographic broadcasting - scenarios that could come to pass in less than five years with the arrival of the 6G wireless network.

Each of these concepts has recently been demonstrated using existing technologies as a call to action for media companies to plan ahead.

“The potential is huge now test and create a proof of concept,” says Jessica Driscoll, Director of Immersive Technology at government funded innovation organisation Digital Catapult. “How do we start thinking about AI-powered interactive storytelling, or what does more collaborative art and music production look like? How will multi-sensory experiences enhance storytelling?”

6G will theoretically deliver between 10 times to 100 times faster speeds than 5G with data rates as high as one terabit per second, latency measured in microseconds and an ability to integrate digital and physical versions of the world.

“It’s not difficult to imagine how the football viewing experience might benefit from advances in innovations such as extended reality (XR) and haptics and which may, for the first time, make Match Day a truly immersive experience,” says Valerie Allie, Senior Director, Video Solutions Group, at tech licence developer InterDigital. “The advent of 6G will accelerate significant XR advancements, so now is the time for the sports leagues to think creatively over the next decade about how to offer immersivity to their fan base.”

Initial work on 6G specifications will start with Release 20 of the 5G standard next year with Release 21 expected to be ratified by 2028, in time for commercial 6G network launches in 2030, according to mobile industry standards body ETSI.

That means the 2032 Brisbane Olympics could be the first major event to enjoy the benefits of the next generation network.

“6G has advanced significantly from an industrial agenda standpoint,” says Alain Mourad, Head of Wireless Labs Europe, InterDigital. “There's already commitment worldwide and at standards bodies like 3GPP and ITU.”

Next March, 3GPP (the body which develops mobile broadband standards) will hold its first official workshop inviting members to share their views on the agenda and scope.

“Once the first standards released around 2028 it will take another couple of years before we start seeing some implementations of these specifications and products,” Mourad confirms.

The ITU evaluates and standardises the 6G specifications proposed by 3GPP as IMT-2030. This extends existing services in 5G (IMT-2020) such as Immersive Communication to include a set of new attributes bracketed under the headings Ubiquitous Connectivity; AI and Communication; and Integrated Sensing and Communication.

Together, 6G networks will enable immersive, ubiquitous, and sensory digital experiences on a massive scale. This will make it possible for 6G applications to “sense” their surroundings, and thereby turn the network into “our sixth sense”, according to a report by the consultancy Capgemini.

Towards 2030, telecoms giant Ericsson expects users to be able to experience all day XR, where the XR device would be used as a main device for all our communication, similar to today’s smartphone.

 

[subhead] Dual location performance

The Advanced Media Production (AMP) network, developed by Digital Catapult and motion capture facility Target3D, is the UK’s first interconnected 5G enabled facility. The government funded initiative links studios in London and Gateshead with labs in Belfast, another in Gateshead and an Immersive 5G Lab in Newcastle city centre. It offers compute power, motion capture cameras, volumetric capture systems and 5G connectivity for media and business to experiment with.

Such experiments include ‘dual site performance’ in which a performer in one place has their actions replicated via holographic video in another location or where two performers in separate locations combine to deliver a performance in a virtual platform like Roblox.

“We are interested in pushing forward the potential of real time holographic broadcasting,” says Driscoll.  “What’s interesting is the audience interaction that feeds back into those virtual worlds. For example, if you've got two pop stars, one in the north of England and one in the south, performing together in Fortnite then what's the real-time audience experience and the feedback loop that goes back to the performers? That's something that hasn't been cracked.

“As the performer, you can see people moving (virtually) in the metaverse but you can't really discern individual gestures. There’s a lag in the environment. It’s not seamless. But a completely low latency 6G environment would enable real-time interactions. You could have real-time 360 audio.” She continues, “When there’s no noticeable latency everyone can experience something at the same time. You could have meaningful interactions and very high quality volumetric video and sound that is also personalised. These are things we have barely begun to explore.”

In July 2023, researchers from Abertay University showcased how actors could shoot scenes together in real-time from two different locations (Dundee and Manchester) using a 5G internet connection against a consistent virtual environment. The clear practical incentive is to reduce travel time and cut carbon costs.

“There is the appetite to be able to share talent across different geographies but people’s mindset remains very traditional,” says Driscoll. “Until sustainability becomes much higher up people's agenda, and we insist on travelling to down the carbon footprint then real dual site or multi-site performance won’t take off in the way that we thought it would.”

Electronic arts duo Gibson/Martelli developed a dual-sited performance using motion capture, virtual environments and live music with artists in AMP’s north and south studios 250 miles apart linked by 10GB fibre connection edge-compute capability and local 5G networks.

The goal was to develop a 'playbook' of ideas and techniques to guide others exploring linked performances.

Gibson /Martelli plan to add in more elements, including streaming of the mocap data to remote VR audiences and giving show control to the dancers via mocap gloves and a machine learning toolkit that can recognise specific gestures.

Digital Catapult runs its facilities for all sectors and finds what it calls ‘creative spillover’ where tools, workflows, processes and content from the creative industries like immersive audio, VFX or game engines are applied in more industrial settings.

“We've had interest from the National Grid to create a digital character and from the Royal Navy, to use virtual production techniques to visualise data for submarine operators,” informs Driscoll.

Haptics in the loop

While immersive broadcasting may be in its infancy (and stymied by existing connectivity limitations), sports producers could benefit from 6G roll-out as soon as 2030.

“This next generation of telecoms infrastructure has the potential to profoundly transform how fans engage with topflight sporting events,” says Allie.  “It may even usher in a new era of immersive content that elevates the live match excitement of audiences to a different level.”

It’s possible to imagine VR headsets delivering an immersive experience and giving fans the perception that they are actually in the stadium, no matter where they are in the world. Meanwhile, AR enhances the real-world experience by overlaying digital information onto the physical world.

“What we call ‘immersive video’ is video where the user’s point of view can be adapted with sensation of depths and parallax,” explains Allie. “Doing that relies on a capturing a huge amount of information to generate 3D video in realtime and then to have some viable transmission at scale that could be a deployed on wireless networks.”

Into this mix comes digital sensory experiences.

“Let’s say you are watching the normal 2D sports experience and you receive some haptic feedback on your smartphone or a headset or controller synchronised to the performance no the pitch. When a player touches the ball, the user receives haptic feedback which brings you closer to the live experience.”

This has been demonstrated by InterDigital using 5G connectivity.

The idea is similar to the vibrations game players already get through their console controller. Devices could produce sensations of pressure, texture, or even heat to increase immersion in virtual scenarios. A haptic glove might deliver different sensory responses to individual fingers, the palm, and back of the hand to imitate a more natural tactile experience such as holding an object or climbing a terrain.

“Consider eSports. A gamer plays their favourite sports car race game and shares their gameplay with followers on Twitch. Today’s viewers just get a 2D image of this gameplay. With our layers of immersive video and haptic feedback the remote viewer will be immersed in 3D video and they will feel the exact haptic feedback that the player felt during the gameplay. To do that we carry haptic media signals as an additional track in the bitstream.”

InterDigital has built a platform to test and evaluate immersive video experience from end-to-end with its work being fed into development of MPEG-I.

“The first release of immersive video standards is already available so you can develop and deploy some scenarios today,” Allie says.

AI and Communication

6G is expected to be a AI-native network, AI is embedded in the networking equipment. This will enable the network to learn and manage itself, be more autonomous, and make it cheaper to run.

What does this mean in practice? Driscoll explains, “For example, if I'm broadcasting from the AMP studio into Glastonbury or another city location the idea is that the network infrastructure should be self-optimising and self-organising. It can cope with signal reflections off buildings, weather patterns (rain negatively impacts mobile coverage) or if there’s sudden congestion on the local network. It enables real-time multi-site collaborations and also multi-user collaborations.”

Companies like Nvidia are counting on AI being able to automatically optimise every point in the production process to achieve best performance.  

“AI optimisation will help to figure this out because at some points you won't need a certain amount of capability or capacity or throughput in one location, but you will need it in another. Currently that is very difficult to do.”

Security and resilience

The complexity of working with digital media assets across the 6G network is expected to stretch regulatory and privacy concerns.

An AI-driven metahuman developed by Digital Catapult and Target3D for example used one person's physical body, a second person’s movement data and a third person’s voice. Securing individual IP to be compliant is one issue which 6G could make better or worse.

“The issue of security and resilience has emerged as a lot stronger in IMT-2030,” says Driscoll. “When we were discussing 5G immersive applications there wasn’t such a strong dialogue about security because the applications tended to be discreet, proof of concept, one-offs generally for marketing or research. As we shift to 6G and the promise of mass mobile communication the question is how do you make everything safe?”

Safety features are being built into the 6G standard for chipsets, hardware protocols and software stack in the hope that security will be more robust.

Once bitten

Promises were made to operators that investments in 5G would deliver groundbreaking experiences like VR which they would be able to monetise. Yet many have yet to see any return on that investment and VR/AR or XR applications have failed to take off.

Current cellular technology is blamed as being too slow to manage the high data rates and the level of latency required to stitch augmented and virtual reality together in real time.

“6G’s latency and speed enhancements will answer these issues,” says Philippe Guillotel, a scientist investigating machine learning for video processing at InterDigital. “There will be the opportunity to offer new experiences that create more value for audiences, but understanding how people will use and interact with digital environments will be key to 6G’s success.”

His colleague is more circumspect. “Operators are taking a more pragmatic approach to investment because there are multiple technology enables already in 5G that have not been deployed yet or because there is no business case yet identified,” says Mourad. “But it doesn't mean that in the next couple of years as 5G matures that these will stay on the shelf. We are already seeing advanced use cases pick up with XR being one of them.”


 


No comments:

Post a Comment