Tuesday, 11 November 2025

Touch the future: Immersive video will soon make its presence felt

IBC

As XR devices become more accessible and 6G wireless systems emerge, we’ll move from simply watching video to stepping inside it.

article here

Imagine a world that fuses the digital, physical, and human to create revolutionary immersive experiences. Ericsson calls it The Internet of Senses. Nokia describes “a new world of sixth-sense applications”, and European tech body ETSI talks of the ubiquitous communications network acting as a ‘radar’ to sense and comprehend the physical world. 

The dawn of 6G

Video codec and mobile standards developer InterDigital thinks that the world is on the verge of stepping inside video. It forecasts that, with the arrival of 6G, we will experience the coming together of machines, ambient data, intelligent knowledge systems, and new computation capabilities. 

According to Nokia: “One striking aspect of that will be the blending of the physical and human world, thanks to the widespread proliferation of sensors and AI/ML combined with digital twin models and real-time synchronous updates.”

6G is expected to launch commercially by 2030, with an initial release planned for 2028. Included in the 2028 release is Integrated Sensing and Communication (ISAC), a technology that is considered to have huge potential. ISAC allows the network to become a source of situational awareness, collating signals that are bouncing off objects. It would collect data on the range, velocity, position, orientation, size, shape, image, and materials of objects and devices, essentially expanding the functionality beyond just communication. 

There are 32 potential use cases for ISAC listed in the technical report from the mobile specification group 3GPP. Among them is the ability to build digital representations of the physical world, a so-called digital twin. For example, a digital twin could incorporate a player’s physical environment into an extended reality game. 

“ISAC will enable motion detection and tracking of people and objects,” says Valérie Allié, Senior Director for Media Services at InterDigital. “We will have all this sensing data that will be integrated with high video quality and ambisonic audio. That will enrich spatial computing and deliver even more exciting XR experiences.” Analysts Futuresource predict that 6G deployment will coincide with the maturity of XR hardware and software ecosystems, which is expected to take place between 2028 and 2032. Ericsson also expects that by 2030, most of us will be using XR devices for all our communication, similar to today’s smartphone.

“As we get closer to 2030 and the release of the first 6G standards, XR entertainment is going to become an expectation. We will see everything from interactive digital sports venues to real-time augmented city guides and digital twins,” says Lionel Oisel, Head of InterDigital’s Video Lab, which is based in Rennes, France. “But the success of these experiences will hinge entirely on the quality of experience – where ultra-low latency, responsive interactivity, and consistent media synchronisation are all essential to unlocking XR’s full potential.” 

Universal haptics 

The research lab also believes that haptics will play a bigger part in how we virtually experience sports, films, and TV. In contrast to visual or auditory interfaces, haptic technology is said to enhance realism by stimulating the sensation of touching, grasping, or manipulating virtual objects – making digital landscapes feel more tangible. 

In January 2025, the first MPEG-I Haptics Coding standard was published, paving the way for haptics to be encoded, streamed, and rendered to mobile displays, headphones, and XR headsets.

With a standardised format, haptics can now be streamed alongside audio and video in the same bitstream. It can be authored once and played anywhere across networks, devices, and platforms. In short, according to developer SenseGlove: “haptics is finally ready for prime time.” 

The idea is to be able to encode the haptic signal just once and still enable playback on any device, rather than continue having to create a different process for each unique platform from Microsoft, Sony, Apple, or cinema’s D-Box system. 

There is a clear use case in gaming. For example, when you play Battlefield 6, you will experience over 170 curated effects designed specifically for the game, provided you have the right haptics gear, like a seat pad. As developer Razer Sensa HD Haptics describes it: “You’re no longer just reacting to the fight on the screen, your body becomes part of it.” 

“You've seen haptics in gaming before, but wouldn't it be cool if somebody could make a movie with haptics that you experience through your TV or on your chair?” posited Liren Chen, CEO of InterDigital.  

Philippe Guillotel, Senior Director at InterDigital and a leader of the group in MPEG that is standardising representations of haptic data, says he is trying to convince streamers like Netflix that physical feedback will bring a new experience and added value to their content. 

“Since everything is offline [on-demand], it would be easy to create content with haptics. The issue is the device. One of the reasons we are concentrating on delivering haptics to smartphones, game controllers, and especially to the headset is that most people have these. We need devices to be inexpensive to be adopted by the market.” 

“There is a creative aspect to haptics and we are engineers,” he says. “So, we need artists. We need to educate people in creative schools that haptics is a new modality. [Creatives] can learn how to do it, and they have to understand how people perceive it. Then, we will have a much better content experience.”  

Earlier this year, Apple released a trailer for F1: The Movie, which synced action on-screen with the iPhone’s Taptic Engine: “making you feel the roar of Formula 1 engines.” Subtle moments, like a seatbelt snapping or a ping pong ball bouncing, trigger delicate taps, while high-speed crashes jolt your hands.   

New video codec underway 

InterDigital is also competing for its technologies to be included in a new video codec, which is currently being developed by ISO/ITU as a successor to the MPEG standard Versatile Video Coding (VVC). The new codec, H.267, is intended to be more efficient in terms of bandwidth than VVC without increasing the complexity on the decoder side.   

There is currently a call for proposals out to the industry. These will be evaluated in January 2027. Following this, there will be a standardisation stage and a final standard release scheduled for 2029. 

Already in the testing stage, InterDigital claims to have demonstrated performance gains averaging 25% over VVC with its technologies. Some tests show gains of double that.  

The target for H.267 is to deliver improved compression efficiency, reduced encoding complexity, and enhanced functionalities, such as scalability and resilience to packet loss. 

“It's a real big challenge and a great opportunity to develop new ideas, patents, and algorithms,” said Edouard Francois, Senior Director 2D Codecs Lead at InterDigital. “In particular, we are exploring how AI can be used in synergy with traditional video compression methodologies.” 

Other groups likely to respond include Nokia, Ericsson, Fraunhofer HHI, and MediaTek. Oisel explains: “This standardisation period will determine which tools are adopted (therefore licensable). To do that, you have to prove that it delivers huge gains and also that you don't have high complexity. The issue with AI tools is that they put the complexity on the decoder side, which is something that chip makers like Broadcom will fight against because they don’t want to add complexity to their hardware. If you come with a tool with huge gain but also huge complexity, then this won’t be selected.”  

 


No comments:

Post a Comment