SVG Europe
article here
The video industry in general and sports in particular is going multi-modal. The picture and the audio alone is no longer enough. The industry has pushed resolution, dynamic range and audio fidelity to the point where enhancements are no longer monetisable. Attention is turning to new technologies which promise greater immersion in the game.
“2D video is good. Everyone knows that, but if we want to add value to the experience we might want to bring additional media and we also want to bring interactivity,” says Gaelle Martin-Cocher, Senior director, Media Systems, InterDigital. “The passive experience is good but younger generations demand more.”
InterDigital, the US headquartered research lab, develops the technologies that underpin many of today’s video compression and wireless communications standards. A key focus now is how the sensations of touch, force, motion and temperature can enrich immersive experiences.
“The goal is to enable interoperable transmission and rendering of tactile sensations across devices and platforms, complementing visual and auditory media,” says Philippe Guillotel, Senior Director, at InterDigital and a leader of the group in MPEG that is standardising representations of haptic data.
SVG Europe were given a tour of the company’s European HQ in Rennes, France during which its experts served a number of demos. One potential use of haptic signals is to leverage data from pressure sensors in the balls used during the FIFA World Cup 2026 to send vibrations to mobile devices each time someone kicks it or when a goal is scored.
“It’s a very interesting use case and the data would be straightforward to collect,” Guillotel says. “I hope they will use it.”
Another scenario is for Formula One. Guillotel explains, “There are already sensors detecting motion, acceleration, deceleration in the cars and you can use this information to enhance the feeling taking a chicane or a overtaking. In basketball, when someone barges into you, you could receive that feedback.
“The easiest way to do it would be to have a library of effects ready to go and to have someone push a button for the appropriate effect when an event like a foul or a goal happens during the live broadcast.”
“The easiest way to do it would be to have a library of effects ready to go and to have someone push a button for the appropriate effect when an event like a foul or a goal happens during the live broadcast.”
It would also be possible to automate this process using computer vision. Sensory experiences could potentially be conjured from anywhere in the stadium or on the pitch such as inside a rugby scrum or next to a high diver 10 metres above the pool.
Haptics standards
In October 2021, MPEG officially recognised haptics as a core media type, placing it on equal footing with audio and video. Finally published in January 2025 as ISO/IEC 23090-31, the new MPEG-I Haptics Coding standard unifies the encoding of vibrotactile and kinesthetic data, enabling interoperable, high-fidelity tactile feedback across XR, gaming, and broadcast.
This paves the way for haptics to be encoded, streamed, and rendered within the same ecosystem that powers today’s media experiences — from mobile devices to cinema and XR headsets.
“We have developed standards to describe the haptic information and distribute it using traditional broadcast or streaming formats including Dash, MP4 file format and CMAF. Establishing a consistent approach to the delivery of haptics is important to the technology’s adoption across immersive entertainment.”
Any device implementing the standard interface - gloves, vests, gamepads, or robotics actuators - can decode or adapt tactile data using the metadata-driven mapping layer.
The architecture supports very low bit-rate (about 8 kbps) synthetic effects meaning that a haptic track is a very light weight addition to the payload. MPEG is now extending the framework to cover interactive haptics, avatars and object-based touch interactions, forming the next phase of the Haptics standard for advanced XR applications.
A key question is whether sports fans want this.
Guillotel says that feedback from tests has been positive but that it’s difficult to judge when the technology is so new. “I'm convinced that it will be something people will use but it’s too early to know. The issues is the device. One of the reasons we are concentrating on delivering haptics to smartphones, game controllers and especially to the headset is that most people have these and are comfortable with them. We need devices to be inexpensive to be adopted by the market.”
Further demos on show at InterDigital included real time delivery of volumetric video combined with haptics. The video would be captured from a multi-cam array surrounding the athlete and presented against a CG environment such as a sports stadia. Viewable in 2D displays or VR headsets, the application is considered useful for training by coaches and athletes able to playback, pause and interact with the 3D video.
Making presence felt
Haptics are part of a wider concept emerging in media which aims to immerse viewers in the sensation of being present.
Valérie Allié, Senior Director for Media Services, InterDigital says, “Imagine the year is 2032 and you are preparing to watch your country’s athletes compete at the Brisbane Olympics. Maybe you personalise your perspective to replay a winning goal or use AR overlays to cross the finish line with your favourite athletes. This potential to deliver high-quality, immersive content, whether through avatar interactions that will enrich the in-stadium experiences, or live 3D immersive content, and the ability to sense this experience thanks to haptic feedbacks, all depends on advanced compression capabilities and new media formats.”
In this future it’s not only cameras that will capture content, but Integrated Sensing and Communication (ISAC) which will enable any device to act as a ‘radar’ to gather surrounding information about the environment. This will enable application providers to build detailed and accurate digital twins or realtime replicas of an environment.
“Using sensing data gathered from devices around the pitch, the coach's bench, or any other area in this arena, we can create dynamic digital replicas of stadiums full of players, of spectators, and other activities,” says Allié.
“ISAC will enable motion detection and tracking of people and objects. We will have all this sensing data that will be integrated with high video quality and ambisonic audio that will enrich spatial computing and deliver even more exciting XR experiences.”
AI will help to optimise network resources to deliver hyper-personalised content. This could include immersive player tracking which overlays with the live game feed to provide detailed stats of a player like speed, stamina and positioning.
Commercial 6G network launches are being primed for 2030, according to mobile industry standards body ETSI. That means the Brisbane Games could be the first major event to enjoy the benefits of the next-generation network theoretically delivering between up to 100 times faster speeds than 5G with data rates as high as one terabit per second and latency measured in microseconds. At this point the ability to merge detailed digital twins its physical counterpart becomes viable.
Energy aware streaming
However, more streaming to more devices comes at an environmental cost. InterDigital research found that streaming the Paris Olympics consumed about 1.25 terawatt hours of electricity which roughly equates to enough electricity to power 400,000 European homes for a year.
Its solution is Pixel Value Reduction (PVR), an algorithm deployed on the content at source (such as CDN) to manage pixel brightness in order to lower energy consumption of any display without impacting the visual quality. It’s claimed to yield up to 15% energy savings in display power consumption and that viewers cannot perceive the small reductions in brightness. Indeed, had PVR tech been applied to all televisions streaming the 2024 Olympics globally, it could have potentially saved enough energy to power 12,000 European homes for one year.
PVR is part of the Green MPEG standard specification (of which InterDigital is a co-author) planned for the end of 2025.
“An AI-driven resource management can increase the energy efficiency and reduce the carbon footprint of large sporting events like the Olympics,” says Allié.
New video codec under way
InterDigital is also competing for its technologies to be included in a new video codec which this month was officially launched by ISO/ITU as a successor to MPEG standard Versatile Video Coding (VVC). The new codec, H.267, is intended to be more efficient in terms of bandwidth than VVC without increasing the complexity on the decoder side. That’s something that chip vendors want to avoid.
There is currently a call for proposals out to the industry. These will be evaluated in January 2027 following which there will be a standardisation stage leading to finalising in late 2029.
Already in tests, InterDigital claims to have demonstrated performance gains averaging 25% over VVC with some tests showing gains of double that.
e target for H.267 is to deliver improved compression efficiency, reduced encoding complexity and enhanced functionalities such as scalability and resilience to packet loss.
“It’s a really big challenge and a great opportunity to develop new ideas, patents, and algorithms,” says Fabrice Le Léannec, Senior Principal Scientist, InterDigital. “In particular, we are exploring how AI can be used in synergy with traditional video compression methodologies.”
No comments:
Post a Comment