Monday, 15 September 2025

Sony Showcasing Growing Spatial Awareness & More at IBC

RedShark News

article here

Sony’s IBC exhibit was, as ever, packed with tech for all corners of the industry from enterprise level cloud media via Ci to virtual production and content verification with the PXW-Z300. RedShark was also intrigued by a small but busy demo tucked away at the back of the stand featuring a monitor for viewing pseudo-holographic content.

The Spatial Reality Display (ELF-SR2), first launched over a year ago and seemed at that time to targets industrial applications with a cost (over $4000) reflecting this. Plus, you’d need serious GPU power to process it.

However, on the Sony stand it was shown paired with a pair of VENICE Extension System Minis. This new accessory for the VENICE 2 was announced earlier this year and has a footprint the size of an average smartphone  

“We’ve already had users put our original VENICE Mini on helmets or the front of motorbikes for cool action videos, but the Extension Mini is about four times smaller so we thought it would be cool to put two together,” Sony’s Stuart Newton told RSN. “In fact, when we paired them side by side, the distance between the sensors [the interocular] is perfect for 3D stereoscopic capture.”

The monitor at the show was screening previously recorded 3D content but Newton said you can connect two Extension Minis and live stream 3D to the monitor. “This would be ideal for creating content for something like Apple Vision Pro. Primarily we expect people to use the Extension Mini in singles, but you can also put these cameras in an array and do full 360 with each Mini recording at 8K.

Now available in two sizes (27-inch and 15.6-inch) the 4K LCD Spatial Reality display is built with a micro-optical lens over the screen which divides the image between your left and right eyes, giving an auto-stereoscopic (3D) experience. A proprietary high-speed sensor follows your eye movement “down to the millisecond”, sensing pupil position which an algorithm then crunches to process content for each eye “without lag”.

“The spatial content market is starting to take shape,” Newton added. “I think we’ll see a lot of new spatial content from VR/AR devices coming out and a lot more 3D content for movies.

An all-seeing piece of kit

There was a European outing for the Ocellus camera tracking system, Sony’s first such product which released ahead of NAB this year.

It provides marker-free camera tracking through five image sensors and comprises a sensor unit, a processing box, and three lens encoders. The sensor unit is small and fits directly on top of the camera body.

“Traditional camera tracking works by tracking physical dots in a space to calibrate camera movement,” Stuart Newton explained. “This marker-less system uses sensors to create a point space that maps out the entire space indoors or outdoors.”

It does this using Sony's Visual SLAM (Simultaneous Localisation and Mapping) technology.

“You could attach the module to small Venice Mini and move the camera out of sight so some of sensors are blocked off from seeing the full field of vision but since it's already mapped out the space it still knows where it is.   

“It's basically an all-seeing piece of kit.”

While it can be used with non-Sony cameras, it is optimised for a Sony chain because its hardware enables the throughput of metadata (including about focus, iris and zoom) from the camera while shooting to external devices via Ethernet.

If the lens does not support metadata acquisition through the camera, lens encoders can be added to the camera to obtain it. The metadata is necessary for virtual production and AR.

Newton said Sony is working on making the calibration easier to configure.

“The good thing about this is that it will work with the whole range of Sony cameras including studio cameras for sports or news or you could even do augmented reality outdoors as well.”

An adjacent demo showed how the system simplifies and automates match-moving by utilising camera trajectory data from Ocellus working with software from Sony’s Virtual Production Tool Set (the latest version 3 of which is slated to release this winter). New features are a viewing angle colour correction, ray tracing acceleration, and calibration for third-party cameras.

Recording evidence of authenticity

The new Sony PXW-Z300 was announced in March, as the world’s first camcorder to embed C2PA digital signatures for recording evidence of content authenticity.

At IBC, Sony was showing how it could help broadcasters to verify footage by displaying the digital signature information compliant with the C2PA standard.

The PXW-Z300 incorporates an AI-processing unit and an image processing engine for human subject recognition based on face, eye, skeletal structure, and posture information. It also features an auto-framing function that automatically adjusts composition to keep human subjects centred in the frame.

Live Streaming

The TX1 enables live streaming from Sony cameras. Co-developed by Sony and LiveU, it is due for release in 2026.

The LiveU TX1 supports resilient bonded transmission for faster data transfers using multiple network connections. It enables automatic file transfers simply by connecting the device to a camera via USB. In addition to USB connectivity, SDI support is also planned, enabling compatibility with a wide range of Sony camcorders.

 


No comments:

Post a Comment