AV Magazine
Immersive experience venues from the Illuminarium to Cosm Hollywood Park are changing the face of live events, but the Madison Square Garden Sphere promises to top the lot with a whole new entertainment medium.
article here
The $1.9 billion
building is set to become the largest spherical structure in the world when it
opens in the second half of 2023 at The Venetian, Las Vegas. The multi-use
20,000 seat venue will house a 160,000 sq ft display plane that wraps up, over
and around the audience, while the exosphere will be coated with a 580,000 sq
ft display, both of which are programmable. With over 170 million pixels it
will be the highest resolution LED screen in the world at 16K x 16K.
“This is being
driven by James’ (Dolan, Executive Chairman and CEO of MSG) vision to achieve
the ultimate in immersive experience,” says Andrew Shulkind, senior
vice-president, capture and innovation at MSG Entertainment. “This is not just
about having a bigger screen or the sharpest resolution, though it is that. It
is about marshalling every cutting edge technology to create an entirely new
platform for immersive entertainment.”
That required
developing a set of content creation tools and production workflow for this
unique presentation format which will be retained inhouse and put at the
disposal of artists, creatives and performers. The 16K output resolution is
essentially made by combining a number of different video feeds together.
“At 16K x 16K the
screen is higher resolution than any single camera can capture,” Shulkind says.
“We’re using camera arrays and applying techniques like tiling, uprezzing and
stitching to deliver full resolution and colour fidelity to the canvas.”
Hyper-lapse (moving
time-lapse) arrays required a custom built motion control rig to shoot
repeatable frame accurate moves.
All this kit is taken to location (a forest, a race track, a canyon) and
mounted on technocranes or helicopters, strapped to an arm on a boat or on a
Six Flags rollercoaster meaning a key design consideration is to keep it as
lightweight as possible.
“The 180-degree
field of view and resolution is so unforgiving the final image has to be
seamless,” he says. “There’s no tolerance for artefacts or faulty stitching.”
The display is tilted
at 55 degrees and the seats recline 12.5 degrees so the primary viewing is
straight ahead, but the screen is so vast framing must account for many
different perspectives. “Even if most people are looking at one part of the
canvas for most of the time it is the periphery of their vision that provides
so much more information.”
Testing, testing
Testing has been taking place at MSG Sphere Studios in Burbank, which is a
quarter-scale prototype of the Las Vegas Sphere.
“One critical thing
we’ve been exploring is the relation of the lens we use in the field to the
field of view in the screen that audiences see. Field of view is more pertinent
than focal length. We have been doing a lot of experimentation to see what
works best to achieve specific psychological effects. In some cases, the best
field of view for wide landscapes is a 165 degree lens, but some people and
faces are better at 120 degrees. There have been many surprises and there are
no hard rules yet.”
“Also, in a concert
venue the best seats are up close but here the closest permanent seat to the
screen is 160 ft away. That doesn’t mean we don’t seat the floor, but we want
every seat in the house to be the best seat in the house.”
At the outset,
Sphere attractions will be less CG and fantasy and deliberately grounded in
what feels natural – epic landscapes, for example. It has even signed with NASA
to send a MSG Sphere camera system to the ISS and record the Earth from space.
Esports and boxing
could be staged and the venue is timed to open ahead of the Las Vegas F1 GP
which has a track that surrounds the Sphere. During the race F1 will take over
the exosphere display for race-related content.
“These are the kind
of creative opportunities we give to partners,” Shulkind says. “We present them
with the technology and learnings we developed inhouse and ask them how to make
the best content for the Sphere.”
3D spatial audio
Developed in partnership with Berlin-based Holoplot, Sphere will house a custom
spatial audio system – Sphere Immersive Sound – using 1,600 permanently
installed speaker panels each composed of approximately 100 individual
speakers, resulting in over 160,000 channels of audio.
These speaker
panels sit behind the LED media plane, which is acoustically transparent for
“concert grade” audio – a feat of audio engineering that has been extensively
modelled and tested.
“A sphere is probably the worse choice for a concert venue in terms of acoustics since you’ve got to figure out a technology that directs the energy on to the audience and away from the spherical structure to avoid echoes and reflections,” says Stuart Elby, MSG’s senior vice-president, Advanced Engineering. “We knew we were going to use beamforming to provide the optimal sound mix for every audience member, no matter where they’re seated. But we also wanted to section the seats down to very small groups to be able to programme the experience of sitting there differently.”
Wave field synthesis
Research led them to Holoplot, a startup which in 2017 was using wave field
synthesis to improve speech intelligibility at train stations.
“They had that
nucleus of understanding in the algorithm of how they could do this. Our
collaboration started with that seed and we expanded it into large scale
concert grade audio.”
For example, an
audience member could hear a whisper that sounds like someone is talking
directly in their ear. Guests sitting in different sections can hear different
sounds (languages, instruments) – expanding the possibilities for customised
audience experiences.
“A lot of the work
we’re doing is creating a guide for what works and what to avoid for when
artists and engineers programme this space,” says Elby.
The seats include
haptic devices which allow artists to direct frequencies ranging from ultra-low
rumbles to pitches of 500Hz directly to individual seats, creating an
additional layer of immersion beyond the audio and video.
A proprietary wind
system has been devised to enhance the illusion of being in the natural world.
It can be adjusted from an idle breeze “that you’d hardly notice” to gale force
winds.
“We’ve quite a bit
of IP around real wind effects,” says Elby. “The same system is used to
disperse scent. We’ve tested it at scale to ensure the scent gets to the
distances we need.
“We’re not trying
to totally overload your senses like a five minute theme park ride. We wanted
more of a range than your traditional butt shaker.”
Driving content
management and video playback – including video processing of interior and
exterior displays – is technology from 7thSense in a continuation of a decade
long partnership with MSG. It has built a metadata pipe from production to
screen to automate the workflow.
“We’ve spent a long time working to develop a very reliable, very high performance, super high bandwidth network storage solution,” explains Rich Brown, CTO, 7thSense. “The magnitude of pixels needing to be rendered efficiently is extremely challenging. This is the biggest, highest spec display we’ve undertaken and we’ve made a huge step toward the future with networked driven architecture.”
Playback and data management
The chief components from 7thSense are banks of Delta media servers, its
FPGA-based pixel processor, Juggler that plays back live and recorded camera
feeds at low latency and a new Generative product that integrates games engines
into the live workflow. The whole network at Sphere is driven by ST 2110 – the
broadcast industry ‘standard’ for streaming media around a facility.
“We’re streaming
multi-layer 12-bit, 60fps uncompressed media,” says Brown. “From day one that
was a requirement. Every single pixel has to be represented correctly. We also
need to add broadcast trucks and live camera feeds and mix it all live so we
wanted to stick with one standard.”
Screen control and
synchronisation with audio and haptics are triggered by a bespoke show control
system that sits above 7thSense – another part of MSG’s unique configuration.
Making a spherical
screen of such size and target resolution was a major engineering hurdle for
MSG and Montreal-based LED fabricator, Saco.
Explains Alex
Luthwaite, MSG’s vice-president, Show Systems Technology: “Traditionally your
pixel pitch is an X/Y spacing in a grid arrangement (eg 4mm x 4mm) but you
can’t do that with a compound curve surface. You pick either X or Y as a fixed
aspect and adjust the other. You essentially have to lose diodes on one axis to
make a curved screen.”
Consequently, the
Sphere’s pixel pitch is 9.4mm but this fluctuates 10 per cent across the
surface to ensure image resolve whilst dropping pixels. The plane is divided
into facets, the biggest of which is 15 sq ft. Each facet is itself composed of
different tile and panel types.
“When you cover a spherical structure in LED you’ve got to be careful you don’t get colour shift as you move from one side to the other. This is heightened by the sheer scale and shape of the structure. We didn’t want to look at the bottom and see one colour and the top to find another.”
All hidden behind the screen
In order to avoid visual occlusion all rigging, speakers and lights, LED
components such as chips, IC drives and power supplies are hidden behind the
screen. The hardware also had to be positioned to avoid interfering with audio
waves passing through to ensure at least a 50 per cent acoustically transparent
screen.
“The screen is very
bare. We’ve stripped every single piece of unessential hardware and plastic off
the screen. The less material, the less it resonates and less occlusion which
is detrimental to audio performance. We studied this in anechoic chambers,
sound booths and scale tests.”
If the interior is
somewhat more of a traditional LED screen, the exterior presented new
challenges, not least to be heat and UV resistant. It’s composed of 1.4 million
80mm ‘pucks’ (or single pixel) that each contain 48 diodes.
The team faced
similar challenges in terms of the pixel pitch except this time on a convex
surface so that the pixels don’t appear wider at the equator than they are at
the poles. The solution was to arrange the diodes in each puck in a clock-like
fashion and in rings to scatter the light.
“The building has a
helical metal surface which we had to be sympathetic to. We made mega panels,
trapezoid in shape, to triangulate the exterior, interlacing the pixels with a
225 mm offset spacing. This increases your perceived resolution. The image
resolve happens closer because of it.
“Certain elements
have been done on other projects but never to this scale or complexity, and
never all in one place,” he adds. “With such a large exterior, that it is even
visible under flight paths, we want no single point of failure. We have to make
sure there are redundant data paths all over the surface.
“Immersive is a bit
of buzz but I don’t think anyone will do it to this degree of detail and scale.
It might be cool to put VR goggles on but you are isolated from everyone else.
This is a shared experience.”
No comments:
Post a Comment