AV Magazine
article here
In a Hoxton warehouse a six-person undercover unit is tasked with cracking a world-changing mystery. On the dark web, a black market known as ‘Origin’ has begun selling contraband, including Covid 19 vaccines, rhino horn extract – even Brad Pitt’s DNA. Suddenly, you’re face to face with dinosaurs.
What is going on?
This is Lost Origin, a research and development project designed to push the boundaries of what’s possible in the nascent world of immersive and interactive tech-led performance storytelling. Even that is a mouthful but there’s no getting around the multi-disciplinary fusion of traditional and cutting edge techniques in an effort to a create unique experience.
“The objective was to create an immersive collaborative experience,” explains Maciek Sznabel who oversees the creative implementation of immersive projects at project lead, Factory42.
“Virtual Reality experiences can be solipsistic. You can experience VR alone but you cannot share the experience with others. The aim of Lost Origin was to design an interactive experience that people can enjoy together and, to achieve that, we decided to mix theatrical performance with mixed reality technology.”
The project is funded within the Audience of the Future programme by UK Research and Innovation through the Industrial Strategy Challenge Fund. It is the culmination of a series of R&D ‘outputs’ which explore how audiences of the future will enjoy new forms of entertainment and visitor experiences.
Previous ‘outputs’ included a VR project featuring Sir David Attenborough, co-produced by Factory42 with Sky – one of the partners for Lost Origin alongside the Almeida Theatre in Islington and University of Exeter and MR wearables developer, Magic Leap.
“We began with weekly meetings of all the creative groups and it quickly became apparent that nobody had done anything like this before,” says Sznabel. “We had people with huge experience in film, TV, animation, theatre interaction. On the one hand we had to apply this knowledge and on the other hand forget what we know, be open minded and think another way.”
The process started from the basics, such as how many rooms they would need, for how many people and what the throughput of people would be. “We slowly started to create a vision for the physical space and the software simulation.”
Sets were built out of cardboard boxes, starting with just a single box, and gradually building up and changing the design. “You couldn’t build the set first and then fit everything around it. This was a constant process of iteration, of trial and error. The starting point was – let’s put this box on the floor and see what we can do with it in AR.”
The storyline quickly took shape as involving elements of science (dinosaurs) and mystery (to make the player’s spooks), but it too had to be kept loose and evolved with all the other pieces, including the animation which typically takes the longest time to create. “The process was quite similar to designing a game in the sense that we have computer animation and narrative joining together,” adds Sznabel.
Only this time they had the added layer of real-world sets through which players would move and interact with objects both animated and actual (lights, audio). “It took maybe a year of things moving from paper to software tests and animatics to test, change, and retest constantly working out how we want people to interact, how many interactions, what do the players’ experience.”
Iterating to outcome
The final version is just one of many possible iterations. A smaller scale version was tried out at the Metro Centre in Gateshead where Magic Leap enabled people to interact with virtual objects like a dinosaur or robot. From this they learned that an original plan to have three large rooms with different applications of Magic Leap wasn’t going to be possible. In part this was because of the limits of Magic Leap.
“Magic Leap gave us rules for how to optimise the technology such as not making a virtual object larger than five metres but we wanted our experience to have scale,” says Sznabel.
Another plan was to give each player the chance to have different experiences in Magic Leap by finding and picking up Easter Eggs in the corners of rooms, but this too was shelved when Covid forced the team to reduce the production’s complexity. Instead, two of the spaces used in the final project rely on more traditional (albeit still advanced) technologies with the AR glasses reserved for a big ‘reveal’ toward the end.
The technologies were kept separate to minimise complexity.
Interactive projections were used principally in the Journal Room. This is the point in the experience where things get surreal; the walls of the room look like they’re made from pages of a crumpled journal. Three Intel RealSense cameras were used to capture live depth and spatial data from the Journal Room, as seen from a variety of angles. The data generated from these cameras was then transferred to NuiTrack AI, a middleware package which allowed the team to locate and track the six human figures from each field of view.
“We used this human pose tracking data with our bespoke motion analysis tools, which allowed us to identify and respond to specific movements and gestures in realtime,” says Mike Golembewski, Factory 42’s interactive designer and developer.
Show control
The Production Family (TPF) was responsible for implementing the show control systems used to orchestrate and synchronise the media, sound, and lighting used within the show. This is a custom show control rack designed, built and programmed by TPF.
“It uses Qlab software as the main triggering system and is programmed to create multiple cue lists, all running at the same time,” explains TPF co-founder, Dominic Baker. “A range of timecode, OSC triggering and Midi is then used across the experience to control our lighting, sound and video servers. This ensures all the rooms are perfectly in sync, and can all be running at the same time. Input triggers and buttons are also taken into this system, so that music, sound and lighting can be triggered by buttons, proximity sensors, IR and other servers.”
The interactive projections used within the Journal Room were designed and authored as custom standalone applications in Unity, powered by three high-end PCs running Windows. Explains Golembewski: “These PCs received show control signals via OSC, provided live audio feeds to QLab via Dante, and streamed realtime 50fps HD video into a disguise D3 Server via NDI. The audio feeds were upmixed into a 7.1 surround setup, and the video was projected on to the walls using three Panasonic RZ120 projectors.”
Inside the interactive projections, the Intel RealSense cameras and NuiTrack AI middleware provided the information the team needed to understand where people were located within the room. “Our bespoke gesture analysis software let us understand how the audience was moving, and let us respond to it visually,” he continues. “All of the visual elements projected in the space, and all of the interactive audio elements, were created procedurally, and then mapped to user behaviours using multi-layered interaction controllers.”
A ‘flame interaction’, for instance, in which audience members could fan their arms to build a fire, consisted of 30 uniquely tuned audio interactions, and 25 animation-based interactions, layered together and presented simultaneously in a seamless display. Lighting was specific to each room to both “enhance the paranormal as well as the more naturalistic landscapes,” says lighting designer, Jess Bernberg.
The Journal Room is lit with tungsten light to emit a warmth while most of the other spaces are lit using LED to create a harsher, industrial feel. Lighting also informs the experience. Areas light up to show players have solved a ‘puzzle’ and informs them where to move/hold their focus.
“Lights are rarely static and help keep the set alive,” adds Bernberg. “In each room there is at least one colour change to signify something happening.”
Animatics to final pixel
The CGI was Sznabel’s responsibility, a task the former Weta Digital artist relished. His CV includes The Hobbit: The Battle of the Five Armies, Dawn of the Planet of the Apes and Avengers: Infinity War.
“Our approach was to make the creatures as realistic as you’d find in a AAA game. The final textures are 2K resolution but seen in the device it’s a bit lower though you don’t notice it.”
Remarkably the small team of half a dozen artists completed all the work in house at Factory42.
Aside from design, blocking and layout they gave the dinosaurs character. “When a dinosaur looks at you we needed to have it interact with the person – to emote something. Some are scarey, others curious.
“For me this was a combination of a classical cinematic approach combined with game development and also a safari in the sense that you want to touch everything but at the same time you can’t. The animal has to behave as if it wants to touch you and at the same keep its distance.”
Almedia Theatre advised on the performance element, how players might move around and interact with the set. Actors also worked as guides within the performance space and would help recalibrate an AR headset as needed – keeping in character of course.
Being at the cutting edge also means technology can go wrong, or at least take an unexpected turn. In May 2020 when the team was mid-way through the project, Magic Leap announced a restructuring process away from consumer entertainment to focus on business products. Half of its 2,000 employees were laid off.
“We had had good cooperation with them to that point and they maintained that even while the business was moving more into corporate, medical and industrial applications. It felt like a great device and it still does but Magic Leap was clearly experimental. It’s why we were using it. In the future we look to work with other XR glasses.”
Opening the doors of perception
The market for XR glasses has pivoted between Microsoft Hololens and Magic Leap until now but 2022 could see a spate of new wearables from companies including Lenovo, and Xiomi built using a new XR development from chip maker, Qualcomm. Apple is also heavily rumoured to be launching its own XR glasses.
“Personally, I think that this is part of the future of entertainment,” says Sznabel. “I think you can see from the audience reaction that they are experiencing something different from anything that cinema or even VR is capable of.
“From a production point of view, the chief take-away is how this form of experiential activation requires iteration every step of the way. You cannot assume that anything you start out with will be there at the end. The technology is only going to get more powerful, more sophisticated opening up to richer and larger scale experiences. What’s we’ve done is opened the doors to a new realm of content creation exploration.”