Monday, 21 February 2022

Behind the Best VFX Oscar Nominations

IBC


article here 


Bond, Dune and Spider-Man – the three blockbusters that did most to lure punters back to the cinema – all get nods alongside two Disney films

Five films are vying for the Best Visual Effects Oscar – Dune, Free Guy, No Time to Die, Spider-Man: No Way Home and Shang-Chi and the Legend of the Ten Rings, with the winner set to be announced on 27 March. 

Dune 

The VFX goal for the Warner Bros space opera, briefed to overall visual effects supervisor Paul Lambert, was to try to keep everything as grounded and as photoreal as possible.  

“We weren’t going to have any virtual cameras which could only be done in CG,” he tells BeforesandAfters. “We wanted to embrace all of the natural environments which we were going to visit.” 

This included shooting at Origo Film Studios in Budapest against sand-coloured backdrops without resorting to blue or green screen and devising a method of extracting chroma-keys.  

He reveals that an opening sequence to the film, set in space above Arrakis, was shot with LED screens, but that it didn’t make the cut. “[DP Greig Fraser] knew that there was no way we could get that harsh, arid, hot, bright environment with an LED screen,” he says. 

Similarly for the interiors of the ornithopters they found the highest hill in Budapest and built a gimbal, around which they constructed a 25ft-high sand-coloured 360-degree ramp. 

“We called it the ‘dog collar’, the idea being that when it was a bright, sunny day, the sun would bounce off the dog collar into the ornithopter giving us the ideal lighting. Now, the ornithopter was just like a glass bowl. When we had the actors inside the ornithopter and Greig focused on Paul [Timothée Chalamet] and everything was out of focus in the background, it almost looked as if you were over the desert.” 

SFX supervisor Gerd Nefzer came up with the idea of using a massive vibrating plate underneath the sand of Abu Dhabi’s desert to simulate the earth-shaking movement of the sandworms. 

“What we did in post to help extend this 12x12 plate was we shot extensive photography of the effect so we could then replicate it to make it feel like it covered a much larger area,” he says. 

Lambert is employed at lead vendor DNEG, corralling contributions from Rodeo FX, Wylie Co and MPC. Rodeo FX and Weta Workshop worked on concepts for the movie including the worms which had a mouth based on a whale’s baleen. 

Wylie Co played a large role in creating the spice-infused blue eyes of certain characters, basing individual looks on the colour of each actor’s eyes. This was after rejecting contact lenses (not ideal in a sandy environment) and even using onions. 

“We did a test where we cut up some onions and rubbed them under our eyes to try and get some red going on from the onion which we then tinted blue, but Denis [Villeneuve, director] wasn’t interested in that.” 

Spider-Man: No Way Home 

A triumph in fully integrating a classic superhero into the Marvel Cinematic Universe (MCU), No Way Home was the first film of the Covid-19 era to do pre-pandemic level business, thanks in a large part to male moviegoers between the ages of 18 and 34, according to The Hollywood Reporter. It sits at six on the all-time list of top-grossing films, having clocked over $1.77 billion in revenue and ticking. 

Several characters from previous Spider-Man films return to the fray and updating a trio of villains was the main VFX task. 

“The technology from the Sam Raimi and Marc Webb movies are outmoded, but the latest advances and ability to iterate quickly have led to greater photorealism,” explains Marvel’s production VFX supervisor Kelly Port to IndieWire.  

Digital Domain gave the tentacled Doc Ock a complete CG overhaul by creating a fully developed digital character that could hold up in a medium-to-close shot. This was used in combination with live action with Alfred Molina on a platform rig or wires. 

Electro was designed to see more of Jamie Foxx’s face and was given a more comic book-accurate look for his electricity. Luma Pictures, Digital Domain and Sony Imageworks worked on different scenes featuring the character. 

Imageworks took a central role writing complex sand simulation of millions of particles for the Sandman. “Finding the look of the character animation was difficult, facially, when he was talking,” says Port. “They’ve made everything look so much more real with light interaction, true bounce and reflections based on the colour of something. It’s integrated in the live-action or CG environment and sits in there more believably than ever before.” 

The climactic battle on the Statue of Liberty also featured the three costumed incarnations of Spider-Man. “We had three Spideys swinging around, so they each have their own characteristic pose, especially when all three land for the face off on the head of the statue. We worked hard to create an iconic image of the three of them landing and hitting their poses backlit by the moon. Swinging around is just great animation.” 

Shang-Chi and the Legend of the Ten Rings 

In the MCU’s introduction to Chinese wuxia fantasy adventure, the stand out scene was of a fight on a bus careering around San Francisco. This combined location and reference shooting on location with choreographed stunts filmed on a blue screen stage on motion buses in Sydney. This scene was prevised by The Third Floor and orchestrated by Luma Pictures. 

“Planning the sequence, we had to map out the route that the bus took through the streets of San Francisco, always trying to maintain a downhill motion during the most dramatic parts of the fight,” explains VFX supervisor Chris Townsend to BeforesandAfters. “We managed to get most of what we needed in an eight-camera array set of plates, mounted on a car, driving through the streets.” 

These array plates were fastidiously stitched together to create the world of San Francisco outside the bus. The Sydney blue screen stage had two different complete articulated buses; one was on an air bladder system, for general driving shots, the other on a six-axis gimbal for the more dramatic twists, turns and jumps.  

“In terms of getting the physics right, director Destin Daniel Cretton always wanted to have the drama amped up a bit, with the bus lurching downhill. This meant we had to tread that fine line between reality and movie reality. We studied footage of crazy bus stunts and real bus driving plates to get a better sense of what a bus can do… Yes, you can drive a bus on two wheels!”  

Rodeo FX handled the major fight sequence on a skyscraper scaffold and Weta Digital’s 300 shots included the dragons and the power rings themselves. 

No Time to Die 

The key to Daniel Craig-era Bond is the mix of muscular brutality with real pain and sensitivity. This comes to the fore in the final instalment which has registered $774 million at the box office since release last September. 

Director Cary Joji Fukunaga and cinematographer Linus Sandgren avoided green screen as much as possible but still required 1300 VFX shots split between Cinesite, DNEG and ILM. 

The extent of this concept can be felt in the pre-credits sequence filmed in Matera, south Italy, which features 007 back in the classic Aston Martin DB5. 

“It’s a romantic picture postcard location that turns into an action inferno,” says Sandgren. “The whole scene feels violent but intimate and immediate, shot with handheld IMAX cameras [and aerial work]. Everything that goes on in this scene is so intense. You can feel how hard it hurts Bond.” 

When the Aston Martin rotates in the town square the machine guns that pop-out of its headlamp are designed to be shot in-camera FX by special effects expert Chris Corbould (who shares the nomination). 

Framestore’s 400 shots on the film included transforming the open scene shot in Norway in the spring into crisp, snowy winterscapes and the set piece chase featuring Land Rovers, motorbikes and a helicopter. This work involved combining CG vehicles into the plates using keyframe animation and removing objects like stunt ramps and then replacing terrain. 

A separate team of 40 Framestore artists worked solely on the opening titles with director Daniel Kleinman; on Daniel Craig’s last gunbarrel sequence and the Billie Eilish music video for the title track.  

No Time to Die is Bond’s first VFX Oscar nomination since Moonraker. 

Free Guy 

The Truman Show meets Wreck It Ralf in director Shawn Levy’s comic adventure in which Guy, a jovial background character in a Grand Theft Auto-style multiplayer game, becomes self-aware and decides to play hero. 

The film blends live-action characters (led by Ryan Reynolds) and sets with a myriad of digital environments and “a neon rainbow of in-game graphics”, says DigitalTrends, which interviewed Digital Domain VFX supervisor Nikos Kalaitzidis about the work. 

The opening scene is 3000 frames and was initially a single shot from beginning to end as Badass [Channing Tatum] free-dives into Free City.  

“We threw the kitchen sink at it. We threw in more helicopters, more explosions and a bank truck that smashes into a car and all this money comes out, just to start. And then we just kept on coming up with more ideas, with everyone pitching them to Shawn and his team, and they loved everything.” 

Half the sequence was composed of plates photographed in Boston and Pittsburgh and a live action car chase shot using a custom-built 360 camera and a camera shooting hi-speed moving at 70mph. 

“You’ve got to be careful when you have actors and stunt people on set – if it made one weird mistake it could chop some body’s head off,” said Kalaitzidis in this making of video. 

Digital Domain also used an AI-driven renderer called Charlatan to re-animate the mouth and face performance of a digi-double of Tatum and based the game’s glitching effect on the pixelation from 1980s Atari videogames. 

Scanline VFX had the job of destroying Free City, after first building tonnes of buildings, props and other CG assets. Face replacement was used to create Dude, a heightened version of Guy, which involved working with VFX studio Lola to capture Reynolds’ facial expressions and apply his likeness and movement to the on-set performance of bodybuilder Aaron Reed. 

“Our goal is always to try to make it as photo realistic and roughly apply to gravity, physics, that kind of thing,” says overall VFX supervisor Swen Gillberg. “But the beauty of this story was like anything goes. ‘You know what I think this shot needs? I think this needs a pterodactyl. What do you think?’ And we put a pterodactyl in.” 

Near miss: The Matrix Resurrections 

The VFX centrepiece of director Lana Wachowski’s return to the Matrix was a reimagining of the iconic ‘bullet time’ but this time, instead of having the action played out in the middle of camera arrays the camera was to move around the action.  

Just to complicate matters they aimed to have Neo (Keanu Reeves) and the Analyst (Neil Patrick Harris) appear to move at different speeds within the same frame. 

“Bullet time in this movie is used primarily against Neo, so it’s power that’s taken away,” explains Dan Glass, the production’s VFX supervisor from DNEG to IndieWire. “We looked initially at shooting underwater because you get this very natural sense of exertion but also slowness of movement but the natural effect of reduced weight on actors’ faces was distracting from the overall performance.” 

The answer was to use a stereoscopic rig, explained DP Daniele Massaccesi to IBC365. “Instead of having each camera in parallax as if to shoot 3D, dual RED cameras were aligned to shoot an identical view with one recording 120fps and the other at 8fps. The footage was then blended in post to create an 11-minute-long scene played back at normal film speed 24fps.” 

There was quite a bit of massaging and trimming before addition of CG hair and retiming for that underwater look. 

The film’s 2350 VFX shots were created by DNEG, Framestore and One of Us with Volucap at Studio Babelsberg. 

Global supply chain worsens outlook for vast majority of broadcast kit suppliers

IBC

article here 

It’s the worst kept secret in the industry, but few manufacturers are prepared to talk about it openly. The supply chain shortage, which has dragged on for over a year, not only shows no sign of abating but has gotten worse. The impact on broadcast technology vendors is near universal with an alarming 63% of them reporting severe problems in sourcing components as of this month.

According to a new industry survey conducted by the IABM – exclusively revealed by IBC365 – the stranglehold on the supply of key product components such as semiconductors is affecting 97% of vendors, with 63% flagging this as severe. That’s up from 40% of vendors in April 2021, when the IABM first polled its members.

An alarming 40% of businesses warn of a severe impact on their financial viability if the issues are not solved in 12 months. Some 86% of companies will experience moderate or severe problems if conditions persist.

“The findings of our first survey on supply chain problems with electronic components last April painted a worrying picture for our industry,” says Peter White, CEO, IABM. “The results of this follow-up survey demonstrate that the position has deteriorated further in the intervening months, and will result in significant financial impact if the problems continue for the next year.”

Lead times have been most affected by the disruption, with companies reporting on average a 74.1% increase compared to usual. Anecdotal evidence suggests prices have soared by 30-50% in the last year for certain components and this is confirmed by the IABM survey which pegs inflation at 44%.

Some of this cost is being absorbed but margins are being cut and average prices of final product are now up by over 25%.

Microprocessors of all types and power devices seem to be the most limited in supply, reports one vendor. “It’s not just chips but motherboards and also a wide-ranging number of other components,” says Lorenzo Zanni, IABM Head of Knowledge, who conducted the survey.

Stockpiling inventory

White says: “There is evidence of component price increases which will either need to be absorbed by media tech suppliers or will feed through into increased end-product prices.”

He adds: “With no early prospect of the component shortage problem being resolved, many media tech companies have responded by redesigning their products to mitigate issues with sourcing specific components, and/or stockpiling components where possible to keep their production lines flowing in the face of dramatically increased lead times.”

For example, Harmonic revealed in an earnings call that its inventory grew to $71.2 million in Q4 2021. The last time the value of its inventory was so high was in 2011 when tech supplies were affected by a tsunami in Japan.

Holding more stock is a means to cushion a company against irregularities in the supply chain but it also means that the company has millions of dollars’ worth of stock on its books rather than funding the business. It also has the knock-on impact of draining more supply from the market.

It is the bigger companies who have the economies of scale to be able to do stockpiling,” says Zanni. “It’s similar to the early days of the pandemic when people overstocked on toilet paper at home.”

Patrick Harshman, President and CEO of Harmonic, spoke of “continuing to contend with supply chain challenges” in the firm’s Q4 earnings report on 31 January 2022, “which means that our current outlook for 2022 is supply constrained and burdened by exceptionally high cost… Where possible, we continue to stock up on the inventory at higher-than-normal levels in anticipation of continuing supply chain challenges.”

The firm’s CFO Sanjay Kalra added that margins had been impacted and that it would look to offset this “by price increases which we are working with our customers, at the same time, with the economies of scale which we are seeing because of the growth.”

Evertz, another public company, said in its Q2 2021 report on 31 October that the global supply chain disruption “has caused the company to experience unstable procurement capabilities leading to increased lead times and increased component costs”. To minimise the impact Evertz has increased its spend on raw materials since 30 April 2021 by $8.4 million. The pricing environment continues to also be very competitive, it added, “with substantial discounting by our competition.”

No safety in the cloud

Suppliers and developers of product tagged IP, cloud or software-defined are not immune either. All of these systems rely on hardware. Data centres comprise racks of servers. In fact, 96% of businesses rely on hardware, reports the IABM.

“Cloud is not a strategy that will help the industry out of this,” Zanni said. “Cloud service providers will be affected by this.”

It is an understandably sensitive issue with very few companies prepared to talk about it openly, perhaps fearful of scaring customers and investors with price rises, delays or earnings warnings.

One vendor says: “From the middle of 2021 demand began to rise quite markedly, and global supply of some core components and raw materials became restricted. We have subsequently seen a prolonged rise in the costs and lead times to the point where our own delivery schedules to customers are at risk.”

The vendor adds: “We saw a problem in computer touchscreen displays too, which are used in huge quantities across numerous B2B and consumer industries. Basic raw materials such as metal alloys are also increasing greatly in price.”

The electronic components shortage is just one element of overall industrial inflation. Prices for all television equipment will likely go up, and delivery times will increase.

One tech company says it was forced to change the delivery period for its products from 45 days to 60 days or more and claims some manufacturers cannot guarantee a delivery date at all.

BaM Stock Exchange

Companies can find components on the grey market of distributors and brokers, but they can expect prices to be high. An alternative is the BaM Stock Exchange launched last year to enable IABM member companies around the world to list their excess stock using internationally accepted parts codes and descriptions. IABM member companies who are experiencing shortages can search the BaM Stock Exchange listings to quickly discover if parts they require are available.

The initiative is designed by the industry to sustain itself by offering fair pricing and yet has had disappointing take up.

White says: “It is a value-added service designed to keep short supply components within the industry to the benefit of all. I have been surprised that not many companies have taken advantage of the BaM Stock Exchange so far; perhaps this is the moment for them to do so.”

New market balance required

The first signs of a global electronic components shortage emerged in autumn 2020. These were ad hoc situations when even popular components from major manufacturers, which were previously always available on the shelves, suddenly began to disappear. An example: Nvidia’s GeForce RTX 30 graphics cards “were announced, desired, but it was impossible to buy them,” according to one vendor. The situation became critical in the spring of 2021.

Now, the industry is in a transition period in which new prices must be formed to balance the profitability of the equipment manufacturer with market demand for the number of items produced.

The car market is a good representative example. You can buy a car for a recommended price with a delivery period of 6 to 9 months. If you need the car urgently, you can pick it up in the near future with an overpayment of 20-50%.

One vendor prepared to go on the record is Slomo.tv, a leading manufacturer of slow-motion replay servers and video-refereeing solutions.

“It definitely did impact us and quite a lot,” says CTO Igor Vitiorets. “We try to adapt to the processes which are taking place. If possible, we increase the inventory of components, which in turn increases the production cost of the systems for us. We do our best and none of our customers has been left without an ordered system.”

He adds: “In our opinion, unfortunately, the crisis will last for a long time, until the new prices for all products in the entire chain are balanced and also until a new consumer behaviour in the television industry is formed.”

Vitiorets elaborates: “Instead of constantly buying new equipment, the reliability of the equipment and careful usage will come to the fore. Equipment should be maintained to ‘live’ long. New products should be released not because a new trade show comes up and it is urgent to release a novelty, but because it actually solves problems or includes new technologies.”

Distributed live takes hold

InBroadcast


 p42 Feb issue here


Adoption of remote production over the past two years may have been born out of necessity, but as we return to a sense of greater normality, many workflows that have been driven by innovation are here to stay.

 

Remote production is growing fast, and more and more events are starting to utilize it. However, traditional workflows are likely to persist because people still need cameras and monitors locally, even if the processing is in the cloud. Productions are also looking to limit their carbon footprint on-site therefore the industry is leaning toward a more sustainable hybrid model of half remote, half on-site.

“While remote production has gained serious momentum over the last two years, calling it the new norm would be a bit premature,” says Lawo’s Christian Scheck. “Similarly, SDI and OB trucks are going nowhere anytime soon. Nor do they have to, for almost all remote production setups still involve SDI-based devices which can be controlled in the same way as open-standards IP natives. And new OB trucks are still being built.”

Nonetheless, as one of the companies that pioneered IP-based remote production, Lawo is confident that remote production will one day become the norm.

“An IP network usually involves edge devices that take care of signal ingress and egress (gateways),” he says. “They allow users to mix and match SDI solutions with IP-native devices, the only difference being that SDI signals are converted to IP at one edge, and back at the other. The decision to base operators at the production hub, who then control edge devices stationed on-site cuts travel costs and allows talented operators to produce more shows.”

Lawo is still proud of its Mix Kitchen and decentralized audio production approach, but its most groundbreaking development so far has been the release of HOME, which is designed to make IP operation both intuitive and secure.

Many live event productions have requirements to capture the camera feeds into replay and highlight package, creation workflows,” explains Alan Repech, Director of Marketing, Telestream. “If cameras do not have wireless connections back to the home studio or broadcast centre, capturing content requires systems on site. However, the editing and processing can occur remotely even to the point of being able to edit growing files only a few seconds delayed from the action. This means creating highlight packages and other derivative content can be done while the live event is in progress.

In other words, what is often done today with many creatives and a large technical staff on site is now more practical than ever from offsite locations. Some of the remote processing steps may include creating proxies and archive files, frame rate conversion, and HDR-SDR conversion. All of that can be, and is being, done today. In addition, as test, measurement, monitoring, and synchronization solutions increasingly offer fully functional remote user interfaces, the quality assessment of these workflows becomes increasingly practical.

Telestream offers both a flexible waveform monitor/network analyzer called PRISM and product called Inspect 2110 designed for monitoring by exception of ST 2110 video and audio streams. These can be integrated into a complete solution.

“Imagine monitoring all of the most critical contribution and distribution feeds in a live, distributed production environment with the only human effort being to observe a single computer screen,” says Steven Bilow, Product Marketing Manager, Telestream. “Imagine being automatically notified of a PTP timing issue with a distribution stream, being able to explore the issue immediately, and with a single mouse click being able to dive as deeply as you need into every aspect of the ST 2110 streams until you have found and fixed the problem. The result is an environment with lower stress, in the already stressful world of live event and sports production. It is that kind of solution-oriented innovation, that thinking about how to make life easier for users, that we have now; more of which will be continually forthcoming.”

NDI, the group developing the Network Device Interface protocol and part of Vizrt, is bringing new enhancements that will allow even better usability and integration.

“We are working on refining the latest version NDI 5 to enhance the use of audio, NDI Bridge, switching and routing while simultaneously addressing the need for flexibility in determining bandwidth, quality, and latency to enable limitless content creation for all markets,” says Michael Namatinia, company president.

NDI points to the use that ATP Media with AWS and Gravity Media made of the signal protocol to complete a large-scale proof of concept virtualised live production for the Rolex Paris Masters. The project saw multiple freelance broadcast experts test various AWS-hosted live production solutions. David Sabine at AWS reported that freelance experts didn’t notice any difference between using cloud-based live production tools and conventional ones.

“To have the time to be able to experiment with combinations of vendors interoperating with each other and to understand the benefits and limitations of the single vendor solutions was invaluable,” said ATP Media CTO, Shane Warden.

Net Insight concurs that cloud-based remote live production and distributed architectures are steadily gaining traction “as a cost-effective alternate to hardware-based on-premise projects,” according to Kenth Andersson, Head of Strategic Alliances.

“Our customers are taking advantages of solutions like our cloud-based media delivery and routing technology Nimbra Edge to use cloud as the means to deliver live feeds to remote distributed locations. They can connect talent sitting at home doing the production on low resolution feeds.”

Nimbra Edge is a cloud agnostic, multi-cloud and multi-tenant solution that supports the major industry standards such as RIST, Zixi, SRT for ARQ transmission. “In short, our solutions offer openness for technology vendors to integrate their solutions, which is extended as a strong overall solution and media ecosystem for contribution and distribution of media services for customers.”

Adoption of remote production over the past two years may have been born out of necessity, but as we return to a sense of greater normality, many workflows that have been driven by innovation are here to stay.

“We’ve seen the evangelisation of remote production over the past year, and we’ll continue to see media companies break new boundaries with proven cloud-based workflows in 2022,” says Larissa Görner, director of cloud product management, Grass Valley. “Despite traditional workflows such as OBs still having a core role in live content creation, the skyrocketing global demand for content in our market means media organisations are turning to new and pioneering production methods to achieve greater scale and agility.”

GV AMPP, the firm’s cloud-native Agile Media Processing Platform, is in constant evolution. On top of core media production workflows within the platform, recent additions include our AMPP Audio Mixer and AMPP Asset Management solutions.

“GV AMPP is at the heart of our GV Media Universe (GVMU) vision,” says Görner, “a digital ecosystem that allows customers to seamlessly combine on-premise, hybrid and cloud technologies from Grass Valley and verified partners to design live production environments to fit their needs.

“While no company can truly claim to be an ‘end-to-end’ partner, GVMU allows us to be as close to that as possible, providing our customers across the media landscape with seamless access to software and hardware technologies.”

“Remote production is not yet the norm, but it is the direction that most of our live event production customers are heading,” says John Schur, President, Solutions Group, Telos Alliance. “Some of the largest sporting events are being produced almost entirely remote today, but it's taking time for others to change over to new platforms and new workflows.”

The Telos Infinity VIP - Virtual Intercom Platform is a fully virtual and cloud deployable professional intercom system. One of its unique features is the ability to easily integrate with on-prem and remote sites that are using Telos and third-party systems. Infinity VIP is also available as part of the Grass Valley AMPP platform.

French sports broadcaster L’Equipe TV recently launched its OTT platform using a remote production workflow including cloud-hosted remote voice-over solution from Broadcasting Center Europe (BCE).

Jérome Aubin, production director at L’Equipe explains,For us, it is obvious that sports production must reinvent itself. Voice-over is one of them. However, we will only move to full remote production when the economic interest is beneficial. To date, however, we cannot really say that this is the case, especially since we mainly produce small sports events.”

Going forward, 2500 hours of live will be commented with BCE’s remote voice-over solution.  “The BCE solution is easy to use. All it takes is an internet connection and a computer. However, to ensure the best sound quality, we decided to add an external sound card. All our commentators have been using the system since the launch of our ‘Live’ platform. Pandemic or not, it was necessary for a channel like ours to find solutions to produce more content while better controlling costs. The remote voice-over is one of chosen means.”

In addition to sound administration, the cloud remote controller grants access to graphics and titles management. In addition, you can also trigger text titles, write the texts during the event or pre-configure the titles in templates. Sign language video feeds can be integrated and since the solution connects to a webcam or connected camera, users can decide whether to add this view as a picture in picture in the live feed.

“In my opinion, remote live TV production is not a replacement for traditional production workflow, but a good complement and opportunity to cover events and activities with limited budgets,” says Igor Vitiorets, CTO at Slomo.tv. “For serious events, the traditional OB and SDI based workflow is more reliable and preferable.

He makes a pretty convincing argument that for fairly simple broadcasts with low levels of responsibility and quality requirements remote production and broadcast automation are suitable at reasonable cost.

 

 

 

 

“The situation is different with large-scale and important events, because "acceptable" is not enough. Premier League broadcasts need a large number of cameras with high-magnification zoom lenses, SuperMotion cameras, SpyderCam and cameras on motorized rail systems. This expensive equipment is not permanently installed in the arena and requires on-site set up.

“They require experienced camera personnel who are able to quickly react to any changes in the game. Any delays in controlling the cameras and receiving the director's commands are simply unacceptable.”

As a rule, he says, video engineers should also be in the arena to adjust camera settings in real time, e.g. Iris settings. 

Therefore, a fairly large number of personnel should be on the site while, in theory, the Remote Production centre can accommodate an audio engineer, broadcast director and replay operators.

“Despite the well-established procedures and ‘standardised’ broadcasts, directors often have to give direct commands to the members of the TV crew. There is a rule of thumb: to comfortably control live processes, the delay should not exceed 300 milliseconds.

“There are also unplanned events that may occur: ‘impossible’ goals, force majeure or conflicts that are of great interest to viewers,” Vitiorets says. “The procedures and algorithm for broadcasting such moments are difficult to formalize and require an instant reaction from the broadcast director.”

It seems hybrid at the venue and decentralized live will be the modus operandi for major events for some time.

While remote video production is still not considered the norm, we’ve seen a huge increase since the beginning of the pandemic.

“Previously considered out of reach, remote production has become an achievable deliverable in a short amount of time,” says EVS’ CTO Alex Redfern. “The pandemic has led to an irreversible change in the way broadcasters and media companies create live content, opening the doors to new work practices. A large percentage of organizations are currently undergoing a transition from SDI sources and systems to IP core infrastructures. This transformation, which is driven by the need for greater agility and scalability, also applies to OB trucks.”

Leveraging cloud processing and machine learning, XtraMotion is EVS’ software application that enables the transformation of footage from any camera angle on a production into high-speed video using frame interpolation. As a result, says Redfern, productions can easily increase their super slow-motion coverage without any extra cost and without the need for additional hardware on site. XtraMotion was first deployed as a Proof of Concept (POC) at Super Bowl LIV, in February 2020, after which FOX Sports decided to make XtraMotion integral to its productions.

“It was at Daytona 500 that XtraMotion truly demonstrated the extent of its storytelling capabilities by allowing viewers to watch super slow-motion replays from the in-car cameras – a first in the history of live sports broadcasting,” says Redfern.

 

 

Impact of 5G

Of all the technologies likely to transform live remote production it is 5G which holds most promise. Telestream’s Bilow points to mmWave 5G as having the capacity to provide bandwidth up to about 2 Gbps. This makes it possible to support bandwidth-intensive video up to 4K and potentially 8K.  It appears that this bandwidth can even support volumetric video streaming on mobile devices.

He adds, “5G currently performs inconsistently in these applications because, as one moves their devices around, there are frequent handoffs between 5G and much lower performance 4G towers, among other things. Furthermore, 5G is very directional so achieving the theoretical bandwidth is difficult because there are rarely line-of-sight connections. This means that there are challenges to overcome. That said, these challenges are the perfect ones for a company with expertise in streaming media and monitoring to address. They are not insurmountable, and we view mobile and remote production over 5G as a realistic path forward.”

NDI and Vizrt have completed several proofs of concept in 5G. Sky Germany leveraged Vizrt and NDI to deliver real-time 5G for a recent Bundesliga Handball Final. It was the first time a broadcaster made an end-to-end 5G live production with NDI Bridge and Vizrt graphics, analysis, and production tools – all in the cloud.

“5G simplifies and streamlines these benefits significantly with the provision of network slicing and related QoS functionalities,” says Namatinia. “For live production, it brings guaranteed low latency and corresponding bandwidth.

“However, low latency and corresponding bandwidth must not only be ensured in the 5G network; this also plays a role in the further processing of the signals. Corresponding connections to the hyperscalers such as AWS, Azure, or Google must also be ensured.

“Another approach is to use edge computing to reduce the number of hops between the source and the production backend, thus decreasing latency accordingly. Achieving this is something we are currently working on at NDI, to reduce latency, not only for 5G but for other applications using LAN and WAN.”

Net Insight’s Andersson says 5G will unlock smart stadiums by providing an additional means to collect feeds within a venue that is operating a private 5G standalone network.

“It will provide the ability to use more cameras within a stadium such as spot cameras, tracking favourite players or different angles of the field. These feeds can be distributed to production remotely, for example to create a fan zone viewing experience complementary to the main produced broadcast feed.”

Another trend that is starting to have a significant impact on how live events are produced and distributed is the proliferation of cloud-based production platforms. Schur reports that these platforms enable even small venues and specialized events to be produced with professional tools at affordable prices.

“I expect that we'll see many more of these platforms come online, targeting specific types of productions and markets,” he says.

 

 

 

Winter Olympics Trials Virtualised OB Van

Broadcast Bridge

The Olympic movement can always be relied on to push the broadcasting barrier. Most innovations in its history have been incremental such as the move to color or HD and latterly UHD. Its host broadcast division Olympic Broadcast Services (OBS) is arguably in the midst of the most sweeping set of changes ever in transitioning its entire production fabric to IP and cloud in order to meet the goals of sustainability, flexible production, huge content demands and new formats and immersive presentation. BroadcastBridge examines this including a virtualised OB van project being tested at the Winter Games.

Article here 

Tokyo 2020 saw the full introduction of live coverage in UHD HDR, which was a breakthrough in format delivery for the Games’ live signal. Beijing follows suit.

“Introducing UHD and HDR was much more challenging from when we introduced HD for the Beijing 2008 Olympics,” says OBS Chief Technology Officer Sotiris Salamouris. “UHD isn’t a small increase in data flow. It means a big change in all of the back-end systems and the technical infrastructure that can support such mega large bitrates. In addition, there is the HDR factor, which by itself is extremely complex. In parallel, you also need to keep the system consistent with the SDR HD signal that is still our main content distribution format and the one that most of the broadcasters still use.”

To this end, OBS has created a single HDR to SDR production workflow model that will allow the trucks to generate an HD 1080i output via conversion from the primary UHD HDR signal. Almost all of the content will be produced natively in UHD HDR; however, OBS will also rely on several specialty cameras that at this time can only operate in HD 1080p SDR. The video source of these cameras is up-converted to UHD HDR to be integrated into the main production. A full IP infrastructure has been built to support the transport of the UHD HDR signals for the contribution network.

OBS Venue Technical Operations (VTO) team has developed a set of LUTs in-house to maximise the quality between all cross-conversions (from/to UHD-HD and HDR-SDR). By having natively captured the content in UHD HDR or up-converted to UHD HDR, then down-converted again, the final HD 1080i signal delivered to broadcaster will offer higher quality across all platforms than if produced in a standard HD production, OBS state.

All broadcasters will receive the International Signal in the host city’s HD standards. For Beijing, the SMPTE 292 standard is used for the production of the 1080i/50 HD-SDI signal. OBS will follow the 50 Hz specification. The UHD production will adhere to the SMPTE 2036-1 standard and follow the 50 Hz specification. The HDR standard will be Hybrid-Log Gamma (HLG). The 5.1.4 audio configuration will be provided for both standards.

This unified new UHD-HD workflow had one unexpected side-effect: It also contributed to improved HD picture quality, due to the way that video content from UHD HDR is upconverted.

“The picture quality in HD that we managed to achieve for the Tokyo 2020 Games could not have been possible if we had followed the traditional, HD-only, live production workflows from the past,” he says. “Not that our journey did not have its tough moments. Debugging a totally new live production workflow in a new and demanding format like UHD HDR would always have its hurdles.”

Virtualised OB Van Proof Of Concept

Virtualisation will redefine broadcast production requirements and allow for the possibility to scale services and greatly reduce the set-up time. It’s not alone in this of course and in many ways it mirrors the wider broadcast universe, not least in having to design, build, plan and replan many times under Covid.

There is a lot of excitement around a virtualised Outside Broadcast van pilot project in Beijing. It is part of a wider aim to explore more flexible and modular production environments with the goal of reducing logistical and operational complexity compared to traditional broadcast infrastructure.

Since January 2021, OBS Advanced Technologies Manager Geert Heirbaut has worked with systems integration teams at Intel to design a forward-looking virtualised OB van that is able to support agile production.

In use as a proof of concept during the Olympic Winter Games Beijing 2022, OBS will be testing this new live production environment at the curling venue. Based on a cloud-hosted, software-based architecture that mirrors the function of a traditional OB van, the venue production crew will perform their job from a production gallery in the compound, using COTS solutions that offer a similar user experience as traditional broadcast appliances. An on-premise data centre will replicate the cloud-based architecture platform.

The first stage of this innovative project will give priority to functionality and interoperability, as well as ingesting and processing of the 1080p50 SDR video feeds coming from 18 cameras used for the coverage of one of the ‘sheets’ at curling, alongside the audio feeds. Further, four additional native IP cameras, dedicated to the virtualised OB van project, will be connected to the network stack, eliminating the need for camera control units.

“Flexibility and scalability, and especially the potential to maximise our production workflows, are behind OBS's cloud strategy and virtualisation efforts,” says John Pearce Director of Venue Technical Operations. “Virtualised production set-ups will make our workflows a lot simpler, allowing us to deliver a very high quality production with greater agility, reduced footprint and cost efficiency. Making some OB van functions independent of physical infrastructure and moving them into the cloud will also help reduce our set-up time and broadcast footprint.”

Teaming up with Intel and Alibaba, OBS will also explore more flexible and modular production environments by designing a revolutionary virtualised Outside Broadcast van. With the full adoption of an IP-enabled infrastructure, certain functions of the in-venue production units can be moved away from the legacy broadcast components into using virtualised COTS servers and networking, opening up new opportunities that could lay the groundwork for producing the Olympic Games in an entirely new way in the near future.

“It is not possible to use the same system for each sport with the traditional technological stack,” says Salamouris. “With new technology, it is possible. We can use the same servers and systems in a standardised rack arrangement. We can use virtualisation technologies, along with ICT tools to allow us to configure the system for different sports. The broadcast equipment, such as the vision mixer, audio console, and replay server are software that is placed in standardised hardware, which are then configured to the relevant sport. As such, it optimises planning and preparation for future Games.

“Our idea is that in the near future, we will be able to replace the use of typical OB vans or bespoke flight pack systems, with a standardised ICT architecture of servers and IP switchers, where all standard broadcast applications for live production will be only software functions. This will offer us tremendous flexibility in order to address the major problem of having to deal with so many different systems, like individual production units with each one of them based on a different combination of broadcast boxes which themselves require specialised configurations and overall handling. We are ready for this set-up. We have tested the system on numerous occasions and are now excited about using the virtual OB van in a live Olympic environment.”

Cloud And Sustainability

OBS’ entire broadcast workflow has transitioned to IP with more and more content distribution and post-production workflows supported in the cloud. From Beijing, OBS will distribute feeds in HD and UHD through the cloud for the first time to more than 20 broadcasters.

Not only can RHBs receive all the live content produced during the Games over IP, they can now also send back their live interviews with the athletes from the mixed zone directly to their home headquarters, with ultra-low latency and in UHD.

Cloud technology allows broadcasters to address media management workflows from processing to editing to distribution operations in a better, easier, and faster way. If most broadcast organisations were still in the early stages of deployment and integration of cloud-based systems in the beginning of 2020, the pandemic has clearly pushed forward the adoption of such solutions. Most organisations have been forced to carry out production and distribution workflows from home and, during the crisis, rely on cloud services to support their newly remote production.

With its own Cloud platform, OBS says it can accommodate tailored, fully-fledged cloud-based front and back-end solutions for broadcasters to help them more easily set up all or part of their processes in the cloud. For broadcasters, this is a dramatic inflection point in the cost structure of their on-site production as they reduce up-front investments. Also, they can significantly keep their set-up time to the minimum and have their equipment all prepared for their Olympic coverage before even setting foot in the host city.

Impact On Footprint

Rights holding broadcasters (RHB) will always need a base in the host city, so no matter how more efficient and remote workloads are, the IBC will not shrink massively in size. Quite often the issue affecting RHBs in the host city are the same as those in their home country, notably the number of extra personnel required. So, even for the domestic broadcaster of the host country, for instance in this case China Media Group, there is always the need for a big presence in the IBC, because it is not possible to find the increased space in their HQ for their Games-time staff.

“If the IBC was only needed to address the problem of distance from the host country, then the domestic RHB would not be needing space there; however, Games after Games, we have seen that the domestic broadcasters continue occupying one of the largest broadcast areas inside the IBC. It is not like 15 years ago when broadcasters had to be in the IBC to have access to all the content. However, there will always be the need for a large broadcast operations centre in the host city, as well as smaller hubs in remote venues like here at Beijing 2022.”

Elements of an IBC may be different in the future, but the IBC will still be a hugely important component in the broadcast of future Games. By moving all broadcast workflows into the cloud, it can reduce its footprint further.

“Virtualisation is the starting step when moving into the cloud and is all about how one can take advantage of a particular piece of hardware in the most efficient manner,” says Salamouris “Thanks to virtualisation, hardware is now utilised to levels up to or more than 90 percent, a level that was unthinkable with previous approaches that were only relying on physical servers for running the applications software.

“The cloud is the next step as it holds the highest possible density in terms of virtualised assets, together with a highly sophisticated and rich set of tools that allow the just-in-time use and management of this tremendous concertation of computational and networking capacity. There is not a more efficient manner to run applications of any sort.

"For example, in the past, to use a visual mixer in a conventional OB van, it would have required a huge frame consuming a large amount of power, even if we were to use a much smaller portion of its features and overall capacity. A virtual, purely software-based vision mixer running on the cloud will be a significantly more efficient, and hence sustainable, solution as it could be easily configured and its capacities scaled up or down depending on the real needs. We can expand this notion to almost all the technology systems that are required to support the broadcast of the Games.

“Moving to the cloud allows a rather inefficient consumption of technical resources, mostly power but as well HVAC, built enclosed areas etc. to be exchanged with a far more efficient one within the cloud. We will always build our technology in such a way that it offers efficiency, sustainability, and reduces our footprint, while at the same time offering the broadcasters the chance to do the same.”

Instead of highly expensive dedicated international telecommunication optical circuits, OBS is delivering all the live multilateral content, in contribution quality and in extra-high availability over public cloud.

“Just a few years ago this would have seemed simply impossible to ever happen,” says Salamouris “The more extraordinary aspects of this service is that it can already meet the transmission qualities often related to satellite distribution, in terms of latency and resilience, while it is already able to outperform it when it comes to expandability, flexibility, and consequently, cost.

“What really surprised us was the fact that we can use live cloud not only to transmit our multilateral signals in HD, but in UHD as well, with the same resilience levels.”

 


Saturday, 19 February 2022

Camera Technicians Apprentice Scheme Builds on Successful First Year

copy written for VMI

The new Apprenticeship Standard for Camera Prep Technicians is looking to expand after a successful first season launch.

article here

Last summer, nine trainees became the first cohort of apprentices to participate in a two-year programme during which they will be trained to Senior Technician standard.

They will emerge in August 2023 with a formal level 3 NVQ qualification recognised across the industry. Most are guaranteed a job at the camera rental company that currently employs them as part of successfully completing the scheme.

The next intake, due to start this August, is expected to recruit at least 20 applicants into the camera rental industry giving a boost to the industry-wide skills shortage.

“Every apprentice has had glowing reviews from their employers,” reports Mik Nelson, Assistant Principal at the training and facilities provider London Screen Academy (LSA).

VMI, PixiPixel, Movietech, CVP, Cineark, Pro Motion Hire and Brownian Motion are currently signed up as employers for 2022.

“We are hoping to engage many more companies to build on the successes of the current cohort for the 2022 entry,” Nelson says.

Why the scheme is needed

The project was instigated in 2019 when Barry Bassett, Managing Director of VMI discovered that he was not alone among camera rental companies in finding it increasingly difficult to recruit and retain camera technicians.

“The prevailing entry route to the TV industry sees university graduates joining as interns and then receiving ad-hoc on-the-job training,” Bassett explains. “This process often sees new candidates leaving their jobs once they are trained, which is unsatisfactory for everyone.

“The idea was to see if we could offer an alternative entry path by setting up formal vocational training that would result in a recognised qualification and a full-time job.”

ScreenSkills and The Institute of Apprenticeships (IoC) agreed to investigate further and worked with training programme developer SkillSet to design and approve the new standard. Several rental companies also collaborated to design and formalise the standard which was approved in 2020.  However, progress was hit by Covid when the planned 2020 launch was postponed and student numbers on the 2021 course had to be reduced.

Nonetheless, the venture has the support of other organisations including ASPEC (Studio & Production Equipment Companies) who represents a number of UK rental companies, GTC (Guild of TV Cameramen), GBCT (Guild of British Camera Technicians) and Park Royal Business Group (PBRG).

Camera rental companies are going to be key to the scheme’s success and many have joined in sponsoring apprentices. Aside from those previously listed, other companies in the sector have signalled their backing and include Luna Remote, Alan Wells, Shoot Blue and DV Talent. 

Sponsoring companies for 2022 will commit to taking students in March. A selection event planned for this Spring will filter applicants. Interest in the scheme is high, which was demonstrated when the inaugural year attracted over 200 applicants.

“What is key to employers is that a low first year salary plus a £2K Government grant helps to offset the training cost,” explains Bassett. VMI has three apprentices working at its London and Bristol sites as part of the scheme.  “The £11K training course is 95% paid by the Government or from the Levy fund of levy-paying companies.  Either way, the employers pay very little towards the apprentice training but have everything to gain.”

The sponsoring companies play an active role in the scheme and are planning further collaboration in order to add value to the training process.  Initiatives such as providing training to the group to use specialist equipment and structured job-shares will widen the experience of the trainees. After feedback from other rental companies, LSA is investigating reducing the training period from two years to 18 months, to increase the speed of full-trained technicians entering the industry.

Course design and education

The camera technician’s role, as formalised in the programme, is to prepare camera equipment that is complete, works effectively, is correctly maintained and is appropriately configured and accessorised to be suitable for a given production.

A great deal of thought had to go into designing the best method to test technical knowledge (multiple-choice questions), camera prep-tech skills (observation) and troubleshooting, specialist knowledge and approach (discussion), as well as grade boundaries and definitions.

“The core objective of the role is to ensure that customers are provided with the equipment and support they require, at the time and place they require it, so they are able to make full use of the equipment package,” Nelson explains. “Their knowledge and skills can equally be applied to whatever means and methods are used in the workplace to prepare related equipment ready for use.”

Core duties include (but are not limited to) resource planning and allocation for own work, equipment preparation to meet specification and deadline; routine maintenance to ensure working order and the cleanliness of accessories. Lenses not only need to be blemish-free but are delivered correctly to scale. Kit needs to be quality assessed and tested.

“Client liaison is a key part of the role as is learning from colleagues so that apprentices are up to date with developments and component compatibility,” says Nelson.

Trainees also learn about booking kit in and out; the importance of keeping accurate equipment lists and records and the return of equipment to suitable specification after use.

A talent pipe for the industry

Not only does this formal training provide greater incentive for apprentices to remain in employment with their company for longer it will be of tangible benefit to the industry as a whole.

“By training them straight from school whilst they are in full-time paid employment ought to ensure a steady stream of trained technicians at the end of the process,” Bassett says.

Most importantly, at the end of the formal training period, successful apprentices can expect to receive full employment on a good salary. VMI, recognised as the UK’s first certified living wage camera rental company employer, pays newly qualified apprentices a minimum starting salary of £22,500 which will rise over time.

“Young talent are more likely to stay in post before potentially moving out into the freelance world or another industry role. Moreover, by delivering such a practical grounding we can hopefully ensure a higher quality of crew for the future of UK TV and film.”

 


Thursday, 17 February 2022

If We’re Already Living in a Simulation, What Does That Really Change?

NAB News

“These creatures you call mice, you see, they are not quite as they appear. They are merely the protrusion into our dimension of vastly hyperintelligent pan-dimensional beings.”

In Douglas Adams’ classic cosmic comedy The Hitchhikers Guide to the Galaxy, extra-terrestrial beings who manifest to us as mice are in fact creators of a massive supercomputer that is running a simulation of Earth and everyone in it.

article here

Even if we found this out to be true would it actually alter any of our actions? David Chalmers, a philosopher who has ruminated on the nature of reality as computer simulation, thinks probably not.

“I wouldn’t say it’s ‘likely’ we’re in a simulation, I’d just say that we might be and that we can’t rule it out,” Chalmers told Philosophy Now. “I speculate that there is at least a 25% chance. Maybe more important is the idea that virtual reality is genuine reality: that it is just as real as ordinary physical reality. VR will become an increasingly central part of our lives, and I think life in VR can be perfectly meaningful.”

Chalmers’ latest book, Reality+: virtual worlds and the problems of philosophy, has generated a lot of interest and he has been doing lots of interviews in support of it.

His argument — that we can live genuine, fulfilling lives in a virtual realm — has been savagely deconstructed by armchair philosopher Michael Sacascas here at NAB Amplify, not least for the seeming scant regard Chalmers seems to pay to “real world” problems like climate change and also what Sacascas feels is the unlikely scenario in which we would actually want to live most of our time in VR.

Reading Chalmers’ own explanation of his thinking, however, he doesn’t come across as someone whose drunk on electric Kool-Aid acid. Just someone who has watched and read an awful lot of sci-fi.

Science Fiction Philosophy

His acknowledged inspiration comes from sci-fi as much as metaphysics, which makes his writing a lot of fun. The Matrix receives more attention than the works of Kant. Sci-fi classics like Snow Crash and Ready Player One stand shoulder to shoulder with Descartes’ Meditations on First Philosophy and Putnam’s History, Truth and Reason.

“The Wachowskis do a beautiful job in The Matrix of illustrating so many deep philosophical ideas,” Chalmers told Ars Technica. “I think almost every science fiction writer is a kind of philosopher because what is a science fiction story but a kind of thought experiment? What if there were machines as intelligent as humans, or what if we were living in a simulation? We think about these scenarios, and we reason about what follows. When done well, that can bring out something important about the nature of the mind or the nature of reality.”

As more and more of our lives play out in the virtual space, or the metaverse, this is becoming more than an abstract debate. Philosophy students at universities today are being asked to ponder questions like “You might be in the Matrix right now,” or “You might be in a simulation right now.”

This is a reframing of a puzzle that goes back to thinkers like Plato and Descartes. If you follow Descartes, then we can’t trust anything that we experience with our senses as real. The only thing we can trust is our mind. Follow that through, and Chalmers is arguing that virtual realities are genuine and invested with just as much meaning as anything that happens in the physical world.

“Some people will say if we’re in a perfect simulation, then our lives are just illusions — deceptions — and this is a terrible thing,” Chalmers says. “My view is that if we’re in a perfect simulation, it’s not an illusion, it needn’t be a deception, the world around us is perfectly real, and our lives can be just about as meaningful as before. Even if we discovered we were in a simulation, that would be shocking at first, but I suspect after a little while, life would go on.”

Reality or Realities?

“But then there’s the question what does it mean to be real?” Chalmers poses to Philosophy Now. “Joe Biden is real. Santa Claus is a fiction. So what is the difference between being real and not being real? One key difference is: something is real if it has causal powers, if it can make a difference in the world.”

He makes the case that, in principle, if we’re in a simulation-like world, then the digital objects there make a difference. “When I have an experience of, say, a tree in front of me and I’m in a digital world, then that needn’t be an illusion,” He elaborates at Ars Technica. “It’s just that the tree is digital. It’s made of bits. That’s kind of like being made of atoms. We don’t say the tree doesn’t exist just because it’s made of atoms. So, likewise, I think we shouldn’t say the tree doesn’t exist, or the tree isn’t real, just because it’s made of bits. So in principle, a virtual world can be just as real as a physical world.”

Part of the theme of his book is that reality might be made up of many different realities, both physical and virtual. In that case, he explains to Philosophy Now, “reality is more like an interconnected space of goings-on that are all interacting with each other.”

He continues, “We all acknowledge there’s a physical reality and then there are these virtual realities, which are created within the physical reality and to some extent depend on it. We say things like ‘in real life’ all the time in order to draw a distinction between physical reality and the virtual world.

“Maybe we’ve got to rethink the relationship between the mind and reality so that simulations are more real than we might have thought. Some people think, by its very nature, the metaverse can only ever be escapism or illusion, not something on par with ‘real’ life. But if I’m right that virtual reality is a genuine reality, then you can, at least in principle, lead a meaningful life in a virtual world. I think this really matters.”

Thinking this way could be generational. He speculates that older people are much more inclined to count digital worlds as second-class and not fully real, whereas people born in the last 20 years or so are basically digital natives who are used to hanging out in digital realities. “From their perspective, virtual worlds are part of reality and treated that way,” he says

VR Has Both Utopian and Dystopian Potential

Just because virtual worlds and the nascent metaverse are likely to be dominated by corporations in the near future, Chalmers holds out hope they won’t hold sway in the long term.

 “When people are living half their lives in a virtual world it’s hard to imagine that they’re going to give over control of that to corporations,” he says. “I’m optimistic that they might come to new forms of government and regulation, and not simply be run by corporations. If [the metaverse] does end up being run by corporations that has major dystopia potential. Whoever owns these virtual worlds are going to be basically like the gods. They’ll be omniscient and omnipotent with respect to those worlds, and we don’t really want to put that power in the hands of corporations. But in the long term I think it’s going to develop in ways we can’t even imagine.”

Which pill red or blue gives you the outcome you want?

Virtual and Physical Blending is Our Future

Chalmers seems to think it’s inevitable that we’ll reach a point where the virtual world is practically indistinguishable from the physical world, but that this is going to happen outside of all our lifetimes.

For the next two to three decades VR won’t even be that good. “Maybe in 20 or 30 years, we’ll get to really high-quality VR, probably not yet indistinguishable, but at least where things like vision and hearing and so on are concerned,” he says.

“The real challenge is embodiment and having an experience of your body, the sense of touch, the sense of moving your body, the sensations you get from eating and drinking or sex. That’s a much bigger challenge, and that probably is going to require more than just standard virtual reality or augmented reality, maybe something like brain-computer interfaces.

“Once we reach a point where computer processes directly communicate with the areas of the brain associated with the body and with pleasure and so on, you can imagine long-term technologies where that is used to give you a much more realistic sense of living in VR. But I suspect really good brain-computer interfaces like that are probably close to a century away.”

Can Consciousness Be Uploaded?

The simulation hypothesis leads to different ways of thinking about life after death. Science fiction features like Transcendental and the BBC drama Years and Years have explored the idea of “uploading” human consciousness into the internet where our personalities would live on infinitely in some sort of collective AI.

“Maybe if we’re all bits of code inside the simulation, then there’s a possibility that upon physical death inside the simulation, that code could be lifted up by the simulators and moved to some other virtual world or some other portion of the simulation,” Chalmers posits. “Who’s to say that couldn’t qualify as some kind of life after death?”

The concept of “uploading” can be directly compared to religious ideas about the afterlife, and Chalmers appears conflicted as to whether he believes this concept or not.

On the one hand, he says. “When I die, I will cease to exist. My conscious self will go out of existence,” and he also says he doesn’t believe in a soul which is separable from the physical brain and body. However, he has also stated that in thinking about simulation he is now more open “to the idea that perhaps we could have some existence that goes beyond the mirror existence of this physical body, although it may still be tethered to something quasi-physical in the next universe up.”

And if we are in a sim then it begs the question who created and controls the sim? It’s another classic ontological puzzler.

“For all we know, the simulator is just a teenaged hacker in the next universe up,” he says, adding “there’s no reason to think we should worship this simulator or build any kind of ethical system around them.”

Free Guy and the Consciousness of Sims

If you thought Ryan Reynolds’ movie Free Guy was a lot of fun but a load of nonsense, think again. What if non-playable characters actually did attain human level consciousness? What, then, would be the ethics of killing them off?

Chalmers has processed the idea. A central tenet to his work has been arguing that machines can be conscious. Artificial minds are potentially genuine minds, he states, and that’s quite important for thinking about simulated realities.

“Very few people would argue that the NPCs we have right now have moral status on par with humans,” he tells Ars Technica. “If someone shoots an NPC in a video game, we don’t feel they’ve done something morally wrong. On the other hand, if we fast-forward into the future, once artificial intelligence has been developed and we get to human-level AI, then it’s possible there could be NPCs inside a virtual world that are as complex and sophisticated as we are.”

The question then becomes, “do they deserve moral status?” For Chalmers, that question very much turns on a question about consciousness. Human beings are conscious. We have subjective experience. We experience everything that happens to us from the first-person point of view, whether it’s seeing or hearing or feeling pain or pleasure. Will these AI systems or complex NPCs be conscious? Will they have real, first-person subjective experience? Chalmers thinks they would.

“My own view is there’s nothing special about biology. Both silicon and biology, in principle, can support the same kind of consciousness, despite the difference in substrate.”

He even calls Free Guy “an enlightened movie” and quotes from it.

“There’s this great conversation between Guy and his best friend, Guy asks, ‘Does that mean none of this is real?’ And his friend says, ‘Come on, I’m sitting here with my best friend, who’s going through a tough time, trying to help him through all this. If that’s not real, I don’t know what is.’ And in the end, they start what amounts to a civil rights movement for NPCs in this video game.”

How Will We Know When We Are in Virtual Reality?

Today when we put on a headset we know the objects around us are not physical. Virtual objects and physical objects behave in very different ways — at least for now.

“Your hand will go straight through a Pokémon Go creature, but it won’t go through a real animal. The AR objects tend to be a bit cartoonish, but I think it’s going to be important going forward, when we’re using augmented reality, that we have a clear sense of what’s physical and what’s virtual,” Chalmers says.

 “Maybe we can’t know what the world is made of, whether it’s ultimately virtual/silicon or physical/biological,” he adds. “But we can at least know the structure of things. It turns out that’s good enough for many purposes. Maybe I can’t know for sure whether I’m in a simulation, but there is this structural core of reality that we can know about. The more you think about virtual reality and augmented reality and physical reality, at least for me, the more that rings true.”