Wednesday, 4 March 2020

COVID-19: All Eyes on NAB

StreamingMedia
As of today, the NAB Show in Las Vegas is still a go for April 18-22, but as more events across the globe pull the plug, some exhibitors and attendees are beginning to worry that the venerable broadcast and OTT event might be next.
Among the latest international trade shows to cancel or postpone is Prolight + Sound 2020 in Frankfurt, originally set for the end of March and now rescheduled for the end of May. Even that may be risky, given information circulating that the height of the outbreak in Europe could hit in June and July. The television programming conference MIPTV, to be held in Cannes, France March 30-April 2, canceled today. Facebook, Google, Microsoft, Adobe, and others have cancelled major conferences coming up in the next few months, in some cases replacing them with online events. (BuzzFeed is keeping a running list of cancellations.)

Cancellations for Health and Business Reasons

On Tuesday, California-headquartered AJA Video Systems became the first high profile exhibitor to announce non-attendance at NAB citing coronavirus risks. NAB has booked 1,600 exhibitors.
"While AJA made the difficult decision to withdraw from NAB 2020 this month due to Coronavirus risks, we remain fully committed to transitioning all of our planned NAB announcements to web-based video," explains director of product marketing Bryce Button. "It is not a decision we took lightly, and one that we made out of an abundance of caution for the health and safety of our employees and partners worldwide."
AJA may be the biggest company to pull out of NAB, but they're not the only one. 
"We decided to cancel our visits to NAB and other shows in the next months," says Oliver Lietz, CEO of German-based streaming company nanocosmos. "Our idea is to get more engaged into online presentations and meetings. We have both concerns about public health but also see much smaller audiences and visitors joining, which reduces the value of joining the show as exhibitors. Many shows an Europe already have been cancelled. We see a good chance for the organizers to pick up the challenge and move attention to online gatherings." 
Ensemble Designs, which makes signal processors and other video equipment, has decided not to attend the event. "We've been thinking about this for some time, and we're very concerned about the [virus's] impact on the show," said David Wood, president and chief design engineer, adding that it was a business decision, not one based on personal health concerns. "Attendance will be so uneven because of prohibitions on travel," he said, adding that Ensemble was planning to send five or six people to NAB.
One major exhibotor that attends both NAB and International Wireless Communications Expo (IWCE, scheduled for Las Vegas April 1-3) has pulled out of IWCE. An industry source with knowledge of the company’s decision-making said the company will make a decision about NAB by March 15. 

Watching and Waiting

Fellow exhibitors are keeping watch, and many companies we spoke to declined to go on the record. Others, like Akamai, are planning to attend NAB even as they are scaling back other events and monitoring the situation closely.
"Akamai's top priority is the health, safety, and wellbeing of our employees, customers and partners," the company said in a statement. "Given the uncertainty of the COVID-19 virus situation, we have canceled most events globally that are scheduled for March and April and have communicated that news directly to all confirmed participants. 
"We're carefully monitoring what is an extremely fluid situation and will continue to work closely with event organizers, health organizations, and our staff to inform our decision making as it relates our participation in NAB and other third-party events as they approach."
Limelight Networks is still planning to exhibit at NAB, according to a company spokesperson. One publicist who represents more than a dozen exhibitors says all of their clients are still planning to attend, adding that it’s "business as usual, unless NAB itself does/says something."
Imagine Communications VP of global marketing Jennifer Overbaugh's response is consistent with others we've spoken to.
"As of now, Imagine Communications is moving ahead with all plans for NAB, and we look forward to joining our peers once again for this important event. We are closely monitoring international developments with the health and safety of Imagine's employees being our top priority, and will provide an update should circumstances change."
David Cohen, vice president of marketing communications at Grass Valley says, "We will be hugely disappointed if the show is cancelled but realize that preventing the spread of COVID-19 is critical and that the safety of our staff and our industry is of the utmost importance. We are putting in place a contingency plan for if [the] NAB Show is cancelled to make sure we can still share all our new solutions and developments with our customers."

The Show Must Go On...

NAB itself issued a statement on March 3 (updated today), saying the show is proceeding as planned. "We understand there are exhibitors and participants that may decide not to attend this year's show and respect everyone's desire to do what they believe is best. As of today, we are hearing that the overwhelming majority of our exhibitors are looking forward to attending the show and our attendee registrations continue on pace with the normal patterns we see year-over-year."

IBC, the European counterpart to NAB occurring in September, has also issued continuity of business statements with reassuring notices about onsite health precautions.
"IBC remains entirely confident that IBC2020 will take place as planned. There have been no cancellations from any IBC2020 exhibitors or speakers."
However, if the IOC postpones or cancels the Tokyo Olympic Games, which it is coming under increasing time-pressure to do, any event of lesser scale (all of them) will find it hard to persist.
Although the audio-visual industry trade show Integrated Systems Europe went ahead in February, attendance was down by 30,000 from 2019's peak of 81,000. In part this was due to storms disrupting travel that week.
Mobile World Congress organizer GSMA was forced to cancel its massive Barcelona event when the dominos of exhibitors from Ericsson to AWS made it inevitable. Unlike MWC and ISE, NAB has a largely American audience so while the number of visitors travelling from overseas may tumble, it may still have enough domestic support.

...But What if It Doesn't?

Even the viral outbreak has a silver lining. Vendors of remote collaboration tools, virtual meeting systems, and browser-based edit software are reporting a boom in demand as everything from education to conferences and corporate meetings to creative video production itself is forced into remote distributed workflows.  
"Having been champions of remote and decentralized workflows for a while, for those needing to adapt their workflow practices in the light of the current COVID-19 outbreak, we have the tools ready to go right now – from acquisition over production to distribution, as well as disaster recovery,  all can be handled remotely and / or in the cloud," says Cinegy owner Daniella Weigner. "We are looking forward to showcasing these at NAB. Our sales team is ready to help with any remote demos now. In case the show is canceled or we are advised/not allowed to travel, our pre-and post-NAB webinars will be adapted accordingly and we plan to meet our partners and customers virtually. We hope everyone stays safe."
Nanocosmos' Lietz says he hasn't seen NAB take advantage of online meetings previously, but that they offer the potential for NAB attendees and exhibitors to connect even if they aren't at the show.
Most companies seem to be taking watch-and-wait approach, but are working under the assumption that NAB will happen.
"PlayBox Neo is following the NAB's coronavirus updates and will make a final determination based on input from the WHO, along with our management team, employees and customers," said Pavlin Rahnev, CEO. "It's the biggest U.S. show for our company, and we are eager to support it but can't make a definitive statement as yet on our participation until we're closer to the show and have more information on the overall health crisis at hand."  
"Never.no are planning to attend NAB with the safety and wellbeing of our staff, partners and customers firmly in mind," says Scott Davies, CEO. "The team will continue to monitor the situation and follow the advice from the organisers and authorities."
SSIMWAVE says, "We're monitoring the situation with the safety of our team members foremost in our minds, of course, but our expectation at the moment is that we will be at NAB 2020 next month."
Not everyone is so sanguine, though.
"I do think NAB should reschedule, but if not, we’ll be there and will reach out to our customers and colleagues virtually as well," Sam Cercone,  managing partner at Brightline, which makes energy-efficient lighting systems.
Others are more blunt:
"Holding an international broadcast show at this time is bananas," says Owen Tyler, operations director at UK postproduction facility Evolutions.

Behind the scenes: The Mandalorian’s groundbreaking virtual production


IBC
Star Wars feature films are shot on massive sets occupying large soundstages at Pinewood in the UK, supplemented with exotic location work in Tunisia or Ireland and copious VFX from the likes of Industrial Light & Magic (ILM).
But for the TV spin-off, Disney+ used a groundbreaking virtual production methodology that radically shrunk the footprint, eliminated location shoots and essentially performed real-time, in-camera compositing on set.
The Mandalorian was shot on an LA stage surrounded by massive LED walls displaying dynamic digital sets. The integrated suite of technologies included a motion tracking system from Profile Studios and Epic Game’s Unreal engine.
While all these elements have been used in various combinations before, deployment for the fast-paced nature of episodic television had never been attempted.
For example, the art departments on The Meg, Le Mans 66 (Ford v Ferrari), Ad Astra, and Rocketman all used Unreal Engine for previsualisation. The Lion King’s creative team used the Unity engine for several processes including location scouting. Joker and Murder on the Orient Express included scenes lit by video walls.
“This project demonstrates the most multi-faceted use of our technology yet,” says David Morin who heads up Epic’s LA Lab.
As pioneered by The Lion King director and The Mandalorian showrunner Jon Favreau, multiple people including the art designers, production designer Andrew Jones, visual effects supervisor Richard Bluff and cinematographer Greig Fraser ASC ACS were able to collaborate in VR prior to principal photography in order to choose locations, block and light scenes.
Natural workflow“The defining characteristic of this workflow is that most of the creative work is done in pre-production instead of post,” explains Morin.
“If you want a photoreal virtual world on set you have to build it before going on set. That shift requires adapting the VFX workflow so that you can use games engine and virtual reality tools to decide ahead of time where to put the camera and, if there are multiple directors shooting in the same world over a series, they need to agree on continuity.”
The Mandalorian executive producer and director was Dave Filoni working with episode directors including Bryce Dallas Howard and Taika Waititi.
The location backdrops were drafted by visual-effects artists as 3D models in Maya, onto which photographic scans were mapped. Photogrammetry teams headed to Utah and Iceland among other locales to shoot these plates which wound up comprising about 40% of the show’s final backdrops. The rest were created as full CG by ILM and on the studio backlot.“The good news in this transition is that it brings workflow back to something akin to traditional filmmaking,” Morin says. “You have to build sets before you can shoot them in live action and now that’s the same in our VFX workflows. Virtual environments have to be built before the deadline for principal photography. This workflow is very natural for filmmakers.”
Actors in The Mandalorian performed in a 20-foot high, 270-degree semicircular LED video wall with ceiling and a 75 foot in-diameter performance space, called the Volume.
This is where practical set pieces were combined with digital extensions on the screens. The technology’s real innovation is that when the camera is moved inside the space, the filmmakers have the ability to react to and manipulate the digital content in real time.
Visuals in parallax“When the camera pans along with a character the perspective of the virtual environment (parallax) moves along with it, recreating what it would be like if a camera were moving in that physical space,” Morin explains.
By the time shooting began, Unreal Engine was running on four synchronised PCs to drive the pixels on the LED walls in real time. At the same time, three Unreal operators could simultaneously manipulate the virtual scene, lighting, and effects on the walls. The crew inside the LED volume were also able to control the scene remotely from an iPad, working side by side with the director and DP.
“We wanted to create an environment that was conducive not just to giving a composition line-up to the effects, but to actually capturing them in real time, photo-real and in-camera, so that the actors were in that environment in the right lighting — all at the moment of photography,” Fraser explained to American Cinematographer.
Any reflection and refraction of light from the panels bounces off surfaces and behaves as if it were being shot for real on location.
“By contrast on green screen you had to jump through so many hoops to achieve the desired lighting effect,” Morin says.
Lighting for realThe amount of green and bluescreen photography had famously blighted Star Wars episodes 1-3.
“Green screen or black box virtual production is a very intellectual process that requires the actors to imagine how things will look and everyone else to figure it out later. If the director changes the backgrounds in post, then the lighting isn’t going to match and the final shot will feel false. Here, suddenly, it’s a very natural thing. The video walls bring us back to making decisions and improvisations on the set.”
The shiny suit of the title character, for example, would cause costly green and bluescreen problems in post-production.
Instead, the cinematographers (including Barry Baz Idoine who took over from Fraser after he had set the show’s template) were able to work with the reflections of the LED lighting. For example, the DPs could request a tall, narrow band of light on the LED wall that would reflect on Mando’s full suit, like the way a commercial photographer might light a wine bottle or a car — using specular reflections to define shape.
“LEDs are emissive surfaces so you can essentially create any illumination pattern that you have in real life,” Morin says.
“It’s not just reflections, it will generate the entire light for the scene. You can also exclude the virtual world from being an effect and only light the real set. Everyone is still learning how to take advantage of these possibilities.”
The set was rigged such that the 90-degree open area behind the cameras could be enclosed by two additional flat panel LEDs enclosing the set in an entire, lit virtual environment.
“The environments start to become bolder and deeper as the team began to understand the size of the virtual space they were working in,” says Morin. “A 75-ft set is impressive, but the environment can look a thousand times bigger. You have an infinite sense of infinity. You can have a football pitch sized spaceship hanger or a desert vista with the sunset hundreds of miles away and the illusion is impressive.”
For the actors this approach was beneficial since they could relate more to the story surroundings, for instance knowing where the horizon is, even if the screen was only in their peripheral vision.
That said, the illusion will only appear perfect when viewed from the perspective of the motion-tracked camera.
“If you’re not at the camera’s focal point then it looks weird and a distortion of reality. That’s a signature of these kinds of sets.”
Practical elements, such as the fuselage and cockpit of the Mandalorian’s spacecraft Razor Crest, had to be built with attention to their light reflective properties in order to react to the LED illumination accurately, even if it was not in shot.
The production camera’s position was tracked by the motion-capture system via infrared cameras surrounding the top of the LED walls. When the system recognised the coordinates of the camera, it rendered the correct 3D parallax for the camera’s position in real time. That was fed from the camera tracking system into ILM’s proprietary Stagecraft virtual production software, which in turn, fed the images into the UE4. The images were then output to the screens.
Perfecting workaroundsThe whole system also had a time delay of seven frames says Morin (other reports suggest 10- to 12-frame latency) between the camera and the virtual world as a result of the processing needed to generate content in the software and hardware stack. Less than even half a second, this nonetheless, occasionally resulted in the camera moving ahead of the rendered field of view.
“We will reduce the lag further and there are a number of targets we can optimise and eliminate,” says Morin. “It’s an ongoing process.”
Video walls themselves are far from cheap and producers wanting to rent them will have to trade off performance with budget. Even for The Mandalorian’s reported $100 million budget, systems integrator Lux Machina advised a less expensive lower resolution LED for the ceiling portion of the set since these panels were mostly to be used for light reflection and rarely shown in camera.
Advanced LED panels, even the Roe Black Pearl BP2s used on The Mandalorian, may not have the fidelity for capturing extreme close ups and risk displaying moiré patterns.
“We forget that when filmmakers shoot live action it is rarely perfect and they’ve always had to adapt to what they find on the day,” Morin says. “The Mandalorian team wanted to experiment with the system and their expectations were low in terms of final shots they’d get from the wall. They were prepared to replace most pixels in post but knew it would at least give them a template to edit and they wouldn’t need post viz.”
In fact, the virtual production workflow was used to film more than half of The Mandalorian season 1, enabling the filmmakers to capture a significant amount of complex VFX shots.He adds, “But because they adapted to the conditions on set they ended up with a staggeringly high shot count of files good enough for the final picture.”
“A majority of the shots were done completely in-camera,” Favreau confirms. “And in cases where we didn’t get to final pixel, the post-production process was shortened significantly because we had already made creative choices based on what we had seen in front of us.”
Potential issues with moiré were ameliorated by shooting with Arri Alexa LF (Large Format) and Panavision full-frame Ultra Vista anamorphic lenses.
“It allows the inherent problems in a 2D screen displaying 3D images to fall off in focus a lot faster,” Fraser explained, “So the eye can’t tell that those buildings that appear to be 1,000 feet away are actually being projected on a 2D screen only 20 feet from the actor.”
Where the latency from the camera position information to Unreal’s rendering became a real issue, they resorted to the conventional green screen.
“We got a tremendous percentage of shots that actually worked in-camera, just with the real-time renders in engine,” Favreau explained at Siggraph.
“For certain types of shots, depending on the focal length and shooting with anamorphic lensing, we could see in camera, the lighting, the interactive light, the layout, the background, the horizon. We didn’t have to mash things together later. Even if we had to up-res or replace them, we had the basis point and all the interactive light.”
Post-production was mostly about refining creative choices that they were not able to finalise as photo-real on the set.
Morin says that as production progressed the team became more comfortable using the technology and began to design with it in ways that are subtly apparent in later episodes. 
Season 2 production began in October 2019. 

Tuesday, 3 March 2020

Turbo boost the value of live

content marketing for Blackbird 
In sport a split second can separate the best in the world from the also rans. It could be a marginal off-side decision going the way of the striker in the last minute of extra time or the blink and you miss it winning play of an underdog team at an international esports final.
The need for speed is equally true when maximising the value of sports rights.
If the event is not being watched live, and especially if the result is known, it quickly loses appeal to fans. The asset’s shelf-life diminishes rapidly the further from the event itself. It’s why the value of live has always been at a premium and why the ability to let fans share in the action as near as real time as possible is vital if teams, federations and rights holders are to monetize the short window of opportunity.
The sports industry is already harnessing the power of digital and social media and there’s still plenty of room for growth. According to GlobalWebIndex, 22% of consumers say that following sports events is one of their main reasons for using social networks, climbing to 39% among live social video viewers. The challenge for sports broadcasters and sponsors now is to offer an immersive digital experience as well as a compelling reason to watch sport on TV, as they look to maximize audience reach and monetization.
Eleven Sports, for example, operates a dedicated online platform to rapidly clip, edit and publish premium sports content including the UEFA Champions League, La Liga, the NBA and Formula One seconds after live to its social channels and OTT platforms around the world.
In fact, a live stream can be edited just 6 seconds after live in Blackbird, with a curated clip posted to social platforms such as Twitter within 30 seconds.
Like an athlete wearing Nike Vaporfly, Blackbird turbo boosts editorial turnaround to give your sport the competitive edge.

Current events accelerate remote working

content marketing for Blackbird 
Current events have spurred a huge amount of thinking and planning about how workforces operate effectively. Indeed they have accelerated a long term trend of flexible and more sustainable home and remote working.
This has resulted in a significant rise in the use of virtual meetings and demonstrated the efficiency gains this flexible working brings as everything from conferences to commerce is re-thought. Corporates are finding they are required to spend less money on office space, infrastructure and travel.
What began as a rapid response by businesses in China has spread to global hubs, such as Silicon Valley, as a policy to discourage centralised congregations while keeping the economy going.
Two of the largest virtual workspaces and productivity software platforms in the world, Alibaba’s DingTalk and Tencent’s WeChat Work, were overwhelmed by the number of users at the beginning of last month as the extent of the problem became clear and became a key part of the solution to containment.
Amazon, Google, Facebook, Apple and Microsoft are just some of the major corporations that have instructed parts of their workforce to work remotely, in turn surging demand for videoconferencing and online meeting services like Zoom.
Remote working is key to reducing the risk of daily commuter transmission rates and the benefits of telecommuting, including stress reduction, increased productivity, a wider talent pool, and better work-life balance won’t subside with the epidemic.
The emergency adoption of remote work promises to speed up the transformation in the way companies operate – leading to long-term changes in workforce behaviour, according to analysts including Roberta Witty, at research and consulting firm Gartner, as quoted in the Wall Street Journal.
Those working in the professional video production industry have a head start, particularly those able to edit video remotely with Blackbird as easily as if they were on site with their original material.
That’s because Blackbird is the only professional video editor available in a browser. Production staff can work with Blackbird from home on Mac or PC with all the editing tools they need – even on bandwidth as low as 2 Mb/s. It’s a genuinely distributed ecosystem meaning that remote production teams can access video fast from any laptop, making it immediately available worldwide.
With the current crisis very much a developing story, many more companies in hotspot zones will likely begin considering or enforcing home working. When eventually everything does calm down, we may just find that this global disruptor has accelerated positive changes in the evolution of work.

Monday, 2 March 2020

AI is changing the way we edit videos

Redshark News
AI tools for automating video and finishing posting and posting of polished video clips on social media just took two giant leaps forward with the launch of Vimeo Create and fresh multi-million dollar funding for start-up Revl. Both are aimed initially at the burgeoning market for time-strapped but video hungry business marketers.
Vimeo Create is a short-form video editing platform built out of Vimeo’s acquisition last year of Magisto for a reported U$200 million. Backed by Qualcomm, Magisto analyses images, video, speech and audio uploaded via its app to automate the production of a suitably professional clip for sharing on social. Users can select from a gallery of professionally designed video templates, or they can create a video from scratch using their own footage and storyboard.
“It’s a radically simple tool that shortens the distance from idea to execution so more businesses can have a successful video strategy,” said Vimeo chief-executive Anjali Sud. 
Venture Beat notes that Magisto “could play a big role in Vimeo’s broader pivot away from its former ‘YouTube alternative’ status as it looks to position itself as a place for creatives and businesses to access the tools they need to make videos.”
Video is vital for every business, yet the barriers to creating video remain high. Vimeo's own research found that while over half of small businesses used video in their marketing strategies last year, only 22% feel that they're using enough video.
“The research is clear: small business owners and entrepreneurs don't have the tools, time or budgets to make videos at the volume and quality needed to compete," added Sud.

The wider implications

San Francisco-based Revl just raised $5.2 million in a funding round led by Nimble Ventures, Silicon Valley Data Capital and Luma Pictures. That brings the five-year-old Revl’s total funds to $10 million.
The funding will be used to fuel the growth of its AI video production service, Revl X with which it hopes to encourage customers of ‘adventure experiences’ to upload videos and share online as a form of word-of-mouth marketing.
Co-founder Eric Sanchez admits, in a press release, that editing and delivering fully customised ‘video souvenirs’ of epic experiences in real-time is computationally and programmatically very difficult. Revl has apparently used AI and parallel processing to build a fully automatic system that edits and delivers HD cinematic videos in seconds.
What is unique about Revl is that this only works with its own hardware – an action camera which would be loaned to customers by the operators of adventure experiences – think skydiving, racing/driving, alpine coasters, and zip-lines.
How it works is that users select a video package from an Onboarding app and are provided with a QR sticker, which the 4K Arc scans at the beginning of each video. Cameras docked into an Editing Box module on-site which automatically transfers and analyses footage, wipes the SD card, charges the camera and edits the videos and photos, complete with B-roll and animated intros and outros, with RAW files sent to Revl’s cloud.  
“The user will find the finished video in the Revl app and can modify the music and share directly to social media,” it explains.
The Arc has a 12 Megapixel sensor with 150-degree field-of-view lens and will capture 4K video 30fps. It has a gimbal integrated into the rear of the camera which keeps it level to the horizon. It also has electronic image stabilisation which reduces bumps and camera shake. It’s waterproof up to 33ft (10m) and houses a 128GB MicroSD memory card plus two rechargeable lithium-ion batteries rated at 800mAH, 3.7V, 2.96Wh.
Revl also claims its solutions “understand human emotions” to identify key moments in an experience and that the cameras are “geospatially aware”. The AI uses this information to auto-create the clip.
The content also comes with hashtags and other key identifiers pre-embedded, so brands are automatically included in social posts.
“Our goal is to make our technology vertical agnostic in the next 1-2 years so it can be used in almost any kind of adventure or activity,” Sanchez said. “Just like you would expect on the Splash Mountain ride but with a video instead of just a photo, for example.”

What CG simulation and deepfakes means for the future of performance

IBC
The resurrection of 1950’s icon James Dean proposed for a new Vietnam war film is the latest in a growing army of the thespian dead but CG simulation is posing equally great ethical and legal questions for the living.
With sophisticated facial and whole body 3D scanning increasingly common there are unresolved issues over who retains control of the actor’s likeness, their performance and ultimately, of the data.
Actor and director Andy Serkis raised the alert at IBC2019. He said digital scans of an actor’s performance could be captured and then repurposed for other movies or platforms.
“When your performance is captured as data it can be manipulated, reworked or sampled, much like the music industry samples vocals and beats. If we can do that then where does the intellectual property lie? Who owns authorship of the performance? Where are the boundaries?”
He urged acting unions Equity and SAG to examine the issue. “If an actor’s performance from one movie is re-used in another there should be remuneration for that actor, no question. It is their performance and no matter what platform it ends up on they should be paid for it.”
Far from scaremongering, such incidents may already be happening. The computer games industry is currently the largest employer of performance capture with up to 500-character roles being created for a single game. Actors are typically paid £400 a day for performing body movements in a mocap suit, perhaps more if their facial performance is recorded, while voice work can command in excess of £1000 a day.
There are reports that actors who have had their performance captured for the game they were contracted for, have had elements of their performance reused without credit (or financial reward) in other titles.
Tracking use of an original data captured performance is tricky given that any character or creature you can imagine can be animated using the artist’s work as a base.
This is doubly difficult when more than one actor’s performance is blended to create a character. The Serkis-directed Mowgli, for instance, featured CG creatures composed of body movements, facial and vocal performances captured from different actors.
“We can very accurately capture a body, face and vocal performance and bottle that information,” explains Matt Brown, the CEO of Serkis’ London studio Imaginarium. “Being co-owned by an actor, we take an ethical approach to what happens to that data. We would be very uncomfortable if a producer or studio wanted to use that data for a principal character or even a background character in another piece.”
Image right protectionOne avenue worth exploring is whether studios should share some of the data about a performance with the actor so that they have a better record of what they’d delivered on a particular day. Blockchain might be used to track the provenance of information.
“There are complications with this,” Brown suggests. “For example, the actor may have a certain hairstyle, weight or be wearing a costume on the day of the scan so the legal protection about what is name and likeness and how that differs or is used down the line is not straightforward.”
He stresses: “At Imaginarium, we make it clear to actors from the outset how their performance data is being used and we’ve not had an instance where we’ve been asked to provide data at a later date for something that wasn’t contracted."
Conventionally when an actor contracts with a studio they will assign rights to their performance in that production to the studio. Typcally, that would also licence the producer to use the actor’s likeness in related uses, such as marketing materials, or video games.

Similarly, a digital avatar will be owned by the commissioners of the work who will buy out the actor’s performance for that role and ultimately own the IP.
However, in UK law there is no such thing as an ‘image right’ or ‘personality right’ because there is no legal process in the UK which protects the Intellectual Property Rights that identify an image or personality.
The only way in which a pure image right can be protected in the UK is under the Law of Passing-Off.
“This essentially comes down to whether there has been a breach of confidence, reputation or ‘goodwill’,” explains Andrew Bravin, associate in the digital media, advertising and technology group at Soho-based lawyers Sheridans. “A breach of goodwill means the image or name has been misrepresented, or suffered reputational damage, by falsely showing an individual to have endorsed a product or service.”
The first successful case where goodwill was found to have been misused was brought by former F1 driver Eddie Irvine against Talksport in 2003.
In the era of deep fakes, though, technology can bootleg a celebrity, create counterfeit news and unauthorised artificial reality that is nearly indiscernible to the untrained eye.
In law, this is a widening grey area. “If there is technology that allows you to copy and mimic how an actor moves and you are able to create a new character based on those movements which the original actor claims they did not consent to then the law has yet to be tested,” says Bravin.
Actors like Serkis have already had to fight to have their motion-captured work recognised as performance its own right.
“Once you have captured an actor volumetrically you have each element of them in isolation and you can start to control it in different ways,” says Andrew Shulkind, an LA-based director of photography involved in commercials. “If they miss a mark, for example, you could work their foot into just the position you need. There are AI tools which can manipulate the ‘character’ independently of the actual performer. At that point, does the actor still have agency?”
Digital necromancy, digital cryogenicsIt is increasingly common for actors to be scanned during production, principally as an aid to VFX (for de-aging or stunt doubles). The stars of BBC series Good Omens also received full photo scans, showing that the process has entered TV.
Volumetric scans also provide insurance to the production in the case of an actor’s passing before the shoot is complete. Most cases of an actor’s likeness being used posthumously are not contentious, at least in so far as the person’s estate has offered consent.
“I think that it is ethical as long as we can bring the digital actor to the same level of talent and if they are used in roles that follow the path of their career,” says VFX artist Arturo Morales, who worked on Paul Walker’s posthumous appearance in Fast and Furious 7.
“Whether a certain project is ethical or not depends mainly on the purpose of using the ‘face’ of the dead actor,” he adds. “Legally, when an actor dies, the rights of their [image/name/brand] are controlled through their estate, which is often managed by family members. This can mean that different people have contradictory ideas about what is and what isn’t appropriate.”
The recreation of Peter Cushing two decades after his death in Rogue One was carried out with the full approval of the actor’s estate, although it is unknown what kind of financial remuneration was offered, or even if Lucasfilm and Disney needed Cushing’s property approval.
“After all, the appearance of Tarkin in the Star Wars films is the intellectual property of Disney and presumably, they can do whatever they want with that material,” Morales notes. “As this VFX method becomes more and more popular, it will certainly be something every famous actor will start thinking about. In the same way, they leave a will, they will need to decide how their image will be used after they are gone.”
Robin Williams had already spotted this trend and made a clear stipulation in his will to prevent his image from being used for any purpose for 25-years after his death.
Oo-ee-oo you look just like Buddy HollyFamously protective pop artist Prince called the holography of dead stars ‘demonic’. Last year, Roy Orbison and Buddy Holly completed their hologram UK tour.
Re-animation doesn’t require digital scans of course. Walker’s brother stood in for scenes to finish Fast and Furious 7 with previously filmed close-ups of Walker’s face composited over the top. Still, photos and archive footage combined with body doubles were used to bring Audrey Hepburn and Gene Kelly back to life for TV ads, both with the consent of the deceased’s estate. James Dean’s appearance will be created using a similar method.
“If you have data or photogrammetry then conceivably you can get another actor in to mimic or puppeteer a dead actor quite accurately,” Brown says.
While the ability to believably simulate the life and soul of an actor’s eyes has yet to be solved in computer imagery, the use of de-ageing techniques has been used effectively in movies like The Irishman and TV shows like HBO’s The Righteous Gemstones, where John Goodman was de-aged 30 years for one episode. In this instance, the performance in Goodman’s eyes were largely left untouched by the VFX team handling his facial de-aging.
More and more A-listers are reportedly selling their image rights for films which will be made when they’re dead. Other may be inclined to bank 3D scans of themselves in their prime as a form of digital cryogenics.
“When you add it all together [dead people] can begin to have new consciousness,” Framestore chief creative officer Mike McGee told The Telegraph last year.
“It’s only a small step to interactive conversations with holographic versions of dead celebrities or historical figures.”
A question of ethicsThe advance of performance capture and VFX techniques can be liberating for much of the acting community. In theory, they would be cast on talent alone, rather than defined by how they look.
“Performance capture is the end of typecasting,” declared Serkis. “With it, anyone should be able to play anything.”
Serkis champions the idea that performance capture be used to promote greater diversity within the industry but is also wise to the political sensitivities this throws up. What would the reaction be if a man performance captured the role of a female?
“There should be great opportunities for disabled actors to play able-bodied characters,” he said. “It would be possible for an actor of colour to play Abraham Lincoln and for me, as a middle-class white man to play Martin Luther King.”
“The question is whether that is ethically right.”