Monday 9 August 2021

The future of production: Top 5 trends

IBC

The cinematic universes of Star Wars and DC are part of a far wider Metaverse which is being built today but won’t see fruition until decades from now. 

https://www.ibc.org/trends/the-future-of-production-top-5-trends/7811.article

“From a technology perspective we are extraordinarily close,” says Ted Schilowitz, Futurist, Paramount Studios. “We are already experiencing the Metaverse in various ways albeit mostly using traditional screens. 

“Overt examples are games like Fortnite where people gather in a virtual space but it’s not just trying to achieve a game objective. It’s about socialising in a more avatar-oriented three-dimensional space.” 

The experience will span digital and physical worlds, private and public networks, and offer unprecedented interoperability of data, digital items/assets, content, across each of these experiences. 

According to former head of strategy at Amazon Studios, Matthew Ball, “The Metaverse is likely to produce trillions in value as a new computing platform or content medium. In its full version it becomes the gateway to most digital experiences, a key component of all physical ones, and the next great labour platform.” 

Tech companies like Epic Games, Nvidia and Apple all want to own stakes in the Metaverse. Content owners do too. Heading the charge at ViacomCBS, Schilowitz is exploring how assets from Nickelodeon to Star Trek can interface with the emerging forms of visual computing. 

“That is the next big step in entertainment and it is starting to take off,” he says. “New VR/AR devices will enable us to enter these entertainment worlds and take you to the holodeck.” 

Schilowitz’ advice to producers is to “don’t try to lock into one platform or another. Speak to the creative core of the business not the tech side. The tech core should support the creative essence, not the other way around.” 

The final picture may be indistinct but the outline of the edges are already here. 

“The problem is like being a 16th Century surgeon trying to explain how the human body works,” says Sol Rogers, founder and CEO at VR producer Rewind. “We know some of the outside bits and that some bits are plugged together but we’ve no idea how it all works.” 

Sticking with the analogy, our medical knowledge has now advanced to enable us to sequence RNA to fight global pandemics, but the journey has been a long one. When it comes to the Metaverse we’re still in the foothills of learning. 

“It is the layering on of a hybrid representation of ourselves in another [virtual] place or universe that connects humanity using technology,” says Rogers. “That is the direction of travel.” 

 

AI is a tool for hire 

AI/ML has entered media and entertainment to automate speech-to-text captioning or to reduce the cost of storage. Algorithms are also foundational for next-gen video compression.  

On the creative side, AI tools such as Adobe Sensei are helping creatives speed the process of video assembly. Colourlab Ai can quickly match footage to take the pain out of the more laborious aspects of grading allows colourists to make focussing more on the creative aspects of their work. 

Other AIs can assist in performing a lot of the routine work of visual effects.  Framestore and Bournemouth University are funding a research programme to help solve key problems facing the VFX industry using ML. 

Manne Öhrström, Framestore’s Global Head of Software VFX, says, “The potential for using ML in areas like lighting and rendering is huge.” 

“The goal is not to replace the animator but to get it to the point where the animator can bring it to the next level,” says DNEG’s global head of research Roy C Anthony. “AI has a lot of potential to help express our creative potential by simplifying a lot of frustrating tasks and accelerating work and enabling artists to get that aha! moment as quickly as possible.” 

A number of startups are creating artificial voice actors for hire. No longer stilted and robotic, an AI can be trained to generate voices based on style, gender, or type of production. 

Sonantic.io makes voices for video-game characters. It claims to reduce production timelines from months to minutes by rapidly transforming scripts into audio. Users can create “highly expressive, nuanced performances with full control over voice performance parameters.” 

Controversially, the producers of Roadrunner: A Film about Anthony Bourdain used AI to simulate the television host’s voice for three lines of dialogue (which Bourdain wrote but never said). It was controversial because the filmmakers didn’t acknowledge this up front. 

That’s in contrast to director David France who openly used a AI-driven VFX technique to replace the faces of key witnesses to his undercover reporting of gay rights abuse in Welcome to Chechnya. The effect was so convincing, France deliberately included a ‘halo’ around each case to highlight its use so as not to deceive viewers. It also proves that face replacement tools are open to producers of docs, not just Marvel movies. 

Nonetheless, as Karen Hao puts it in an article for MIT, “AI voices are cheap, scalable, and easy to work with.” 

That has got acting union SAG-AFTRA concern about actors being compensated fairly or losing control over their voices, which constitute their brand and reputation. 

 

Immersive real-time production 

If you’ve got $15 million then you can make a 35-minute VFX-driven drama on par with The Mandalorian but most producers will be lucky to be making an entire series on that budget. Fortunately, the virtual production technology and techniques with which Disney+ wowed the industry is coming on tap fast. Virtual production happens to be the form of live action production most suitable to COVID safety. 

“Increased virtual set utilisation has taken on two forms,” says Liam Hayter, Senior Solutions Architect, NewTek. “The first is bringing remote participants via chromakey into a virtual environment, especially where presenters and guests alike are unable to travel to a studio. The second is for a set extension from smaller studio spaces, where budget and space are at a premium but the look and feel of a full studio environment is desired.” 

Demand for virtual sets is soaring across broadcast, corporate communications, advertising and events. Using LED screens rather than greenscreen enable realistic reflections and refractions that actors and camera operators can easily see and explore. 

“Allowing the performer or presenter to see what’s happening around them instead of just a blank screen located behind them, creates a more natural flow to a production,” says Lanz Short, Technical Solutions Manager, disguise. “LED volumes also remove the need for chroma keying, which can be challenging when working in a small studio.” 

Executive producer Erik Weaver at the Entertainment Technology Center@USC argues that VP should be considered an integral part of physical production. “The concept of virtual production as a separate entity exists only because virtual production tools, techniques, and workflows require new skill sets, an adoption of computing and computer language on set, and team leadership with a strong understanding of VFX.” 

The goal for the industry should be to educate everyone in production to acquire skills in computing, real-time technology and baseline VFX knowledge. 

 

Screenlife becomes a genre 

Production has just wrapped on French suspense feature The Pilot, about a drone pilot in Mali whose wife and daughter are kidnapped by terrorists. Directed by Paul Doucet, the film is the latest example in the screenlife genre, in which everything the viewer sees happens on a computer screen, a smartphone or a tablet and the entire film plays out on screens.  

Other recent successful examples include Searching, Unfriended and Profile, which are all produced or directed by the genre’s mastermind Timyr Bekmambetov. Teen horror Unfriended made $65 million on a budget of just $1m spawning a sequel in 2017,   

Previously, filmmakers have wrestled with how to show digital communications on screen. Looking over the shoulder of the actor as they type into the internet immediately dates the film because the technology goes out of date so quickly. It can also be a dramatic black hole.  

Screenlife films entirely take place from the viewpoint of a computer interface and webcam.  With more of us living our lives on screen, Bekmambetov contends shooting stories from this perspective can reveal the inner life of characters. 

For Profile, a based-on-fact drama about a journalist who connects with a jihadist online, the Kazakh-Russian filmmaker wrote special software. Using it, the actors perform all of the computer movements that appear on screen including Google search and video calls. 

“We recorded their faces and their manipulation with the screen at the same time,” Bekmambetov explained to Moviemaker. “This is what is revolutionary for Screenlife.” 

Rather than filming the actors with conventional cameras in a studio, the format calls for recording device screens.  GoPros were mounted on the back of a laptop and fixed as close as possible to the device’s built-in camera to reproduce that camera’s angle. Some devices had a microphone mounted on the back as well. The lead actors in Profile were filmed interacting while located in different countries. 

“We all know how to make Instagram photos, we know how to shoot ourselves,” he says. “The actors could compose their shots because they do it in real life, too.” 

Bekmambetov founded Screenlife Technologies developing products based on AI and deep learning for the mass production of the Screenlife content. The technique has earned him a place in Fast Company’s Top 10 Most Innovative video companies in the world and he has a five picture deal with Universal. Pilot proves that Screenlife is not Bekmambetov’s alone. 

 

 

Video games fuse with film
Virtual production technologies and the arrival of lighter, better quality, more powerful VR and AR goggles will blur the lines between previously distinct disciplines of filmmaking and gaming. At the same time, IP from different media – animation, TV, novels – is being built into new story worlds to fuel the Metaverse. These twin trends demand a cross-pollination of skills. 

Netflix new gaming division will focus on mobile and could build on interactive TV experiments like Bandersnatch. Netflix COO and CPO Greg Peters said (in a Q2 2021 earnings interview video) Netflix plans to focus on its original IPs in order to differentiate itself from what everyone else is offering,

"We are in the business of creating these amazing worlds and great storylines and incredible characters, and we know the fans of those stories want to go deeper.” 

Linear and interactive narrative skills will blend to create content in extended reality (XR) which Sol Rogers, CEO, Rewind says is “transitioning into an established industry at the forefront of innovation.” 

Creating XR content, or even content created in virtual production destined for playback in cinema, requires a mix of disciplines from film and TV, gaming and animation.  

The UK’s Rebellion Studios is at the forefront of this change. It’s a highly successful games developer with sound stages and expertise in performance capture and VFX. The studio is currently producing  Rogue Trooper based on based on a comic character from the 2000 AD strip, directed by Duncan Jones. 

A recent BFI supported Screen Skills survey revealed high demand for producer roles in post-production and VFX, plus those with engine coding skills in VFX and high-end technical operators for post-production. 

The World Economic Forum report,  identifies core job market requirements for skills in artificial intelligence, blockchain, and coding languages. 

“Game making and app development skills are crossing over. Film and TV can bring financing and distribution models (such as co-production partnerships) to the mix,” says Becky Gregory-Clarke, Head of Immersive at StoryFutures Academy. “Be sure to spend enough time scoping the advantages and disadvantages of working on immersive projects with teams from different creative industry backgrounds - they don’t all talk the same language!” 

She is co-producing a forthcoming XR project from Academy Award and BAFTA-winning Asif Kapadia. The director of docs like Amy is adapting graphic novel, Laika, into animated VR with animation studio Passion Pictures.  

Kapadia, says “VR works best when it is rooted in something real and my ambition is for viewers to believe they are really there with Laika, on Earth, during training and finally in space. I hope this film encourages a wider audience, young and old, to experience a new form of cinema and technology.” 

 

No comments:

Post a Comment