Monday 16 August 2021

Video Game Technology’s Impact on Filmmaking Has Only Just Started

NAB

Forty years ago, Tron transferred the visuals of a computer game onto film. Nowadays it is games that are pushing the technical and creative innovations of how we make movies.

https://amplify.nabshow.com/articles/video-game-technologys-impact-on-filmmaking-is-just-beginning/

The video games business out ranked that of Hollywood some years ago. Cinema may be a $100bn global industry but video games raked in $180 billion last year. 

“Perhaps the most important point is that the game industry can massively outspend the film industry when it comes to developing new storytelling tools,” says Bryant Frazer, blogging at Frame.io. “And that opens up a huge opportunity for them to grow and adapt.”

Even directors the calibre of the Russo brothers (Avengers: Infinity War) are wowed by games technology and its potential for next level content creation.

“It feels like we’re moving from filmmaking 1.0, jumping to filmmaking 5.0 — and I don’t see any other path forward that is nearly as exciting, compelling, or freeing for artists as this path,” Joe Russo said recently.

Leaning on Frame.io’s article here’ how video games are changing the movie biz from production to content.

Games engines are the new soundstage

Games engines are being routinely used to render background environments in real time for display on the LED walls of studio volumes. It’s a process that allows scenes to have incredible CG environments with almost no post-production required.

Games engine developers including Notch, Unity and Epic are actively encouraging Hollywood’s use of their tech by adding new features specifically designed for filmmakers.

For example, the new version of Unreal Engine includes a Virtual Camera plug-in that lets you use an iPad Pro as the viewfinder for an imaginary camera.

The technology is being applied to deliver a final shot, in-camera, without the resource and time cost of traditional VFX post-production tools. For instance, when Amazon approached production company Aggressive to design the set and show package for a Prime Rewind: Inside The Boys companion TV series, it spearheaded a virtual-set technique that unified the look of the entire project. Aggressive call it “XR SetMapping”. The team was able to track multiple live cameras with virtual 3D backdrops, and then add AR elements in real time.

According to Bryant, such technology is spreading like wildfire through scripted programs. Among them is ABC’s Station 19, an action-rescue show lensed by Daryn Okada, ASC, and the upcoming Taika Waititi-helmed comedy Our Flag Means Death.

Adding depth

For many productions, simply shooting beautiful footage is no longer enough. Modern VFX teams need more (and better) information to mesh their digital creations into the real-world scene that was captured by the camera.

This can be achieved through volumetric capture and photogrammetry, techniques which use depth-sensing cameras or camera arrays to extrapolate 3D models of an object or scene. Those models can then be used in a virtual environment or post-production.

Writer-director Neill Blomkamp (District 9, Elysium) has embraced volumetric capture on a large scale. His new film Demonic includes more than 15 minutes of volumetric footage.

As described by Bryant, for scenes where the film’s main character explores a simulation of the brain of her comatose mother, actors performed inside a 260-camera volumetric rig. Those volumetric captures were composited into 3D environments using the Unity engine and a new, patent-pending technology code-named Project Inplay, which is designed for real time playback, rendering, and even dynamic relighting of large volumetric point clouds.

Volumetric capture and photogrammetry are not limited to exotic, enterprise-level workflows. Productions of every scale can already start taking advantage of these new tools. Scatter sells  software that works with devices like Microsoft’s Azure Kinect  and Intel’s depth cameras. Apple even has a new API called Object Capture that will allow developers to turn any Mac or iPhone into a tool for photogrammetric capture.

This leads Bryant to ponder that as power and affordability of GPUs continues to grow, these developments “will have a profound impact on how we tackle production and post-production.”

Eventually, volumetric capture, lightfields or computational cinematography will lead to the creation of true autostereo 3D or holographic content with implications for theater-like live performance of avatars. We’ll see.

 

Previz, techviz, PostViz and more

Virtual production is not limited to LED stages and complex volumetric capture rigs. The term, and the technology, encompasses everything from pre-viz to post-viz. There are tools for this derived from the games world that can work for filmmakers on any budget.

“Filmmakers are looking at these toolsets and thinking, ‘I used to need a team to do that, but now I can do my shot blocking in the game engine,’” says Jim Geduldick, SVP of Virtual Production at Dimension North America, a branch of London-based Dimension Studio. 

CineTracer is a $90 real time cinematography simulator. The app uses Unreal Engine to help you work out scene lighting, shot blocking, and storyboards all inside what is essentially a video game.

“These sorts of tools will be the bridge for many into virtual production,” Bryant says. “Workflows will evolve to include these kinds of software tools, and then as more affordable LED stages open up in major shooting markets, we’ll start to see turnkey services being offered to filmmakers.”

Another example: Ghostwheel’s Previs Pro app creates storyboards from virtual cameras, lights, characters and props in 3D environments. It even has an Augmented Reality mode to help you visualize your scene in a real space.

Matt Workman of CineTracer adds, “If you’re on a small team and you want to make a film using these technologies, you can do that without ever setting foot on an LED volume. Use it to do all of your VFX, blocking, storyboarding and previs, then go off and shoot traditionally.”

Kick starting the boom in games engine filmmaking was Epic Games’ decision to open up its technology in 2015 by making Unreal source code available on GitHub. Like Unity, and unlike other traditional filmmaking gear, a core technology of current and future content production is essentially free for anyone to use.

 

Joystick cameras and digital puppeteers

When VFX legend Rob Legato assisted DP Caleb Caleb Deschanel make The Lion King as an animated photoreal production he helped the crew work use traditional camera equipment to set up and execute shots using virtual reality in the same way the shots would be achieved on location.

“Caleb is a fabulous artist but he has no experience of digital translation so my job was to navigate the mechanics of this for him,” explained Legato.

Essentially that meant providing an interface for Deschanel between the virtual world and the tools of conventional filmmaking in such a way that the DP could call and operate a shot just like any other movie.

Camera moves were choreographed using modified camera gear – cranes, dollies, Steadicam (even a virtual helicopter, operated by director Jon Favreau) – to allow the filmmakers to ‘touch’ their equipment with the motion tracked by sensors on the stage ceiling and simulated directly within the virtual world.

“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” says Legato. “If I want to dolly track from this rock to that tree the dolly has real grip and inertia and a pan and tilt wheel which is sending data back to the virtual environment. It’s not a facsimile. In that way you retain the imperfections, the accidents, the little idiosyncrasies that make human creative choices but which would never occur to you if you made it perfectly in digital.”

Since video game developers have spent decades trying to perfect the design of game controllers (the human-machine interface) it makes sense that similar designs are applied to filmmakers using games software.

Some enterprising video editors have even reprogrammed control pads aimed at professional gamers to work with post-production software, by mapping certain keyboard shortcuts to buttons, triggers, and dials, reports Bryant.

Taking this a stage further, consider a video game controllers as a puppeteering device. One that allows you to make on-screen characters interact with the live action on set in realtime.

The Jim Henson Company has done just this for Disney+ show Earth to Ned,  which combines live action, animatronics, and real-time on-set performance-driven CG animation. The character BETI is a CG artificial intelligence entity rendered in real time on set in Unreal Engine.

To have BETI appear to be physically on set, the plan was to create ‘rooms’ she could float in with screens inserted into the set. By tracking all the cameras in real time, it was possible to generate the correct parallax to create the illusion of there being volumes behind the screens.

About a dozen parameters are connected to character attributes that are programmed in the Henson Performance Control System. This enables the performers to use the puppeteer interface to adjust things like brightness, frequency, and effects scaling using inputs such as hand controls, pedals, and sliders.

“The guests on the show were really excited when they came on … because I think people assumed it was going to be a post effect,” says Brian Henson. “That illusion was complete for the guests who were there. They couldn't see any puppeteers. They just saw Ned and they saw BETI. And it's fabulous when that happens.”

Altering storytelling

To my mind the most exciting impact of games tech on filmmaking is how it will alter storytelling. With virtual cameras integrated into the most popular franchises—Fortnite, Minecraft, Roblox, Madden NFL, NBA 2K, and countless more—video games are teaching millions of young people not just visual storytelling but interactive digital production.

Matt Workman points out that young people are already pioneering a new, playful style of real time entertainment that hasn’t really crossed over to linear media yet.

“If anything has the potential to shake up the notoriously conservative film business, it’s a whole new generation of media-savvy creatives with an intuitive understanding of shot framing, action choreography, and editorial techniques and a decided lack of reverence for established styles and genres.”

Beyond Tron to new formats

Tron maybe the granddaddy of representing computer games on screen. Other examples, which are sometimes adaptions of games franchises, include Street Fighter, The Lawnmower Man, Lara Croft, Warcraft, The Matrix, Detective Pikachu. All play out in the linear directed cinema medium. None are interactive which is the essence of the computer game.

As Bryant says, “Video games have now come into their own as a storytelling medium. Modern games boast astoundingly realistic graphics, complex open worlds, and emotionally nuanced narratives that rival the best of Hollywood. But how will the ideas and skills learned from video games influence the next generation of filmmakers?”

The medium of cinema is perhaps not the best fit for the interactive potential of a games storytelling culture. Live streams on YouTube, Twitch and Instagram encourage interactivity, solving a problem that traditional TV has never managed to tackle. Already Vtubers like CodeMiko  (a virtual character performed in real time by an L.A.-based animator and coder) has grown into a full-fledged business employing five developers, a management firm, a publicist, and 750,000 Twitch subscribers.

There’s demand for pre-recorded content, too: YouTube viewers watched 100 billion hours of gaming content in 2020, with video related to Microsoft’s megahit Minecraft alone earning 201 billion views.

“If Twitch streaming and Minecraft movies seem like niche interests compared to the shows on HBO Max and Disney+, consider that independent creators on social media are definitely getting paid,” says Bryant. “Even many small YouTubers make the same or more money than many independent filmmakers.”

So, if video game creators are eschewing traditional content distribution models, and still banking cash, what does that mean for film and TV as a business? 


No comments:

Post a Comment