Monday, 29 June 2020

Virtual Production Can Be a Reality for Everyone. Here’s How


Creative Planet

Think virtual production is the preserve of James Cameron? The confluence of games engines with faster PCs, LED backlots and off-the-shelf tools for anything from performance capture to virtual camera is bringing affordable realtime mixed reality production to market.

Cameron saw this coming, which is why he has upped the ante to where no filmmaker has gone before and decided to shoot the first Avatar sequel as a virtual production under water. Not CG fluids either, but with his actors holding their breath in giant swimming pools.

“The technology has advanced leaps and bounds at every conceivable level since Avatar in 2009,” says Geoff Burdick, SVP of Production Services & Technology for Cameron’s production outfit Lightstorm Entertainment.
Massive amounts of data is being pushed around live on the set of Avatar 2, Burdick says. “We needed High Frame Rate (48fps) and high res (4K) and everything had to be in 3D. This may not be not the science experiment it was when shooting the first Avatar but... our set up is arguably ground-breaking in terms of being able to do what we are doing at this high spec and in stereo.”
This just the live action part. Performance captured of the actors finished two years ago and is being animated at Weta then integrated with principal photography at Manhattan Beach Studios.
Avatar 2 may be the state-of-the-art but it’s far from alone. Most major films and TV series created today already use some form of virtual production. It might be previsualization, it might be techvis or postvis. Epic Games, the makers of Unreal Engine, believe the potential for VP to enhance filmmaking extends far beyond even these current uses.
Examined one way, VP is just another evolution of storytelling – on a continuum with the shift to color or from film to digital. Looked at another way it is more fundamental since virtual production techniques ultimately collapse the traditional sequential method of making motion pictures.
The production line from development to post can be costly in part because of the timescales and in part because of the inability to truly iterate at the point of creativity. A virtual production model breaks down these silos and brings color correction, animation, and editorial closer to camera. When travel to far flung locations may prove challenging, due to Covid19 or carbon neutral policies, virtual production can bring photorealistic locations to the set.
Directors can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail.
What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render– which is light-years from where directors used to be before real-time technology became part of the shoot.
In essence, Virtual Production is where the physical and the digital meet. The term encompasses a broad spectrum of computer-aided production and visualization tools and techniques which are growing all the time meaning that you don’t need the $250 million budget of Avatar 2 to compose, capture, manipulate and as good as publish pixel perfect scenes live mixing physical and augmented reality.
Games engines
The software at the core of modern, graphics-rich video games is able to render imagery on the fly to account for the unpredictable movements of a video-game player. Adapted for film production and the tech consigns the days of epic waits for epic render farms to history.
The most well-known is Epic’s Unreal Engine which just hit version 5 with enhancements intended to achieve photorealism “on par with movie CG and real life”.  A virtualized micropolygon geometry, for example, frees artists to create as much geometric detail as the eye can see. It means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into the engine—anything from ZBrush sculpts to photogrammetry scans to CAD data—and it just works. Nanite geometry is streamed and scaled in real time so there are no more polygon count budgets, polygon memory budgets, or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.
Epic also aims to put the technology within practical reach of development teams of all sizes by partnering with developers to offer productive tools and content libraries.
It’s not the only game in town. Notch has a new real-time chroma keyer, which when combined with its automated Clean Plate Generation produce “fantastic” results with almost no setup or tweaking while providing all the features you’d expect such as hair, liquid handling and hold-up mattes all within less than a millisecond.
ILM which uses a variety of engines also uses proprietary real-time engine Helios, based on technology developed at Pixar.
The Jungle Book, Ready Player One and Blade Runner 2049 all made use of Unity Technologies’ Unity engine at various stages of production thanks to custom tools developed by Digital Monarch Media.
For example, on Blade Runner 2049, director Denis Villeneuve was able to re-envision shots for some of digital scenes well after much of the editing was complete, creating a desired mood and tempo for the film, using DMM’s virtual tools.
Games engines rely on the grunt power of GPU processing from the likes of Intel, Nvidia and AMD which has got exponentially faster to enable real-time compositing.


Digital backlots

The use of video walls in film and TV goes back at least a decade as a light source projecting onto Sandra Bullock and George Clooney in Gravity. More advanced versions playing pre-rendered sequences were deployed by ILM on Rogue One: A Star Wars Story, and its follow-up Solo and during a sequence set on a Gotham metro train in Joker.  A system is also being used on the latest Bond No Time To Die.

The most sophisticated set-ups combine LED walls (and ceilings) with camera tracking systems and games engines to render content for playback not only in realtime but in dynamic synchronicity with the camera’s viewpoint. The result allows filmmakers to stage scenes with greater realism than with a green or blue screen and with far more chance of making decisions on set.

“The big change has come with more powerful GPUs combined with games engines providing the software for real-time rendering and ray tracing,” says Sam Nicholson who heads Stargate Studios. “When you put that together with LED walls or giant monitors we think that at least 50 per cent of what we do on set can be finished pixels.”

For HBO comedy-thriller Run, the production built two cars outfitted to resemble an Amtrak carriage
on soundstage in Toronto. These rested on airbags which could be shaken to simulate movement. Instead of LEDs, a series of 81-inch 4K TV monitors were mounted on a truss outside each train window displaying footage preshot by Stargate from cameras fixed to a train travelling across the U.S.

“It’s a smaller scale and less expensive version of Lucasfilm’s production of The Mandalorian but the principal is the same,” explains Cinematographer Matthew Clark. “It effectively brings the location to production rather than move an entire production to often hard to access locations.”
Any light that played on the actor’s faces or on surfaces in the train had to be synchronized to the illumination outside the windows otherwise the effect wouldn’t work.

“It was important to line up the picture so when you’re standing in the car your perspective of the lines of train track and power lines has to be realistic and continuous. If the angle of the TV screen is off by just a few degrees then suddenly the wires of a telegraph pole would be askew. When we needed to turn the car around to shoot from another angle the grips could flip all the monitors around to the exact angle.”

LED displays are measured in pixel pitches (the distance in millimeters from the center of a pixel to the center of the adjacent pixel) are narrow enough for the images to be photographed. The panels are capable of greater brightness, higher contrast ratios and displaying 10-bit video.

Rental companies in the U.S offering LED screens or monitors include PRG and Stargate Studios; and in the UK, disguise, and On Set Facilities both of which also have operations in LA.
OSF advises that the bigger the pixel the more light it outputs onto your subject, which means very fine pixel pitches may not be optimum for filming. The pixel pitch resolution of the LED screens used on The Mandalorian was 2.8.
OSF is set up as a fully managed virtual production studio covering in-camera VFX (LED), mixed reality (green screen), and fully virtual (in-engine) production.
It has a partnership with ARRI and also has its own virtual private network connected to the Azure cloud for virtual production. StormCloud, enables remote multi-user collaboration in Unreal Engine powered by Nvidia Quadro technology. Entry points currently set up in London and San Francisco are being tested by “a number of Hollywood Studios and VFX facilities,” says the facility.

Camera tracking

Another essential component is the ability to have the virtual backlot tracked to the camera movement by a wireless sensor. This means that as the DP or director frame a shot the display which is often the main lighting source, adjusts to the camera’s perspective. That’s no mean feat and requires minimal to zero latency in order to work.
Professionals camera tracking systems from Mo-Sys and N-Cam are the go-to technologies here but if are purely filming inside a games engine there are budget ways of creating a virtual camera.
To create raw-looking handheld action in his short film Battlesuit, filmmaker Haz Dulull used DragonFly, a virtual camera plugin (available for Unity, UE and Autodesk Maya) built by Glassbox Technologies with input from Hollywood pre-viz giants The Third Floor. 
Another option is the HTC VIVE tracker which costs less than $150 which has been tested at OSF. “If you want to shoot fully virtual, shooting in engine cinematic is amazing with a VIVE as your camera input,” it sums up. “If you want to do any serious mixed reality virtual production work or real-time VFX previz, you are still going to need to open your pocket and find a professional budget to get the right equipment for the job.”

Plug-in assets
The Rokoko mo-cap suit can stream directly into UE via a live link demoed by OSF. The facility explain that the suit connects over the wireless network to the UE render engine and into Rokoko Studio where OSF signs the suit a personal profile for the performer. It then begins streaming the data into UE by selecting the Unreal Engine option in the Rokoko Studios Live Tab (a feature only available to Rokoko Pro Licence users). The system is being refined at OSF with tests for facial capture in the works.
There is video of the demo here: https://youtu.be/5N4OcNJUw9o
Reallusion - - make software for 3D character creation and animation include iClone and Character Creator 3D. The Unreal Live Link Plug-in for iClone creates a system for characters, lights, cameras, and animation for UE. The simplicity of iClone combined with UE rendering delivers a digital human solution to create, animate and visualize superior real-time characters.
Character Creator includes a plugin called Headshot which generates 3D realtime digital humans from one photo. Apart from intelligent texture blending and head mesh creation, the generated digital doubles are fully rigged for voice lipsync, facial expression, and full body animation. Headshot contains two AI modes: Pro Mode & Auto Mode. Pro Mode includes Headshot 1000+ sculpting morphs, Image Mapping and Texture Reprojection tools. The Pro Mode is designed for production level hi-res texture processing and ultimate face shape refinement. Auto Mode makes lower-res virtual heads with additional 3D hair in a fully automatic process.


OSF ran this through its paces, using Headshot to automatically create a facial model which was animated within iClone 7 using data from actors performing in Rokoko mocap suits streamed live to iClone allowing real-time previews and the ability to record animations. OSF also used Apple’s LiveFace app (available for downloaded on any iPhone with a depth sensor camera) and its own motion capture helmets https://onsetfacilities.com/product/face-capture-helmet/ to capture the facial animations. The next part of the pipeline is to transfer the assets over to UE with the Unreal Engine LiveLink plugin and Auto Character set up plugin which creates skin textures in the same way as Epic Games’ digital humans.

Virtual production on a budget
British filmmaker Hasraf Dulull made animated sci-fi short Battlesuit using Unreal Engine, on a skeleton budget and team of just three, including himself. 
Rather than creating everything from scratch, they licenced 3D kits and pre-existing models (from Kitbash3D, Turbosquid and Unreal). Dulull animated the assets and VFX in realtime within Unreal’s sequencer tool.
They retargeted off-the-peg mocap data (from Frame Ion Animation, Filmstorm, Mocap Online) onto the body of the film’s main characters. For facial capture they filmed their actor using the depth camera inside an iPad and fed the data live into UE.
“We had to do some tweaks on the facial capture data to bring some of the subtle nuance it was missing, but this is a much quicker way to create an animated face performance without spending a fortune on high end systems,” Dulull says.
Powering it all including realtime raytracing, Dulull used the Razer Blade 15 Studio Edition laptop PC with Nvidia Quadro RTX 5000 card.
Every single shot in the film is straight out of Unreal Engine. There’s no compositing or external post apart from a few text overlays and color correction done on Resolve.
“If someone had said I could pull off a project like this a few years ago that is of cinematic quality but all done in realtime and powered on a laptop I’d think they were crazy and over ambitious,” he says. “But today I can make an animated film in a mobile production environment without the need for huge desktop machines and expensive rendering.”


No comments:

Post a Comment