Tuesday 10 September 2024

Dry for wet

Definition

article here and also p12-16 here


ICVFX developments have made shooting in and around water a more fluid process, with real-time effects now standard practice. Find out how studios are maintaining naturalism

Water simulation in CGI has advanced massively over the last 30 years, from James Cameron’s Titanic through to the amazing VFX in Avatar: The Way of Water. The technology has matured and rendering power is at the point where many effects can be run in real time.

“The number one benefit of shooting on LED over green screen in a water-based environment is the reflections and refractions,” says Dan Hall, head of ICVFX at VP studio 80six. “We all know that no VP shoot will look correct without a good relationship between the VAD (virtual art department) and AD. When you have water in the foreground, you get an accurate representation of how it reacts to the light that’s emitted from the background. This helps sell the realism of what you are shooting.”

Hall supervised a high-speed test that involved pouring drinks on a beach against an exotic beach background. “We were shooting at over 200fps when I noticed that the virtual sea elements were essentially stationary throughout the pouring. To add more movement to the sea, we did something that wouldn’t have been possible on location – we increased the speed of the waves. 

“To the eye, the waves appeared to be moving quickly, but on reviewing the shot they looked like they were rolling up the beach at only a slightly slower than normal speed while the drinks poured very slowly.”

The way lighting from an LED volume penetrates practical fog on-set lends itself to underwater work. Craig Stiff, lead real-time artist and VAD operator at Dimension, explains: “Green screens cause green spill and would penetrate the fog, so practical fog would likely not be used and instead would be added in post-production. With LED volumes, the light is coming from visible structures which don’t have to be too out of focus. Reflections and out-of-focus edges are also accurate, meaning no painting out or rotoscoping/keying.”  

Having said that, it’s important to consider how the effect will be layered or composited as part of the final image. For example, in survival thriller No Way Up, directed by Claudio Fäh, Dimension and DNEG 360 used SFX elements like fog and haze to give the impression of being deep under the ocean. This, combined with a virtual moon casting light through the fog, sold the idea of it being underwater.

“On-set, we used two haze machines to add atmosphere to the practical set,” Stiff adds. “When used correctly, haze is a great tool for blending the LED wall with the practical set. We then composited VFX elements like ripples, bubbles and fluid simulations in post-production to blend the shots.” 

Plan the dive, dive the plan      

Like anything in virtual production, there are caveats to the way you do things. When working with water, you need to take time to ensure everything has been set up correctly. For example, according to Hall, moiré patterns may become an issue, especially if you want to focus on a highly reflective object like water inside the volume.” 

 “Any wire work (such as for swimming), has to be well planned and executed,” advises Stiff. “Hair and floating materials need to be considered because they won’t behave like they are underwater. Therefore, tight clothing, tied-back hair or head coverings are the best option, unless you can account for it in post-production.”

Practical effects

It is not only possible to augment video backgrounds with practical water effects but it’s encouraged for a more realistic final image as if you were on location. 

For example, says Hall, when working with rain or particle effects, you want to ensure that the physical properties of the virtual rain – including spawn frequency, droplet size, wind direction and material – match those of the physical rain. 

“If there’s rain falling on a subject, it should also be in the virtual environment. Though it’s not always necessary; if you have a shallow depth-of-field in a large volume with physical rain, it may not be needed. This can save on GPU resources.

“The VP supervisor will also be able to advise you on the capability of practical water effects on the volume: LED panels have varying tolerances to humidity.”

However, as we all know, water and electronics do not mix well. The environment has to be very controlled.

“If shooting using a water tank and the LED volume as a backdrop, whether that is for underwater or above, the main thing to worry about is safety,” says Stiff. “These are large electronic devices next to a pool of water with actors. Keeping a decent distance between the water tank and the volume is wise, but this creates a gap between the practical and digital water. To allow the blend between real and digital, camera angles should be kept low to the water surface or high to obscure the edge of the water tank.”

Tim Doubleday, head of on-set VP, Dimension and DNEG 360, claims that the main challenge is limiting practical water to a single area. “When done right, the two elements work brilliantly together since you get all the natural reflections and refractions from the LED wall in the practical water.”

Catch the next wave

Water simulations have advanced to the point where they can now run in real time, including waves, crashing surf and complex water movements.

“The Fluid Flux plug-in for Unreal Engine seems to be widely used in the community and has produced great results,” says Hall. “I also know there has been impressive use of ICVFX for underwater scenes, so I’m interested to see how that progresses.

”As hardware improves, we can do more in real time, which will only lead to more accurate and realistic virtual water elements,” he adds.

The way light interacts with water can also be rendered with a high degree of realism. Doubleday thinks we will see further advancements in how objects interact with water, such as a boat carving through water leaving a trail in its wake, or how a heavy object disperses water when dropped from a height.

“These situations can be simulated using complex offline processes, but I can’t wait to see them running in real time on a giant LED wall!”

Deeper dive: Shark Attack 360, Season 2

Diving into the factual landscape of shark behaviour, the second instalment of Disney+ show When Sharks Attack 360 investigates why sharks bite humans. As the evidence mounts, the international team of experts analyse data in a VFX shark lab, all in order to understand in forensic detail why sharks attack.

For the docuseries, animation studio Little Shadow developed a hybrid VP workflow. Instead of using a traditional LED volume, it used a mix of custom-built systems and off-the-shelf tools to facilitate live green-screen keying, camera tracking and live monitoring.

“We blended live-action footage with CGI, allowing us to transform an underground theatre in Islington into a virtual 360 shark lab,” explains Simon Percy, director at Little Shadow.

“Initially, we used a LiDAR scan to create a 3D model of the venue, which we then employed to plan and scale the project. Due to the venue’s layout and lack of sound proofing, we ran a 4K signal across 110m and four floors using BNC cable, which allowed us to keep most of the equipment separate from the set.”

The creation and integration of CGI assets, such as the shark models and virtual marine environments, were key for building the immersive underwater settings, which were then played back on-set using the VP box, providing immediate visual feedback.

Percy continues: “We built the flight case around a custom PC for Unreal Engine, a pair of Blackmagic HyperDeck Studio recorders, the Ultimatte 12 4K for keying and a Videohub 20×20 for signal management. We also frame synced our cameras to Unreal using the DeckLink 4K Pro. This approach proved both mobile and flexible, ensuring quick playback with real-time asset generation and comping adjustments on shoot days.”

A private Wi-Fi network connected the flight case to an on-set laptop, allowing them to control it remotely, including live switching via ATEM 1 M/E Constellation 4K.

“To bring the underwater scenes to life, we used a green screen and the Ultimatte, which allowed us to integrate the virtual 3D elements into the scenes using AR. This enabled the presenter to have a precise real-time interaction with the sharks. In combination with DaVinci Resolve’s AI functions such as Magic Mask for rotoscoping, we were able to blur the lines of where real and virtual production meet,” Percy adds.

Looking ahead, technological advancements in water and fluid physics simulations are moving quickly. “With the advent of powerful RTX GPUs from NVIDIA and tools like JangaFX’s Elemental suite, we can now simulate water dynamics in closer to real time – a process that would have previously taken days to complete. Blender’s capabilities for large ocean simulations, augmented by plug-ins like Physical Open Waters, hint at the possibilities for increasingly realistic and cost-effective water effects in the future of television production.”

No comments:

Post a Comment