Tuesday, 3 September 2019

Blade Runner style holographic displays just got a step closer

RedShark News
The day when you can no longer distinguish real things from synthetic objects has come a step closer if you believe the claims coming out of Light Field Lab and more pertinently its latest investors.
The company which emerged from the ashes of camera maker Lytro is pinning its hopes on holographic technology as the future of display and its prototypes have proved convincing enough for multiple investors to weigh in with $28 million of funding.
The Californian outfit will use the funds to scale its display technology from prototype to product, enabling holographic objects to float in space without head-mounted accessories. The ultimate aim is a holographic TV and holographic cinematic experiences — the holodeck — where redirected walking techniques and haptic effects might be used to simulate touch.
The company has previously talked of augmenting the holographic image with more senses, such as smell.
Last year it demoed a working prototype of a small display measuring 4 by 6 inches that’s capable of projecting 3D holograms. The aim is to build bigger, modular, versions which can be fitted together to create large format location-based entertainment venues.
The plan is to stack the panels together and stitch the images with hundreds of gigapixels of resolution. These 100+ foot wide screens could be placed on floors or ceilings and would project holographic images into a 3D space.
Later versions will be developed for the consumer market, the company confirmed.
This consumer roadmap is reflected in the latest round of investors which includes Samsung Ventures, Verizon Ventures, Comcast, NTT Docomo Ventures, HELLA Ventures and Liberty Global Ventures and follows securing $7 million in January last year.
Verizon, for example, seems excited about pairing its 5G network to deliver high bandwidth media.
Another investor, Taiwania Capital says holographic display technologies “are a new frontier in the display space. The team has created something that was once considered science fiction, but now exists in reality. This enables new business opportunities and unlocks digital environments that no one has seen before.”
Liberty Global reckons Light Field Lab’s displays are the most exciting new technology it has seen in the entertainment space to date. It wants to hook the tech with the industry’s top content creators “to accelerate holographic media distribution on next-generation networks.”
Light Field Lab is already working with rendering company OTOY to build out a content development pipeline for the technology.
Khosla Ventures, another investor, goes so far as to suggest that the Lab’s technology “is the first to shift the paradigm in the display space since the transition from black and white to colour.”
The company told Variety the most significant updates include improved brightness and detail, increased depth of the projected objects, lots of new video examples, as well as real-time and interactive holographic demonstrations.
Certainly, the ambitious roadmap for developing a commercial holographic display within the next couple of years seems on track.
The acid test will be whether the wait will be worth it.

8K TV: Why here, why now?

Cable Satellite International

Back in 2012 when the ITU-R enshrined UHD in two phases, 4K UHD was already seen as a stepping stone to 8K. UHD-2 was considered so far away that little other than resolution was considered in the specification. While the industry is some way from deploying 4K heads are turning towards what’s next. For some this is an unwelcome distraction from the practicalities of 4K transition, with possible risk of consumer confusion, others view it as the natural progression of an industry which has technological advance written in its DNA.
“It’s not unusual for a new technology to emerge and move forward while an earlier one is still being rolled out,” asserts Peter Siebert, DVB Head of Technology. “In this case, it’s relevant to note that 8K TV sets are not necessarily threats to 4K production and delivery, as they could bring improvements in image quality for lower resolution content through upscaling.”
“There will always be technology Luddites,” says Ben Schwarz, speaking as an independent expert, founder CTOi Consulting and communications chair of the Ultra HD Forum. “A devil's advocate would say that 4K is a distraction from HD deployment.”
For William Cooper, founder of consultancy Informitv, 8K represents a natural evolution of video resolution. “HD is now mainstream, 4K is already a reality and 8K is now a possibility,” he says. “Although there may be diminishing returns with each increase in resolution, if the objective is a representation of the highest fidelity, then 8K or beyond may be technologically inevitable.”
He points out that there are many dimensions to improving the fidelity of video reproduction. “Spatial resolution is one, the precision of each pixel is another, and temporal sampling is a further dimension,” he says. “Traditional television technology is compromised in all these dimensions, with plenty of room for improvement.”
The entire industry is now working to deploy 4K with HDR and NGA (Next Gen Audio), as a result of efforts that have cumulated over the last five years.
The Ultra HD Forum, at pains to put 8K on the back burner, recommends that the industry focuses on these added value services such as HDR with dynamic mapping, NGA and HFR for sports.
“The industry has yet to explore the right combination of resolutions, taking into account HDR, HFR and NGA,” notes Thierry Fautier, VP of Video Strategy at Harmonic and president of the Ultra HD Forum.
Rian Bester, who runs 4K channel Insight TV, agrees, “If you show a consumer 4K verses HD the difference is not that apparent but if you show them HDR verses non HDR or HFR verses non HFR - especially in fast moving content like sport - the difference is very apparent and there is no doubt as to the benefit. Those two aspects are far more valuable than going from 4K to 8K for current screen sizes.”
While no-one is suggesting 8K holds any benefit over mobile (and even telcos like BT Sport argue for HD HDR as optimum for handsets) the claim that you need to sit closer to the home screen to perceive the benefit needs re-examining.
Screen sizes are getting bigger – by about an inch a year according to some reports. What’s more, what we think of as a TV set, could be on verge of a radical format overhaul with MicroLED and rollable screens on the horizon.
“In the not too distant future we will have screens that are significantly bigger than currently and they will be multi-application devices like our phones,” Bester suggests. “For that reason, I don’t think people should get too hung up on the science of traditional viewing distance and screen size. This is completely changing.”
Cooper supports this, “The whole point of increasing resolution is that the pixel structure of the image should be imperceptible. It is a psychovisual effect that results in an image that appears to be more realistic.”
Will the tail wag the dog? This is perhaps the most voluble charge against 8K promotion especially since TV makers don’t always get it right (see stereoscopic 3D for details).
It is no coincidence that brands including Hisense, Panasonic, Samsung and TCL are primary backers of the 8K Association nor that Samsung, LG and Philips are partnering Japanese-owned Spanish streamer Rakuten TV’s plans to stream 8K content later this year, nor that Samsung and Sony are sponsoring 8K productions by Insight TV for marketing purposes.
“Although models are already available as low as $5000 they need to be five times cheaper, and up-scalers don't yet make all 4K or HD content shine at 8K,” notes Schwarz.
HDMI 2.1 is provisioned to support 8Kp120 and we are still in the interoperability phase.
The potential for consumer confusion could be high if misleading messages about near-term 8K content availability are made, which could result in 4K market destabilisation.
“While it’s folly to think that the industry can stymie the natural technological progression of display technology, the industry does owe a responsibility to correctly inform consumers of the availability of native 8K content and when it will reach a reasonable critical mass,” says Matthew Goldman, SVP Technology, MediaKind. He believes 8K will only be available for occasional special events like the Olympics or World Cup over the next five years. “We all need to find a compromise between one part of our industry pushing 8K to sell more ‘better’ consumer TVs to increase profit versus another part of our industry pushing back to prevent undermining the wider marketplace for 4K content creation and consumption.”
There is a certain inevitability in cost reduction from TV sets to distribution bandwidth. “We know 8K TV sets will be affordable at some point in time, and that’s when consumers will adopt them at mass scale,” says Fautier. “Therefore, the industry needs to make sure it can offer attractive services for broadcast, live and on-demand streaming, as well as immersive experiences.”
Bester says Insight TV’s 8K experiments are not necessarily because to build up a library, “but because we want to understand where we need to adapt the workflow chain from production through to delivery.”
While there is equipment from cameras to finishing systems and video switchers capable of a full production chain the workflow is embryonic. Data volumes alone present a challenge and a cost.
Nonetheless, the industry is steadily moving toward capturing video at higher resolutions to enable pan, scan and zooming. Production costs for HD and 4K can be reduced by capturing in 8K using one camera and extracting the region of interest via AI.
For immersive experiences such as VR, it’s necessary to capture at 8K resolution and deliver the field of view to either an HD or 4K display. The benefit of 8K here, according to Fautier, is that it provides “an exceptional QoE in contrast to conventional VR approaches where the full frame is sent and the player up-samples the field of view, leading to a poor experience.”
In the same manner, for personalised broadcast, content can be captured in 8K for end-users to navigate the content (at lower resolution) on mobile devices.
“8K offers a more personalised experience with a high QoE compared to other approaches where the zoom leads to fuzzy picture,” says Fautier. This was demonstrated at the French Open with Harmonic encoding and Tiledmedia packaging.
The applications for ultra-high quality source material are not limited because of the production technology. The biggest bottleneck is distribution.
“Even if you look at a high resolution VR application, the problem is not displaying the picture it is getting the content to the consumer via download or streaming,” says Bester.
The only broadcast network capable of supporting 8K today is ARIB DTH (heavily subsidised by the Japanese government via state-run NHK). Neither ATSC or DVB have made any provision to support 8K. 
“In 2019, DTH is probably the only viable way to deliver 8K at scale [but] satellite distribution hasn't yet managed to ride the 4K wave successfully despite active promotion by the likes of SES or Eutelsat,” says Schwarz.
“As far as compression is concerned, the numbers circulating throughout the industry for bandwidth requirements vary. NHK’s commercial service uses 100Mbps, but recent trials with HEVC have shown live sports content at 85Mbps and VOD at 65Mbps.”
VVC promises to half those requirements in a few years. The proposed standard winding its way through MPEG “is the silver bullet” identified by Bester required to drive things forward “because whether the content is 4K or 8K, it really addresses the bottlenecks like CDN costs and bandwidth.”
Proofs of concept, by BBC R&D among others, show VVC being meaningfully more effective than HEVC with the goal of decreasing bitrate by half. But we will have to wait until 2020 when the MPEG specification is finalised and then 2022 to see it implemented in the first devices. 
“We also need to resolve the licensing model of VVC, and the MCIF is working hard toward that,” says Fautier.
Harmonic’s take is that 8K will start with DTH, “but very quickly we will see IP delivery to connected TVs and mobile devices,” though probably limited to 4K.
The codec is an important element, but high-speed broadband networks - fibre, DOCSIS 3.1, and 5G –“are the best fit to carry 8K content, even using the HEVC codec,” says Fautier.
For live applications at scale, multicast will likely be needed, with 5G FeMBMS (Further evolved Multimedia Broadcast Multicast Service) an attractive solution (being trialled in Bavaria).
Globecast, which reports more than 60 percent of its customers in Europe still to make the transition to HD, identifies OTT as the preferred method for content delivery in new formats.
“That was true for 4K and will likely stay true for 8K,” says Juliet Walker, CMO. “That’s because you don’t need to wait for an industry-approved new interoperable tech standard for the signal transmission chain. Innovation comes fast on the internet and device/display vendors are quick to adopt new technologies to sell ‘boxes’.”
As with 4K, Walker thinks sports will lead the way in 8K, even while 4K rollout remains sluggish. Expect to see the first wave of 8K content produced at the 2020 Tokyo Olympics, as well as VoD in 8K streamed to connected TVs in the same time frame.
That’s why, argues 8K Association and the UHD Forum, that it’s important to agree on a standard for 8K that includes support for IP delivery (VoD and live) across all type of networks, 5G included, on all devices (TVs to smartphones). Immersive applications being different from broadcast ones, will also require guidelines. 
The DVB for its part has completed a report into media formats beyond UHD-1 4K. “These formats have the potential to be commercially viable in the coming years,” says Siebert. The report was submitted to DVB’s Steering Board in July as an input document for potential future specification work.
The prevailing view, voiced by Antonio Corrado, CEO at video delivery network MainStreaming, is that “broadcasters won’t be able to justify the cost for a small niche audience that will be able to experience streaming in 8K until its wider adoption by device makers and consumers.”
Thomas Wrede, VP, New Technology & Standards, SES Video says the key ingredient is an effective business model. “We need an equation that encourages subscribers to pay for the quality of the content itself, and not just the screen they unboxed.”
To lay the groundwork for 8K the industry needs an aligned end-to-end ecosystem. Even with two organisations (the UHD Alliance and Ultra HD Forum), guiding 4K deployments, “it was not an easy process,” Fautier admits. “However, member companies have learned to work together, even if they compete in the marketplace. The same collaboration needs to exist for 8K, and the 8K Association is at the forefront to drive those efforts.”
In summary, there is a ton of work required to make 8K a reality. This is a multi-year effort, at least five to ten years depending on the application. The fear is that too much focus is put on classical broadcast, while IP is a low hanging fruit, at least to connected TVs and mobile devices and that the immersive experience is still the wild west and will take time to mature, likely in the 2022-2024 period.
“The next generation of video entertainment is based on several pillars of which a higher resolution is but one aspect,” says Schwarz. “I expect a paradigm change on the whole concept of resolution. Specific content will be produced at given resolutions. 8K will remain the upper limit until some new disruptive technology makes something akin to vector-based video a practical reality.”
By the same token, certain territories lagging now but unencumbered by legacy infrastructure in future, could see 8K leapfrog 4K in the same way that cellular did over DSL.
“This is why we shouldn’t get too hung up on the science,” Bester says. “Let us imagine you have an entire wall in your living room at 16K, if you go beyond that it will not make a difference. I think 16K is the top limit where the drive for higher resolutions will end.”
ends

Monday, 2 September 2019

IBC - Automation and 8K take centre stage

Broadcast 

Cloud and IP, content and metadata through to smart use of automation and AI, IBC2019 is all about the ‘supply chain’
If recent IBCs and NABs are a guide then expect the trajectory of core postproduction tools to move further into the cloud enriched with time saving automation. The most common AI applications are facial and object recognition, speech to text and enhanced metadata tagging.
Avid’s ‘reimagined’ Media Composer is its biggest redesign in 15 years. it supports Netflix’s mastering and delivery requirements including the ACES (Academy Color Encoding System)
Tasks that previously took hours can now be done in minutes, claim Avid of a new distributed processing module. Avid NEXIS Cloudspaces effectively extends local offline storage into Microsoft Azure.  
Adobe is speeding workflow using AI. In After Effects, a new content-aware option automates removal of objects like boom mics, signs and even people from footage and fill in the pixels with neighbouring pixel data to complete the scene.
Blackmagic Design’s latest DaVinci Resolve features a cut page tool to speed editing of fast turnaround projects. An upper timeline shows the entire programme while a lower timeline on the same page shows the current work area to avoid users having to zoom in or out. An AI/ML Engine uses facial recognition to automatically sort and organise clips into bins based on people in the shot.
8K ecosystem widens
Indicators for 8K will be spread all over the floor at IBC2019. Both Avid and Resolve software can handle 8K finishing (Avid even claims 16K).
Indeed, it’s the new must-have appendage for vendors (see also AI). The greater data overhead can help render higher quality visual effects or deliver more information to the final image for cinematographers wanting to mix resolution, aspect ratios, and sensor size.
Drama, like mini-series Trust, are increasingly shot at higher than 4K resolutions and more general programming is following suit. 4K channel Insight TV is shooting some content in 8K, including segments of Car Crews with Supercar Blondie, starring social influencer and car nut Alex Hirschi.
Blackmagic Design identifies the corporate video comms market as early adopters. It announced a flurry of 8K capable solutions including the ATEM Constellation switcher earlier this year.
“What’s most important for us when we talk about an 8K product is that it’s inherently available for 4K as well,” explains Craig Heffernan, technical sales director EMEA. “It’s a future proof workflow that enables anyone to test or implement 8K workflows based around UHD budgets and planning, and allows us to provide a foundation to build tools for customers leading the industry into 8K content production.”
The format will also find a home in live production for techniques such as region of interest—extracting 4K or HD images from a single 8K one.
“The benefit is that 8K offers a more personalised experience with a high quality of experience compared to other approaches where the zoom leads to fuzzy picture,” says Thierry Fautier, vp, video strategy at Harmonic which will present results of its 8K over 5G demonstration at the French Open at IBC2019.
Full frame imaging
The crop of large-format digital cinema cameras and lenses continues to grow. Bigger and better sensors are becoming easier to produce - though price is still a factor. Full-frame sensors offer better depth of field control and image quality, particularly in low light situations.
An inexpensive new option is on verge of launch by lens maker Sigma. Touted as the world's smallest and lightest full frame mirrorless (i.e electronic shutter) camera, the Sigma fp will record 12-bit CinemaDNG raw in 24p 4K. It’s got the virtue of a point and shoot camera with the trappings of a cine quality imager and is compatible with Sigma and Panasonic lenses.
Rival options include Sony’s Alpha 7R IV, shipping around IBC for about U$3500 and capable of recording 4K up to 30p. Sony says the 61 Megapixel sensor makes it the highest-resolution full-frame camera it has ever introduced.
Look to Chinese vendor Z Cam for release (post IBC) of full frame 6K and 8K versions of its E2 modular cinema camera. The E2-S6 sports a 26MP Super35 CMOS sensor paired with either a Canon EF or ARRI PL mount. It can shoot UHD 4K 60p and will cost a budget friendly U$3,995.
Harmonising OTT and TV
A new initiative intended to standardise the delivery and presentation of broadband and broadcast delivered television is being put before the industry at IBC.
DVB-I, from cross industry consortium DVB, aims to do for OTT what it did for digital TV. Namely, to enable broadcasters to deploy common services across a wide range of devices and to enable manufacturers to save costs and offer a single consistent user experience for all video services.
“We are not early – but I don’t think we are too late,” says Peter MacAvock, DVB chair. “The OTT march is fully underway and DVB-I is designed to provide the type of standard and rigour to the OTT sphere that DVB brought to digital TV.”
Specifications include the integration of channel list so that all broadcast and IP services are discoverable and a low latency mode to ensure that the overall delay for live OTT channels is the equivalent to broadcast, potentially down to fractions of a second.
Silver bullet compression
Debate continues about the merits of codecs to succeed HEVC for streaming video with cost as much a consideration as compression quality.
The chief contenders are AV1, developed by the Alliance for Open Media (AOM) and Versatile Video Coding (VVC), an MPEG-led standard.
In tests earlier this year, BBC R&D found that compression gains from VVC far exceeded that of either HEVC and AV1 but at the cost of processing time. It also found that AOM has significantly reduced AV1’s computationally complexity.
VVC could cut bandwidth requirements in half over HEVC and is considered the silver bullet to make 4K and even 8K fly.
“For an economic broadcast of 8K television the industry needs VVC,” says Thomas Wrede, vp new technology & standards, SES Video.
However, there are question marks about the licence costs of AV1 and VVC which the Media Coding Industry Forum is working to clear up. In the interim, MPEG has fast-tracked development of MPEG-5 EVC as a royalty-free codec competing directly with AV1.
With the race to standardise EVC and VVC due next year, at the time AV1 is set to mature, there will likely be a photo finish.
Interoperability - more important than ever
Spme form of IP production tool can be found on most booths but there appear two competing systems: the ST 2110 family of standards and NewTek’s NDI.
“The reality is that we’re unlikely to have a single IP solution,” says Ian Wadgin, senior technology transfer manager, BBC R&D. “What we need is a way for the two systems to interoperate and pass content between them.”
A broadcaster may use ST 2110 in their studio environment but have NDI in their live news production workflows where more compressed workflows are important to deal with less than optimal connectivity.
The answer might lie in NMOS (Network Media Open Specifications) enabling an NDI source to appear on a ST 2110 matrix and vice versa.
Since NewTek was acquired by VizRT in the industry’s most dramatic M&A this year, eyes will be on this integration.
“Convergence of the two systems would be welcome and means that the right tool will be available for content producers whatever their requirements or budget,” Wadgin says.
Vox Pops
The entire value chain for 8K needs evolving. If you look at 8K cameras, particularly for ENG type applications, the selection is very limited. Not all post production tools are set up to deal with 8K. Then there’s storage. We’ve started producing projects in 8K, not necessarily to build up the library, but because we want to understand where we need to adapt the workflow chain from production through to delivery. Rian Bester, CEO, Insight TV
The next big thing promises to be shooting in ‘Full Frame’. However, the choice of cameras available to cinematographers is very slim and with the exception of DSLRs, all are very expensive with a gaping hole in the middle. We have a sneaking suspicion that some manufacturers will launch professional Full Frame cameras to make this new format available to the mainstream. Barry Bassett, managing director, VMI
As television continues its transformation into an era of universal IP delivery, the resilience and quality of experience of broadcast must be maintained. Understanding emerging developments in hardware and software, together with a view on how these might be implemented by the television device sector are critical if the viewer experience is to be protected, and industry and government are to derive the maximum economic and social value of the unique opportunity that lies ahead. Richard Lindsay-Davies, CEO, DTG
Having recently migrated our content to the public cloud, our IBC exam question is around how we use this opportunity to help our viewers discover even more UKTV content and to enjoy it when and where they want it. Simple but smart, nimble products that can easily plug in to our existing ecosystem will fit the bill nicely. Sinead Greenaway, CTOO, UKTV

Behind the scenes: The Lion King

IBC
The animated remake of the Disney classic employed a live-action film crew that could work inside virtual reality using traditional camera equipment to set up and execute shots in the animated world the same way they would be achieved in the real world.
For all its pioneering virtual production, the biggest breakthrough in The Lion King is its dedication to putting old fashioned filmmaking front and centre of the creative process.
“You can’t improve on 100 years of filming,” says Rob Legato, the production’s VFX Supervisor. “You start with the actors, you block out the scene, you select the lens, you compose and light the shot. If you shortcut that, it no longer feels like a movie.”
The Lion King, is of course, Disney’s remake of its 1994 animated smash and also a follow-up to The Jungle Book. It re-unites director Jon Favreau with the VFX team led by Legato and MPC VFX supervisor Adam Valdez.
Indeed, for Technicolor-owned MPC, the story began as early as October 2016, while still wrapping up work on The Jungle Book campaign and months before Legato and Valdez won the Best VFX Oscar. They began discussing how the pipeline and methodology could continue to evolve from The Jungle Book to take their next project to yet another level.
Unlike The Jungle Book, where actor Neel Sethi (Mowgli) was composited into photoreal CG backgrounds with equally life-like animated animals, this time the entire film would be generated in a computer but shot with all the qualities of a David Attenborough nature documentary.
“Once the decision was taken to treat the movie as if it were live action it has to be filmed with a live action intent,” Legato explains to IBC365. “That means there are creative choices that you make only in analogue. You don’t make them frame by frame, you make them by looking at it and changing your mind, on the spur of the moment, in response to what is happening in front of you. A live action movie is the sum total of the artistic choices of a director, a cinematographer, an editor and many more. You have to find a way of recreating those on a virtual stage.”
Having studied at film school and served as a VFX supervisor, VFX director of photography and second unit director on films like Scorsese’s The Aviator, The Departedand Shutter Island; and Robert Zemeckis’ What Lies Beneath and Cast Away, Legato observes that his best work has not been about creating fantastic worlds but about believable ones.
“From the Titanic sinking to Apollo 13 launching any success I have had is about trying to fool the eye into believing something that looks like any other part of the movie.”
Cinematographer Caleb Deschanel earned his spurs for director Francis Coppola’s camera department on The Godfather and Apocalypse Nowand has six Oscar nominations to his name, including The Right Stuff, The Natural and The Passion of the Christ. He worked with Legato on Titanic for which Legato also won an Oscar.
“Caleb is a fabulous artist but he has no experience of digital translation so my job was to navigate the mechanics of this for him,” explains Legato.
Essentially that meant providing an interface for Deschanel between the virtual world and the tools of conventional filmmaking in such a way that the DP could call and operate a shot just like any other movie.
“We don’t want to throw him to the wolves just because he doesn’t come from a digital background,” Legato says. “His intuition about what makes a shot work is essential so we did everything in our power to help his ideas translate.”
That’s not to suggest that the virtual production techniques advanced for The Lion Kingwere designed solely for Deschanel.
“What we’re trying to do is tap into our gut level response to things, our instantaneous art choice – what happens if I pan over here and then, if I move a light over this way, and put that rock there. Everything is designed to be done in real time, instinctively, instead of over thought and intellectualised.”
Virtual realityThe key production advance to achieve this is VR. Trialled on The Jungle Book, its extensive use here enabled the filmmakers to collaborate on shooting the movie at nearly every stage as if it were a live action.
Where Avatar broke ground by giving the filmmakers a window on the VFX world — they could see the CG environment in real time during production as if they were looking at it through the camera’s viewfinder — The Lion King inverts that idea by putting the filmmakers and their gear inside a game engine that renders the world of the film.
If that concept sounds a little hard to grasp, Legato explains that it’s not as sophisticated as it sounds.
“VR allows you to walk around the CG world like you would on a real set and put the camera where you want. The physicality of it helps you to psychologically root yourself. You know where the light goes, you know where the camera goes, you know where the actors are – and all of a sudden you start doing very natural camera work because you’re in an environment that you’re familiar with.
On a virtual stage dubbed the Volume in Playa Vista, LA, Favreau and his crew donned HTV Vive headsets to view 360-degree pre-built panoramas and pre-visualized animation.
Camera moves were choreographed using modified camera gear – cranes, dollies, Steadicam (even a virtual helicopter, operated by Favreau himself) – to allow the filmmakers to ‘touch’ their equipment with the motion tracked by sensors on the stage ceiling and simulated directly within the virtual world.
Effectively, they were making a rough version of the movie in real-time with graphics rendered using a customised version of the Unity game engine.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” says Legato. “If I want to dolly track from this rock to that tree the dolly has real grip and inertia and a pan and tilt wheel which is sending data back to the virtual environment. It’s not a facsimile. In that way you retain the imperfections, the accidents, the little idiosyncrasies that make human creative choices but which would never occur to you if you made it perfectly in digital.”
In other words, the storytelling instincts of artists with decades of experience making features is absolutely at the heart of this virtual production process.
“An amateur director can still put cut shots together but if they were given The Godfatherto make then they would make a very different movie. A Martin Scorsese directed film is very specific to his point of view. It is not arbitrary, it is very serious art. And that is what we are trying to do here.”
Pre-building the virtual worldIn order for any of this to work there was a major first stage which was pre-building the virtual environment.
The VFX team and core crew including production designer James Chinlund spent two weeks on safari in Kenya photographing vistas and data capturing foliage, the different species of plants and trees, and various lighting environments. They photographed 360-degree HDR images of the sky and the sun and built a huge library of these high-resolution images.
Back at MPC, the landscapes were modelled in Maya and, using a bespoke asset management system, integrated into Unity.
Working within the VR world, Favreau and Deschanel were able to explore the locations, effectively scouting for the places to shoot each scene.
Next, Favreau and animation supervisor Andy Jones would work out the mechanics of a scene – the rough blocking and approximate animation, again in VR.
“It just kept evolving and iterating until we get to something that we like,” says Legato. “We could all walk around in the virtual world together, and see things for the first time, look at things from different angles. You could be miles apart in the VR world but three feet apart on the stage but we could talk to each other and say ‘Why not take a look at what I am seeing?’ and we could all snap to their point of view.
“When that was done, Caleb would work with Sam Maniscalco (lead lighting artist) and they would light the scene for each shot.”
As with live action, the action was covered from different angles in order to provide a selection of takes for editorial. When editor Mark Livolsi had made his selections, the resulting shots were sent back to MPC along with the camera tracking data to finesse into final production quality.
Months of research went into character development. The final designs were sent to artists at MPC, who built them using new proprietary tools for improved simulation of muscles, skin and fur.
From 12,000 takes of photography, MPC delivered nearly 1500 shots (170,668 Frames or 119 minutes of final images) to Disney. 145 shots that got started were omitted in the process.“They perfected the nuances of performance and lit it to look absolutely photoreal but the creative choices of what we’re shooting has been already selected in this looser live action virtual space.”
Because they wouldn’t be involved in the film’s principal photography, The Lion King’shuman actors (including Donald Glover and Seth Rogan) were often asked to perform with each other in the Volume rather than simply reading script pages from a standing stationary position at a mic. Legato says the stage environment helped them deliver real, physical performances as references for the animators.
“We photographed with multiple Blackmagic Design cameras so the animators could see the intent of the actor,” said Legato. “But when they pause and they look and you see them thinking, you know that that’s what drives the performance. It’s much more informed than just voices only.”
The output from the cameras was routed over more Blackmagic gear so the team could watch playback or drop the footage directly into Avid for editorial.
“If we needed to throw an image onto a big screen so that the actors can get a sense of working with the pre-viz we could do that,” Legato says. “The Blackmagic kit was like a Swiss Army knife, a useful and necessary tool in this process, which fitted together whichever way we needed it.”
Da Vinci was used as a finishing tool and also to apply colour correction to the animation even before going through the Digital Intermediate process at MPC.
The main virtual camera was modelled on an ARRI Alexa 65 to enhance the film’s epic quality, paired with the Panavision 70 cinema lenses that were used on the reference trip in Africa.
But it is the tactility and authenticity of using actual camera devices and the instant feedback into the virtual environment which, Legato believes, gave them the ability to iterate the animation to a greater degree than ever before.
“I don’t know of anybody else doing it,” he says. “Even James Cameron isn’t shooting Avatar with VR in this way.”
The technology is already filtering into TV production, albeit the most high-end TV possible. Disney’s Star Wars spin-off The Mandalorian, which is created by Favreau, is using a similar set up albeit with the Unreal games engine.
“The ability to see in advance what only can be imagined until fully constructed will create better and better films, plays, concerts, and television shows,” adds Legato. “What you can now preview with great detail can only make for more exciting and original artistic expressions. So, in short, the encouragement to explore will take the advantages of VR to the next level.”

Saturday, 31 August 2019

Behind the scenes: The Man in the High Castle

IBC
Amazon’s The Man in the High Castle imagines what the world would be like if the Axis powers won World War II. With the fourth and final season streaming in November, IBC365 spoke with cinematographer Gonzalo Amat about creating the show’s retro-futurist look which took its initial cues from executive producer Ridley Scott’s own dystopian noir Blade Runner.
“Ridley has always pushed us to be edgier with our work,” explains Amat. “Prepping for series 3 and 4 he’d say things like ‘use less fill’ or ‘be bolder’ which encouraged us to pursue bolder ideas. Sometimes, guest directors would wonder about close-up shots but we always reassured them by saying, ‘Ridley loves that bold stuff - no need to worry about the conventional shots.’”
Scott has been insistent on a cinematic look to the show from its debut in 2015, urging the creatives to stay away from the visual language of regular TV.
“We’re constantly asking ourselves what we’d do if we were doing a movie,” says Amat who has lensed half of the 40 episodes, with James Hawkinson shooting the other half. “We don’t use zoom lenses so you have to actually place the camera for the prime lens that you have instead of zooming in. The use of wide shots and close-ups are not common for TV. Audiences are very visually sophisticated so they notice and value the effort.”
Blade Runner remains the “bible” in terms of look design, to which has been added references for each season and episode. Season 1 and 2, for example, also drew on The Conformist and In the Mood for Love while seasons 3 and 4, were inspired by Japanese cinema.
“The simplicity of blocking the actors and efficient use of the wide shot was something we recovered from Yasujirō Ozu and Kenji Mizoguchi’s films,” says Amat. “Just in general, we looked at great films from great authors, including a lot of Kubrick’s films all the way to current filmmakers like David Fincher. I personally looked at The Road to Perdition and Revolutionary Road (shot by Conrad Hall) as inspiration for these two seasons. Films like The Assassination of Jesse James guided me on how to approach available light, and Bridge of Spies and The Lives of Others were helpful on how to approach period looks on a budget.”
He continues: “My purpose for seasons 3 and 4 was to go back to the basics. To be more expressive and connect with the characters, we created bold frames, graphics, and simpler lighting. We used single source rather than fill light. With more hard lighting in the background and no sunlight on our actors, these were some of the basic concepts used that I have loved since the show’s conception.”
The Philip K Dick adaptation holds particular resonance for Amat who was born in Mexico City to Spanish immigrants.
“My wife’s parents grew up, lived, studied, and worked under Franco’s regime and, as a child, I spent a lot of time in Spain immediately after Franco died,” he says. “Their stories of those times have definitely been an element I used to portray an attempt of having a normal and hopeful life under an authoritarian regime.”
Amat grew up in Mexico in the 70’s and 80’s under the PRI which ruled uninterrupted from 1929 to 2000 and again from 2012 to 2018.
“I have my own account of living in a regime with no democracy, and a government controlled media,” he says. “[MITHC] has felt very personal because of my experience. The fictional world of Nazi and Japanese occupation in the 1960’s feels very similar to Spain during the same time period. In my own research, I have used references from Spain, including family pictures and accounts, from those times.”
While the first series was shot with a Red Epic because they needed 4K footage, they subsequently shifted to the ARRI Alexa, despite its 3.2K resolution. According to Amat, the main reason was the better low-light sensitivity of the Alexa which led to using less lighting on the scenes.
“Working in natural light as much as possible is vital for this project but given the tight turnaround times it can be tricky. Our locations team and production designer [Drew Boughton] have been great about always thinking of windows, orientation, color of walls, and so on, before we even go scouting. It becomes a lot easier if you do that work ahead of time.”
Season 4 is shot on Alexa Mini, instead of the Alexa SXT, so the DPs could make use of the camera’s internal ND filters and its smaller size.
The show is shot on sound stages in Vancouver with various scenes shot on locations in the Californian desert, in British Columbia, farmlands in Denver and various national parks.
“The Nazi-occupied East Coast of the show is designed with de-saturated colours, almost black and white with no red, except for the Nazi iconography. In the Japanese-occupied West Coast, there are more pastels with green, aqua, and warmer tones but they are faded and also de-saturated. In the Neutral Zone, we used faded earth tones and textures reminiscent of Americana. In season 3, which takes place in 1963, we played with the palette inspired by 1960’s US culture which incorporated more colour and sun. With season 4, there were some newer looks too but they were confined within the original design so it still feels like the same show.”
Down the mine shaftThe mine caves sequence in the S3 finale, ‘Jahr Null’, for which Amat is Emmy nominated this year (Hawkinson won an Emmy for his work in 2015), required three different locations, one of which was an actual mine. Amat’s goal was to shoot with the actors’ headlamps and nothing else.
“You can’t put any lights up in the mine since it’s a one-way tunnel so used the headlamps and lanterns from our actors and, with their help, they lit the scene. We used atmosphere smoke and haze in order to create the level of darkness needed and to allow the lights to shine. Our gaffer constructed a makeshift handheld light that worked by bouncing light off of the walls of the mine as the actors moved, recreating a natural movement of light.”
The following tunnel sequence required the visualization of a giant gateway to a multiverse.
The VFX team, led by senior VFX supervisor Lawson Deming, are also nominated for their work on this episode. Jahr Null, or Year Zero, refers to a social engineering plan to erase all traces of American history and identity. Tasks ranged from digitally remodeling cities like New York to appear as an earlier time with added sinister details, to the destruction of the Statue of Liberty in favour of another monument called the New Colossus.“We had to have numerous meetings about what this tunnel was as a concept and then to make sure that the look of the scene matched the concepts of quantum physics within the story,” he says. “I wanted to achieve lighting that felt almost like plasma. We decided to put a mirror at the end of the tunnel with circle lighting right above it. We then created the effect of space travel by using a blast of light that collectively used more than a million watts. It was a complex setup, but fortunately we had a lot of very talented people come together to execute it.”
Perhaps unsurprisingly when shooting on location, many businesses objected to hanging Nazi flags and Fascist propaganda so much of this was added in post.
“In the last couple of seasons, I’ve been experimenting with playing difficult scenes in a different way,” Amat says. “Where for example we have a disturbing scene, and you play it like a normal scene, so not darker, moodier, just normal. Sometimes the audience finds it more powerful when you make something disturbing into a piece of normal life.”

Games engines are changing the rules of production

RedShark News
Virtual Reality in conjunction with realtime rendering of graphics using games engines have become the new must-have accessory to cutting edge production. Its use has advanced from a tool to help pre-vizualization of select action or visual effects intensive sequences to being the medium through which filmmakers collaborate in a virtual environment mixing live action; digital puppetry with CG backgrounds.
Two films this year exemplify the approach. In the wildly kinetic, ultra-cool John Wick three-quel, a climactic set-piece sequence takes place in a room high up in the Continental Hotel where ceiling, floor and interior is made of glass.
The cinematography by Dan Lausten echoes that of Roger Deakins’ work filming a glass filled room in Skyfall and recalls the house of mirrors shoot out from John Wick 2 (itself a homage to Orson Welles’ climactic scene in The Lady from Shanghai).
Just to make it even more difficult, in John Wick Chapter 3 – Parabellum there are giant LED screens playing back vibrant colours inside and outside the glass room.
“[Director Chad Stahelski] wanted this idea from the beginning, and we spent a long time talking about how to achieve it,” Laustsen told IBC.  “They built the set about 800 x 400 ft in a studio. It was really complicated to light, so we shot tests with the big LED screen on the outside. When you have glass surrounding you 360-degrees you have to be very careful to avoid lights and other equipment being in picture but we had the experience of handling something similar from John Wick 2.”
What differed from the JW2 was the use of VR to prep the scene. This included a full design of a 3D version of the glass office that the key filmmakers could view in real-time in VR goggles way before the actual location was constructed as a physical set.
According to concept illustrator Alex Nice, this virtual version even included proxy fighters, and the ability to mock-fight in VR. This was all built and played back in Epic Games’ Unreal Engine 4.
The benefit of being able to virtually walk around a complex set was not just beneficial to the production design but to the stunt-fight co-ordinators and to Laustsen who was able to ‘pre-visualise’ how to shoot the scene, such as where to place the camera and how to light it, to a far more accurate and realistic degree than before.
The team even built in a virtual camera which allowed for scene capture, depth of field and lens selection within the game engine. What Lausten saw and the decisions he made while wearing the VR headsets was able to be relayed to crew by displaying it on a monitor.
This technique, however, was an exception for a film which makes a virtue of shooting as much as possible, including star Keanu Reeves’ stunts, in-camera.
Not so The Lion King, which has arguably pushed the boundaries of virtual production further than any project to date. Entirely animated and featuring singing, talking animals, Disney’s feature is designed to look as if it were shot for real as a live action.
That meant giving the filmmakers, including director Jon Favreau and cinematographer Caleb Deschanel, as close to an experience of shooting a live action movie as possible for the entirety of the production, but within a VR environment.
Once again, UE4 was the engine of choice into which highly detailed CG landscapes (designed from real Kenyan safari vistas) and pre-built character animation had been ported.
The filmmakers did everything from location scout the CG panoramas, blocking the scene with virtual and real actors (whose performance was shot on a stage then translated into animation by facility MPC) to selecting camera angles, shooting multiple takes for coverage and modifying lighting.
Making the shots feel real was all about emulation. The production created physical representations of traditional gear because Favreau believed it would help the film feel like it was photographed, rather than made with a computer. There was an emulated Steadicam rig and an emulated handheld camera rig. There were cranes and dollies. There was even a virtual helicopter, operated by Favreau himself.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” Favreau explains in the film’s production notes. “Even though the sensor is the size of a hockey puck, we built it onto a real dolly and a real dolly track. And we have a real dolly grip pushing it that is then interacting with Caleb [Deschanel], who is working real wheels that encode that data and move the camera in virtual space. There are a lot of little idiosyncrasies that occur that you would never have the wherewithal to include in a digital shot.”
The Lion King can claim to be the first time filmmakers walked around as if on a real set – in sync and in real-time using VR – communicating to each other, pointing things out and manipulating things together.
Francesco Giordana, Realtime Software Architect at MPC calls it “a real milestone to be able to put multiple people into the same space at the same time collaborating this way – where new multi-user workflows meet old school cinematography and filmmaking.”
The significance of this approach, is not just the ability to see and manipulate the virtual in realtime, but the marriage of digital with the analogue or conventional film grammar down to mimicking the tactility of actual camera equipment.
“When we're in VR, it gives you the visceral feeling of being there,” says three-time Oscar winning Rob Legato who is the film’s overall VFX Supervisor. “The whole concept of virtual production and virtual cinematography is about imparting your analogue live choices and not the ones that we have to think about for a long period of time. So, you want to have something that you have instant feedback on. If you line up a camera and something moves in the background, you might change your composition based on live input – that you react to immediately without having to think about it.
“The closer the technology gets to imitating real life, the closer it comes to real life. It's kind of magical when you see it go from one state to another and it just leaps off the screen.”
The film’s virtual production extended techniques developed on The Jungle Book. Where Avatar broke ground by giving the filmmakers a window on the VFX world — they could see the CG environment in real time during production as if they were looking at it through the camera’s viewfinder — The Lion King inverts that idea by putting the filmmakers and their gear inside a game engine that renders the world of the film.
Physical devices were custom built, and traditional cinema gear was modified to allow filmmakers to ‘touch’ their equipment— cameras, cranes, dollies — while in VR to let them use the skills they’ve built up for decades on live-action sets. They don’t have to point at a computer monitor over an operator’s shoulder anymore.
“That was all hugely valuable to the stunt-vis team,” adds Nice, “because then they can really get a sense of things like the stairs and where walls were, and where things were obscured. It actually becomes a really helpful storytelling tool and planning tool way ahead of time. The sooner that you have these guys being able to spatially map out this environment allowed them to do their magic for the film.”