Tuesday, 7 July 2020

Back at work down under - AV in ANZ

AV Magazine
Given the territory’s far flung urban centres – not to mention its isolation from the rest of the world – remote working was a way of life for many businesses and education facilities down under pre-Covid. Post-pandemic and the need for AV communications has never been more relevant.
“People were quietly confident it was going to be a good year for ANZ and that this would lead into a prosperous 2021,” says Jamie Hind, regional sales director, APAC, Exterity. “Instead, we’ve seen so many people and so many businesses affected, including all of the major vertical markets.”
Hotels and the hospitality market have experienced disruption and, unsurprisingly, a number of projects have either been delayed or cancelled. Other verticals such as stadiums and venues are devoid of patrons for the foreseeable future making investment choices harder to justify.
“The healthcare sector has been overwhelmed and, aside from making sure robust communications are in place in hospitals and medical buildings, it is hard for other projects to even make the radar with everyone’s health and wellbeing the top priority,” says Hind. “Many projects that were already in motion have continued apace so there is still plenty happening, but it may be another twelve to eighteen months before we really see the full effect Covid-19 has had.”
Covid-19 has required organisations throughout ANZ to review their unified communications solutions. As people return to work, meeting rooms will need tweaking for the new world while those still working from home will need to feel connected.
“Remote, or home-based workers, have now become the norm and there is a good chance that the way we work, and where, will be changed forever,” says Hind.
Unified comms
Stuart Craig, vice-president at Crestron ANZ agrees. “Our working from home and video engagement learnings will not go away as the pandemic subsides. Projects will be delayed, but the reality is more people are talking about workplace technology now than ever. While we can all expect a bumpy few months, the opportunities in the mid-long term only got bigger through this experience.”
The territory is widely touted as an innovator in technology and early adopter perhaps because of the need to be independent from other regions. “ANZ is often not a friendly time zone to get a hold of support for quickly. For this reason, our market relies heavily on being proficient in our craft,” says Blake Kirby, senior brand manager at Xilica’s ANZ partner, Amber Technology. “We need to have all the answers on hand because it could be eight to ten hours before we’re able to get a response from a manufacturer. Our market has deeply knowledgeable and highly skilled workers in the pro AV world.”
“An example being a very early uptake in LED technology, in airport installations which I saw on my travels way before it was common elsewhere,” says Andy Lee, sales and account manager, Datapath. “Recently, we’ve seen really encouraging adoption in corporate and education markets, which is not always an obvious one for us when compared to our known command and control focus.”
A heightened appreciation for collaboration in Australia, compared with other mature AV markets, was already driving technology adoption trends, reports Alistair Hayward, head of UKI and Asia Pacific, Promethean. “Australia has been at the forefront of IoT, the integration of classroom devices and adoption of education platforms, such as Google and Microsoft.”
The shift to distance learning during lockdown has sparked debate about whether this could be the future of curriculum delivery. Based on market feedback, Promethean doesn’t believe this will be the case.
“Students are social learners, and the collaboration that takes place in the classroom cannot be achieved to the same levels remotely,” says Hayward.
“Recognising that schools will have varying requirements and processes in place, we’ve invested in five high quality video suites around the country that will enable our team to support schools in their IFPD decision making.”
As lockdown occurred during one of the quieter sales periods in the year for IFPDs, it is too early to say for sure how budgets and investment priorities might be affected.
“One thing we can be sure of is a shift in how procurement decisions are made,” says Hayward. “We also appreciate that some schools will still prefer an on-site ActivPanel demonstration when making investment decisions. To support these, we’ve established the necessary protocols to enable us, and our partners, to do so safely.”
Kirby also reports increasing interest in project work at schools and higher ed facilities for AV/IT solutions from Xilica, Wyrestorm and other vendors. “Some facilities are using the lockdown time to make upgrades that were perhaps difficult to conduct during regular opening times. These can include anything from critical infrastructure to changing that ageing projector globe.”
In the aftermath of Covid, live events will be among the last to resuscitate. “AV trends will probably revolve around safe conferencing and crowd and queue control technologies to enable some return to business,” reports Tim Lambert, sales manager for Powersoft’s regional distributor, PAVT. “Given the number of business closures and asset sales, we expect a very buoyant second hand market which will create issues for distributors of new products for the foreseeable future.”
One of the key elements of growth pre-Covid in ANZ was the focus on lifting the experience of patrons at theme parks and museums. “Engagement has been a key outcome as it helps attract and encourage repeat business, as well as drive customer growth via word of mouth or social media,” says Claudio Cardile, managing director of Barco ANZ. “We saw an increase in the use of projection and LED solutions to create immersive installations. Our future depends on our social calendars being stripped to the bare minimum,” he says. “Social distancing contradicts the very nature of the AV industry which largely relies on the concept of being social and connected.”
At the same time, Cardile is optimistic that AV will regain ground in the coming months. “This is largely driven by the fact that AV technology provided the vital link between business leaders, employees, customers and supply chains as the world learned how to adjust to the new ‘normal’ almost overnight.”
Quiet neighbour
In New Zealand, the education and government markets were performing reasonably well, pre-Covid, reports Lambert. The staging market, however, was largely stagnant “due to the low return on investment” experienced by most operators.
“The highly competitive nature of the small to medium staging market causes operators to be very cost driven and often looking to utilise second hand products or keep old systems in service for very long periods of time,” he says.
Upcoming tenders in New Zealand include the Waikato Regional Theatre, slated for 2022-23 and the Auckland Convention Centre which is now being assessed for a rebuild after a fire destroyed its roof late last year.
Exterity points to the New Zealand International Convention Centre in Auckland and the new Sky City Hotel as ongoing projects that have been showing interest in IP-driven AV technologies. “There has been a lot of anticipation as to what shape that might take,” says Hind.
Sporting rivalry
“We are one of the very few pure manufacturers directly represented in our industry in NZ,” claims Craig. Crestron fields an all NZ Team in Auckland. “In many cases integrators and distributors play hybrid roles. NZ business culture and heritage are different and need to be respected.”
It is easier to obtain a liquor licence in Wellington than Melbourne so New Zealand’s hospitality industry is more competitive and less likely to spend large sums on decent AV compared with Australia,” says Lambert.
Perhaps the biggest differences between the Antipodean neighbours are around labour unions. “NZ has no real unions of any strength left and has perhaps smaller operators,” says Lambert.
In Australia, the redevelopment of Stadium Australia and Quay Quarter Development, both in Sydney, as well as the Queens Wharf Development in Brisbane are projects to watch.
“They are set to drive an enormous amount of investment and technology adoption as they start to come out of the ground and move towards opening over the next two years,” says Hind.
Crestron said it enjoyed its strongest (fiscal) year ever in the region in 2019-20 with a sizeable take up of its cloud, virtual control and Flex (Teams Rooms) solutions.
“Given that the IT department now makes the bulk of the decisions for our solutions, I don’t see that as a surprise,” says Craig. “Another clear trend is the need for standardisation – one consistent hardware and software platform with a consistent, simple user experience across the campus, enterprise or global facilities.”
The University of Sunshine Coast in Queensland recently completed one such project that incorporates an array of Crestron technology. “The use of cloud and virtual control as a backbone of a scalable system that can be deployed and managed from the cloud allows for a very intuitive services platform beyond AV, and has produced a real benchmark installation for the industry.”

Loupedeck is now compatible with Resolve and for live streaming

RedShark News

Loupedeck, the custom photo and video editing console, continues its rapid development path with recent updates meaning it now works with DaVinci Resolve and Avid ProTools, while an integration with Streamlabs means the deck becomes a live video mixer for online broadcasting.
The Streamlabs integration allows users to control a stream by switching between scenes, to hide and unhide any unwanted sources or skip/mute events at any time and can control the audio levels of sources.
Peripherals like this, and Elgato's Stream Deck, have arrived to make live broadcasting on Twitch or YouTube Gaming simpler and extremely affordable.
Loupedeck founder and CEO Mikko Kesti believed in this mission so much that he used his own savings to create the first prototype.
After a wildly successful Indiegogo crowdfunding campaign (proudly 488% above target) and the recruitment of some of Finland’s leading developers, the original version of the hardware control surface was released in 2017.
This was a keyboard made for Adobe Lightroom, with dedicated dials and buttons for almost every adjustment in the RAW photo editor. The second generation Loupedeck+ added support for other creative apps, including Adobe Premiere and was closely followed at the end of 2019 by Loupedeck Creative Tool (CT), an editing console which expands the product far beyond the Adobe suite not least with native support for Final Cut Pro X and photo editing software Skylum Aurora HDR.
Like Loupedeck+, the Creative Tool is an extension of your keyboard, mouse, tablet and pen which maps software to the console’s buttons and is designed to make editing quicker and more comfortable particularly for photographers and videographers on the road (on a plane… remember those days?)
The latest software for the product (free with the U$549/£469 hardware and compatible with both Windows and MacOS) enables users to program custom actions and adjustments, shortcuts, keys, delays, macros, text, links, run application, and mouse movements for third party software including Steinberg CubasePro10_5, Serato DJ Pro, Pixologic Zbrush 2020, Avid Pro Tools, Resolve 16 and Photo Mechanic and even non-creative tools like Microsoft Excel.
It is also natively compatible with Ableton Live for music promo production.

Custom profile with Resolve

Custom profiles are a set of custom actions which can be mapped from any software onto the buttons of the deck. Unlike Loupedeck native integrations which use the API of the software, the Loupedeck custom profile just uses the available shortcuts in the program so there are some limitations in the software.
In the menu of the Loupedeck software itself you find the application dropdown list at bottom of which it says ‘find more’ or ‘get profile list’. That takes you to the Loupedeck website and a list of custom profiles that people have already created that you can download.
Do this for Resolve and you’ll find that any hot key available in Resolve can be mapped to the CT. You can’t use the CT in the colour panel however. Blackmagic doesn’t allow Loupedeck to work with that yet but every panel that you have in Resolve – media, cut, edit, fusion - is in the Loupedeck software. Indeed, there’s already a Da Vinci Resolve custom profile available for download. There are hot keys in colour but not so much working with Resolve’s colour wheels (yet – who knows, Blackmagic could allow its API to be used for native integration in future).
Also handy is a macros function which enables a couple of functions in Resolve to be grouped in one hot key without having to hit both key strokes. So, if there’s some function you are doing repeatedly, this could save you a lot of time.
The benefits and capabilities of the Loupedeck CT have been well received. To recap a few of them: reviewers have praised its design and build quality with aluminum cover and dials, LED backlight, touchscreens and machine-quality ball bearing. It’s lighter than you might think but carries 8GB of storage and is equipped with a removable USB cable for portability. A set of preset workspaces, specific to each integration, are dedicated to each stage of the editing process and can be tailored to match your workflow.
All its tools and functions work from the one consistent user interface regardless of the application and it’s designed for to be controlled with one hand. That’s useful since custom buttons on a keyboard can require both hands to control a shift or command key and some other button, so this frees you up.
There’s also a Developer Program which opens the doors to the Loupedeck SDK, exclusively for the Loupedeck CT. This means, more supported software and more plug-ins with Autodesk Fusion 360 compatibility expected soon.

Behind the scenes: Mulan

IBC
Cinematographer Mandy Walker frames for the epic and the intimate in Disney’s touchpaper test for theatrical return.
Disney’s live action version of the Chinese folk tale Mulan will be the first real test of audience appetite for a return to cinemas, post-coronavirus. Set for an August release, it will be one of the first major big screen releases after months of lockdown across the globe. 
While other films, such as Disney’s Artemis Fowl, skipped theatrical and went straight to streaming, Mulan is a suitably big screen experience which Mandy Walker ACS, ASC (Hidden Figures, Australia) and director Niki Caro have made the most of with an epic scale of locations, large format photography and visual effects largely kept in-camera. 
The mythic story follows a young woman who disguises herself as a male warrior to take her father’s place in battle to save him and fight for her country. 
Near the beginning of the film, Bori Khan (Jason Scott Lee), the warrior leader of the Rourans and ally of powerful witch Xianniang (Gong Li) attacks a garrison of Chinese imperial troops. The army is seen galloping on horseback across desert toward the fort, then scaling the walls, in one of the film’s many epic action sequences. 
How it was shot 
“This scene had to be meticulously pre-planned because we are combining several elements of photography shot in different locations at different times,” Walker tells IBC365. 
Background plates of the dune-filled desert was filmed by a scenic unit in Northwest China. The horses and riders were filmed in South Island, New Zealand and a set was built on a backlot near Auckland for the garrison including the main front wall and gate, a side wall and reverse of the structure inside in the garrison’s walls. The scene was several weeks in planning and took four weeks to shoot and assemble. 
“We did a lot of previz for the stunts, ordered for filming in such a way that we could capture individual shot elements. We also used storyboards so we could align where and how to join between one location and another shot,” Walker explains. “There’s not a lot of CGI in this movie. We try to use location backgrounds to reduce the amount of matting and as much as possible we shot in-camera. The horses are ridden by Mongolian stunt riders who travelled to New Zealand for this scene.” 
The entire cast underwent two months of military training in preparation for the film before rehearsing battle scenes on location. 
“I went with Niki to these rehearsals which might be of 100 people and 60 horses and with a lens and finder I’d line-up angles and take note of the choreography to plan where our camera should go. 
“In this scene I’ve used a Russian arm to travel with the horse riders and a cable-cam running on the top. There’s also a drone but only for very wide shots. I don’t like using drones, especially large drones carrying heavy payloads, close to actors or horses because it can be dangerous.”
Camera choice 
The galloping riders and stunt work was shot with five cameras including an ALEXA LF shooting 150 frames a second (sometimes slowed down in post to equivalent of 300 and 600fps).  
“I didn’t want to use GoPros or Phantoms because I want to acquire at a high resolution to retain maximum information into post to blow up or tweak shots if required,” she says. 
The A camera for the picture, though, is ARRI Alexa 65 recording ARRIRAW at 6K. “When Niki and I first talked about the story it was clear she imagined it on an epic scale yet she was adamant that the audience should feel as if they are with Mulan herself at all times. That led me toward the epic quality of 65mm which is a perfect format for capturing vistas and one in which can be very intimate with the subject by using depth of field to separate foreground from background. A touchpoint here is the way 70mm is used to show the relationship between epic landscape and central character in Lawrence of Arabia.” 
For wider shots, Walker used Panavision Sphero 65 glass, but for closer portrait shots of Mulan she had a portrait lens specifically made by Panavision and based on Petzval Portrait lenses, developed in the mid-1800s for use with daguerreotype exposures. 
“This lens is used predominantly when we see Mulan. I showed Panavision in LA some of the photographs I’d taken on location and also classic Chinese paintings as images for the movie that I feel were an influence and from that they recalibrated an older glass and modified it to fit 65mm.” 
She continues, “These lenses have the effect of focusing attention on the centre of the frame and dropping elegantly at the edges which is another key to our visual language for Mulan.” 
Lead actress Yifei Liu did most her own stunts including sword and martial arts sequences. This meant that in battle sequences Walker could use long 800mm and even 2800mm lenses and still focus on her face through the melee. “Her presence still felt very intimate,” she says. 
Walker went on three location scouts to China, travelling the length and breadth of the country. “I noted that traditional architecture is very symmetrical and Chinese art and painting is often composed with central framing. This is a characteristic of classic Chinese films such as Crouching Tiger Hidden Dragon or those by Zhang Yimou but it’s less of a tradition in western cinema. I felt we needed to centre Mulan in the middle of the frame, aided by shooting spherical, but cropping to widescreen 2.39:1.” 
She had another lens built, based on a gaussian lens, that worked with the 65 mm sensor and a 2.39 extraction. “We used that for special moments when Mulan is showing her elite warrior skills. It’s a radial effect with chromatic aberration, which centres her and de-emphasises everything else.” 
Lighting and colour design 
“We knew the desert plates should be shot in the sun so when we were shooting in New Zealand we aimed to also shoot in the sun but of course some days were overcast. With a scene that’s so wide you are relying on natural light - and especially where you have horses running - I over expose the image a little so that when I do lose the sun or have to put in more contrast I can shift it either way in the DI. I sought an exposure that can go either way otherwise I can’t lift the shadows or find the highlights are blown.” 
Walker monitored on Sony OLEDs with minor on set CDL tweaks by DIT Chris Rudkin. 
“After shooting digital on a number of films I’ve seen my LUT getting simpler and simpler,” she says. “I find if you have a very tricky LUT that is skewing the colour or contrast it doesn’t always work across the board for every lighting situation. Going from night exterior looks to day interior, and constantly changing light during the day on exteriors. So here I stared with a basic ARRI K1S1 LUT with a low saturation which gave us room to punch up some primary colours in post.” 
Rudkin used the CODEX Vault XL with the ALEXA 65 to keep the data wrangling invisible to Walker.  
“We would break the footage in the middle of the day so the dailies colourist could start on what we’d been doing during the day and view by the time we wrapped,” Walker says. “That was a big help. We could check everything, including what second unit were doing in real time in China on Moxion, which was helpful when we were working in remote locations in New Zealand.” 
She likes to see material at higher resolution, and used PIX for checking continuity on-set, and ensuring that the second unit was on the right track.  
“Our colour palette is devised together with production design and costume design who based their look on historical research of the fabrics and colours for the Imperial Palace, the guards, the commoners, the Emperor.  
Caro and Walker worked with VFX supervisor Sean Andrew Faden on background plates, reference plates, and the mapping of lenses. They also created a look bible, as a reference for VFX and for Natasha Leonnet, the DI colourist. 
“What was most important was the red colour for Mulan herself. This isn’t a documentary of course, it’s a Disney film so it had to look beautiful and that meant subtly delivering a heightened colour palette.” 

Monday, 6 July 2020

Sony's new plans could pave the way for software upgradable image sensors

RedShark News
In May, Sony revealed the first two models of its intelligent vision sensors, which the company described as the world’s first image sensors with integrated AI processors. Now, the company plans a major shift from hardware sales to “software by subscription” for data-analysing image sensors on location.
While initially directed at CCTV, self-driving and other Internet of Things applications using object recognition, there is scope to extend the idea to photography and computer vision in media and entertainment.
As more and more types of devices are being connected to the cloud, it is commonplace to have the information obtained from them processed via AI on the cloud.
But when video is recorded using a conventional image sensor, it is necessary to send data for each individual output image frame for AI processing, resulting in increased data transmission and making it difficult to deliver real-time performance.
The new IMX500 image sensor products from Sony perform image signal processing and high-speed AI processing (at just 3.1 milliseconds) on the chip itself, completing the entire process in a single video frame. This design makes it possible, say Sony, to deliver high-precision, real-time tracking of objects while recording video.
Fierce Electronics reports that the chip’s AI processing capability came from placing the logic chips on the back of the sensor “thereby giving the sensor more pixels to improve the light sensitivity.”
The extracted data is output as metadata, “reducing the amount of data handled.” This, and the lightning speed of image recognition opens up the possibility of new applications, from the detection of face masks to differentiating between whether a human or a robot is entering a restricted area.
Sony says the sensor “captures the meaning of the information of the light in front of it,” performing image analysis and outputting it all within the chip. In the example used in an explainer video (below), Sony shows the sensor identifying a basketball and a guitar, for instance.
Just as intriguing is Sony’s business plan for an imaging division that makes U$10 billion a year and which has built dominance over rivals like Samsung and China’s OmniVision Technologies through hardware breakthroughs.
Reuters reports that the sensor software “can be modified or replaced wirelessly without disturbing the camera.”
Sony says it hopes customers will subscribe to the software via monthly fees or licensing, “much like how gamers buy a PlayStation console and then pay for software or subscribe to online services.”
“Transforming the light-converting chips into a platform for software - essentially akin to the PlayStation Plus video games service - amounts to a sea change,” Reuters says.
Analysis of data with AI “would form a market larger than the growth potential of the sensor market itself in terms of value,” explained Sony’s Hideki Somemiya, who heads a new team developing sensor applications.

In order to do that, Sony needs partnerships with internet giants that have an existing development community. And in mid-2019 Sony did just that by striking an alliance with Microsoft to implement some smart camera solutions using Microsoft’s Azure cloud platform.
“These products expand the opportunities to develop AI-equipped cameras, enabling a diverse range of applications in the retail and industrial equipment industries and contributing to building optimal systems that link with the cloud,” the company wrote in a blog post.
However, you won’t find this technology in phones and handheld cameras just yet. These new sensors, for now, will be restricted for commercial purposes such as surveillance cameras and smart retail spaces that demand complex computer vision architectures such as Amazon Go.
That’s not to say either the core technology nor Sony’s business can’t or won’t be applied to media and entertainment in future. Super-fast in-camera AI-driven processing of faces and objects might assist applications like performance capture, pre-vizualisation or mixed reality photography.

Future possibilities

Such a chip could have the smarts to reignite light field cinematography by enabling quicker lighter weight processing of volumetric data.
If a chip can send metadata to the cloud for processing, it can also send information back to the sensor to upgrade its capability without having to re-engineer it or buy a new body.
More generally, such chips attached might aid computer vision of augmented reality glasses which have stalled in their tracks.
It’s also another chipping away at the bricks and mortar of traditional hardware and black boxes sold into professional environments and all our homes.

Esports assume parity both on and off the field

copywritten for Blackbird
With elite sports around the world halted under lockdown, esports not only filled the gap but assumed a parity that won’t fade as competitions return to the field.
Sports teams and federations, broadcasters and athletes have endorsed esports during the pandemic not as an imitation of the real thing but as an integral part of its future.
By some measures esports audience sizes were comparable to traditional sports audiences even before COVID struck. The largest esports league, League of Legends, boasted higher viewership than MLB, the NBA, or the NHL.
The global esports audience was already poised to attain $3 billion in revenues by 2022.
Since the pandemic, gaming as a spectator sport has gone through the roof. For example, leading live gaming platform Amazon-owned Twitch, saw audiences rise 31% in March.
Professional sports understand that esports is essential to increasing brand awareness, broadening their audience and providing new revenue streams. From February, many pivoted to digital simulations with motor racing quickest off the mark. Formula E’s 8 weeks long Race at Home Challenge was wildly popular with ex-F1 driver Stoffel Vandoorne claiming the title. Formula 1 continues to host a series of virtual Grand Prix with esports and gaming solutions provider, Gfinity, overseeing the delivery of the tournament and broadcast production. 
Esports viewers are predominantly young and therefore particularly valuable to broadcasters which have also sought to capitalise on demand. Virtual Grand Prix are broadcast live on ESPN; the world’s most famous thoroughbred horse race, the Grand National, was a virtual affair watched by almost 5 million free-to-air viewers in the UK; the National Basketball Association’s NBA 2K League has been airing on ESPN since May—the first linear broadcasts of this major league esports property.
In another sign of the blur between real and virtual world, the ePremier League Invitational Tournament featuring stars from the EPL’s twenty teams was broadcast on Sky Sports, Facebook and YouTube.
The EPL resumed on Sky Sports with a crowd soundtrack provided by EA SPORTS FIFA as further indication that the production values of actual and virtual live broadcasts are indistinguishable.
The leading esports and gaming publishers and developers are bringing their product to fans using Blackbird. Gfinity, for example, adopted Blackbird to frame-accurately clip, edit and distribute highlights from live streams for publishing to digital channels within second
Riot Games, whose titles include League of Legends and VALORANT, has chosen Blackbird for remote fast turnaround video production.
U.S-based VENN have even taken this moment, as the world emerges from lockdown, to launch a streaming TV network aimed at pop culture and esports audiences. Its cloud-based remote production relies on Blackbird to frame-accurately edit, enrich and publish a wide variety of engaging video content ultra-fast.
Using Blackbird, marketing teams can also rapidly access vast gameplay archives for repurposing across social and web. Sponsor branding and adverts can be added, partners tagged and promotional messaging included to drive monetization.
Whatever the sport, regardless of the sportscaster, esports is a natural extension of commercial strategy. It’s in the game.

Friday, 3 July 2020

Running while standing still: Making HBO's Run

written for Red.com
The creative team behind HBO’s Run pioneered a virtual production technique with potentially far-reaching impact for how future episodic shows are made. Created by Vicky Jones and co-executive produced Phoebe Waller-Bridge (Fleabag), the show’s deceptively simple premise has two former college sweethearts (Merritt Wever and Domhnall Gleeson) drop everything in their lives to reunite after 15 years apart. The characters spend most of their time travelling across the United States on a train but cast and crew never planned to leave Toronto.
Cinematographer Matthew Clark (Late Night) explains, “In my first meeting with Director Kate Dennis, Production Designer Denise Pizzini, and the producers, the discussion was all about how we were going to shoot a story which takes place largely on a train hurtling across the country. We ran the gamut of ideas from photographing onboard a real train; partial real, partial studio shoot; or using green screen.”
Stargate Studios in South Pasadena gave them a new direction. It demonstrated how a mocked-up train carriage surrounded by LED screens could display video to simulate a photoreal location.
“What was key is that the display was tracked to the camera movement by a wireless sensor so, as I stood up and shot out of the window, it looked like I was shooting down at the track and if I knelt down I was shooting more of the sky,” Clark explains. “Since we had planned to shoot a lot of handheld the ability to change perspective in real time sold us on the possibility. We then had to figure out if it would work practically.”
On a soundstage in Toronto, the production built two cars outfitted to resemble an Amtrak carriage. These rested on airbags which could be shaken to simulate movement. Instead of LEDs, a series of 81-inch 4K TV monitors were mounted on a truss outside each train window. The train’s windows were made a little smaller than real life to avoid camera seeing the edge of the TV screens.
“It’s a smaller scale and less expensive version of Lucasfilm’s production of The Mandalorian but the principal is the same,” says Clark. “It effectively brings the location to production rather than move an entire production to often hard to access locations.”
At all times Clark wanted to recreate the feeling of being onboard an actual train. “I wanted to make the camera feel as experiential as possible – to put the audience with the characters,” he says. “That meant that any light that played on the actor’s faces or on surfaces had to be synchronized to the illumination outside the windows otherwise the effect wouldn’t work.
“It was important to line up the picture so when you’re standing in the car your perspective of the lines of train track and power lines has to be realistic and continuous. If the angle of the TV screen is off by just a few degrees, then suddenly the wires of a telegraph pole would be askew. The monitors are on a truss so that when we needed to turn the car around to shoot from another angle the grips could flip all the monitors around to the exact angle.”
For this endeavor, he selected Panavision’s DXL2 with the RED MONSTRO 8K sensor in order to work with large-format lenses.
“I wanted to create the sense of shallow depth of field with a large field of view so when we’re inside the train the emphasis of the scene is on the actor not on what’s going on outside,” Clark says. “It is supposed to be overexposed in places to give it a bit of lo-fi realism.
“Sometimes we do focus on the view outside the train which is—let’s not forget—a TV screen that is flat and displays video that is completely in focus. To avoid the whole shot feeling like that, I wanted to control depth of field.
“The camera and lens package helps creates an intimacy akin to shooting anamorphic. Crucially, the way this works with the RED sensor is that the image is much more rectilinear, giving a more defined straight edge to the frame and no aberrations on either side of the wider lenses. That’s an important consideration given that we are confined to such a small space.”
Opening the MONSTRO sensor up full enabled him to use large-format Panavision Ultra Speed lenses typically with speeds from T2 to T2.8 and close focus distances from 2 to 5 feet.
“The Ultra Speeds did allow for the T1.1 a few times when we wanted to get ‘in the character’s head’ by using razor thin depth of field—one eye sharp or a super soft shot that the character then walks into focus. The fun part of lens/sensor choice was the field of view with relation to the depth of field – perfect for our story.”
In pre-production, Stargate’s Sam Nicholson, ASC and Bryan Binder fixed four RED 8K cameras and two Sony VENICE 6K cameras to the outside of a train car and captured footage from Amtrak lines crisscrossing the States, including from Vermont to New York, NYC to Chicago, onward to Los Angeles, and a route from San Francisco back to Denver.
The footage was stitched together at Stargate into a 180-degree bubble for either side of the train “so you could literally see the cars turning in front,” Clark explains.
“My gaffer, Randy Brown, and I would go through the script and figure out how to light each scene in a fairly conventional way. Then we’d find a spot in the VFX footage that I liked as a foundation and shared this with Stargate so they could pixel map the light to match the level on the live-action picture.”
Ambient exterior lighting fixed to the adjustable truss just above the monitors outside the car’s windows was supplied by the SkyPanels S30, S60 and S120 series, augmented on the ground with ARRI L7/L5/L10 LED Fresnals. These were tied to a control board enabling Clark, an operator and the VFX team to tune the lighting. The system was able to deliver a photorealistic moving image, displayed through the train windows, with animated lighting to match the plate, tracked to and composited with the shot in real time.
“For example, if the train is passing in and out of trees then the light would fluctuate or if it travels at night past a red light, that red light would be tracked down the outside of the car. Everything was pixel mapped to the plate footage by Jon Craig at Stargate. We could adjust the color temp, intensity and even the placement of the effect by a combination of traditional dimmer board and VFX map communication.
“In the pilot, there was no baseline there for me so to some extent we were using our best judgement to get it right, but once in the series Jon, Randy, Andrew Read – our fantastic dimmer board operator – and I were able to work out some more detail, making our lighting control much more repeatable.”
For even greater realism, Pizzini included reflective surfaces in props and materials which flicker and shift with the lighting design from super dark to bright and modern.
“Any glass on the set, even water glasses, reflect the effect of light and picture from the outside world. The characters even take a couple of shots on an iPhone out of the window—which are in fact of the monitor. We could create these layers of actuality because we are able to work with the image right there, something that has always been troubling with greenscreen.”
While some scenes were lit just using the monitors, it proved challenging to augment the window illumination, especially for shots framed close up.
“Being able to get an additional light in to augment the feeling of sunshine coming through a window was a struggle,” Clark says. “It was problematic to move a light near enough so that it wouldn’t interfere with the view of the monitor through the camera.”
Considering the volume of scenes featuring the train in the seven-episode series, it is remarkable how few shots needed additional treatment in post. Clark estimates that just 10 percent of scenes required additional greenscreen as coverage when a character moved quickly through a carriage.
“Usually doing VFX takes a lot of time and productions typically don’t grant the DP much time to work it out in post, but on Run I could set exposure based on what I saw there. Sometimes I wanted to see every detail of the trees, other times I wanted it blown out. I felt this approach maintained a sense of story for Kate, myself and the actors without having to revisit the scene and manipulate the photography later.”
Naturalism was also the tonal key for Clark’s color palette which is predominantly of desaturated blue and silver in keeping with Amtrak livery with shadow greens for landscape.
“We didn’t want to tip our hand and make it super colorful which is the ‘go to’ for comedy. There are moments of gravity and mystery which is why a more muted palette is suitable, especially in the pilot to set the mood. As the story proceeds, the looks evolve but all the while we’re aiming for naturalism and not to push too far in one direction.”
As processors get faster and high-resolution displays become more affordable, the real-time combination of photographic and CG assets will become much more prevalent.
“I do think virtual production will change the way a lot of shows are made,” Clark says. “For me, it’s important that when you make a choice to build a set and ring it with LED screens or monitors you don’t lose the naturalism you get for the location. That is what you are striving for. You don’t want the technology to get in the way of the storytelling.”

Thursday, 2 July 2020

Virtual Production is the hottest thing in North Wales

RedShark News
In 2009 Avatar pioneered virtual production and by all accounts Avatar 2 due 2021 (with its challenging use of actual water rather than CG fluids) is still state-of-the-art but you don’t need a $250 million budget to merge the physical with the digital into one photoreal story.
Far from it, and in the UK, it is On-Set Facilities (OSF) which is leading the charge.
Set up in 2016 in Corwen, North Wales by Asa Bailey, a former digital creative director at ad agency Saatchi and Saatchi, OSF has grown into a fully managed virtual production studio covering in-camera VFX (LED), mixed reality (green screen), and fully virtual (in-engine) production.
It has sales and support offices in Madrid and LA and partnerships with leading professional camera tracking firms NCam and Mo-Sys and just recently signed with ARRI to become an ARRI certified virtual production partner.
Bailey, who is CEO and CTO, said of this alliance, “we can use our knowledge of virtual production systems to supply global clients with a lot more first class on-set technology. We can now design and provide complete, turnkey, on-set VFX and virtual production studio solutions with total confidence and better meet the growing on-set technology needs of major studios, production companies, and broadcasters.”
 ARRI its “cross-disciplinary competence” in 4K/HDR camera systems, postproduction, and rental combined with “a deep understanding of content production workflows and environments and expertise in state-of-the-art lighting, places the ARRI System Group ahead of the competition.
Sony might disagree. Its Venice cameras were chosen ahead of the competition by James Cameron to shoot the next Avatars.
Nonetheless, the pact with OSF could put ARRI-led productions the way of OSF and at the very least means ARRI camera data will have a solid pipeline into OSF’s realtime production tools.

Virtual production systems

These include camera tracking systems, LED screens for digital backlot and virtual camera modules as well as plug-ins to third party animation and software assets that work with Unreal Engine.
These include the Rokoko mo-cap suit which can stream directly into UE via a live link demoed by OSF and 3D character creation tools from Reallusion.
OSF also use Apple’s LiveFace app (available for downloaded on any iPhone with a depth sensor camera) and its own motion capture helmets to capture the facial animations. 
The firm’s Chief Pipeline Officer, Jon Gress, is also Academic Director at The Digital Animation & Visual Effects School at Universal Studios in Orlando.
OSF even has its own virtual private network, in beta, connected to the Microsoft Azure cloud for virtual production. From a network perspective, StormCloud is designed to connect many thousands of remote VPN users in Unreal Engine. Entry points currently set up in London and San Francisco are being tested by “a number of Hollywood Studios and VFX facilities,” says the facility.
“Filming in-engine and previewed in real-time and with pre-production and editorial teams also connected to StormCloud, virtual production is blurring the lines between what happens on-set physical sets and virtual sets hosted in the cloud,” says Bailey who is CEO and Virtual Production Director.

The rise and rise of virtual production

Most major films and TV series created today already use some form of virtual production. It might be previsualization, it might be techvis or postvis. Epic Games, the makers of Unreal Engine, believe the potential for VP to enhance filmmaking extends far beyond even these current uses.
Examined one way, VP is just another evolution of storytelling – on a continuum with the shift to color or from film to digital. Looked at another way it is more fundamental since virtual production techniques ultimately collapse the traditional sequential method of making motion pictures.
The production line from development to post can be costly in part because of the timescales and in part because of the inability to truly iterate at the point of creativity. A virtual production model breaks down these silos and brings colour correction, animation, and editorial closer to camera. When travel to far flung locations may prove challenging, due to Covid19 or carbon neutral policies, virtual production can bring photorealistic locations to the set.
Directors can direct their actors on the mocap stage because they can see them in their virtual forms composited live into the CG shot. They can even judge the final scene with lighting and set objects in detail.
What the director is seeing, either through the tablet or inside a VR headset, can be closer to final render– which is light-years from where directors used to be before real-time technology became part of the shoot.
Productions with scenes made using games engines, camera tracking and LED screens or monitors include Phoebe Waller Bridge co-created series Run, and the Waller Bridge co-scripted Bond 25 No Time to Die.
Coincidence? Yes of course it is.