Saturday, 6 June 2020

Sport's Project Re-start

IBC
How elite sports plan to social distance and get fans closer to the action
Four-year-old ‘Im Sophie’ was the unusual centre of a media scrum on Monday 01 June. The greyhound won the 10:21 at Perry Bar, the first race in the first professional sport in the UK to return following lockdown. Galvanised by the prospect of live sport, national press seized on the event in anticipation of greater things to come.
Elite sports have the nod to go live. Project re-start is a major exercise in social distancing and leaner on-site workflows that has got every broadcaster and sports producer scrambling to adjust.
“In a nutshell I would say we’re looking at every single contract having a completely different production model post-COVID,” says Frank Callaghan, Director of Live Programming and Production, Gravity Media.
Sport is entering a period of uncertain duration in which competition is allowed to resume under strict biospheres. Sport teams aside, this includes continued restrictions on travel for production staff to and from hubs as well as fewer crew at venues.
Even then, a behind-closed-doors Formula 1 race is estimated to require around 1500 people, highlighting the scale of the logistics required to keep everyone safe.
When it resumes on June 15, the English Premier League will be more of a hybrid outside broadcast than it was when play ended in March. There’s no getting around the need for camera operators and technicians on site but the need to reduce footprints is imperative, says Nick Symes, Gravity’s Technical Director.
Remote comes to the fore

Gravity Media’s soccer work ranges from producing world feeds of the EPL to acting as the BBC’s production partner for the FA Cup which resumes quarter final action on June 27.
“The most pressing issue is how we operate from a gallery,” says Callaghan. “Normally you’d have a P.A, a director, EVS and graphics operator all in close proximity with a very fine-tuned workflow. Finding a way to bring back feeds to production centres, then add graphics and vision mixing desk with remote operators, possibly even doing commentary remotely. Nothing is off the table.”
Gravity has installed a remote talkback system enabling the team to produce greyhound and horse races by key staff working from home. It also operates postproduction and studio hubs in the UK and Australia meaning that it could mix an Aussie game in the UK – or vice versa – if necessary.
Remote production only goes so far. Mobile units are necessary for most live sport production and these are very cramped at the best of times.
“To overcome this, broadcasters can add extra units to provide greater spacing,” informs Glenn Adamo, Managing Director of The Switch Production Services. “They can also provide plexiglass partitions inside the truck and ensure all surfaces are sanitised before and after each show – as well as monitoring temperature readings to avoid the spread of the virus. Operators will have to certify that the air filters are highly efficient and changed regularly to capture particles. Headsets will have to be disinfected daily. Announcers could similarly be socially distanced in the announce booth.”
Substituting the crowd
Social distancing aside, it is empty stadia which is the main editorial concern. The lack of crowd colour, noise and ambience risks diminishing the broadcast product. Various innovations are likely to be introduced suited to different sports and dependent on clearing rights.
“This represents an opportunity for broadcasters to be novel and creative,” says Paolo Pescatore, Tech, Media & Telco Analyst at PP Foresight. “More replays and greater focus on players. Expect greater focus on the dug outs to hear what’s going on the behind the scenes. Some fans might even enjoy no atmosphere as they will be able to hear every kick, pass, shot and instructions from the bench. Switching commentators off should be an option.
“We might even see some cameras being operated remotely,” he predicts. “This could allow for more cameras to be onsite to provide different angles. However, there will be a supply issue given the number of matches that need to be completed for the current season.”
The German Bundesliga is the canary in the coalmine. As the first major European football league to resume following the epidemic, its behind closed doors trial has informed the EPL’s approach along with broadcasters BT Sport and Sky Sports (the BBC and Amazon).
The Bundesliga production is familiar but with significant tweaks. It chose to go live without crowd audio, although this proved so alien that local German stations and US distributors like ESPN added fake crowd noise.
Its directors framed the players at tighter angles to cut out empty stands and to capture more of the player’s emotional reaction as a substitute for crowd response. Close to the pitch cameras were repositioned to provide additional safety buffers.
All of these can be expected to crop up in the EPL’s reappearance.
Other possibilities including use of second screens to enhance the viewing experience “perhaps allowing fans to pick the angle close to where they are used to sitting in the ground,” speculates Callaghan. New sets of dynamic graphics and use of augmented reality is on the cards.
The AR³ Football software from WTVision is one product capable of generating virtual crowds as well as tied-to-the-field augmented reality graphics like “man of the match” infographics. It is being used for resumption of the Portuguese Primeira Liga by broadcaster BTV. 
A Danish league game on 25 May included a 40m x 3m screen in front of one stand filled with 10,000 fans watching by Zoom video link. Players could hear viewers in the virtual grandstand chanting.
Other ideas being discussed by the Premier League’s broadcast advisory group include integrating crowd noises from football sim game FIFA 20 into broadcasts, 360-degree replays, coaches and subs being interviewed live during match play and a new tactical camera feed.
“For sure there will be a bigger push towards social and immersive features at home,” Pescatore says. “This could include extended highlights, restart, in-game highlights. Who knows, maybe footballers and athletes will be the new broadcasters providing behind the scenes footage directly from their smartphones.”
With public viewing in pubs and bars currently closed, the in-home TV broadcast of matches is now all the more valuable for fans and broadcasters. Although customers paused their Sky Sports subscriptions during lockdown, this new opportunity gives Sky the chance to come out of the pandemic strongly.
According to Ampere analyst Daniel Harraghy, “As well as the chance to attract new subscribers, Sky will receive financial aid from the Premier League in the form of a rebate … expected to be worth £330m, shared between Sky and BT with the majority of this figure going to Sky.” 
The benefits of being on site are tangible. Producers can instruct ENG crews to react very quickly to events, for example.
“With a disparate production team where comms are more lumpy, things inevitably take longer and that can effect production value if you don’t have the agility to react,” says Symes. “It has been a challenge since we’re all used to working in a certain way and expect to go into a gallery and sit next to the person pushing the buttons on the vision mixer or to shout at the EVS op. COVID-19 has deconstructed that thinking.”
He says, “Every production company and broadcaster is looking at what everyone else is doing in the market. It will be very fluid of the coming months. There is going to be a lot of trial and error.”

Three sports already up and running

Bucking the trend
Professional Bull Riders (PBR), a subsidiary of IMG, was the first professional sport to return to competition in North America with a series of closed-to-fans event in Oklahoma beginning 25 April, then throughout May, broadcast on CBS Sports Network.
Production crew is approximately 20% fewer than pre-COVID. “There are fewer manned cameras, which means that a smaller crew is working harder and doing more things, with most everyone becoming a utility player,” explains Robby Greene, head of TV production for PBR.
Normally, a PBR event would have 8 to 9 manned cameras. To limit the number of crew it is using some robotic cameras and POVs. Additionally, outside the TV broadcast, two cameras are typically assigned to the in-arena production. Obviously, without fans in the arena viewing a Jumbotron, those cameras aren’t used, nor do they need the usual producer and director for the in-arena video production.
Crews have to follow PBR’s overall safety protocols. “They’re part of a working group, which must remain separated from other groups such as bull riders, bull stock contractors, judges and medical personnel,” Greene says.
At the Lazy E Arena in Oklahoma, PBR had 23 working groups. About half of these comprised the television broadcast.
“Within the working group, there is social distancing. All participants, including the crew, must be medically tested before they are allowed into the arena. We’re testing for both presence of virus and antibodies. Everyone is quarantined until testing negative for coronavirus, and then they can proceed in manning the production.”
The OB truck is a particularly protected bubble. “One half of the crew in the truck uses one door, the other half uses the other door,” he reports. “They’re all required to wear masks, which is a challenge with a headset on, but one we need to manage during this time.”
PBR doesn’t produce remotely. CBS play by play announcers and colour analyst are in the arena, socially distanced. Sideline interviews are done at a distance with boom mics.
“Events without fans have created a very different atmosphere,” he says. “It’s a lot more raw – fans can hear the clanging of metal gates and riders cheering for one another. At the three closed weekends in Oklahoma, we didn’t hide visuals of empty seats. Our opening shots showed the full arena, but during the bull riding, we did opt for tighter shots.”
For the upcoming PBR Monster Energy Team Challenge at the closed set at South Point Arena in Las Vegas (June 5-28), the team have created the Let ‘Er Buck Saloon in the Monster Pit, featuring LED boards and an array of speakers. Rather than fans seeing a backdrop of an abyss of empty seats, the video wall will carry a moving picture show of various graphic elements played to loud music.
Come out fighting
The Ultimate Fighting Championship (UFC) returned on 9 May live from VyStar Veterans Memorial Arena in Jacksonville, Florida. After three events, UFC returned to Las Vegas for a month-long series of matches hosted out of the UFC APEX facility. The events will be closed to fans and only essential event personnel will be in attendance with a number of increased health and safety measures in place including strict quarantine procedures leading up to the events.
Overall broadcast production personnel have been reduced by 40 percent. The announcers are seated at separate tables at least six feet apart.  Other production personnel are required to wear PPE at all times.  Plexiglas partitions have been installed in the production truck. Media interviews with fighters (both before and after the event) are conducted virtually.
Essentially, all of the cameras used for normal fight coverage are being used except those used to capture crowd shots.
A UFC spokesperson explains, “There is an opportunity to enhance the broadcast on the audio side.  Since a lot of the microphones that we use to capture the atmosphere aren’t capturing crowd noise, we’ll be able to focus solely on fight action inside the Octagon.  The production team will look to incorporate not only the jolting blows during fight action but also the interactions between fighters and their corners.
“During the fight, you’re really going to hear those punches and kicks that are landing.  When we’ve got 8,000 people in the stands, the viewer at home doesn’t always get the luxury of hearing exactly what it sounds like when a kick lands to the body.  We’ll try to bring that sound into the broadcast more than we would in the past because we aren’t battling thousands of screaming fans.  We will also be able to get very clean, clear audio from the fighters’ camps as they’re calling out instructions both during the fight and between rounds.”

Let’s go racing
Arena Racing Company (ARC) was the first pro sport to get back up and running in the UK. Gravity Media serves UK betting shops with feeds from ARC horse and greyhounds tracks and content to ARC’s international partners in Australia, Greece, the U.S and more.
Production was part-remote before the pandemic in that pictures from the track were relayed to a permanent production gallery housed at Gravity’s Chiswick-based production HQ.  Facilities there include an EVS XT3 for feed ingest and replays, design and implementation of Aston 3D graphics with integration to live betting data, and dedicated audio capability.
The channel director, technical supervisor and EVS operator would normally be housed in the gallery with a commentator in adjacent voice-over booth. On resumption, all those key roles are being fulfilled remotely from home (with no EVS replays for the first couple of weeks back).
“We receive the video feeds into our MCR as normal [via NEP Connect which are managing social distancing at the track] with gallery operations stitched between production and managing the service output,” explains Nick Symes.
Gravity implemented remote access to MCR control systems and a low latency (half to one frame) streaming solution which provides each operator with a multi-view of feeds. Another video stream is used by the commentator, also working from home.

Friday, 5 June 2020

Here's What We Know about the Super-Secret Tech Behind 'Avatar 2'


No Film School

It’s the most hotly anticipated sequel in Hollywood and also its biggest gamble. James Cameron’s 2009 sci-fi spectacle grossed $2.78 billion at the box office for Fox but will be twelve years since the original release when Avatar 2 premiers in December 2021. With a billion dollars spent on production spanning all four planned sequels, Disney will be fingers crossed that there’s audience appetite for more tales from Pandora.
You would be a fool to bet against it. If anyone can shepherd a franchise into profit and longevity it’s the Marvel-Lucasfilm-Mouse House. Naysayers at the time of Avatar’s debut were quickly silenced by the astonishing traction that the film’s fusion of 3D cinema and high concept storytelling galvanised.
The sequels are equally anticipated in tech circles for their approach to production. The original was a bone fide ground breaker in pushing forward virtual production techniques such as the use of a virtual camera for the director to visualise CG characters and backgrounds in realtime on a live action set. Only in the last few years, with the advance of game engine renders and LED screens as backdrops, has this become standard on films like The Jungle Book or shows like The Mandalorian.
The secrecy surrounding not just plot but production technology is all part of building a sense of excitement ahead of release, as producer Jon Landau knows only too well when posting behind the scenes shots on Instagram. Recently more information has been revealed.
Cameron committed to shooting the sequels in high frame rates almost as soon as Avatar released but has never committed publicly to the new film’s exact specification.
Until now. In an exclusive interview https://www.ibc.org/trends/behind-the-scenes-avatar-2-and-3/6007.article with IBC, the European counterpart to broadcast and film trade show NAB, Cameron’s production company Lightstorm Entertainment goes on the record to state that Avatar 2 and 3 (which are being shot and produced in parallel) are captured in 3D 4K at 48 frames a second and in High Dynamic Range.
“Massive amounts of data is being pushed around live every minute,” says Geoff Burdick, senior vice president of Production Services & Technology, Lightstorm Entertainment. “We needed HFR and high res and everything had to be in 3D. This may not be not the science experiment it was when shooting the first Avatar but the sync for 3D at those higher frames and resolutions is still an issue. Alerting camera to issues is a big part of our job.” 
Burdick goes on to explain that the issues he is on particular look out for during the live action shoot surround stereo capture such as ensuring parity between left and right eye lenses (on up to three stereo pairs shooting simultaneously), whether an iris is mismatched, if the zoom is offset or there are rotational axis issues.  
“There are critical camera adjacent monitors for our DP and focus pullers who are working in convergence and dialling in interocular and can look perfect but my small team and I see the same feed live and I can radio to Jim that we have an issue. Nobody would have seen that without this set up.” 
The set-up he refers to is a mobile screening room that simulates the theatrical environment right at the point of capture. The ‘pod’ houses a Christie Digital 3D projector capable of projecting DCI compliant dailies to large screen.  While it’s not unusual for high end shows to screen dailies in this way, it is probably the most state of the art example since this is being projected in 3D. 
The 3D rigs are comprised of multiple 6K-capable Sony Venice cameras, with the optical blocks disconnected from camera body by a cable at distances of up to 20 feet. By lowering the weight and improving ergonomics, Cameron and DP Russell Carpenter can wield the cameras with greater flexibility and freedom. 
Burdick also describes the workflow enabling the data to be transported around set. The glue is a series of Blackmagic Design boxes which take the signal direct from the Sony Venice cameras and convert it into multiple combinations for thorough on-set reviews. These include 3D 48fps in 2K and 4K, 3D 24fps in 2K and 4K, and 3D 24fps in HD plus SDR and HDR variants.
While the film’s final theatrical release format has not been announced, it is my understanding that Avatar will only have certain select scenes shown at high frame rate. Instead, and unlike experiments by Ang Lee (Billy Lynn’s, Gemini Man) which were given a 120fps full picture release, Cameron intends to master only sequences of his film at HFR 48fps.
This would chime with his recent commentary to Collider that HFR is of primary use to smooth motion blur of action sequences or fast pans caused when playing back stereoscopically at 24fps.
“I have a personal philosophy around high frame rate, which is that it is a specific solution to specific problems having to do with 3D,” Cameron said. “When you get the strobing and the jitter of certain shots that pan or certain lateral movement across frame, it’s distracting in 3D. To me, [HFR is] just a solution for those shots. I don’t think it’s a format. I think it’s a tool to be used to solve problems in 3D projection.” 
The one plot point which has been widely shared is that significant scenes will take place underwater. Not just CG water, but with the actors trained to hold their breath for up to four minutes, while wearing performance capture suits and no scuba breathing gear.
The problem with filming this is not the underwater part, of which Cameron has extensive experience not least shooting the actual Titanic in the 3D documentary Ghosts of the Abyss, but the interface between the air and the water, “which forms a moving mirror,” he explained to The Independent newspaper.
“That moving mirror reflects all the dots and markers, and it creates a bunch of false markers. It’s a little bit like a fighter plane dumping a bunch of chaff to confuse the radar system of a missile. We’ve thrown a lot of horsepower, innovation, imagination and new technology at the problem, and it’s taken us about a year and a half now to work out how we’re going to do it.” 
Part of the solution involved covering the surface of the tank in small white balls that prevent overhead studio lights from contaminating the performance capture system below… while still allowing anyone below to surface safely through them should the need arise. 
The interview with Burdick just pertains to the live action shoot on stages at Wellington, New Zealand. There are two other major stage of Avatar production; a performance capture of all the principals which was filmed first and finished for actors including Kate Winslet as long as two years ago; and a virtual production (at Manhattan Beach Studios near LAX) in which performance capture assets animated at Weta Digital are used by Cameron and his editorial team (including five lead editors) to shape the narrative.
This aspect of production, including use of games engine tech, is still being kept behind closed doors.
Also worth noting that, should Avatar 2 take a bath at the box office, sequels 4 and 5 may not be greenlit by Disney.





Streaming Protocols Go 4K And Interstellar

RedShark News

The power of live video over the internet got a couple more jolts in the arm this week when financial services giant Bloomberg upgraded its streaming service to 4K, while millions of people caught the space bug watching NASA’s SpaceX launch facilitated by streaming contribution links.
Both were enabled by competing protocols designed for low latency jitter eliminating live video.
 Zixi, which is the leading internet streaming protocol in terms of customers, added Bloomberg TV+ to its roster by helping it deliver a claimed ‘market first’ 4K live transcode processing and distribution of the subscription service’s programming. The direct to consumer service is streamed to desktop, tablet and mobile web devices and now available in full 4K ultra-high definition live on Samsung TV Plus, an app pre-installed on some Samsung 4K tellies.
The UHD streams from Bloomberg studio cameras to consumer screens are of broadcast quality and speeds, according to Bloomberg, and include a transcoding process to HEVC which takes all of 300 milliseconds. 
Zixi explains that its ‘advanced WebVTT’ implementation holds and controls metadata throughout the entire broadcast workflow, facilitating the legal mandate to include closed captioning ‘along with the precise placement of the time code and frame rate for unique and individualized monetization.’
Cloud video platform Blackbird is already being used by Bloomberg Media for fast turnaround news editing and social publishing and not coincidentally Blackbird recently declared its backing for Zixi.
“This partnership with world leading technology provider, Zixi, is an integral part of Blackbird’s OEM strategy,” said, Ian McDonough, Blackbird CEO at the time. “Zixi is the default standard for the ingest and distribution of live video. Blackbird enables customers to distribute and syndicate broadcast quality video content to digital platforms. With our customers insisting on using these two great technologies together, we are very happy to have partnered to bring this offering to market.”
Rival protocols
Perhaps the biggest difference between Zixi and rival protocol SRT from Haivision is that SRT is open sourced and therefore free to use.
Haivision can call on some high profile users too and none more eye-catching than helping facilitate the launch of the Falcon 9 rocket from the Kennedy Space Center last weekend. The historic moment marked the first time in nine years that astronauts from the US travelled to the International Space Station from US soil.
Haivision’s own Makito X video encoders and decoders streamed live video from the launch pad for real-time monitoring at the NASA and SpaceX control rooms. Using SRT, the teams sent bi-directional audio and video feeds between each of the control rooms to communicate during the launch.  
In addition to that, Haivision’s video streaming solutions already power live and on-demand video streaming and IPTV workflows across NASA facilities.  
While SRT and Zixi work just fine for closed networks such as these, broadcasters wanting to transmit live events at scale over the internet to slash the cost of dedicated fibre and satellite links, want the flexibility to be able to work with any professional ISP and any set of vendor decoders and encoders.
A new format is riding to the rescue. Reliable Internet Stream Transport (RIST) is intended as a vendor neutral specification for an interoperable protocol and is rapidly amassing support. While the days of SRT and Zixi are far from numbered – indeed both support RIST since doing so also enables them to sell more equipment – there are some who think that RIST will ultimately become the de facto protocol as broadcasters make IP their primary means of transmission.


Wednesday, 3 June 2020

Distributed production under lockdown and beyond

copywritten for Sohonet
pp24 MESA Journal Spring/Summer 2020 https://www.mesalliance.org/wp-content/uploads/2020/06/20MESpringrevlorez4.pdf




As a society, we are all using remote technologies to remain connected during this unprecedented time. Whilst this shift has intensified in recent weeks, it is not a new phenomenon within media and entertainment, and we can assume that when the world returns to relative normality, the requirement for teams to work ‘together apart’ will remain fundamental.
In fact, we believe that the enforced experiment that the whole creative industries sector is taking part in will lead to a revelation about how remote distributed workflows are perceived with long lasting impact on business culture and economics.
When social distancing measures were mandated a few weeks ago our advice to customers and industry colleagues was simple: focus on 1) connectivity (cloud or remote workflows), 2) security, 3) controlling the creative work station, and 4) sharing your work with your colleagues or customers.
These basic steps haven’t changed except that it’s clear to us that in the scramble to get up and running many decisions taken in the first instance are likely to now impact the overall effectiveness of your workflow.
Chief among these is security. Prior to current events, it would have taken an act of Congress to get approval to bring your pre-release content home.  Unless you had the clout of someone like Michael Bay you would never have asked to bring the actual content (the workstation itself, the hard drives) to your house. 
While there definitely a cohort of creatives working purely through cloud resources, we think the majority of productions are seeking continuity from freelancers and staff who took workstations home. While no-one is consciously putting their customer’s IP at risk, the number one mistake we are seeing people make, regardless of craft, is inadvertently connecting their workstation directly to the internet.
It’s a lot to take in when you feel like you are finally getting comfortable with your (temporary) ‘new normal’, but ensuring an ‘air gap’ between your workstation and the internet may be the most important thing you do to protect work and to ensure that when we all return to ‘business as usual’, the option to work remotely becomes an ordinary and easy to deploy choice.
The next issue is sharing.  When thinking through this problem, you need to think about the video problem (frame rate, color fidelity) and the people problem (how many remote viewers, is the sharing synchronous in real-time or can everyone comment on their own timeline asynchronously). You also need to think about how to share the ‘deliverable’. 
‘Synchronising’ your tools works in the office is easy when you have tons of bandwidth for send and receive, but most home solutions are going to have an asymmetrical speed where the upload is significantly smaller than the download speed. In this scenario, sending the deliverable is a better approach than synchronizing everything, supplemented perhaps with a ‘sync’ at the end of the workday, which can work through the night if needed.
A remote collaboration tool like ClearView Flex can be used to invite colleagues to a secure live stream. You can do a live edit session or review and approval in real-time with over the shoulder instruction like ‘back up two frames’, ‘cut this’, ‘tweak that’, as near as you would in a suite.

Color grading is a different story. It’s not unusual for colorists to take home their project on Baselight or Resolve but the catch now is that ‘critical review’ quality output is not yet possible from any cloud tools (though many industry players are working hard to solve this).
Restrictive home bandwidth will make SDR color grade reviews tricky, let alone HDR passes. It is not possible to finalise work in a projection theatre or on professional grade reference monitors. Your studio can’t send someone to calibrate your home monitor and we can’t send you complicated pieces of kit and expect you to install it on your own. Any solution in this unique situation has to be really simple, supported by phone, so we can get people working as best as possible.
Society will overcome the current situation and gradually return to work. This will likely be in a phased pattern, dependent on many issues, but key creative talent such as colorists are likely to be prioritised by film and TV productions desperate to finalise shows.
There will be positive consequences resulting from production lockdown. Chief among these will be an enlightened attitude in Hollywood and beyond to the practicality and benefits of a distributed content-production workforce. Production in any industry will be impacted to such a degree it is worth delineating this as Before Covid and After Covid (or BC/AC).
The economics of the whole industry will be hard hit so studios will be looking for even more cost efficient means of production going forward. When approximately 30% of a film’s budget is associated with travel, then remote working will now be central to the producer’s equation.
In a time when the industry globally is increasingly conscious of its carbon footprint, and with individuals naturally wary of taking any kind of flight in the immediate aftermath, we again see remote work as core to production.
Where BC remote collaboration was seen as a necessity for an overbooked director or the –  was the prerogative of a key creative to have, say, review and approval dialled in to their location, in AC it will become the norm for any member of production from director and VFX Supervisor to editor and production execs to do more of their work together from where they live, reducing costs and improving speed and agility.
We’ve done the hard yards. Everyone is innovating and solving the problems now. Nothing will be this hard again. We will move from BC in which 10% of the industry had tried remote to one where 85% did so overnight. And guess what? It’s proven, battle hardened in trial and error. Going back to a normal remote work environment with access to high bandwidth, professional grade tools and visiting tech support will seem like a breeze.
Virtual remote work is no longer a ‘love to try it some day’ scenario but one in which we all have real world experience. Executives too.
That’s why remote distributed production will no longer be an academic discussion. It will be ‘Yup, we can get this done.’

Elite sport poised to bring fans closer and safer

copywritten for Blackbird

The great production restart is on and as ever sports is leading the way. Domestic and international top-level sports are taking steps to resume professional competition albeit behind closed doors and with strict biosecurity measures in place for players and coaching staff.
The Bundesliga resumed fixtures on May 16 followed by the Danish soccer league on May 28. Italy’s Serie A clubs have voted to restart the season on June 13 and the English Premier League could kick-off soon after. 
Formula 1 is targeting the Austrian GP on July 5 to start the 2020 season and, in the U.S, the PGA Tour plans for tournament golf to return in June, with the year’s first major, the PGA Championship, scheduled for early August.
While getting sport back on track is key to maintaining competition integrity – not to mention bringing a bit of normal life to millions of fans – there are financial and legal obligations which make a restart attractive and necessary.
The first collision team sport in the world to return was Australia’s National Rugby League (NRL) when round three of the Telstra Premiership kicked off last week. The fact that 4.5m Australians alone watched the first round of matches is a testament to the hunger of sports fans for live action.
Like so many sports federations, the NRL is going direct to fans by streaming match day content to 800,000 fans globally with an NRL Watch account. Streams of the eight games played last week are run through the Microsoft Azure cloud and delivered live while the NRL’s digital team based in Sydney use Blackbird to rapidly edit and publish clips and highlights during and post-match to social channels.
New South Wales state Premier Gladys Berejiklian welcomed the game being back on TV as giving “some normality back” while Alex Glenn, captain of the Brisbane Broncos, said: “We are the first sport that is going to be back in Australia and televised,” adding, “I think the whole nation, New Zealand and England are going to be watching so it’s a huge occasion.”
Remote production operations are one of the critical ways in which sports can leaven the burden of social distancing at events while delivering all the action on the field to viewers.
This is fantastic news for Blackbird users such as Eleven Sports, Deltatre and IMG who collectively use Blackbird to create content fast and efficiently for a wide variety of sports including the NFL, European Tour, EuroLeague Basketball, FIBA, UFC, Champions League football, the Bundesliga and Serie A. Clubs who use Blackbird such as Liverpool, Arsenal and the Buffalo Bills will benefit too.
Our sports-starved quarantined population is eager for action. The prospect of certain elite sports, like the EPL, being shown free to air could see viewing figures explode.
Yet hosting events in a neutral venue and without a stadium atmosphere could impact the value of the live experience. To counter this, the EPL is proposing a range of ideas from changed camera angles to mask sparse arenas to TV interviews with players as they leave the pitch for half time, micing up officials for fans to hear their decisions to coaches or subs being interviewed live during match play.
Provided rights issues can be sorted then some or all of these initiatives could be carried forward into the new campaign.
According to champions elect Liverpool FC it’s becoming clearer that we will be watching a different ‘product’ take shape when our sport returns to our screens.
That product will be bringing fans closer into the stadium with more intimate connection with their heroes than ever before – even at safe distance.

Monday, 1 June 2020

Tales From the Loop DP talks large-format and natural light

PostPerspective 
“Not everything in life makes sense,” a woman tells a little girl in the first episode of Amazon’s series Tales From the Loop. Sage advice from any adult to a child, but in this case the pair are both versions of the same character caught in a time-travelling paradox.
“This is an adventure with a lot of sci-fi nuances, but the story itself is about humanity,” says Jeff Cronenweth, ASC, who shot the pilot episode for director/producer Mark Romanek. “We are representing the idea that life is little different from the norm. There are time changes that our characters are unaware of, and we wanted the audience’s attention to detail. We didn’t want the visuals to be a distraction.”
Inspired by the retro-futurist paintings of Swedish artist Simon StÃ¥lenhag, Tales from the Loop gravitates around characters in a rural North American community and the emotional connection some of them feel toward artefacts from a clandestine government facility that litter the landscape.
Rather than going full Stranger Things and having a narrative that inexorably unlocks the dark mysteries of the experimental lab, writer Nathaniel Halpern (Legion) and producer Matt Reeves (director of Dawn of the Planet of the Apes and The Batman), construct “The Loop” as a series of individual loosely connected short stories.
The tone and pace are different too, as Cronenweth explains. “Simon’s artwork is the foundation for the story, and it elicits a certain emotion, but some of his pieces we felt were overly strong in color or saturated in a way that would overwhelm a live-action piece. Our jumping-off points were his use of light and staging of action, which often depicts rusting, broken-down bipedal robots or buildings located in the background. What is striking is that the people in the paintings — and the characters in our show — treat these objects as a matter of fact of daily life.”
Near the beginning of Episode 1, a young girl runs through woods across snowy ground. Filmed as a continuous shot and edited into two separate shots in the final piece, the child has lost her mother and spends the rest of the story trying to find her. “We can all relate to being 9 years old and finding yourself alone,” Cronenweth explains. “We begin by establishing the scale of the environment. This is flat rural Ohio in the middle of winter.”
Photography took place during early 2019 in southwest Winnipeg in Canada (standing in for Ohio) and in sub-zero temperatures. “Our dilemma was shooting in winter with short daylight hours and at night where it reaches minus 32. Child actors are in 80 percent of scenes and the time you can legally shoot with them is limited to eight hours per day, plus you need weather breaks, or your fingers will break off. The idea of shooting over 10 consecutive nights became problematic. During location scouting, I noticed that the twilight seemed longer than normal and was really very beautiful, so we made the decision to switch our night scenes to magic hour to prolong our shoot time and take advantage of this light.”
He continues, “We had a condor [cherry picker] and lights on standby in case we couldn’t make it. We rehearsed two-camera setups, and once the light was perfect, we shot. It surprised everybody how much we could accomplish in that amount of time.”
Working in low, natural light; maximizing time with child actors and establishing figures isolated in a landscape were among the factors that led to the decision to shoot large-format digital.
Cronenweth drew on his vast experience shooting Red cameras on films for David Fincher, including Gone Girl, The Social Network and The Girl With the Dragon Tattoo. Cronenweth was Oscar nominated for the latter of those two films. His experience with Red and his preference for lenses led him to the Panavision’s Millennium DXL2 with the Red Monstro 8K VV full-frame sensor, which offers a 46.31 mm (diagonal) canvas and 16 bits of color.
“It was important for us to use a format with 70mm glass and a large-format camera to give scale to the drama on the small screen,” he says.
Another vital consideration was to have great control over depth of field. A set of Primo 70s were mainly for second unit and plate work while Panaspeeds (typically 65mm, 125mm and 200mm) allowed him to shoot at T1.4 (aided by 1st AC Jeff Hammerback).
“The Monstro sensor combined with shooting wide open made depth very shallow in order to make our character more isolated as she tries to find what was taken away from her,” explains Cronenweth. “We also want to be with the characters all the time, so the camera movement is considerable. In telling this story, the camera is fluid, allowing viewers to be more present with the character.”
There is very little Steadicam, but he deployed a variety of technocranes, tracks and vehicles to keep the camera moving. “The camera movement is always very deliberate and tied to the actor.”
Shooting against blinding white snow might have been an issue for older generations of digital sensors, but the Monstro “has so much latitude it can handle high-contrast situations,” says Cronenweth. “We’d shoot exteriors at the beginning or end of the day to mitigate extreme daylight brightness. The quality of light we captured at those times was soft and diffused. That, plus a combination of lens choice, filtration and some manipulation in the DI process, gave us our look.”
Cronenweth was able to draw on his experience working camera on eight pictures for fabled Swedish cinematographer Sven Nykvist, ASC, FSF, (Sleepless in Seattle, What’s Eating Gilbert Grape). Other tonal references were the films of Russian filmmaker Andrei Tarkovsky (Stalker) and Polish genius Krzysztof Kieslowski (notably his 10-hour TV series Dekalog).
“I was motivated by Sven’s style of lighting on this,” he says. “We were trying to get the long shadows, to create drama photographically as much as we could to add weight to the story.”
Cronenweth’s year spent shooting Dragon Tattoo in Sweden also came into play. “The way exteriors should look and how to embrace the natural soft light all came flooding back. From Bergman, Tarkovsky and Kieslowski, we leaned into the ‘Scandinavian’ approach of tempered and methodological filmmaking.”
The color palette is suitably muted: cold blues and greys melding with warm yellows and browns. Cronenweth tuned the footage using the DXL2’s built-in color film LUT, which is tuned to the latest Red IPP2 color processing incorporated in the Monstro sensor.
Cronenweth recalls, “In talking with [Light Iron supervising colorist] Ian Vertovec about the DI for Tales From the Loop, he explained that Light Iron had manufactured that LUT from a combination of work we’d done together on The Social Network and Dragon Tattoo. That was why this particular LUT was so appealing to me in tonality and color for this show — I was already familiar with it!”
“I’ve had the good fortune of working with Jeff Cronenweth on several feature films. This would be the first project that’ve we’ve done together that would be delivering for HDR,” reports Vertovec. “I started building the show LUT using the camera LUT for the DXL2 that I made, but I needed to rebuild it for HDR. I knew we would want to control skin tones from going too ruddy and also keep the green grass from getting to bright and electric. When Jeff came into grade, he asked to increase the contrast a bit and keep the blacks nice and rich.”
The pilot of Tales From the Loop is helmed by Romanek, for whom Cronenweth has worked for over two decades on music videos as well as Romanek’s first feature, One Hour Photo. The remaining episodes of Tales From the Loop were shot by Ole Bratt Birkeland; Luc Montpellier, CSC; and Craig Wrobleski, CSC, for directors So Yong Kim, Andrew Stanton and Jodie Foster, among others.

How to Make High-Quality Animated Sci-Fi in Realtime with Unreal Engine on a Budget

No Film School
Think virtual production is the preserve of James Cameron? Filmmaker Hasraf ‘HaZ’ Dulull creates cinematic quality animated sci-fi in realtime using Unreal Engine, under lockdown, with a crew of just three (+ voice actors and sound composer). Here’s how it was done:
Having made sci-fi features 2036 Origin Unknown and The Beyond (both were on Netflix) as well as the action comedy show Fast Layne now available on Disney +, the UK-based director/producer Hasraf ‘HaZ’ Dulull is establishing quite a profile. Neil Gibson, graphic novelist and owner of TPub Comics, reached out to him online looking for a director to create a proof of concept based on one of the publisher’s IP.
Dulull, who grew up watching anime, was asked by Gibson to read his graphic novel ‘The Theory’, and pick a story in the anthology.  HaZ was instantly attracted to Battlesuit.
“I love big mecha robots,” Dulull says. “I’ve always wanted to make a film with big robots and I love a challenge.”
The catch: TPub imagined it as live action… inside of four months to premiere at Comic Con in London.
“Even if I’d worked 24 hours a day I knew that on the small budget they had it just wouldn’t look that good. It was important to me and the audience and also to TPub that we give it the stamp of the highest quality we could to tell this story.”
Dulull suggested producing the story as an animation along the lines of the Netflix’ Love, Death & Robots.
“They were very hesitant at first, thought it sounded very expensive, requiring a studio and a large team to do it well,” he says.
To convince them, Dulull made a mock-up in Unreal Engine.
“At the time [late 2019] I was prepping for my next sci-fi live action feature, Lunar, which was supposed to go into pre-production in May 2020 [postponed due to COVID-19] and I’d begun to previz key sequences using Unreal Engine. I remember thinking to myself, that the quality of the output from Unreal was good, it’s free to use and it’s real-time. I thought that with a bit more graphics power and love you could make actual narrative content this way.”
The short test sequence was made using the paragon assets which are free to download and use from The Epic marketplace.

The reaction online when it was posted was positive and gave TPub the confidence to say, ‘Wow! if you can do 12 minutes of that we will leave you to it!”
Setting the benchmark
The test informed Dulull that he could pull it off, but it also set expectations for TPub too. “We decided to take a realtime approach mainly because of low budget and tight schedule we had for this ambitious project.
“We weren’t going to be making something that looks like Pixar but the benchmark was still high. Audiences don’t really care how much budget you have - it’s the end result that matters.  
“The great thing about realtime CGI rendering is - what you see is what you get. I was able to show how it would look regardless of budget.”
Production began during Christmas 2019 and from the start with a shoestring crew and agile work procedure. Joining Haz was Unreal technical director Ronan Eytan and Unreal Engine artist Andrea Tedechi.
The storyline for Battlesuit followed the graphic novel although the script was continually being refined by Gibson at TPub during the adaptation of it for screen. Indeed, this was the publisher’s first foray into television production and animation.
“One panel of a comic book tells you so much but to get that coverage in animation could require maybe six shots,” Dulull says. “That’s a different mindset for graphic novel writers to adapt, and Gibson was great to work with in making that transition from graphic novel to screen.”
It wasn’t as if Dulull was an expert at character animation either. True, he’d spent the early part of his career working in video games and then visual effects for movies like The Dark Knight and even nominated for several VES awards for his work as a Visual Effects Supervisor, before becoming a director.
“I’ll be the first to admit I’m no character animator which is why our core team and the approaches we each took were so important. We each had different skills.”
But there were other workarounds too, the likes of which Dulull believes could benefit other indie filmmakers.
Asset building
Rather than creating everything from scratch, they took a kitbash approach to building some of the key assets, by licencing 3D kits and pre-existing models (from Kitbash3D, Turbosquid and Unreal’s Marketplace). Tedechi used these as the base to build and design principal assets such as the Mech robots and the warzone environment. Dulull was able to animate the assets and FX in realtime within Unreal’s sequencer tool and to begin assembling sequences.
Facial animation of the lead character was more of a challenge. They didn’t have the budget to have a full motion capture session with actors, so they bought “tonnes” of mocap data (from Frame Ion Animation, Filmstorm, Mocap Online) and retargeted that data onto the body of the main characters.
But they still needed to find a way of getting original performance on the face of the character.
“Ronan had this genius idea of using the depth camera inside an iPad (used for facial recognition) to capture the movement of the face and eyeballs of our actor. He also developed a pipeline utilising the live link in Unreal.”
Actress Kosha Engler (known for her voice work in video games such as Star Wars Battlefront and Terminator Resistance) was also the character’s voice artist.  The team was able to record her voice and capture her facial performance at the same time in one session.
“We had to do some tweaks on the facial capture data afterwards to bring some of the subtle nuance it was missing, but this is a much quicker way to create an animated face performance than trying to do it all from scratch or spend a fortune on high end facial capture systems.”
Workflow under lockdown
By the time COVID-19 lockdown measures were announced across the UK in mid-March, they were already working in a remote studio workflow.
“The nature of the production and the tools we used allowed us to work from anywhere – including our home living rooms,” he explains.
While Tedechi concentrated on building environments and robot assets and Eytan was busy with the character animation and pipeline side of things in Unreal, Dulull was exploring camera moves and creating each shot.
“My style as a director is to be hands on and this project suited this perfectly. I would ask Andrea for specific assets, take those and build out all the shots, light them and animate before putting them into an edit (Davinci Resolve 16), while Ronan would rig the characters and troubleshoot any technical issues.”
They used a 1TB Dropbox as a back-up store and share respository and communicated via Skype screen sharing and What’sApp.  To keep production tasks and schedules on track, they used an online management system called Trello to help them coordinate the collaborative workflow and assign tasks.
“It’s far better than using email to communicate and tracking long email threads is a nightmare. Plus, Trello is free for a certain number of users. It’s like a big index card system (which also has Dropbox integration) so we could all make notes and move the cards around informing us who was using certain things.
“We also used Vimeo Pro to upload and share latest versions with timecode annotations. For instance, Andrea could see how I’d used an asset and then work on refining that certain angle.”
Virtual camera work
A large part of the film takes place in a war zone, for which Dulull wanted a visceral, raw handheld action vibe to the camera work.
“I soon realised that trying to keyframe that level of animation would have been too slow and unproductive.”
He turned to DragonFly, a virtual camera plugin for Unreal (also available for Unity 3D and Autodesk Maya) built by Glassbox Technologies with input from Hollywood pre-viz giants The Third Floor. 
With the plugin and its companion app for the iPad and operated Gamevice controllers, Dulull had a lightweight virtual camera in his hands.
“It was such a tight schedule that learning a new toolset was really tricky but I wanted to use the camera for elements in the virtual cinematography I knew I couldn’t animate.”
After testing it at Epic’s lab in London, Dulull was able to use this in his own front room.
“I felt like James Cameron making Avatar. I treated it like a realworld camera with realworld lens decisions.  I was able to review takes and shots instantly. As a filmmaker this gave me the flexibility speed and power to design and test shoot ideas quickly and easily
“Virtual production is a gamechanger. Now indie filmmakers can use the same tools used by the big film studios and you can do it in your own home.”
With each scene having a huge amount of assets, the team had to be smart to reduce the polygon count. “We set some rules. All big robots were in high res as we would see them in closeups a lot, but we dialled the level of detail down on the buildings or backgrounds, especially when the camera is moving around fast. The great thing is, you can control all that level of detail within Unreal.”

 All the shots were captured at 2K resolution in Prores4444 Quicktime format, and the desert scenes were in EXRs – taken into Davinci Resolve for editing, colour grade and deliverables.
Workstation Power
Powering it all, Dulull used the latest Razer Blade 15 Studio Edition laptop PC. This comes with Nvidia Quadro RTX 5000 card on board.
“The Razer Blade was constantly being pushed with most of the scenes being very demanding with the amount of geometry, FX and lighting including scenes that were 700 frames long,” he says. “I was able to pump out raytracing in realtime.”
Every single shot in the film is straight out of Unreal Engine. There’s no compositing or external post apart from a few text overlays and colour correction done in editorial on Blackmagic Design Resolve (on which Dulull also edited).
That made good use of Nvidia’s GPU and utilised the Razer’s 4K display allowing him to make creative tweaks to each shot, lighting and color right through to deliverables.
The score was also written in synchronicity with production, as opposed to being made weeks afterwards in post. Dulull turned to noted composer Edward Patrick White (Gears of War tactics) to write the music whilst production of the animation was happening.
“A lot of Edward’s musical ideas influenced my direction of the scene,” Dulull says. “That’s another unusual and creative way to work.”
Review and approval
He sent versions of the edit which would have mostly first pass animation and lighting to the publisher for their notes.
“Their notes were typically about story and dialogue, things that if we were doing this in a more conventional linear route, would entail time and expense to fix. But because we were in a realtime environment and we didn’t have pipeline steps such as compositing or multipass rendering as you would get in conventional CG, we were able to make iterative changes really quick straight out of the engine.
“The ability to make these changes on the fly without making a difference to our budget and schedule is another huge game-changer. It could mean that many more story ideas get made because the risk to the producers and finance execs is so much smaller.”
Indie filmmaking goes virtual
With the cancelling of Comic Con, TPub decided to release the short online exclusively with Razer. There is a TV series in development and it’s hoped Battlesuit can showcase not only TPub’s IP but also what is possible with this way of creating animated narrative content.
“Tools like Unreal Engine, DragonFly, Razer Blade and Nvidia GPU - reflects the exciting revolution of realtime filmmaking that we are all currently venturing into – where indie filmmakers with small teams can realise their ideas and cinematic dreams without the need for huge studio space or large teams to set up and operate,” Dulull enthuses.
“If someone had said I could pull off a project like this a few years ago that is of cinematic quality but all done in realtime and powered on a laptop I’d think they were crazy and over ambitious. But today I can make an animated film in a mobile production environment without the need for huge desktop machines and expensive rendering.
“I can see a future where realtime filmmaking is going to unleash a wave of new and fresh stories that not only feel big but are full of imagination and ideas that were previously deemed too risky and expensive to do.”