Friday 28 July 2023

Behind the Scenes: The Open

IBC

The 151st Open takes place at Royal Liverpool, 16-23 July with EMG in charge of pretty much all broadcast technical facilities.

article here

Ultimately working for the Royal & Ancient (R&A), EMG provides a full bag of clubs and then some for as many as a dozen different production partners to perfect their tee to green week-long coverage from Hoylake.

“The scale of provision that EMG provide dwarfs any other coverage we do,” said Hamish Greig, Director of Golf Operations. “The Open is everything we normally do for the European Tour but on steroids.”

EMG’s engineering and operations team has covered all DP World Tour events worldwide with European Productions, the production partner of DP World Tour. Last year alone, EMG covered 36 golf events across 15 countries in Europe and the Middle East, including the historic 150th anniversary of The Open last July.

The normal provision for a European Tour event is eight trucks comprising of units NOVA 112A and NOVA 112B for main coverage, graphics and editing, plus RF, commentary, office, buggy, scaffolding and cabling trucks.

At The Open, EMG is assigned to produce 12 different production streams just as a standard each with their own separate facilities provision. It provides all fibre hook up, talkback and operating positions where required.

This includes coverage for NBC and the Golf Channel in the US and Sky Sports domestically, for TV Asahi in Japan, BBC Radio and IMG on-course radio, commentary and on-course facilities for Canal+ and the BBC. A Celebrity Challenge event is also being streamed live on the Sunday before the event using RF cameras to cover the back nine holes.

EMG builds the International Broadcast Centre (IBC) which houses the heart of the broadcast facilities. This includes a main production gallery with Kahuna mixer, a submix gallery, a super ISO gallery for the World Feed and another complete production gallery for NBC’s unilateral output. Here too, there’s an EVS submix gallery, audio main mix gallery and a graphics gallery. Every camera is isolated and camera ops operate as if they are live all the time.

“There is such a good relationship between NBC and ETP that this removes the need to do a lot of side-by-side production, which was the norm at The Open only a few years ago because rightsholders needed more dedicated resources,” Greig said.

“On top of that we have our own QC area for all the different signal distribution and MCR set up to look after all the routers, vision mixers, multiviewers, 21 x EVS servers, graphics servers as well as distribution and fibre transport.

“Everything fans out from the MCR to either the host feed or to the different trucks assigned to manage different flavours of production. It is a very complex set up.”

Complementing the main feeds are trucks that handle other functions, Marquee Group 1 and 2 productions are two curated feeds, covering shot by shot round of the leading (or marquee group) of players, which have their own production areas, plus a dual Par 3 UHD HDR production covering the 13th hole and the iconic 17th.

Behind the Scenes: The Open - What’s the 5G frequency?

“One thing that everyone forgets about The Open is the sheer scale of it,” Greig said. “RF cameras are the extreme of that. We’ve got 35 RF cams, six more than last year, plus two RF vision links (for on course RF monitors) one for Sky presenters roving around the course and one for the Live at the Range presenter roving around the driving range.”

He adds, “I don’t believe there is another event in the world where you have that number of RF in one space.”

EMG is well versed in handling the logistical complexity. Its division, EMG Connectivity, unites the activities of Eurolinx, Broadcast RF and the RF activities of EMG France to deliver 1,500 RF events a year.

“We are fortunate that we provide and manage all these facilities on site for all these different strands of productions. For example, we’d never be able to do what we do now with RF if there were multiple providers. It would simply be too complex to manage. The frequency management is critical and we can only do it because we have a very close relationship with Ofcom. All our radio talkback is done using low power RF over fibre systems around the course. Previously everyone used to ramp up the power which would knock every other user out.”

He said 5G wouldn’t particularly assist. “You would need your own 5G core [network] and we don’t have one for this level of coverage. There’s already a limit on spectrum and 5G would take another slice of it. Don’t forget that latency is such a key element to RF so whatever model we use we want to ensure the same for every piece of kit in the system so you don’t have latency issues.

“5G will definitely be of benefit to smaller remote venue productions where they can take their own core around and slice up the available spectrum but for larger productions like this the spectrum would still be too limiting, however it could greatly aid data control (Paint), reverse video, Tally and intercom etc.”

Behind the Scenes: The Open - Cable guys

The other major infrastructure element is cable connectivity. The R&A installed fibre rings around the course into which EMG access approximately 20 different nodes to transport cameras, data and audio. There are approximately 14 RF receive points taking signals into the fibre network.

Every feed comes into the broadcast compound where it is transferred to the various trucks via another 20 km of Tac 24 cable. The broadcast compound at Hoylake is long and thin and necessitates cables that are themselves 100-150m in length.

“With so many signals and data, fibre management is one of the biggest issues and its management is a big part of the job. Feeds come from the course to a cabinet and then into the MCR before fanning out to different trucks. All the cabling really does look like a yellow sea.”

Behind the Scenes: The Open - UHD HDR

Two of the course’s par 3 holes (13 and 17) are covered in UHD HDR from which a HD 1080i flavour is extracted for the World Feed. Hole 17 has recently been remodelled and is the signature hole built in consultation with the R&A and golf course architect Martin Ebert.

The UHD HDR can be used by any rights holder but is particularly in demand by US satellite broadcaster DirectTV using kit supplied by EMG.

“We use the UHD outputs for the UHD HDR feed, and the HD outputs feed the normal golf infrastructure. So the main director directs his normal cameras. The par three hole director has their green camera and supplements it with two RF cameras on that hole as those holes are covered every minute of the day.”

Behind the Scenes: The Open - Cameras

There will be 160 cameras of various types around the Royal Liverpool course which plays 7,218 yards. Ninety-eight of these cameras are for the World Feed and for the first time include a drone cam in UHD HDR supplied by Aerial Camera Systems (ACS). There’s also a CAMCAT Colibri wire camera system from ACS running 120m across the 18th Grandstand area. A plane will provide further overheads.

Also new from ACS are high frame rate Phantom cameras positioned for bunker shots also in UHD. Toptracer cameras track the ball from the Tee and deliver graphics on-screen.

There are more super-slo-motion cameras, an SLR camera with shallow depth of field working in RF for beauty shots and a camera in the courtesy car ferrying golfers to and from the circuit. POV cams and remote operated SMARTheads are dotted throughout the course.

The World Feed cameras are augmented by various broadcasters for their own unilateral coverage. Sky, for example, is using a further four RF and nine cabled cams.

Behind the Scenes: The Open - Mics

The host feed also benefits from over cabled 150 mics (EMG also supplies 50 high power and 60 low power radio mics) including Sennheiser 416 and 418 and shotgun mics plus on-course commentator mics and the more directional 816s as well as M58s which feature an internal shock-mount to reduce noise.

“We have FX mics at the Tees, greens and fairways and also the grandstands so from every hole no matter where the RF camera or commentator is you’ve got multiple positions you can take the audio from,” Greig explained.

There are 70 duplex channels for either radio talkback channels for all the various production streams or Interruptible foldbacks for full course coverage, 24 In ear monitors as well as Romeo and Freespeak high quality digital talkback channels for studios.

In the audio submix area of the IBC, operators ‘sweeten’ all the course mics and convert the feeds to MADI streams before making them available to all productions.

Behind the Scenes: The Open - Live at the Range

Coverage also includes the driving range which at Hoylake is located outside of the course boundaries across a main road and a rail line. Sky has its main presentation studio here while Live At The Range, a live stream provided by the R&A, is based here. For radio cam crews and commentators too, the Range is a staple part of the broadcast.

Trains only stop operating on the Thursday before giving EMG’s engineers barely a day to connect fibre from a duct under the track to complete the cabling connectivity between the range and the TV compound, for rehearsals from Saturday pre-event.

“The Open is a year-long project,” said Greig. “We start planning the next one as soon as one ends. There are constant changes on all the different production streams as you go along.”

Behind the Scenes: The Open - Hazards

For a links golf course like this weather will always be a hazard, with wind the main concern. Cameras are lowered on gantries each night and safely secured.

At St Andrews in 2015 a day of golf was lost due to high wind causing the balls to move on the greens and the event had to be finished on the Monday.

EMG has put in place a full Disaster Recovery solution. For instance, one of its main production trucks has been given a different power and signalisation circuit enabling the broadcast to continue to air should a catastrophic loss of power happen elsewhere.

Behind the Scenes: The Open - Graphics

MST Systems delivers the main on-screen leader boards and lower thirds for The Open just as is it does for ETP.

The PinPoint wind solution provides real time wind speed/direction readouts tailored for various broadcasters. The wind data is provided from a small ultrasonic anemometer that is placed in ‘clear air’ on the golf course. It collects wind speed/direction every second and transmits each reading in real time.

One of the options is a rolling 10-minute average that displays the average wind strength/direction of the last 10 minutes, together with the maximum and minimum wind speeds during that time. A forecast option predicts wind speed and direction, maximum gusts, temperature, percentage chance of rain etc. There are eight fibre drops for its sensors around The Open course.

Virtual Eye Golf system (a division of ARL) displays changing values over time in data animations regularly used for coverage including of PGA events. These include virtual course flyovers created to scale, with all tees and distance measurements adjusted to match the daily course setup.

Its 3D graphics remain stable during the motion of helicopters or drones in live broadcast, as the graphics are pinned to a hidden virtual model ‘beneath’ the video. Virtual Eye can display green and fairway contour animations to represent the shape, speed and lie of the terrain. A vertical distance tool allows for accurate measurements - on the fly – of the vertical distance between two points.

A common scenario shows the state of play at each hole, based on the tee and pin positions, along with the course conditions of the day. The system gives our Operators the flexibility to adjust speeds, frame angles, graphic timings, and the content of the animations.

Graphics can be displayed in different parts of the frame. In use, it provides a ‘yardage book’ feel for tee shot setup and operates like the viewer’s caddie: “It keeps up, doesn’t interrupt and yet provides the right information at the right time,” said ARL tartly.

Behind the Scenes: The Open - Remote facilitation and Sustainability

EMG’s provision is also helping broadcasters remote produce their shows. Sky for example will present from site but production is mixed in Osterley. TV Asahi’s programme is cut and polished in Tokyo (EMG provide it with a flypack on-site) as is The Golf Channel’s. Hawk-eye is also operated remotely in Basingstoke, and for NBC back in America, as are some Toptracer operators in Sweden and some of the graphics provided by Virtual Eye whose HQ is in New Zealand.

EMG core trucks and crew are busy the week before at the Scottish Open. “It’s very important that we travel to Hoylake without using flights because of sustainability,” Greig said. “Everyone including me will be travelling by bus or train or car where there’s no other way.”

The fleet of new fuel-efficient OB trucks recently launched by EMG will not be present but instead are busy servicing ETP events.

In 2021, EMG became the first OB and facilities company in the UK to install its own HVO (Hydrotreated Vegetable Oil) Green D+ fuel station. This fuel has up to 90% reduction in net CO2 greenhouse emissions when compared to regular diesel. In January this year it become the first OB specialist to achieve the DPP Committed to Sustainability mark.

 



Thursday 27 July 2023

It Was Just a Beautiful Dream: Virtual Production for “Live Again”

NAB 


article here

“Live Again” is the tenth collaboration between British dance band Chemical Brothers and director duo Dom&Nic.

“It’s a trippy Groundhog Day-esque adventure story through multiple environments in a continuous dance performance by Josipa Kukor,” describes Promonews.

To achieve it they filmed long unbroken shots with background virtual environments switched live without edits.

“The woozy, wonky analog sounds and the dream-like lyric suggested a hallucinogenic visual journey following a character caught in a loop of death and rebirth. The hero in the film wakes or is reborn in a different environments ranging from deserts to nighttime neon city streets and cave raves to Martian worlds,” the directors told Promonews.

“This is an idea that could not really have been achieved with traditional filmmaking techniques. We created virtual CGI worlds and used long unbroken camera takes, without edits, moving between those different worlds seamlessly with our hero character.”

Dom&Nic’s production company Outsider brought together cinematographer Stephen Keith-Roach, production designer Chris Oddy and VFX facility Untold Studios, along with virtual production specialists from ARRI Solutions, Creative Technology and Lux Machina, all hosted on the ARRI Stage London.

Dom&Nic say that the band encouraged them to capture the feel of the track in the cinematic texture and look of the film.

“We were given the challenge to give it the visual equivalent of putting a clean sound through a broken guitar pedal to transform and degrade it into something unique. We love the way the film has an analog and messed up film look to it, it really adds to the visual trippy experience.”

Untold Studios real-time supervisor Simon Legrand added, “After designing seven bespoke virtual worlds in pre-production, we were then able to tweak elements on set, on the fly, giving the directors the freedom to play and experiment. This is the first time that virtual environments have been switched live on set in this way.”

Will Case, director of innovation at Creative Technology, confirmed, “It really pushed the boundaries of working in real-time workflows and technologies to bring to life Dom&Nic’s visually stunning promo.”

Friday 21 July 2023

BTS: Mission: Impossible - Dead Reckoning Part One

IBC

For a machine-tooled $290 million stunt-fuelled blockbuster it comes as a surprise to learn that the latest Mission: Impossible instalment was made with improvisation and experimentation at its core. 

article here 

Director Christopher McQuarrie may not quite be working with improv to the degree of a Shane Meadows or Mike Leigh but there are similarities to their indie drama in his method.

“No-one gets a script,” says the film’s editor Eddie Hamilton ACE, who cut the last two MI films, Rogue Nation and Fallout, as well as Top Gun: Maverick). “Instead, every head of department gets a sense of the type of sequences that are going to happen.

“Chris [aka McQ] and Tom have been having creative discussions for a decade about things they would like to see in the next MI movie. They will build the story around locations that are available and the geography that exists in that location. They cast specific actors that they want to work with and then they will work organically with the actors to evolve the characters giving them a lot of room to play around with dialogue.

“The wardrobe department will design costume for the cast that feels appropriate for each location and the art department design sets appropriate to where the characters are emotionally in the story – but there is no script. McQ has an idea of which way the story is going but not the details.

“He will usually write the scene the night before or even the morning of the shoot. Tom will approve the pages, everyone goes to set and evolves the scene. Ultimately what the script ends up being is what the script supervisor types up at the end of each day. We spend weeks and months discovering the organic fluidity of the movie in the edit room which is very time consuming but ultimately rewarding.”

The broad outline for Mission: Impossible - Dead Reckoning Part One revolved around Grace (Hayley Atwell) reflecting the origin story of IMF characters Benji, Luther and Ethan. She is someone who journeys from selfish to selfless by the end of the film and who has the raw ingredients to make up an IMF agent. Couple that with a wish list mis-en-scene that Cruise and McQuarrie want to achieve - riding a bike of a cliff and crashing a train respectively - and that was the starting point. A Hitchcock MacGuffin about the hunt for an all-powerful AI called Entity became the plot’s glue.

“We’ve been working on this for three years and on every tiny nuance and detail,” Hamilton says. “Nothing is too insignificant to hone and craft.”

 

That stunt

In September 2020, Hamilton with the main crew flew to Norway for Day One of principal photography which is now fabled as the day Tom Cruise jumped off a mountain on a motorbike, six times.

“We used the penultimate take of day 1. One the second day he had cameras attached to the bike but with Mission Impossible you want to see that what he is doing is for real.”

Hamilton explains that there is a standard visual language for a Tom Cruise stunt. This is to use shots that proceed tight to wide or wide to tight, or sometimes tight to wide to tight. This stunt was a little different though.

“For the hero take, which was shot on a helicopter, we start on a medium of Tom riding down the ramp then we pull back so you see him on the bike and we hold on the shot as he goes over. The camera tilts down and sees him falling. We aren’t cutting. It’s more impressive this way even though we have great angles from low down, a wide, a drone, and from the bike.”

A front-on shot of Cruise jumping appears in the titles and in trailers and was first seen in an IMAX behind the scenes short released last December from a 12-minute edit that Hamilton had made in the week of production for Paramount.

“I put together this reel to show the studio what we were doing. They trust Tom and McQ and they give them the resources to make the film but they’re not intimately involved. We didn’t want to send them just a minute of dailies of Tom riding off a mountain. We wanted to show them exactly what eight months of training and preparation had led up to.”

Trimming to the bone

Hamilton was in Soho or adjacent to soundstages at Longcross Studios for most of the shoot although production was interrupted on several occasions due to Covid. Photography began September 2020 and the final day was April 2023, just a couple months before the film’s premiere.

“When McQ is in production he’s collecting ingredients so he can cook the film in editorial,” Hamilton explains of the unusual process. “He has a sense of how the film will come together but knows he will only discover it in the edit. He overwrites all the scenes so there’s plenty of options. The actors give us a massive variety of emotions in their delivery and if things need improvement we’ll go back and do pick-ups or even throw out a scene if the audience flatly reject it when we test the film (which did happen).

“The reason it works is because Christopher McQuarrie is the secret ingredient, simple as that. He and Tom are constantly discussing where the film’s strength and weaknesses are and what we can course correct on.

“Some days, after having had a conversation with Tom over breakfast, Chris will come into the edit room and say ‘We’ve got an idea for this scene’ which means chopping off this half, rearranging this half, shooting some pickups, adding extra lines of dialogue or ADR.

“It is very stressful for everybody because we are doing all the heavy lifting of the storytelling and breaking the story apart during the process of filming. Chris also has to manage the day-to-day stress of production and problems with rigs on cars or issues with weather and the other 101 things a director has to worry about.”

Like Top Gun: Maverick, MI:7 burned through a lot of footage enabled by shooting digital (Sony Venice) for the first time in the franchise. Up to five times more material than was needed was shot for some scenes.

“There were so many options of Ethan running around the alleyways in Venice, for example, stopping and talking to the IMF team, the Entity instructing him which way to go, bumping into other characters. There was so much of it but I knew it would be refined into a rocket ride - which is one of Tom’s favourite expressions.

Hamilton aimed to stay on top of all the dailies. “I watch it all and break it down to have a sense of the best angles, best moments, the most dynamic action. We put that on a timeline and label it up.”

Leave them wanting more

One of the guiding principles is to leave the audience wanting more. Keenly aware of “action fatigue” that labours the experience of watching explosion after explosion and prolonged fight sequences in recent blockbusters, “we were very sensitive to the feeling of length for any sequence.”

He says, “If people even breathed a thought that something was too long we’d keep aggressively trimming it down so it felt like just enough or even not quite enough.”

Instinct aside they relied on numerous test screenings with friends and family and with cold audience recruits.

“McQ and Cruise want to make mass entertainment that works all over the world. We don’t fight any feedback if the audience feels strongly about something. [From feedback] they said they’d seen Tom driving a BMW before, but the real fun was when Ethan and Grace get into the Fiat (a much smaller car they switch into and continue a chase through Rome).

“The BMW material was trimmed to the bone. In fact, every sequence is trimmed tight to the frame. I’ve seen the film 700 times over three years, combing through each sequence 50 to 200 times to make sure that every single tiny emotional beat is exactly right for where the audience needs to be for that point in the story.”

“I tend not to worry about it while we’re building. You know it will be too long and lumpy, it won’t make sense, it will be boring.  I know we will compress it and we all trust the process.”

Hamilton acknowledges that they “have the time and resources for it to come together in the end.”

This particular car chase, in which Cruise and Atwell’s characters are handcuffed together, combines action thrills with comedy and works in a way that other films with similar scenes do not. Hamilton thinks this is because they are not relying on editing to create comic timing.

“Everything is done in a two shot. We’re not cheating. You are watching two actors coming up with ideas and trying stuff out which contributes to the natural ease of what you are seeing. We’ve got these compositions, whether in profile or three quarters of the two of them, where you can see the geography around them, the Hummer chasing them, and that all contributes to the idea that we’re using edit to create comedy timing. You are watching natural comedy play out.”

Dialogue scenes cut like action

Pauses between the action are some dialogue heavy scenes including one set in the Department of National Intelligence near the film’s beginning and, later, a night club scene. To retain the audience’s attention as well as to unsettle them the filmmakers employ an unusual technique where the eyelines of characters are deliberately mismatched.

McQuarrie and DoP Fraser Taggart shoot A cam and B cam to provide left to right and right to left eyelines. In the edit Hamilton crosses the eyeline on a vital piece of information or where there’s an emotional shift in the scene in order to jolt the audience back into the picture.

“The way audiences watch a film is that they are not checked into the scene the whole time. Certain things the actor’s will say will trigger your own thoughts about your own life and sometimes it will take you out of the film for a few seconds. So, the way we trigger your attention to come back is by cutting on very specific emotional beats or words in the scene.

“It is all extremely precise. Every nuance is crafted and we refine it hundreds of times. Sometimes we watch a 10-minute scene 40 times in a day checking to see where your eye is moving in the frame and if certain pieces of emotion are landing for you and that your understanding of the story is working.”

There’s more going on here. The filmmakers lean into extreme close ups and Dutch angles, eschewing wide shots except to establish the proximity of characters in a scene. Director and DP employ long focal lengths to frame the closeups, a technique that adds intimacy.

“McQ learned which lens worked for each character, a 60mm or 75mm sometimes a 135mm, and would use the difference intimacy levels and common geography (shooting a close-up but seeing another character in the background or racking focus between them) which delivers emotional impact because you feel present in the scene.

“All those elements allow us to keep the pace of the dialogue very tight. We almost cut the dialogue scene like an action scene.”

Some of the criticisms levelled at the film is that dialogue scenes are too full of exposition.

“They all started out much more dialogue heavy and we boiled them down to the absolute minimum for the story to completely make sense,” he says. “It’s not like you have to listen to every line but when we take out more dialogue than is in the movie the audience are more confused.”

That train crash

The finale on a runaway steam train manages to do something audiences will probably not have seen before.

“We calibrated it precisely to get exactly the right amount of ‘Holy shit!’ from the audience,” Hamilton says. “In the kitchen carriage element of the scene we are down to the bare minimum you need to comprehend what is going on.”

McQuarrie came up with the idea in February 2020 and previz for it was among the first things designed in prep. The previz involved the broad strokes of the wreck, with the understanding that the actors would be finding performance on the day. McQuarrie has talked about the tendency of the previs team to animate the characters rushing through the sets as quickly as possible whereas his concept was for slow and suspenseful action.

An actual 70-ton locomotive was built (by SFX supervisor Neil Corbould) powered by a diesel engine housed in the coal tender behind the train. The bulk of Ethan and Gabriel’s fight, along with the majority of the wide establishing shots, was filmed on a railtrack in Norway. The wreck including a partial bridge was shot at a quarry in the UK’s Peak District.

“When you’ve just had a huge fight on the roof of the train (again, compressed massively) you want the audience to get to the end and still be wanting more. We vary the use of music and sound design constantly so you’re not getting too tired of one particularly sense. The whole end the movie has no score at all until they are climbing back through the last carriage.”

Train interiors were shot at Longcross and involved sets of tilted on gimbals at up to 90-degree angles. A camera tethered to a rail system on the roof of the carriage set enabled a camera to move with Cruise and Atwell as they clambered inside.

“It took forever to film. You are only getting 2-3 sets up a day because it was effectively stunt work required where safety is paramount and it take hours to rehearse and shoot but the results speak for themselves. It is very hard to do something you’ve never seen before.

“I promise you we have done it again in Mission: Impossible - Dead Reckoning Part Two. We have already filmed the third act climax. It is jaw dropping from beginning to end.

“On Maverick I was eventually inured to all the aerial sequences because I’d seen them so many times but the visceral raw thrill when you see it for the first time on a big screen with all the sound, colour and VFX is so intense. It’s a lot of resources and experts working for months to make it happen.”

Thursday 20 July 2023

BTS: FIFA Women’s World Cup Australia & New Zealand

IBC

The FIFA Women’s World Cup Australia & New Zealand features a number of production firsts including volumetric ‘datatainment’, fully remote live matchday hosting, and a stream dedicated to TikTok – but it is all in HD.

 

Born as a 12-team tournament in 1991, the FIFA Women’s World Cup was expanded to include 16 countries at USA 1999 and 24 at Canada 2015. For the FIFA Women’s World Cup Australia & New Zealand 2023 32 nations compete for the first time.

The production plan will build on FIFA’s benchmarking broadcast tradition and will once again be enhanced with innovations and improvements.

Fully remote host

For the first time in the history of FIFA tournaments, the Women’s World Cup will see the implementation of a fully remote live match production, with five host broadcast match directors and their teams operating from an existing technical hub in Sydney.

They will be supported by nine stadium-based production teams, comprising camera operators and a floor manager. Also at the hub, four teams dedicated to additional content and four slo-mo teams will produce various five match feeds – the Basic International Feed (BIF), Clean International Feed (CIF), World Feed (WF) and Additional Content Channel (ACC). Press conferences will be captured live and offered as a separate isolated feed.

The remote production model relies on a dedicated fibre Broadcast Contribution Network (BCN), which connects each stadium to the central hub in Sydney over 40G redundant links. Further, the hub is connected to the IBC over 400G redundant links.

Similarly, non-live production will also use a remote production concept: while some strands, such as the FIFA MAX server technical facilities will still be handled from the Sydney-based IBC, most post-production operations (short-form match highlights and 24-minute Daily Highlights programmes, amongst others) will be housed at a non-live hub back in London.

The host broadcast production format is HD 1080p/50 HDR with 21 to 25 cameras in use, depending on the phase of the tournament. Speciality cameras will include ultra-motions, super-slow motions, cable cams, pole cams and heli cams.

Following the successful use in Qatar 2022, cine-style cameras will be in operation at every single match. All feeds will be distributed in HD 1080p/50 HDR format HD 1080p/50 SDR format (Rec. 709) HD 1080i/50 Embedded audio – SMPTE-299M.

The BIF/CIF feeds carry 16 embedded audio channels including for stereo TV, radio, 5.1 and Multi-Channel International which is encoded in Dolby E.

Five internationally renowned directors will be leading their teams from the central hub in Sydney: Jamie Oakford and Gemma Knight (from the UK), Angus Millar (AUS), Danny Melger (NED) and Sebastian Von Freyberg (GER).

The FIFA MAX server

The entire video and audio content produced for the tournament will be ingested and logged onto the FIFA MAX server in 1080p/50 format. Approximatively 3,000 hours of content are expected to be uploaded to the server during the tournament and made available to all rights  holders.

 

Rich graphics and VAR

Enhanced graphics introduced for Qatar return for this event. The increase in the number of data points now available from each match at the tournament naturally lends itself to increased opportunities to pass on the most relevant information to the viewer.

The data, which is provided centrally by the FIFA High Performance Programme Department, is picked up by the graphics teams to support the narrative of events on the pitch. In-match enhanced graphics are usually lower thirds or found in the corner of the screen to ensure that the information does not detract from the match itself.

Full-frame graphics are used at half-time and full time; these are often on screen for longer as there is time to get into more detail about how a specific piece of data analysis is impacting the match. The graphics shown at half-time and full time are also useful for broadcasters, who look to use them either as in-studio analysis or as digital assets on their social media platforms.

For VAR, on-field reviews are covered in a Picture-In-Picture (PIP) format, with the main window replicating the images that the referee is offered for review by the VAR. FIFA will continue its trial with broadcasting VAR review decisions in-stadium and to a live television audience.

Non-live production

FIFA TV Team Crew (FTTC) project will serve additional content, mainly away from the live matches. Thirty two FTTC producers, who are native speakers of their respective participating team, will be embedded with that team and follow them throughout the tournament and be responsible for the centralised production of any content coming from the teams. The producers will be paired with a local camera operator to produce exclusive player/coach interviews and with backroom staff plus footage of training sessions.

On matchdays, each crew will be in the stadium to support the HB coverage by providing further content such as pre-match and post-match interviews, fan colour, match ISO and post-match dressing-room filming.

FIFA TV will also create fully produced content pieces that are made available to broadcasters (again via the FIFA MAX). These feature star players, players to watch plus profiles of the head coaches and the backroom staff.

The production of highlights is also key to an event of this magnitude. FIFA TV will produce a two-minute packages for broadcasters to insert quickly in their programming or on their socials. It will then produce a more comprehensive 24’ round-up s programme every matchday. Both of these highlight offerings will be produced at FIFA TV’s remote production hub at Stockley Park in London.

In fact, FIFA TV started producing content around 18 months ago with footage and short films of host country/cities, promotional trailers, interviews and so on. A Preview Series of eight 26’ episodes tell the story of all 32 participating teams, how they qualified, what is special about them and what and who to look out for at the tournament itself.

Digital first

Vertical mobile friendly content will be provided by content creators at every Host city using exclusive venue and behind-the-scenes access– such as arrivals, players on the pitch, goals, post-match moments, players with family, etc. Content destined for FIFA-approved platforms is produced with aspect ratios including 1:1, 4:5, 9:16 and 16:9.  

Near-live clips will be provided within minutes (stadium sound, no editing) while edited clips such as ‘Digital First Highlights’ will also be provided. This operation will be crewed by eighteen Digital First content creators (two in each Host City covering matchday and other team activities), as well a senior producer and two video editors.

For the first time, FIFA will offer a digital production dedicated to TikTok that includes live streaming, clips, daily shows and content from selected influencers.

Datatainment

FIFA has developed a new streaming concept it calls entertainment with data or datatainment. By using volumetric cameras, data will be collected from the players and fed into the feed in the form of augmented graphics. Captures of all elements such as player tracking and data (speed, distance, ball speed, etc.) will be translated into live data visualisation on dedicated match feeds, allowing fans to follow the action through heat maps, passing accuracy stats, head-to-head/ball possession comparisons and general player/team performance.

Matches to be covered with this concept including all three USA group games (v. Vietnam, Netherlands and Portugal) as well as all matches from the quarter-finals onwards.

Kickoff

The 64 matches will be played at ten stadiums kicking off on 20 July 2023 at New Zealand’s Eden Park in Auckland/Tāmaki Makaurau. Stadium Australia in Sydney/Gadigal hosts the final a month later on 20 August.

The Lionesses begin their tournament against Haiti at Lang Park, Brisbane on 22 July.

 



Ted Chiang: Who Has the Power to Determine AI’s Impact?

NAB 

Esteemed sci-fi author Ted Chiang says that we should reframe debate about AI as one about the ethics of labor exploitation.

article here

Rather than think of AI as some nuclear level threat to humanity he proposes we think of it as a management consultancy, albeit a faceless bureaucratic entity in hock to capital.

On the plus side, this means we do have it within our power to control and shape its impact on society and the workforce in particular.

On the debit side, change demands that executives of already powerful tech companies take responsibility for guiding the ethical future of AI and by extension humanity.

That’s the really scary thought: That we’re reliant on Zuckerberg and Musk and the titans at Microsoft, Apple, Amazon and Google to take the right decisions that are not purely to rack up their profit.

Chiang explains his argument in an essay for The New Yorker, “Will AI Become the New McKinsey?“

“I would like to propose another metaphor for the risks of AI [and] suggest that we think about AI as a management consulting firm, along the lines of McKinsey & Company.

 “Just as AI promises to offer managers a cheap replacement for human workers, so McKinsey and similar firms helped normalize the practice of mass layoffs as a way of increasing stock prices and executive compensation, contributing to the destruction of the middle class in America. Even in its current rudimentary form, AI has become a way for a company to evade responsibility by saying that it’s just doing what ‘the algorithm’ says, even though it was the company that commissioned the algorithm in the first place.”

He says we should ask: how do we prevent AI from assisting corporations in ways that make people’s lives worse?

“It will always be possible to build AI that pursues shareholder value above all else, and most companies will prefer to use that AI instead of one constrained by your principles. Is there a way for AI to do something other than sharpen the knife blade of capitalism?

When Chiang refers to capitalism he is specifically criticizing “the ever-growing concentration of wealth among an ever-smaller number of people, which may or may not be an intrinsic property of capitalism but which absolutely characterizes capitalism as it is practiced today.”

He says if we cannot come up with ways for AI to reduce the concentration of wealth, then I’d say it’s hard to argue that AI is a neutral technology, let alone a beneficial one.

“By building AI to do jobs previously performed by people, AI researchers are increasing the concentration of wealth to such extreme levels that the only way to avoid societal collapse is for the government to step in.”

He says, “The doomsday scenario is not a manufacturing AI transforming the entire planet into paper clips, as one famous thought experiment has imagined. It’s AI-supercharged corporations destroying the environment and the working class in their pursuit of shareholder value. Capitalism is the machine that will do whatever it takes to prevent us from turning it off, and the most successful weapon in its arsenal has been its campaign to prevent us from considering any alternatives.”

If AI is as powerful a tool as its proponents claim, they should be able to find other uses for it besides intensifying the ruthlessness of capital, he argues.

But the “they” here need to do a lot of work. With power comes great responsibility, he argues.

“The tendency to think of AI as a magical problem solver is indicative of a desire to avoid the hard work that building a better world requires. That hard work will involve things like addressing wealth inequality and taming capitalism.

“For technologists, the hardest work of all — the task that they most want to avoid — will be questioning the assumption that more technology is always better, and the belief that they can continue with business as usual and everything will simply work itself out.”

Interviewed by Alan Berner at Vanity Fair, Chiang says no software that anyone has built is smarter than humans. What we have created, he says, are vast systems of control.

“Our entire economy is this kind of engine that we can’t really stop. It probably is possible to get off, but we have to recognize that we are all on this treadmill of our own making, and then we have to agree that we all want to get off. We are only building more tools that strengthen and reinforce that system.”


Wednesday 19 July 2023

“Oppenheimer” and Technology’s Ethical Consequences

NAB

Beware of what we create, might be the message from Oppenheimer, on the face of it a film about the invention of the atomic bomb, but with obvious parallels to day.

article here

Director Christopher Nolan might have had in mind the nascent cold war when he began the project but since then the Russia invasion of Ukraine and the rise of AI has given his film added resonance.

“When I talk to the leading researchers in the field of AI right now, they literally refer to this right now as their Oppenheimer moment,” Nolan said on a panel of physicists promoting the new feature moderated by Chuck Todd [21:02]. “They're looking to his story to say, okay, what are the responsibilities for scientists developing new technologies that may have unintended consequence?  

I'm not saying that Oppenheimer's story offers any easy answers to those questions, but at least can serve as a cautionary tale.

Nolan explains that Oppenheimer is an attempt to understand what it must have been like for those few people in charge to have developed such extraordinary power and then to realize ultimately what they had done. The film does not pretend to offer any easy answers.

I mean, the reality is, as a filmmaker, I don't have to offer the answers. I just get to ask the most interesting questions. But I do think there's tremendous value in that if it can resonate with the audience.”

Asked by Todd what he hoped Silicon Valley might learn from the film, Nolan replied “I think what I would want them to take away is the concept of accountability. When you innovate through technology, you have to make sure there is accountability.

“The rise of companies over the last 15 years bandying about words like ‘algorithm,’ not knowing what they mean in any kind of meaningful, mathematical sense. They just don’t want to take responsibility for what that algorithm does.

There has to be accountability, he emphasized. “We have to hold people accountable for what they do with the tools that they have.”

Nolan was making comparisons between nuclear Armageddon and AI’s potential for species extinction, but he is not alone in calling out big tech to be place the needs of society above their own greed.

In an essay for Harvard Business Review, Reid Blackman asks how we can avoid the ethical nightmares of emerging tech including blockchain, robotics, gene editing and VR.

“While generative AI has our attention right now, other technologies coming down the pike promise to be just as disruptive. Augmented and virtual reality, and too many others  have the potential to reshape the world for good or ill.

Ethical nightmares include discrimination against tens of thousands of people; tricking people into giving up all their money; misrepresenting truth to distort democracy or systematically violating people’s privacy. The environmental cost of the massive computing power required for data-driven tech is among countless other use-case-specific risks.

Reid has some suggestions as to how to approach these dilemmas – but it is up to tech firm that develop the technologies to address them.

“How do we develop, apply, and monitor them in ways that avoid worst-case scenarios? How do we design and deploy [tech] in a way that keeps people safe?”

It is not technologists, data scientists, engineers, coders, or mathematicians that need to take heed, but the business leaders who are ultimately responsible for this work, he says

“Leaders need to articulate their worst-case scenarios — their ethical nightmares — and explain how they will prevent them.”

Reid examines a few emerging tech nightmares. Quantum computers, for example, “throw gasoline on a problem we see in machine learning: the problem of unexplainable, or black box, AI.

“Essentially, in many cases, we don’t know why an AI tool makes the predictions that it does. Quantum computing makes black box models truly impenetrable.”

Today, data scientists can offer explanations of an AI’s outputs that are simplified representations of what’s actually going on. But at some point, simplification becomes distortion. And because quantum computers can process trillions of data points, boiling that process down to an explanation we can understand — while retaining confidence that the explanation is more or less true — “becomes vanishingly difficult,” Reid says.

“That leads to a litany of ethical questions: Under what conditions can we trust the outputs of a (quantum) black box model? What do we do if the system appears to be broken or is acting very strangely? Do we acquiesce to the inscrutable outputs of the machine that has proven reliable previously?”

What about an inscrutable or unaccountable blockchain? Having all of our data and money tracked on an immutable digital record is being advocated as a good thing. But what if it is not?

“Just like any other kind of management, the quality of a blockchain’s governance depends on answering a string of important questions. For example: What data belongs on the blockchain, and what doesn’t? Who decides what goes on? Who monitors? What’s the protocol if an error is found in the code of the blockchain? How are voting rights and power distributed?”

Bottom line: Bad governance in blockchain can lead to nightmare scenarios, like people losing their savings, having information about themselves disclosed against their wills, or false information loaded onto people’s asset pages that enables deception and fraud.

Ok, we get the picture. Tech out of control is bad. We should be putting pressure on the leaders of the largest tech companies to answer some hard (ethical) questions, such as:

 Is using a black box model acceptable?

Is the chatbot engaging in ethically unacceptable manipulation of users?

Is the governance of this blockchain fair, reasonable, and robust?

Is this AR content appropriate for the intended audience?

Is this our organization’s responsibility or is it the user’s or the government’s?

Might this erode confidence in democracy when used or abused at scale?

Is this inhumane?

Reid insists: “These aren’t technical questions — they’re ethical, qualitative ones. They are exactly the kinds of problems that business leaders — guided by relevant subject matter experts — are charged with answering.”

It’s understandable that leaders might find this task daunting, but there’s no question that they’re the ones responsible, he argues. Most employees and consumers want organizations to have a digital ethical risk strategy.

“Leaders need to understand that developing a digital ethical risk strategy is well within their capabilities. Management should not shy away.”

But what or who is going to force them to do this? Boiling it down – do you trust Elon Musk or Mark Zuckerberg, Jeff Bezos or the less well known chief execs at Microsoft, Google, OpenAI, Nvidia and Apple – let alone developing similar tech in China or Russia - to do the right thing but us all?

 
 
 
 

 

Behind the Scenes: Asteroid City

IBC

For a film as singularly American as Asteroid City, Wes Anderson chose to base his latest film in Spain.

 

 article here

Wes Anderson’s previous movie The French Dispatch was filmed on location in the small French town of Angouleme. For Asteroid City, he set up another bubble for cast, crew and production design to film for 35 days during 2021 at Chinchón, 50km south east of Madrid.

 

Other locations including Death Valley were scouted but the desert environs of Chinchón, provided unobstructed views, hundreds of yards in all directions, and the natural light required to build large sets on an area the size of a football field.

 

“With the opening pan, you see in every direction,” explains producer Jeremy Dawson. “The car chase went right down the road, almost a kilometre long. Experientially, we wanted that feeling that you're actually in Asteroid City. You saw the set everywhere.”

 

Longtime Anderson collaborator Adam Stockhausen (production designing Oscar winner for The Grand Budapest Hotel) created the buildings and interiors including the luncheonette, garage and motel. The mountains, boulders, and rocks were all constructed, too, to such a scale that barely any green screen was used.

 

“We naturally made use of forced perspective,” Stockhausen explains. “The town becomes desert and heads endlessly to the horizon, and it is imperceptible to tell where it begins and ends, and achieves a hyper-reality. When you look off in the distance and see the ramp of the highway and the mountains off in the distance, they're pieces of scenery, and well over 1000 feet away. Some are five, six stories tall.”

 

Even the sections of the film taking place in New York - essentially anything that appears in black and white in the film - was also shot in Spain.

 

“In each town near Chinchón, there is a tiny, little theatre,” Stockhausen reports. “We took those as locations, and all of the backstage shots (the scene introducing the actors) were all set up there. The opening broadcast stage is one of those theatres with everything ripped out. The control booth that the camera pushes through is bolted onto the balcony as a little constructed item.”


The film is a paean to 1950s Americana set against the backdrop of the Cold War, the Hollywood era of Marilyn Monroe and of futurism. The film could make an interesting counterpoint to Christopher Nolan’s forthcoming dramatisation of the atomic bomb tests in New Mexico in Oppenheimer).

 

Perhaps Asteroid City’s most obvious cinematic reference is to Steven Spielberg’s Close Encounters of the Third Kind (1977) in which an alien landing becomes a mass viewing spectacle. There’s even a rock formation that resembles The Devil’s Tower from Spielberg’s classic.

 

Jordan Peele used Close Encounters as the jumping off point for his myth-busting satire Nope released last year.


For Anderson and Stockhausen, a major inspiration for the look of the landscape (as well as the town) was Bad Day at Black Rock, the 1955 film directed by John Sturges, and starring Spencer Tracy.  Shot around Death Valley and Mojave Desert, the film provided real topography that Stockhausen then worked to duplicate with sculptors and painters.

 

Other key design inspirations included Billy Wilder’s Ace in the Hole (1951) in which a small carnival and caravan of people spring up in a desert outpost, much like in Asteroid City, after the alien has landed. Wilder’s Kiss Me, Stupid (1964) in which the action is focused around a very real gas station, surrounded by studio backlot artificiality, was another influence.

 

The alien spaceship was fabricated as miniatures not CGI and the alien is a three-foot high stop-motion puppet, animated by Kim Keukelerie, and based on a performance by Geoff Goldblum acting in alien costume.

 

As befits the 1950s period look, Asteroid City is shot on 35mm, the eleventh film that

cinematographer Robert Yeoman ASC has shot for Anderson on negative.

 

He loaded ARRICAM cameras with the same combination of films stocks he had deployed on The French Dispatch – Kodak Vision 3 200T colour and Double-X Black 5222 for the B&W scenes – and lensed with Cooke S4s and ARRI Master Anamorphics.

 

Film grain was an important element for this production, particularly for the B&W scenes, Yeoman explained in an article for Kodak.  “Wes and I fell in love with the look of the Double X B&W stock on The French Dispatch, as it has a superb scale of tonal contrast and grain. Also, on this film we were essentially shooting in a desert with the sun overhead and a lot of contrast. I was concerned about the highlights burning, but I knew that both film stocks would hold detail in the image.”

 

Rushes were processed at the Hiventy lab in Paris, which then delivered 4K scans of dailies to colourist Doychin Margoevski at Company 3 in London. The final grade was done at CO3 by colourist Gareth Spensley.

 

For the lighting, Yeoman took cues from Bad Day at Black Rock and Wim Wenders’ Paris, Texas photographed by Robby Müller.

 

“They were not afraid to shoot in the harsh midday sun in the desert and actively used that as an expressive element in their stories,” Yeoman remarks. “We visited every location [in Spain] during prep when we talked extensively about the shots. Wes pushed me to embrace frontal and overhead sunlight, and the town exteriors were all shot in natural light.”

 

Skylights were built into the sets of buildings so they could continue to use daylight for interiors, with no traditional movie lights.  Frank Capra’s classic It Happened One Night was used as a reference for the motor court, even down to the shadows cast through the overhead lattice work during the film’s outdoor picnic scene.

 

They covered the skylights of set buildings with a full grid to give a soft, even light, which allowed the actors to move around with no lighting adjustments. It also meant that the interior and exterior shots were balanced. Yeoman added practical lights in the background to scenes set at dusk to give a 'pop' to the image.

 

“There is nothing like hearing the purr of film running through the gate,” the DoP adds. “Everyone on set pays more attention to what they need to do when shooting film compared to digital. Wes does not have a video village, and with a very small crew on set, that means we can move quickly between set-ups. Ultimately, shooting on film creates a more intimate atmosphere for the actors and there are a lot fewer distractions for them."