Monday 29 April 2024

We Need Copyright Laws for AI in Hollywood, But There Are… Issues

NAB

article here

The legal battle between developers of generative AI tools and creators, artists and publishers is often viewed as a zero-sum game: deleteriously impacting the business and livelihood of the latter or the bottom line of the former.

But the outcome will be more complex according to Paul Sweeting, co-founder of the RightsTech Project and founder & principal of Concurrent Media Strategies.

In a primer on the subject at Variety, he explains that despite at least 16 high profile legal cases in the US the courts are likely to struggle to find precedents that clearly apply.

Defense lawyers for OpenAI/Microsoft and Stability AI, defending respective copyright infringement suits brought by The New York Times and Getty Images, will claim fair use — that the training process is transformative of the input and therefore not infringing under prevailing legal precedents.

As Sweeting explains, the amount of data used to train the largest AI models is in the order of tens of billions of images (or hundreds of billions of words). And what the system actually retains from its training data is not the words or images themselves, but the numeric values assigned to their constituent parts and the statistical relationships among them.

It’s complex.

“Whether that process constitutes actual reproduction of the works in the training data, as plaintiffs have claimed, is as much a technical question as it is a legal one,” he says.

Pamela Samuelson, a professor of law and information at UC Berkeley, tells Sweeting that the biggest challenge plaintiffs in those 16 cases face will be establishing actual — as opposed to speculative or potential — harm from the use of their works to train AI models, even if they can establish that their works were copied in the process of that training.

She still rates the NYT and Getty Images cases as most likely to succeed or compel the defendants to settle because both companies had well-established licensing businesses in the content at issue that pre-date the rise of generative AI.

Meanwhile in Europe, the EU’s AI Act will require developers of large AI systems to provide “sufficiently detailed summaries” of the data used to train their models.

This sounds like good news. Surely, we should all want to trim the march of AI in order to compensate human creators whose work has helped to build AI tools now and in future?

However, some artists are concerned the balance will be tipped too far or that any new legislation will not be sufficiently nuanced to allow for legitimate copyrighted creation of works by artists who have used AI.

The US Copyright Office has a long-standing policy that copyright protection is reserved for works created by human authors. It treats the purely human elements of a work separately from the purely AI elements as distinct from the AI-assisted human elements.

Hollywood is similarly concerned at the extent that narrow interpretations of copyright will throttle the use of AI in production and post-production.

For its part, the Copyright Office is about to publish the first in a series of reports into AI with recommendations to Congress of any possible changes to copyright law.

The first such report will address issues around deepfakes. Others will cover the use of copyrighted works in training AI models, and the copyrightability of works created using AI.

Sweeting says there is “broad agreement” that the Copyright Office’s current policy is “unworkable, because the volume of mixed works will quickly overwhelm the system, and because the lines will keep shifting.”

In the absence of those updates or new legal precedents then the working and training with AI picture remains murky.

Who’s Going to Intervene When It’s Creators Vs. Big Tech’s AI?

NAB

article here

Is the tide turning on the ability to amass art from the internet to train generative AI… without compensating creators? A legal pincer movement from Europe and the US could soon regulate access to copyrighted material and few people in Hollywood or in wider society would shed a tear.

For those who think building Gen AI products on other people’s work is wrong; the passing of the Executive Order on the safe, secure and trustworthy use of AI is already late.

The target of their ire is Gen AI billion-dollar market leader OpenAI, whose video generator Sora, revealed earlier this year, laid bare the potential of the technology to auto-create photoreal content.

Although OpenAI refuses to admit it — to the increasing frustration of media commentators — The New York Times demonstrated that OpenAI has in fact trained its ChatGPT large language model on more than one million hours of YouTube videos, all without payment or consent.

"Why should OpenAI — or any other LLM — be able to feed off the works of others in order to build its value as a tool (or whatever you call generative AI)?” argues IP lawyer-turned-media pundit Pete Csathy in The Wrap. “And even more pointedly, where are the creators in this equation?”

The core argument is that GenAI would not work nor be a product without being trained with content and that artists and creators of those creative works should be compensated.

OpenAI and other AI companies contend that that their models do not infringe on copyright laws because they transform the original work, therefore qualifying as fair use.

“Fair use” is a doctrine in the US that allows for limited use of copyrighted data without the need to acquire permission from the copyright holder.

A tighter definition of fair use is what the Generative AI Copyright Disclosure Act is designed to achieve on behalf of creators. Following the EU’s own historic legislation on the subject, the act introduced last week would require anyone that uses a data set for AI training to send the US Copyright Office a notice that includes “a sufficiently detailed summary of any copyrighted works used.”

Essentially, this is a call for “ethically sourced” AI and transparency so that consumers can make their own choices, says Csathy who says “trust and safety” should logically apply here too.

“To infringe, or not to infringe (because it’s fair use)? That is the question — and it’s a question winding through the federal courts right now that will ultimately find its way to the US Supreme Court.”

And when it does, Csathy’s prediction is that ultimately artists will be protected. He thinks that the Supreme Court will reject Big Tech’s efforts to train their LLMs on copyrighted content without consent or compensation, “properly finding that AI’s raison d’etre in those circumstances is to build new systems to compete directly with creators — in other words, market substitution.”

As Csathy puts it, simply because something is “‘publicly available’ doesn’t mean that you can take it. It’s both morally and legally wrong.”

Few people, and certainly not Csathy, go so far as to want to ban GenAI development or even that there might be instances where “fair use” is appropriate. What they want is for OpenAI to fess up and be honest, trustworthy, and transparent about the source of its training wheels.

Behind the Scenes: Civil War

IBC

article here

In Alex Garland’s action thriller cameras are a weapon of truth


There’s a scene in Oliver Stone’s 1986 movie Salvador about the country’s chaotic civil war where a photojournalist played by John Savage is killed in the heroic attempt to capture the money shot - or proof - of military bombs falling on civilian population.

The heroic nature of photojournalists and the wider importance of upholding the journalistic quest for truth is writer-director Alex Garland’s mission in Civil War - although the lines are blurred. The film’s hero, a veteran war photographer, is among a press pack dreaming of the ultimate money shot: the capture or execution of the US President.

Garland has said he intentionally wanted to embody the film’s action through the grammar of images that people may have seen on the news. This grammar is less cinematic and more documentary like, a tactic also used by Stone on Salvador and filmmakers Roland Joffé and Chris Menges on The Killing Fields, another film about war correspondents fighting for truth and justice.

The cinematography reflects the vérité feel of actual combat, eschewing the clean camerawork that Garland and regular DP Rob Hardy ASC BSC have used on previous films like Annihilation.

While the main camera used is Sony Venice they shot a lot of the action scenes using the DJI Ronin 4D, a relatively inexpensive camera costing around £6000.

“I wanted something truthful in the camera behaviour, that would not over-stylise the war imagery,” explains Garland in a feature he wrote for Empire. “All of which push you towards handheld. But we didn’t want it to feel too handheld, because the movie needed at times a dreamlike or lyrical quality.”

“That more handheld look when it comes to combat stuff [is] in my mind the way I view things,” comments Ray Mendoza, the film’s military advisor. “Watching these handhelds — it’s more visceral.”

The cautionary fable takes place in a near-future America that has split into multiple factions embroiled in a civil war. The Western Forces, an armed alliance of states rebelling against the federal government, is days away from pushing the capitol to a surrender. In the hopes of getting a final interview with the president (Nick Offerman), Lee (Kirsten Dunst), a veteran combat photographer  travels 857-miles across the country to the White House with an aspiring photographer named Jessie (Cailee Spaeny).

Garland chose to shoot the $50 million movie chronologically in part to capture something more truthful in the actor’s performances. The schedule dictated that they shoot quickly, and move the camera quickly, which also leant itself to a more maneuverable camera. Very few shots in the film use tracks and dollies. The crew also mounted eight small cameras to the Lee and Jessie’s Press van.

“It does something incredibly useful,” Garland writes of the DJI Ronin 4D. “It self-stabilises, to a level that you control — from silky-smooth to vérité shaky-cam,” Garland writes. “To me, that is revolutionary in the same way that Steadicam was once revolutionary. It’s a beautiful tool. Not right for every movie, but uniquely right for some.”

The point about a combat photographer is that they have to put themselves in a position where they can see the thing that is happening, otherwise they can't take the photo.

The small size and integrated self-stabilisation of the DJI Ronin 4D meant that “the camera behaves weirdly like the human head,” Garland adds. “It sees ‘like’ us. That gave Rob and I the ability to capture action, combat, and drama in a way that, when needed, gave an extra quality of being there.” 

While the camera is not certified as an IMAX camera, Civil War (like The Creator) is presented for IMAX screens because it used IMAX post-production tools and a sound design suitable for the giant format.

Garland provocates by repurposing the images, tools, and euphemisms of modern war — airstrikes, civilian targets, collateral damage — onto American soil. Familiar and iconic images, from the streets of New York to the nation’s capitol, are radically recontextualised, like the eerily empty streets of London in Garland’s screenplay for the 2002 zombie film 28 Days Later.

As the son of political cartoonist [Nicolas Garland], Garland grew up around journalists. Lee and Jessie, whose last name in the film is Cullen, are named after two war photographers whose work Garland admires: Lee Miller and Don McCullin.

Iconic images from the Vietnam War of a young girl who had been burned by napalm, of a Buddhist monk who set himself on fire and the execution of a VC soldier [in a Pulitzer Prize winning shot by Eddie Adams] “became reasons why journalism did have an effect and changed the public mood,” Garland said after the film’s premier at SXSW.

“That's partly why photojournalists are at the heart of this film,” he said. “Often modern journalism of that sort is videoed, rather than stills. But journalism can be fantastically powerful, provided that it's being listened to. And one of the really interesting things about the state that the U.S and the U.K and many others are in right now is that the warnings are all out there on all sides of the political divide, but for some reason they don't get any traction.”

“Is it just that we're not able to absorb information because of the position we already hold?”

Hence, he decided to take such polarization out of Civil War to the point of refuses to engage in how it started – and instead tries to find points of agreement. It is “de-politicized for a political reason.”

Modern Warfare

It is exceptionally difficult however to make a war movie that is, in fact, anti-war.

“War movies find it very, very difficult to not sensationalise violence,” Garland says in A24’s production notes. “Most of the anti-war movies in a way are not really anti-war movies. They have so much to do with camaraderie and courage. It's not that they are trying to be romantic, but they just become romantic. They sort of can't help it because courage is romantic and tragedy in a way is romantic.”

He points to films like Stanley Kubrick’s Paths of Glory (1957) or the harrowing Soviet war epic Come and See (1985) as rare exceptions.

So, in Civil War, when characters are shot, they don't have squibs on them spouting fountains of blood. You don't see big blood splatters up the wall behind them. They just fall down. Blood then leaks across the ground if they've been lying there for the right amount of time.

“There's nothing really glamorous about a mass grave,” he said. “There's nothing really romantic about it.”

Similarly, they deployed blanks for gunfire (rather than purely reliant on audio FX). These make a loud noise, like a .50 calibre gun, that people react to instinctively by flinching.

The film’s explosive denouement featuring a siege of the capitol had to have each beat choreographed to be as tactically authentic as possible. Filmed on soundstages in Atlanta it involved 50 stunt persons, cars, tanks, explosions and gunfire. The aim was to put the audience in the middle of the battleground surrounded in chaos.

“We'd have a map of the area sketched out, and we would be drawing arrows and drawing little cones over where a camera was positioned,” Garland explains. “You could put together quite sophisticated choreography: this tank will move here, as this Humvee drives forward fast towards the other Humvees, and as it passes, that's when these soldiers will move down. We would just run that choreography again and again and again.”

He gave Mendoza free reign to choreograph the sequence, so long as nothing was embellished. “I hired a lot of veterans, and it's great to see them move through it, get into the scene of it,” Mendoza says. “It's pretty accurate just even from the dialogue, to the mood, and a lot of the gun fighting.”


Saturday 27 April 2024

“Civil War:” The Camerawork to Capture the Chaos

NAB


Perhaps it could only take an outsider to update the American Civil War of the 1860s and imagine what would happen if similar events tore apart the United States today.

British writer-director Alex Garland didn’t have to look far for inspiration: The January 6, 2021 mob attack on the Capitol was a vivid insurrection filmed live on TV in broad daylight. While these events are a thinly disguised template for the finale of his film Civil War, Garland seems less interested in apportioning blame to the political right or left than in asking why we might end up there again.

You could see similar events play out in Britain or any other country, he told an audience at SXSW after the film’s premiere. “Any country can disintegrate into civil war whether there are guns floating around the country or not,” he suggested, adding that “civil wars have been carried out with machetes and still managed to kill a million people.”

“I’ve known a lot of war correspondents because I grew up with them,” Garland said in the same on-stage interview. “My dad worked [as a political cartoonist] on the Daily Telegraph. So I was familiar with them.”

Garland showed cast and crew the documentary Under The Wire about war correspondent Maria Colvin, who was murdered in Syria. His lead characters are news and war photographers played by Kirsten Dunst and Cailee Spaeny, whose character’s names echo those of acclaimed photojournalists Don McCullin and Lee Miller. Murray Close, who took the jarringly moving photographs that appear in the film, studied the works of war photographers.

“There are at least two [types of war photographer],” said Garland. “One of them is very serious minded, often incredibly courageous, very, very clear eyed about the role of journalism. Other people who have served like veterans are having to deal with very deep levels of disturbance (with PTSD) and constantly questioning themselves about why they do this. Both [types] are being absorbed and repelled at the same time.”

He represents both types in the film. While it is important to get to the truth — in this case, the money shot of the execution of the US President — he questions if that goal should take priority over everything else they come across in their path. At what point, Garland asks, should the journalist stop being a witness and start being a participant?

“Honestly, it’s a nuanced question, nuanced answer,” he said. “I can’t say what is right or wrong. There’s been an argument for a long time about news footage. If a terrible event happens, how much do you show of dead bodies? Or pieces of bodies? Does that make people refuse to accept the news because they don’t want to access those images? Or worse, does it make them desensitized to those kinds of images? It’s a tricky balance to get right.”

In this particular case, one of the agendas was to make an anti-war movie if possible. He refers to the controversial Leni Riefenstahl directed 1935 film Triumph for the Will, which is essentially Nazi propaganda.

Garland didn’t want to accidentally make Triumph for the World, he said, by making war seem kind of glamorous and fun. “It’s something movies can do quite easily,” he said. “I thought about it very hard and in the end, I thought being unblinking about some of the horrors of war was the correct thing to do. Now, whether I was correct or not, in that, that’s sort of not for me to judge but I thought about it.”

Garland establishes the chaos early, as Dunst’s character covers a mob scene where civilians reduced to refugees in their own country clamor for water. Suddenly, a woman runs in waving an American flag, a backpack full of explosives strapped to her chest.

“Like the coffee-shop explosion in Alfonso Cuarón’s Children of Men, the vérité-style blast puts us on edge — though the wider world might never witness it, were it not for Lee, who picks up her camera and starts documenting the carnage,” reviews Peter Debruge at Variety.

To achieve the visceral tone to the action, Garland decided to shoot largely chronologically as the hero photographers attempt to cross the war lines from California to the White House.

After two weeks of rehearsals to talk through motivations and scenes and characters, Garland and DP Rob Hardy then worked to figure out how they were going to shoot it. He wanted the drama to be orchestrated by the actors, he told SXSW. “The micro dramas, the little beats you’re seeing in the background, are part of how the cast have inhabited the space.”

Spaeny, offers insight into Garland’s “immersive” filming technique in the film’s production notes. “The way that Alex shot it was really intelligent, because he didn’t do it in a traditional style,” she says. “The cameras were almost invisible to us. It felt immersive and incredibly real. It was chilling.”

A featurette for the movie sheds light on Garland’s unconventional filming style, in which he describes Civil War as “a war film in the Apocalypse Now mode.”

While the A-camera was a Sony VENICE, they made extensive use of the DJI Ronin 4D-6K, which gave the filmmakers a human-eye perception of the action in a way that traditional tracks, dollies and cranes could not. They also bolted eight small cameras to the protagonists’ press van.

To Matthew Jacobs at Time Magazine, Spaeny likened the road scenes to a play, adding, “unlike theater, or even a typical movie shoot, Civil War changed locations every few days as the characters’ trek progressed, introducing constant logistical puzzles for the producers and craftspeople to solve.”

Dunst’s husband Jesse Plemons makes a brief appearance in the film, but commands the scene as a menacingly inscrutable soldier carrying a rifle and wearing a distinct pair of red sunglasses.

“I can imagine that people might read into or some kind of strange bit of coding into Jesse Plemons’s red glasses,” Garland says in A24’s notes. “Actually, that was just Jesse saying, I sort of think this guy should wear shades or glasses. And he went out himself and he bought six pairs, and we sat there as he tried them on, and when he got to the red ones, it just felt like, yep, them.”

 


With “Cowgirls on the Moon,” the Workflow’s in the Cloud

NAB

article here

It began as a joke, but putting cowgirls on the moon is a serious attempt to showcase what is possible by using a raft of new filmmaking technologies from virtual production and cloud rendering to generative AI.

Unveiled and demonstrated at NAB Show, the faux movie trailer for Cowgirls on the Moon is a goofy but high-concept challenge led by AWS that conforms to elements of MovieLabs 2030 Vision.

“It started off as a joke,” Katrina King, global strategy leader for content production at AWS, explained. “Let’s do something ridiculously out there that’s really going to force us to lead into modern cloud computing and generative AI. And I said something like ‘cowgirls on the moon.’ It was a joke, but nobody came up with a better idea so that’s what we went with.”

The aim was to demonstrate the power of three technologies working in tandem: generative AI-assisted virtual production, cloud rendering and VFX with the use of AWS Deadline, and holistic production in the cloud.

“At AWS, we believe very strongly in the responsible use of generative AI,” King continued. “So we used applications that allow artists to work more efficiently and to offload the mundane technical aspects.”

For instance, they used text-to-video generator Runway for concept art and storyboards, another AI tool for facial recognition, and an enhanced speech tool included within Adobe Premiere. The latter tool completely rebuilt the dialogue track as if it had been recorded in an ADR session. “The amount of time that saved us up having not to go into an ADR session was incredible,” King said.

The principal generative AI tool was Cuebric, which was used to generate assets and import them into Unreal Engine and to automate certain technical aspects of the production. All of the backgrounds in the virtual production and animated in Unreal Engine were generated using Cuebric.

 

“Once Cuebric exports it we have these different layers which are then presented in Unreal, so that as we move the camera and the camera tracks on the volume we get parallax,” King explained.

Visual effects facility DNEG delivered 36 VFX shots for the production in just eight days. The whole project was essentially run as a full studio and render farm in the cloud.

Project producer Ron Ames talked up the benefits of the virtual production, such as being able to swap out entire infrastructure and multi-location collaboration.

“We first said, ‘We want these machines to be Linux.’ But then we changed our minds. ‘Now we want them to be Windows.’ Literally in minutes we had new machines up and running,” he said. “The ability to work quickly to collaborate, to tear down walls. We had groups working in Vancouver, in London, LA, Boston, Idaho, Switzerland, Turkey, Tucson, Netherlands [on the project all linked to assets by cloud].”From “Cowgirls on

Ames previously used extensive AWS workflows as producer of Amazon series The Lord of the Rings: The Rings of Power. He thinks other producers remain unconvinced about choosing to put their next project into the cloud.

“Petrified would be the word, not reluctant. The notion that we’ve done it before this way, or we have investments in a certain infrastructure, is one of the impediments to moving forward,” Ames said.

“On Rings of Power, the great good fortune we had was a team of producers and at AWS supporting us to try new stuff and if it doesn’t work, we’re not going to give up, we’re going to make it work and make it work in a way that actually has benefits. Once we saw the efficiencies, the creative possibilities, and truly the collaborative power of breaking down silos, walls, traditional ways we’ve been working, we realized that this had a great value.”

 


Why “Super Churners” Are Driving Change in the SVOD Model

NAB

article here

Cancelling streaming services is no longer niche or occasional. Churn has gone mainstream and premium SVODs are going to have to employ new tactics to compete for a share of the household wallet.

Finished watching The Bear? Ditch Hulu. Want to watch Fallout? Sign up for Amazon Prime Video. Time for Slow Horses Season 3? Then cancel Amazon (having already binged Fallout) and get Apple for at least a couple of weeks. More and more of us are doing this, partly because price inflation has exhausted the amount people are willing to spend on stacking SVODs, most of whose content they don’t actually watch.

It’s also because the SVOD system of no-contract, one-month viewing — so important in kick-starting the streaming business — makes it so easy to do.

According to data from Antenna, at the end of 2023 nearly one-in-four streaming US consumers qualified as serial churners — individuals who have canceled three or more Premium SVOD services in the past two years. That’s an increase of 42% YoY.

Antenna even identifies a group of “super heavy serial churners,” those who make seven or more cancellations within the past two years, and found that they constituted 19% of subscribers in 2023.

More data: serial churners were responsible for 56.5 million cancellations in 2023, up a whopping +54.6% year-on-year, while cancellations by non-serial churners increased 18.5% to 82.8 million in the same period.

As the headline in The New York Times succinctly puts it, “Americans’ New TV Habit: Subscribe. Watch. Cancel. Repeat.”

While consumers value flexibility, the implications could be significant for the major media companies, especially as it seems likely this behavior will become even more common.

One option outlined by John Koblin in The New York Times is to bring back some element of the cable bundle by selling streaming services together.

Executives believe consumers would be less inclined to cancel a package that offered services from multiple companies. Disney, for instance, is bundling Disney+, Hulu and ESPN+ into one package and, later this year, will launch a sports streamer pooled with Fox and Warner Bros. Discovery.

Another tactic is to promote “coming soon” content prominently on the home page. For instance, Apple TV+ is teasing Dark Matter, a science-fiction series that comes out in its app in May.

Peacock promoted a special offer to deter new subscribers from cancelling by offering a deal to sign up for a full year at a discount.

According to Antenna research, cancellation rates for those who did sign up did not drop off a cliff a month later, but instead were close to average.

Netflix appears immune, according to Antenna data. Or rather; it is the service most likely to be part of household bundles with every competitor part of a revolving carousel that consumers pick and mix according to the latest show to land.

Without a predictable revenue stream, it is harder for streamers to invest in new content, causing them to cut production and deliver fewer stand out new releases to market, in a vicious cycle that will gather pace unless nothing changes.

 


Friday 26 April 2024

UEFA relegates UHD for Euro 2024 and Champions League Final

IBC

Sports broadcasters have given up on 4K UHD for the time being at least with the UEFA Champions League Final and this summer’s European Championship both being produced in 1080p HD HDR as the host production format.


article here

 

That’s a significant reversal of more than a decade long trend to up the ante in terms of broadcast resolution with each successive major tournament.

 

The Champions League Final has been broadcast in UHD since 2015 and the Euros since 2016 (then 2021) but now there’s been a rethink.

 

Nor is UEFA an outlier. “There’s been a change in delivery format for a lot of major international tournaments and also domestic tournaments where broadcasters are now looking at 1080p HDR as the chosen one,” says Eamonn Curtin, Global Client Director for leading outside broadcast facilities provider EMG.

 

“It’s a benefit for us technically since we just have one signal to produce and manage rather than the four that made up the UHD signal.”

 

If UHD cameras were used as source routing in a standard (non-IP) outside broadcast truck they would typically be distributed as four links using 3G-SDI (Serial Digital Interface). This

Quad Link 4K/UHD signal needed careful monitoring to ensure sync when combined on output and made the work of the OB technically more challenging.

 

That’s not the reason there’s no UHD at either event. UHD could after all be produced with upscaling. The main reason is lack of broadcaster/rights holder interest. Eight years after BT Sport became Europe's first 4K broadcaster few others have followed suit. The cost to buy kit, refresh, rip and rewire studios and OB vans let alone buy satellite transponder space is an expense few could justify when viewers were unwilling to pay a premium for the visual uplift.

 

More tellingly, the visual uplift in resolution wasn’t actually a huge leap when full HD High Dynamic Range came into the equation. Taste tests of HD HDR versus 4K suggested that viewers preferred the sharper contrast and detail in light and shade in the HD version.

 

Add to that BT Sport’s takeover by Warner Bros. Discovery which closed in September 2022. BT Sport, like Sky Sports before it, had planted a flagpole as the most consistently innovative broadcaster of its generation. Where Sky Sports led on HD, on-screen graphical presentation and dabbled in stereo 3D, BT Sport pioneered 4K, Dolby Atmos and experimented with VR with plans for 8K broadcasts thwarted by the pandemic.

 

WBD brand TNT Sports, which is covering the Champions League Final as host broadcaster from Wembley, is not averse to innovation but it is choosing to put its firepower into other areas. This includes virtual presentation studios the likes of which we’ve seen at recent Olympics, its remote ‘holographic’ style interview studio ‘Cube’ which features at tennis tournaments and richer data insights gleaned in realtime from athletes and equipment, as showcased during the Tour de France.

 

Coverage produced for digital distribution online and social media is another highly significant area of production that WBD (through Eurosport) and rights holders like UEFA, FIFA and Wimbledon are putting more and more resources toward.

Paris 2024 will, however, be produced in UHD HDR and 5.1.4 immersive audio (alongside extensive social media and digital output) by the International Olympic Committee’s production company OBS, in part because of demand for the format in Japan and by NBCU in the U.S. It’s not clear if the BBC will offer a UHD channel as part of its summer coverage.

 

Ursula Romero, Executive Producer at International Sports Broadcasting (ISB), said at the 4K HDR Summit last November, “We are constantly questioning whether we should broadcast in 4K and HDR. It’s a perpetual question mark because there is a big gap between producers and consumers.”

 

Her reasoning was that younger audiences care about other things than resolution – or indeed watching on the main household TV. “Everything revolves around social networks and they are now looking to watch sports by the minute and by the second, even in vertical format.”

 

Yet, Isidoro Moreno head of engineering at OBS said at the same event that Paris 2024 will consolidate 4K-HDR as the “top” TV standard for the next decade.

 

Most viewers won’t notice the lack of UHD coverage from either UEFA event. Indeed, the widespread use of cine-style cameras at the Euros will provide a cinematic quality to that production that viewers have not seen before.

 

These feeds will be cut live into the broadcast in a trend which is being ramped up across all major sports. The depth of field from the large format sensors is able to capture fan reactions, team tunnel entrances and team line ups in a way that viewers have come to expect from the numerous glossy sports docs on streaming services.

 

For consistency, the bulk of HD cameras used for the host coverage at the Euros are Sony HDC 3500s. There are in excess of 40 per venue and feature a global shutter to eliminates the ‘jello effect’ and banding noise. The 3500s are also capable of HDR as well as standard dynamic range. The HDR format used in the tournament is HLG.

 

Other than that the technical template and editorial formula for Euros coverage will be largely unchanged. “It’s a safe pair of hands,” is how Curtin puts it.

 

EMG servicing UEFA host coverage

 

EMG are providing the host facilities and the unilateral facilities for UEFA at Cologne and Dusseldorf, two of the ten venues in Germany, in a continuation of work for the tournament organisers which began in 2008.

 

Its specialist cameras division ACS is supplying helicopters for match coverage at every venue. It is additionally servicing the official Fan TV production for every venue with flyaway kits as well as every Technical Operations Centre (TOC). The TOCs are essentially banks of servers, routers and waveform monitors which supervise distribution of all Uefa’s on-site feeds, including those for Eurovision, UEFA broadcast partners and VAR technology.

 

EMG merged with fellow facilities provider Gravity Media in January although it won the Euro 2024 contract a year ago.

 

“One of the keys for us was that our EMG headquarters is in Cologne,” says Curtin. “We can show the strength of our combined group across Europe by sending teams from Belgium, the Netherlands, the UK, and Germany.”

 

Its team is led by operational lead Chrissie Collins and Simon Nichols, who is Gravity Media’s director of engineering. Xavier Devreker is in charge of Fan TV at EMG, Matt Coyde of ASC handles special cameras and Chris Brandrick manages connectivity.

 

The groups is sending four trucks to Germany, two per venue. Each is identical in size and floorplan. One is for the host, the other for the unilateral feed, but the unilateral truck is also the back-up should anything in the host fail.

 

EMG is also working with ITV in Germany providing facilities for ITV’s unilateral coverage such as matchday commentary and presentation studio in Berlin. ITV feeds will be routed out of a Nova 50 series truck from each venue with production remoted back to EMG’s operations centre at WestWorks in White City where there are also greenrooms, 6 edit bays and office space of ITV’s team. For presentation, all signals are backhauled to the remote centre, incorporating video, audio and data.

 

Sustainability was a key reason that ITV selected EMG as its partner. The Nova 50 series truck is designed to use less power-hungry equipment. The 1700w solar mats fitted to the roof help create fuel efficiencies and internal battery power. A remote surface model enables much of the kit that would be housed in a traditional articulated OB can be maintained at WestWorks. While much smaller than conventional trucks and needs fewer crew it still features four bays of equipment including a gallery space and can accommodate up to nine people.

 

“ITV is able to send fewer production crew for the initial stages of the tournament to help achieve their sustainability goals,” Curtin says.

 

Because EMG is providing facilities at every venue for the Fan TV and TOC it is able to combine forces and send fewer trucks, regionally based too, to deliver the kit “which is another great way of reducing CO2.” EMG Group crew will also travel to Germany by train.

 

This is the start of a bumper summer of sport particularly for European facilities with several trucks and crew catering to Germany in June then moving across to Paris for the Olympics.

 

“It means most if not all facilities providers are at full tilt,” Curtin says. “There isn’t much resource availability left.

 

At least one of EMG’s trucks is heading to Paris straight after the Euros. Meanwhile EMG is still busy delivering T20, ODI and test cricket in the UK for Sky Sports as well as golf production for the European Tour, the centrepiece of which is The Open at Royal Troon, Scotland which begins the Thursday after the Euros final.