Saturday 27 April 2024

“Civil War:” The Camerawork to Capture the Chaos

NAB


Perhaps it could only take an outsider to update the American Civil War of the 1860s and imagine what would happen if similar events tore apart the United States today.

British writer-director Alex Garland didn’t have to look far for inspiration: The January 6, 2021 mob attack on the Capitol was a vivid insurrection filmed live on TV in broad daylight. While these events are a thinly disguised template for the finale of his film Civil War, Garland seems less interested in apportioning blame to the political right or left than in asking why we might end up there again.

You could see similar events play out in Britain or any other country, he told an audience at SXSW after the film’s premiere. “Any country can disintegrate into civil war whether there are guns floating around the country or not,” he suggested, adding that “civil wars have been carried out with machetes and still managed to kill a million people.”

“I’ve known a lot of war correspondents because I grew up with them,” Garland said in the same on-stage interview. “My dad worked [as a political cartoonist] on the Daily Telegraph. So I was familiar with them.”

Garland showed cast and crew the documentary Under The Wire about war correspondent Maria Colvin, who was murdered in Syria. His lead characters are news and war photographers played by Kirsten Dunst and Cailee Spaeny, whose character’s names echo those of acclaimed photojournalists Don McCullin and Lee Miller. Murray Close, who took the jarringly moving photographs that appear in the film, studied the works of war photographers.

“There are at least two [types of war photographer],” said Garland. “One of them is very serious minded, often incredibly courageous, very, very clear eyed about the role of journalism. Other people who have served like veterans are having to deal with very deep levels of disturbance (with PTSD) and constantly questioning themselves about why they do this. Both [types] are being absorbed and repelled at the same time.”

He represents both types in the film. While it is important to get to the truth — in this case, the money shot of the execution of the US President — he questions if that goal should take priority over everything else they come across in their path. At what point, Garland asks, should the journalist stop being a witness and start being a participant?

“Honestly, it’s a nuanced question, nuanced answer,” he said. “I can’t say what is right or wrong. There’s been an argument for a long time about news footage. If a terrible event happens, how much do you show of dead bodies? Or pieces of bodies? Does that make people refuse to accept the news because they don’t want to access those images? Or worse, does it make them desensitized to those kinds of images? It’s a tricky balance to get right.”

In this particular case, one of the agendas was to make an anti-war movie if possible. He refers to the controversial Leni Riefenstahl directed 1935 film Triumph for the Will, which is essentially Nazi propaganda.

Garland didn’t want to accidentally make Triumph for the World, he said, by making war seem kind of glamorous and fun. “It’s something movies can do quite easily,” he said. “I thought about it very hard and in the end, I thought being unblinking about some of the horrors of war was the correct thing to do. Now, whether I was correct or not, in that, that’s sort of not for me to judge but I thought about it.”

Garland establishes the chaos early, as Dunst’s character covers a mob scene where civilians reduced to refugees in their own country clamor for water. Suddenly, a woman runs in waving an American flag, a backpack full of explosives strapped to her chest.

“Like the coffee-shop explosion in Alfonso Cuarón’s Children of Men, the vérité-style blast puts us on edge — though the wider world might never witness it, were it not for Lee, who picks up her camera and starts documenting the carnage,” reviews Peter Debruge at Variety.

To achieve the visceral tone to the action, Garland decided to shoot largely chronologically as the hero photographers attempt to cross the war lines from California to the White House.

After two weeks of rehearsals to talk through motivations and scenes and characters, Garland and DP Rob Hardy then worked to figure out how they were going to shoot it. He wanted the drama to be orchestrated by the actors, he told SXSW. “The micro dramas, the little beats you’re seeing in the background, are part of how the cast have inhabited the space.”

Spaeny, offers insight into Garland’s “immersive” filming technique in the film’s production notes. “The way that Alex shot it was really intelligent, because he didn’t do it in a traditional style,” she says. “The cameras were almost invisible to us. It felt immersive and incredibly real. It was chilling.”

A featurette for the movie sheds light on Garland’s unconventional filming style, in which he describes Civil War as “a war film in the Apocalypse Now mode.”

While the A-camera was a Sony VENICE, they made extensive use of the DJI Ronin 4D-6K, which gave the filmmakers a human-eye perception of the action in a way that traditional tracks, dollies and cranes could not. They also bolted eight small cameras to the protagonists’ press van.

To Matthew Jacobs at Time Magazine, Spaeny likened the road scenes to a play, adding, “unlike theater, or even a typical movie shoot, Civil War changed locations every few days as the characters’ trek progressed, introducing constant logistical puzzles for the producers and craftspeople to solve.”

Dunst’s husband Jesse Plemons makes a brief appearance in the film, but commands the scene as a menacingly inscrutable soldier carrying a rifle and wearing a distinct pair of red sunglasses.

“I can imagine that people might read into or some kind of strange bit of coding into Jesse Plemons’s red glasses,” Garland says in A24’s notes. “Actually, that was just Jesse saying, I sort of think this guy should wear shades or glasses. And he went out himself and he bought six pairs, and we sat there as he tried them on, and when he got to the red ones, it just felt like, yep, them.”

 


With “Cowgirls on the Moon,” the Workflow’s in the Cloud

NAB

article here

It began as a joke, but putting cowgirls on the moon is a serious attempt to showcase what is possible by using a raft of new filmmaking technologies from virtual production and cloud rendering to generative AI.

Unveiled and demonstrated at NAB Show, the faux movie trailer for Cowgirls on the Moon is a goofy but high-concept challenge led by AWS that conforms to elements of MovieLabs 2030 Vision.

“It started off as a joke,” Katrina King, global strategy leader for content production at AWS, explained. “Let’s do something ridiculously out there that’s really going to force us to lead into modern cloud computing and generative AI. And I said something like ‘cowgirls on the moon.’ It was a joke, but nobody came up with a better idea so that’s what we went with.”

The aim was to demonstrate the power of three technologies working in tandem: generative AI-assisted virtual production, cloud rendering and VFX with the use of AWS Deadline, and holistic production in the cloud.

“At AWS, we believe very strongly in the responsible use of generative AI,” King continued. “So we used applications that allow artists to work more efficiently and to offload the mundane technical aspects.”

For instance, they used text-to-video generator Runway for concept art and storyboards, another AI tool for facial recognition, and an enhanced speech tool included within Adobe Premiere. The latter tool completely rebuilt the dialogue track as if it had been recorded in an ADR session. “The amount of time that saved us up having not to go into an ADR session was incredible,” King said.

The principal generative AI tool was Cuebric, which was used to generate assets and import them into Unreal Engine and to automate certain technical aspects of the production. All of the backgrounds in the virtual production and animated in Unreal Engine were generated using Cuebric.

 

“Once Cuebric exports it we have these different layers which are then presented in Unreal, so that as we move the camera and the camera tracks on the volume we get parallax,” King explained.

Visual effects facility DNEG delivered 36 VFX shots for the production in just eight days. The whole project was essentially run as a full studio and render farm in the cloud.

Project producer Ron Ames talked up the benefits of the virtual production, such as being able to swap out entire infrastructure and multi-location collaboration.

“We first said, ‘We want these machines to be Linux.’ But then we changed our minds. ‘Now we want them to be Windows.’ Literally in minutes we had new machines up and running,” he said. “The ability to work quickly to collaborate, to tear down walls. We had groups working in Vancouver, in London, LA, Boston, Idaho, Switzerland, Turkey, Tucson, Netherlands [on the project all linked to assets by cloud].”From “Cowgirls on

Ames previously used extensive AWS workflows as producer of Amazon series The Lord of the Rings: The Rings of Power. He thinks other producers remain unconvinced about choosing to put their next project into the cloud.

“Petrified would be the word, not reluctant. The notion that we’ve done it before this way, or we have investments in a certain infrastructure, is one of the impediments to moving forward,” Ames said.

“On Rings of Power, the great good fortune we had was a team of producers and at AWS supporting us to try new stuff and if it doesn’t work, we’re not going to give up, we’re going to make it work and make it work in a way that actually has benefits. Once we saw the efficiencies, the creative possibilities, and truly the collaborative power of breaking down silos, walls, traditional ways we’ve been working, we realized that this had a great value.”

 


Why “Super Churners” Are Driving Change in the SVOD Model

NAB

article here

Cancelling streaming services is no longer niche or occasional. Churn has gone mainstream and premium SVODs are going to have to employ new tactics to compete for a share of the household wallet.

Finished watching The Bear? Ditch Hulu. Want to watch Fallout? Sign up for Amazon Prime Video. Time for Slow Horses Season 3? Then cancel Amazon (having already binged Fallout) and get Apple for at least a couple of weeks. More and more of us are doing this, partly because price inflation has exhausted the amount people are willing to spend on stacking SVODs, most of whose content they don’t actually watch.

It’s also because the SVOD system of no-contract, one-month viewing — so important in kick-starting the streaming business — makes it so easy to do.

According to data from Antenna, at the end of 2023 nearly one-in-four streaming US consumers qualified as serial churners — individuals who have canceled three or more Premium SVOD services in the past two years. That’s an increase of 42% YoY.

Antenna even identifies a group of “super heavy serial churners,” those who make seven or more cancellations within the past two years, and found that they constituted 19% of subscribers in 2023.

More data: serial churners were responsible for 56.5 million cancellations in 2023, up a whopping +54.6% year-on-year, while cancellations by non-serial churners increased 18.5% to 82.8 million in the same period.

As the headline in The New York Times succinctly puts it, “Americans’ New TV Habit: Subscribe. Watch. Cancel. Repeat.”

While consumers value flexibility, the implications could be significant for the major media companies, especially as it seems likely this behavior will become even more common.

One option outlined by John Koblin in The New York Times is to bring back some element of the cable bundle by selling streaming services together.

Executives believe consumers would be less inclined to cancel a package that offered services from multiple companies. Disney, for instance, is bundling Disney+, Hulu and ESPN+ into one package and, later this year, will launch a sports streamer pooled with Fox and Warner Bros. Discovery.

Another tactic is to promote “coming soon” content prominently on the home page. For instance, Apple TV+ is teasing Dark Matter, a science-fiction series that comes out in its app in May.

Peacock promoted a special offer to deter new subscribers from cancelling by offering a deal to sign up for a full year at a discount.

According to Antenna research, cancellation rates for those who did sign up did not drop off a cliff a month later, but instead were close to average.

Netflix appears immune, according to Antenna data. Or rather; it is the service most likely to be part of household bundles with every competitor part of a revolving carousel that consumers pick and mix according to the latest show to land.

Without a predictable revenue stream, it is harder for streamers to invest in new content, causing them to cut production and deliver fewer stand out new releases to market, in a vicious cycle that will gather pace unless nothing changes.

 


Friday 26 April 2024

UEFA relegates UHD for Euro 2024 and Champions League Final

IBC

Sports broadcasters have given up on 4K UHD for the time being at least with the UEFA Champions League Final and this summer’s European Championship both being produced in 1080p HD HDR as the host production format.


article here

 

That’s a significant reversal of more than a decade long trend to up the ante in terms of broadcast resolution with each successive major tournament.

 

The Champions League Final has been broadcast in UHD since 2015 and the Euros since 2016 (then 2021) but now there’s been a rethink.

 

Nor is UEFA an outlier. “There’s been a change in delivery format for a lot of major international tournaments and also domestic tournaments where broadcasters are now looking at 1080p HDR as the chosen one,” says Eamonn Curtin, Global Client Director for leading outside broadcast facilities provider EMG.

 

“It’s a benefit for us technically since we just have one signal to produce and manage rather than the four that made up the UHD signal.”

 

If UHD cameras were used as source routing in a standard (non-IP) outside broadcast truck they would typically be distributed as four links using 3G-SDI (Serial Digital Interface). This

Quad Link 4K/UHD signal needed careful monitoring to ensure sync when combined on output and made the work of the OB technically more challenging.

 

That’s not the reason there’s no UHD at either event. UHD could after all be produced with upscaling. The main reason is lack of broadcaster/rights holder interest. Eight years after BT Sport became Europe's first 4K broadcaster few others have followed suit. The cost to buy kit, refresh, rip and rewire studios and OB vans let alone buy satellite transponder space is an expense few could justify when viewers were unwilling to pay a premium for the visual uplift.

 

More tellingly, the visual uplift in resolution wasn’t actually a huge leap when full HD High Dynamic Range came into the equation. Taste tests of HD HDR versus 4K suggested that viewers preferred the sharper contrast and detail in light and shade in the HD version.

 

Add to that BT Sport’s takeover by Warner Bros. Discovery which closed in September 2022. BT Sport, like Sky Sports before it, had planted a flagpole as the most consistently innovative broadcaster of its generation. Where Sky Sports led on HD, on-screen graphical presentation and dabbled in stereo 3D, BT Sport pioneered 4K, Dolby Atmos and experimented with VR with plans for 8K broadcasts thwarted by the pandemic.

 

WBD brand TNT Sports, which is covering the Champions League Final as host broadcaster from Wembley, is not averse to innovation but it is choosing to put its firepower into other areas. This includes virtual presentation studios the likes of which we’ve seen at recent Olympics, its remote ‘holographic’ style interview studio ‘Cube’ which features at tennis tournaments and richer data insights gleaned in realtime from athletes and equipment, as showcased during the Tour de France.

 

Coverage produced for digital distribution online and social media is another highly significant area of production that WBD (through Eurosport) and rights holders like UEFA, FIFA and Wimbledon are putting more and more resources toward.

Paris 2024 will, however, be produced in UHD HDR and 5.1.4 immersive audio (alongside extensive social media and digital output) by the International Olympic Committee’s production company OBS, in part because of demand for the format in Japan and by NBCU in the U.S. It’s not clear if the BBC will offer a UHD channel as part of its summer coverage.

 

Ursula Romero, Executive Producer at International Sports Broadcasting (ISB), said at the 4K HDR Summit last November, “We are constantly questioning whether we should broadcast in 4K and HDR. It’s a perpetual question mark because there is a big gap between producers and consumers.”

 

Her reasoning was that younger audiences care about other things than resolution – or indeed watching on the main household TV. “Everything revolves around social networks and they are now looking to watch sports by the minute and by the second, even in vertical format.”

 

Yet, Isidoro Moreno head of engineering at OBS said at the same event that Paris 2024 will consolidate 4K-HDR as the “top” TV standard for the next decade.

 

Most viewers won’t notice the lack of UHD coverage from either UEFA event. Indeed, the widespread use of cine-style cameras at the Euros will provide a cinematic quality to that production that viewers have not seen before.

 

These feeds will be cut live into the broadcast in a trend which is being ramped up across all major sports. The depth of field from the large format sensors is able to capture fan reactions, team tunnel entrances and team line ups in a way that viewers have come to expect from the numerous glossy sports docs on streaming services.

 

For consistency, the bulk of HD cameras used for the host coverage at the Euros are Sony HDC 3500s. There are in excess of 40 per venue and feature a global shutter to eliminates the ‘jello effect’ and banding noise. The 3500s are also capable of HDR as well as standard dynamic range. The HDR format used in the tournament is HLG.

 

Other than that the technical template and editorial formula for Euros coverage will be largely unchanged. “It’s a safe pair of hands,” is how Curtin puts it.

 

EMG servicing UEFA host coverage

 

EMG are providing the host facilities and the unilateral facilities for UEFA at Cologne and Dusseldorf, two of the ten venues in Germany, in a continuation of work for the tournament organisers which began in 2008.

 

Its specialist cameras division ACS is supplying helicopters for match coverage at every venue. It is additionally servicing the official Fan TV production for every venue with flyaway kits as well as every Technical Operations Centre (TOC). The TOCs are essentially banks of servers, routers and waveform monitors which supervise distribution of all Uefa’s on-site feeds, including those for Eurovision, UEFA broadcast partners and VAR technology.

 

EMG merged with fellow facilities provider Gravity Media in January although it won the Euro 2024 contract a year ago.

 

“One of the keys for us was that our EMG headquarters is in Cologne,” says Curtin. “We can show the strength of our combined group across Europe by sending teams from Belgium, the Netherlands, the UK, and Germany.”

 

Its team is led by operational lead Chrissie Collins and Simon Nichols, who is Gravity Media’s director of engineering. Xavier Devreker is in charge of Fan TV at EMG, Matt Coyde of ASC handles special cameras and Chris Brandrick manages connectivity.

 

The groups is sending four trucks to Germany, two per venue. Each is identical in size and floorplan. One is for the host, the other for the unilateral feed, but the unilateral truck is also the back-up should anything in the host fail.

 

EMG is also working with ITV in Germany providing facilities for ITV’s unilateral coverage such as matchday commentary and presentation studio in Berlin. ITV feeds will be routed out of a Nova 50 series truck from each venue with production remoted back to EMG’s operations centre at WestWorks in White City where there are also greenrooms, 6 edit bays and office space of ITV’s team. For presentation, all signals are backhauled to the remote centre, incorporating video, audio and data.

 

Sustainability was a key reason that ITV selected EMG as its partner. The Nova 50 series truck is designed to use less power-hungry equipment. The 1700w solar mats fitted to the roof help create fuel efficiencies and internal battery power. A remote surface model enables much of the kit that would be housed in a traditional articulated OB can be maintained at WestWorks. While much smaller than conventional trucks and needs fewer crew it still features four bays of equipment including a gallery space and can accommodate up to nine people.

 

“ITV is able to send fewer production crew for the initial stages of the tournament to help achieve their sustainability goals,” Curtin says.

 

Because EMG is providing facilities at every venue for the Fan TV and TOC it is able to combine forces and send fewer trucks, regionally based too, to deliver the kit “which is another great way of reducing CO2.” EMG Group crew will also travel to Germany by train.

 

This is the start of a bumper summer of sport particularly for European facilities with several trucks and crew catering to Germany in June then moving across to Paris for the Olympics.

 

“It means most if not all facilities providers are at full tilt,” Curtin says. “There isn’t much resource availability left.

 

At least one of EMG’s trucks is heading to Paris straight after the Euros. Meanwhile EMG is still busy delivering T20, ODI and test cricket in the UK for Sky Sports as well as golf production for the European Tour, the centrepiece of which is The Open at Royal Troon, Scotland which begins the Thursday after the Euros final.

 

 


The path to Porto: WRC readies for iconic Rally de Portugal

SVG Europe

article here

After Monte Carlo, Sweden, Kenya and Croatia, the FIA World Rally Championship (WRC) moves to Portugal on 9 May for the start of the European gravel phase and promises spectacle in and around the scenic city of Porto.

As well as being the biggest sporting event held annually in Portugal, the Rally de Portugal is ranked as one of the championship’s classic events with challenging stages including the legendary Fafe jump before the finish. Belgium’s Thierry Neuville, who drives for Hyundai, heads into the event championship leader.

“Rally de Portugal is a really big broadcast for us. It’s a spectacular rally and one of our biggest in terms of on-site audience attendance,” says Florian Ruth, senior director content & communication at rights holder WRC Promotor.

Difficult stages

This year’s course is made up of 22 stages, more than any here since 2012, spread across a large geographic area and reaching a total length of 1,690km. This includes a night stage in the host city itself around the maze of beautiful streets converging on the river Douro and the famous Dom Luís metal arch bridge.

This makes the event logistically complex.

“We have a very experienced team on the ground and a very good partner with [Portuguese national broadcaster] RTP with whom we have an excellent relationship to make a great co-production,” Ruth adds.

On the ground, RTP is supporting WRC with production facilities particularly for the Porto night stage. WRC runs four OB vans across the different stages with the live world feed gallery produced at WRC TV’s production hub in Helsinki managed by technical production provider NEP Finland.

Aerial coverage is provided by two helis (one with Cineflex, another carrying Shotover) and a pair of drones. These are just four of over 100 camera sources. Each of the first 15 cars are fitted with three to four onboards (around 45 onboards in total). These are bespoke builds developed by NEP for the WRC.

For main stage coverage feeds from the onboards, heli and some selected ground cameras are relayed via a relay plane to Helsinki. This isn’t the only connectivity pathway.

Ruth explains: “We can send feeds from the OB trucks via the plane and we can send line cuts from the trucks via satellite and also via LiveU. For example, we transmit all the live TV stages of the main race on satellite. Then, when the support categories [WRC2 and WRC3] begin, we switch to RF contribution. In addition, we have a variety of other camera crew also providing feeds via LiveU. All of RTP’s feeds are contributed over domestic fibre, managed by Tata Communications.”

In Helsinki, WRC teams produce the world feed adding commentary, graphics and final mix before distributing to right holders via fibre, by SRT streams or via satellite.

Races are produced in full HD encoded in HEVC, still considered by WRC and most leading live sports producers as a less complex production format than four channels of HD for 4K UHD.

Capturing the excitement

Editorially, a chief goal of WRC coverage is to combine the action on the circuit and in the cars with the excitement of the live event. To do this it selects camera positions from which it can pan from racing action to emotion in the crowd.

“The stadium stages of the Rally de Portugal in particular, will focus a lot on the crowd atmosphere. We expect 30,000-40,000 spectators and we want to see their emotion,” he adds.

Audio comms between driver and co-driver is a unique aspect to this motorsport and one which WRC coverage leans into. A multimedia box installed under the driver’s seat records driver and co-driver exchanges and transmits them via the plane back to the Helsinki base. External mics on the car, positioned to pick up the engine as well as the sound of gravel (rocky or sandy) terrain, are also fed back from the car. Additional audio is gathered from mics arrayed around the circuit and among spectators.

Sunday’s decisive stage will be made up of double passes through the 19.91km of Cabeceiras de Basto, of which 12.6km are completely new, and the iconic 11.18km of Fafe, attended by 100,000 people many of whom will have stayed to party overnight. The hillsides overlooking the sweeping bends that precede it are a magnet for fans.

“When the first car comes through, they sing like a football crowd and that is what we want to hear and transmit,” says Ruth.

In all the team will produce 22 to 25 hours of live coverage across the four-day event culled from over 400 hours of total content. That translates to between 15TB and 20TB of material.

All key footage is transferred via fibre to the Helsinki hub. There the WRC produces daily highlights, news packages, digital clips, social media content and assets all of which lands in a new digital archive system launched in conjunction with Moments Lab (formerly Newsbridge). This WRC Championship is accessible to rights holders, broadcasters and sponsors.

DAZN deal

Last August, the WRC combined its various distribution platforms into Rally.TV, a 24/7 linear and OTT home of all rally and rallycross events. Its broadcast strategy is to serve rights holders and grow the audience for the product, which makes a recent carriage deal with DAZN significant. Rally de Portugal marks the start of its involvement.

“As our co-operation with them is just about to start, DAZN first need to learn our product but we are having good conversations with them and they have some good ideas. Their editorial teams are very promising and the promotion they will do for us will help grow the championship,” says Ruth.

Six of the 22 stages of Rally de Portugal are packaged as live TV stages for distribution on mainstream channels.

“The live TV stages are where we do wider storytelling, character building and updates as to what has happened in the rally so far. If you’re not so deep into watching every moment of every stage we can catch a broader audience,” he adds.

The competition is a race against the clock rather than head-to-head on the track so the editorial leans into the personalities of the drivers and in particular the relationship between driver and co-driver in the cockpit.

“That’s an aspect we focus on a lot in our storytelling because it sets us apart from every other motorsport,” Ruth says.

WRC Promoter is also keen to enrich fan understanding with virtual overlays and time and section comparisons.

“We are showcasing the pinnacle of world rally drivers in cars which are so competitive that even when racing over 20+km to the limit the difference between them at the end is just tenths of a second,” Ruth says. “We are trying more and more to visualise the differences. Every millimetre and every centimetre on every turn counts and we are trying with digital enhancements to make this more and more visible to the viewer.”

Remote production

Production moved to a remote model in 2022 and it is proving successful on several levels, one of which is improved sustainability.

Ruth says the WRC is conscious that its CO2 footprint is being monitored and that its remote production has already had a substantive positive impact.

“Simply in terms of plane travel there’s a lot less people and freight we have to fly. We have reduced our emissions a lot”, he reveals.

There is, of course, a cost benefit in not transporting all the facilities and people to every location but one plus “more important than cost efficiency” is production resiliency, according to Ruth. To produce a rally in the middle of a safari park live from Helsinki was an ambition WRC had long wanted to achieve, Ruth says, and it was achieved for the first time last year with the WRC Safari Rally Kenya, and successfully repeated this year.

“The first year we travelled to Kenya the production facilities suffered. It’s not only Kenya. Mexico too. The more you rig in remote locations the greater the risk of failure,” he explains.

Driving on bumpy roads tended to loosen screws in gear, EVS kit needed dedusting of sand, 40º heat doesn’t help either.

Now, with all the essential production parts moved offsite there is less need to cable everything and less need to check and double check. He continues: “There always the chance of a broken connector on-site so to remove all that risk and be in a very stable environment where everything is perfectly connected and tested in a state-of-the-art IP production hub makes a big difference.”

Looking ahead, in June the WRC is returning to Poland, a country with a strong fanbase (it lost its place on the calendar due to repeated breaches of spectator safety) and it’s launching a new rally in Latvia in July.

There’s also WRC Finland in August “our Monaco Grand Prix”, Ruth says, “one of the most prestigious on the calendar”. Indeed, some of the most spectacular rallies in terms of landscape are still to come, with teams set to race at high altitude in Chile before the season finale in Japan, home of reigning World Champions Toyota Gazoo Racing and one of their two full-time drivers, Takamoto Katsuta.

 


Thursday 25 April 2024

Editing in the Cloud Is Easy (As Long As You Have the Right Speed, Storage and Strategy)

NAB

There’s an old tech industry joke that “the cloud” is a fancy way of saying “somebody else’s computer.” That’s a bit of an oversimplification since cloud computing services are a lot more involved than just providing access to a server someone else owns.

article here

But the fact remains that the primary attribute of cloud computing is accessing computing resources — software applications, servers, data storage, development tools, and networking functionalities remotely over the internet.

Increasingly that means everyday post-production processes and crafts like editing too. Like much of post-production activity, the real shift to cloud came with COVID. If productions were to continue behind closed doors then remote and collaborative ways to continue the job had to be found.

While many facility managers and editors found those ad hoc attempts at the start of 2020 to be just about workable, the way the technology was proven to work opened up people’s minds to the benefits of more permanent cloud-based editing.

Today, at the very least, hybrid work-office scenarios are common, with cloud-based workflows no longer considered unusual across all genres ranging from live news and sports to feature animation, scripted TV and documentaries.

In a series of primers (ostensibly to promote its cloud storage), LucidLink explains cloud video editing and outlines the benefits it can offer.

Much of what the company has to say will be familiar to industry pros, but there’s a no-nonsense clarity for anyone unsure.

Cloud video editing refers to workflows that leverage the cloud rather than on-premise infrastructure. Editors can share their data with the complete toolset of a desktop-based NLE such as Adobe Premiere, Avid or DaVinci Resolve. The key difference is that the data itself is stored in the cloud, rather than on local devices. With the right software, cloud-based video editing can also include tools installed on virtual machines that perform parts of an editing workflow.

One of the chief benefits of working this way is remote collaboration. Since cloud-based systems and storage are inherently accessible from anywhere in the world, this enables both hybrid and fully remote workflows for editing teams.

Configured correctly (and the article doesn’t particularly delve into the cost of cloud storage and data transfer which vary greatly depending on facility needs), cloud can save time and cash.

“Although the cloud offers clear advantages when it comes to smaller files (like low-res video proxies), until recently handling large files was an unsolved challenge for cloud video editors due to lengthy upload and download times,” LucidLink notes, before offering its tech as a solution.

There’s also a look at the merits of cloud versus on-premise set-ups with files residing in a SAN or NAS system within a facility.

This latter approach, says the vendor, “requires copying large files to hard drives or using file transfer services if collaboration requires working with freelance talent in other locations other than at the facility itself.

“Even when working with large amounts of raw video data, editors often need to search, analyze and tag files, preferably in real time. The larger the file, the longer it takes to download, upload, render, or share.  Beyond the costly hardware investment, these systems still don’t solve the problem of waiting for files to download or distribute.”

However, it’s not usually a zero sum game. Most facilities at this moment in time prefer to retain one foot in both camps, in part as a safety net for data loss.

There are of course lots of choices when it comes to storage and the right strategy is vital for any production, says LucidLink.

“On-prem SAN and NAS systems can be very performant, but those benefits only exist in one location: a facility. The need to collaborate anywhere however is not addressed by these legacy approaches. This is where a cloud-based approach comes in.”

As we have seen from the recent NAB Show, more and more vendors are offering cloud based workflows. These increasingly start from the camera, where proxies are uploaded directly via the internet to some form or media management platform, and from which authenticated users anywhere can download or stream files to work from.

In a few years, looking back at the heavy duty power hungry monoliths of Silicon Graphics machines, Quantel boxes, or Autodesk hardware, we will wonder just how we ever worked without the internet.

AI Is Definitely Changing (But Not Destroying) Hollywood

NAB

The current consensus appears to be that generative video is not yet a Hollywood-killer and perhaps never will be. While AI is creeping into production, it is doing so to augment certain workflows or make specific alterations with no sign of it being used to auto-generate entire feature films or push creatives out of a job. But it’s still the early days.

article here

“It’s a fraught time because the messaging that’s out there is not being led by creators,” said producer Diana Williams, a former Lucasfilm executive now CEO and co-founder of Kinetic Energy Entertainment at the 2024 SXSW panel, “Visual (R)evolution: How AI is Impacting Creative Industries.”

Certainly, AI is a disruptive technology, but M&E of all industries should be used to taking tech change on board.

Julien Brami, a creative director and VFX supervisor at Zoic Studios, spoke on the panel with Williams, as Chris O’Falt reports at IndieWire. Brami said the common thread with each tech disruption is that filmmakers adopt new tools to tell stories. “I started understanding [with AI] that a computer can help me create way faster, iterate faster, and get there faster.”

Speed. That’s what you hear, over and over again, as the real benefit of Gen AI imaging, writes O’Falt who spoke to numerous filmmakers about the topic.

“Few see a viable path for Gen AI video to make its way to the movies we watch. Using AI is currently the equivalent of showing up on set in a MAGA hat.”

Finding actual artists who are willing to use AI tools with some kind of intention is tough, agrees Fast Company’s Ryan Broderick. Most major art-sharing platforms have faced tremendous user backlash for allowing AI art, and there’s even a new technology called Nightshade that artists are using to block their images from training generative AI.

Graphic designer and digital art pioneer Rob Sheridan tells Fast Company that the backlash against AI tech in Hollywood is directly caused by both tech companies and studios claiming that it will eventually be able to spit out a movie from a single prompt. Instead, Sheridan says it’s already obvious that AI technology will never work without people who know how to integrate it into existing forms of art, whether it’s a poster or a feature film.

“The thing that is hurting that progress — for this to kind of fold into the tool kit of creators seamlessly — is this obnoxious tech bubble shit that’s going on,” he says. “They’re trying to con a bunch of people with a lot of money to invest in this dream and presenting this very crass image to people of how eager these companies are, apparently, to just ditch all their craftspeople and try out this thing that everyone can see isn’t going to work without craftspeople.”

Media consultant Doug Shapiro tells Fast Company that AI usage will increase in Hollywood as studios grow more comfortable with the tech. He also suspects the current backlash against using AI is likely temporary.

“There’s this kind of natural backlash that tends to ease over time,” he says. “It’s going to get harder and harder to tell where the effects of humans stopped, and AI starts.”

Generative AI is cropping up most commonly in relatively small-stakes instances during pre- and post-production.  “Rather than spend a ton of money on storyboarding and animatics and paying very skilled artists to spend 12 weeks to come up with a concept,” Shapiro adds, “now you can actually walk into the pitch with the concept art in place because you did it overnight.”

Studios have also begun using AI to touch up an actor’s laugh lines or clean up imperfections on their face that might not be caught until after shooting has wrapped. In both cases, viewers might not necessarily even know they’re looking at something that has been altered by an AI model.

David Raskino, co-founder and CTO of AI developer Irreverent Labs, suggests to Will Douglas Heaven at MIT Technology Review that GenAI could be used to generate short scene-setting shots of the type that occur all the time in feature-length movies.

“Most are just a few seconds long, but they can take hours to film,” Raskino says. “Generative video models could soon be used to produce those in-between shots for a fraction of the cost. This could also be done on the fly in later stages of production, without requiring a reshoot.”

AI is putting filmmaking tools in the hands of more people than ever and who can argue that’s not a good thing?

Somme Requiem, for example, is a short film about World War I made by Los Angeles production company Myles. It was generated entirely using Runway’s Gen 2 model then stitched together, color-corrected, and set to music by human video editors.

As Douglas Heaven points out, “Myles picked the period wartime setting to make a point. It didn’t cost anywhere near the $250 million of Apple TV+ series Masters of the Air, nor take anywhere like as long as the four years Peter Jackson took to produce World War I doc They Shall Not Grow from archive video.”

“Most filmmakers can only dream of ever having an opportunity to tell a story in this genre,” Myles’ founder and CEO, Josh Kahn, says to MIT Technology Review. “Independent filmmaking has been kind of dying. I think this will create an incredible resurgence.”

However, he says, he believes “the future of storytelling will be a hybrid workflow,” in which humans make the craft decisions using an array of AI tools to get to the end result faster and cheaper.

Michal Pechoucek, CTO at Gen Digital, agrees. “I think this is where the technology is headed,” he says. “We’ll see many different models, each specifically trained in a certain domain of movie production. These will just be tools used by talented video production teams.”

A big problem with current versions of generative video is the lack of control users have over the output. Producing still images can be hit and miss; producing a few seconds of video is even more risky. Its why humans will need to be involved. But, of course, as you read this OpenAI’s Sora just got better and better.

“Right now, it’s still fun, you get a-ha moments,” says Yishu Miao, CEO of UK-based AI startup Haiper. “But generating video that is exactly what you want is a very hard technical problem. We are some way off generating long, consistent videos from a single prompt.”