Tuesday, 16 July 2024

Behind the Scenes: The Acolyte goes old school to capture original Star Wars grit

IBC

The Acolyte looked to capture the handmade, analogue feel of the original Star Wars trilogy in a surprising departure from LED volumes.

article here

The production of recent Star Wars TV spin-offs have taken different paths. While The Mandalorian, The Book of Boba Fett, and Ahsoka were filmed largely on Industrial Light & Magic’s StageCraft video wall, Andor was filmed at Pinewood and locations throughout the UK. That’s the route taken by The Acolyte too in a bid, it seems, to capture the pre-CGI analogue quality of the original films.

“It was decided very early on that The Acolyte was not going to be a Volume show,” explains Chris Teague who photographed four of the season’s eight episodes. “Sets and locations would fit really well with an aesthetic that was a little bit more handmade, if you will. More textural, less glossy and clean, basically.”

This is Teague’s first foray into the Star Wars universe, as it is for series showrunner Leslye Headland with whom he shot episodes of Netflix drama Russian Doll. He is a fan, though, and like many, he reveres The Empire Strikes Back as the best of the franchise’s features.

“I remember having this distinct feeling as a kid watching how it ends on this downbeat note which was striking and surprising to me,” he says. “I didn’t know you could tell a story that way.

“Plus, I just loved the look of it. It’s this futuristic world but at the same time, everything feels very rough around the edges.

“The colour palette resonated too. The original films tend to have this mix of organic neutral tones, browns and greens, and then very solid washes of colour for the lightsabers or, as at the end of Empire Strikes Back, with strong oranges and intense blues. Those colours contrast with each other and it’s a palette Leslye and I wanted to bring to The Acolyte.”

Shared vision

Eschewing virtual production, they shot on sets at Shinfield Studios in Berkshire and Arborfield Studios, Wokingham. Teague talks of a close collaboration between camera and art department to structure the sets so that they could be lit and photographed properly.

“When you walk onto a show with a bunch of different department heads who you’ve never worked with before and who come from very different experiences you just never know creatively if you’re going to end up on the same page.”

With production designer Kevin Jenkins and second unit director Christopher Cowan (Teague calls him “action designer”) the DOP felt they were all working towards the same goal.

“The enthusiasm level was through the roof. My camera operators were just so excited to have been part of the experience. People just love Star Wars. So to end up able to be a part of it was a lifelong goal for a lot of people.”

In episode 1 there’s a dream sequence where Osha (Amandla Stenberg) meets her twin sister Mae and the environment around her changes from snowy landscape to dark forest to forest on fire. It’s hard to believe it was created in-camera and not using LED walls.

“A lot of the transitions in those dream scenes are done simply through editing, or through match cuts, where one camera’s traveling left to right and then it cuts with another camera, traveling left to right. There’s also a beautiful visual effects shot where we transition from the snowy planet of Carlac to a forest at night. But we also used very old school methods of transitioning. At the end of that sequence, when Osha is looking at the younger version of her sister, the world goes white behind her. For that we used a piece of white gauze and dimmed the lights so when there’s no light on it, it basically looks invisible. As we bring the lights up on it, it illuminates to create an in-camera effect.”

They also used locations in Wales and the Portuguese island of Madeira, chosen for its dramatic and distinctive terrain. “We could be on one side of the island shooting one planet and on the beach on the other side shooting another planet the next day.”

Martial arts

The fight scenes were designed to blend familiar Star Wars action with Asian martial arts – dubbed Force Fu. Fight choreography was planned in a 3D environment then put into a 3D model of the set and virtually photographed.

“We iterate the virtual photography of the scene to the point where we have a pre-visualization of the whole action sequence. We knew cut-to-cut exactly what we were going for. That didn’t mean we couldn’t modify things to some extent on set. We’d make micro-adjustments to camera, or put on a slightly different lens to amp up the look as much as possible.

Some scenes were storyboarded top to bottom based on artwork by storyboard artist Jim Cornish. “A lot of times with storyboards you do end up deviating from them more than virtual previz because they’re a little bit more atmospheric.”

“For other scenes, when the sets were built, Leslye and I would go onto the set with our stand-ins and we shoot photographs of every setup so we could see what our backgrounds would be and how the blocking would work out.”

Lucasfilm were open to camera testing with Teague opting to shoot with Sony Venice 2, a camera on which he shot multiple episodes of Disney+ comedy Only Murders In The Building.

“There are so many excellent cameras out there that make fantastic images which means you kind of can’t go wrong in your choice.”

He paired the large format sensor with anamorphic lenses “because that just felt like the language of Star Wars.”

The Arri Alfas are a 2x squeeze anamorphic glass modified to deliver “an exceptionally crisp central image” but with a focus fall off around the edges which Teague liked.

“Being able to modulate the depth of field with large format, where you could really pull backgrounds slightly out of focus and give them that kind of magical anamorphic look was really helpful and I think attributed a lot to the look of the show.”

Light work

Teague lit the sets as naturalistically as possible working with LEDs for convenience and flexibility.

“For day interiors the lights tend to look more realistic the further away from the camera you can put them. For some sequences including the opening episode’s cantina sequence, we relied a lot on old school Tungsten lighting and arrays of smaller focussed Wendy Lights. When you put a group of them together they created a beautiful beam of sunlight that has this great quality of being both kind of hard and punchy and spread out at the same time. That was something that my gaffer Jonny Franklin introduced me to and was an essential component to the look of that scene.”

Episode four plays out as a race against the night as Osha and a crew of Jedi hunt for Mae through the jungle terrain of Wookiee planet Khofar. The daylight lowers until it ends up as a setting sun.

“That was a really fun effect to create on a stage. We put two Tungsten 20K lights up on cranes and created a sunset effect live in-camera. That’s something I had never done before, but I felt it worked beautifully.”

The overall ethos of Teague’s work is ‘less is more’. “When we move the camera, it is motivated,” he says. “If we don’t need to move the camera, then we’re not going to. Hopefully that has a bigger impact on the audience. I think that that simplicity was very present in the original films.

“Star Wars is also about the ensemble, a team coming together with a common goal even though they come from different places with different motivations,” he says. “That was something we really focused on in the way that we framed our group shots and strove for dynamic group scenes that showed the effort behind the mission.”

Friday, 12 July 2024

The NFL’s Creator Partnerships Touch Down With Audiences

NAB

All sports are chasing lucrative but elusive potential fanbases to sustain and grow their business with social media a prime route to reach them. Posting official clips on TikTok or Facebook is nowhere near enough.

article here

The NFL has shown how engagement can be driven solidly upwards by working with content creators and social platforms — provided the league is willing to cede control over output.

Content creators — experts in educating, entertaining and exciting their audience and communities — are being drafted to provide fans with multifaceted ways to engage with the game and their favorite teams.

A panel at VidCon, “The NFL’s Trailblazing Approach to Creator Partnerships,” explored how the NFL has collaborated with creators and social media platforms  to extend the sport’s influence on a new, younger, and diverse generation of fans.

Its strategy youth marketing program kicked off six years ago, explained Ian Trombetta, the NFL’s SVP of social, influencer and content marketing.

“It started with listening to the players and the types of things they wanted to do off the field, and we really ramped up our content on the field with too,” Trombetta said. “It was about listening to them around fashion, gaming, all the different things they’re interested in ways in which we can support them at the league.

“What that opens up is a lot of opportunities for us throughout the year, whether it’s in season or in the offseason. Increasingly we’re looking to pair them with creators and with partners like YouTube and then weaving that into our programming throughout the year.”

The results have enabled the NFL to “reach younger audiences, more multicultural audiences. Even global audiences now are tuning in in ways that they hadn’t before,” he continued.

“Even on our own channels the Creator content actually outperformed some of our own lead content, which is great.”

Another key component has been securing buy-in to the strategy from all 32 clubs. Trombetta said that wouldn’t have happened five years ago. “But today, whether you’re in Green Bay, Miami, you’re in Los Angeles and Las Vegas, you have a creative strategy that’s always on,” he explained.

“That’s been really important for us. They do believe in providing more access to players and to the Creators who have to co-create together and who needs access to the facilities, access to coaches and legends.”

One of the NFL’s star creators is Adam W, who has worked with the league for five years and has 55 million followers on social media. He aspired to be an NFL player and is now working up close and personal with its stars, albeit in a different way.

“They’ve been so cool,” Adam W says of the NFL marketing team. “A lot of times I’ll get a brief which is very open and allows me to still be who I am on camera with my videos. That makes it very organic, but at the same time pushing the narrative of the NFL and kind of combining the world of my humor with the sport.”

A recent video post of his with Dolphins wide receiver Tyreek Hill generated 30 million views. In total his videos were watched 14 billion times last year alone.

YouTube head of creators and gaming, Kim Larson, who acts as a creator commissioner, confirmed the new engaged audience demographic is “younger, more diverse, more global,” and said the platform had a of couple goals in partnership with the NFL.

“The first was to diversify and bring in new fans,” Larson said. “The second was to create an unbelievable viewing experience. And I think we did both.”

She stressed that the League had showed respect for creators from the get-go. “I felt like they were empowered with a sense of editorial control and ownership. They gave us all the tools and incredible access at all the tentpole events, the Pro Bowl, the Draft, the Super Bowl, individual games, and then archival footage. Opening up the IP and, more importantly, allowing creators to retain their monetization rights against that was huge.”

It helps that the NFL’s players are mostly in their early twenties and have grown up with social media. “They’ve grown up following different creators. There’s a lot of respect there,” Trombetta said.

“There’s opportunities everywhere for players now. You don’t have to be Tom Brady to actually create a name in this space. You can be an offensive lineman who isn’t getting the most attention on a week to week basis. That’s why they’re really starting to embrace collaboration with creators.”

The league plans to build on this foundation in tandem with YouTube and creators. One goal is to mine the NFL’s extensive video archive.

“We really haven’t figured out the best way to tap into that,” said Larson. “It’s just so voluminous. So we’re working with [the NFL] content team to figure out how we surface more of that, and get that to the right creators at the right time. We’re going to just put more gas on the fire.”

Trombetta is eyeing international growth in Latin America, where the league broke ground by staging a first game in Brazil.

“We’re going to have creators all over that both from the States as well as in Brazil,” he said.

The panel also discussed the importance of authenticity and data-driven decisions to creating content that resonates with diverse audiences and drives long-term engagement and loyalty.


Media Excel Gains Visibility with AI Reality Check and Innovation

Streaming Media

Artificial Intelligence has become a buzzword in streaming video with the promise of revolutionizing how we create, compress, and distribute videos. It’s crucial to separate the hype from reality and rare to find a vendor willing to do so. Media Excel’s CEO Narayanan Rajan is one.
article here
“I think there's pressure on vendors to satisfy certain segments of the market that you're working with things that are AI and ML related,” Rajan tells Streaming Media. “There's also a lot of pressure on customers just to implement the thing with all the buzz in order to satisfy their shareholders, their owners, their constituents.
“Some of the conversations we've had have been very much along that line. They may love the fact that we’ve got AI/ML and it checks a box for them but that solution may not be appropriate for the job they have in mind.”
Rajan stresses, “We don't dismiss the pressure that the market puts on our customers to satisfy the checklist item. It's a real thing and we respect that. We saw similar in terms of cloud transition and we're seeing it now. Some of those customers that transitioned to cloud have returned to on-prem, especially for 24x7 live streams because the cost can be prohibitive.”
Media Excel has been two decades in the business of delivering encoding and transcoding solutions for broadcast and streaming. Its most recent development HERO DIVA (Dynamic Intelligent Video Adaptive) encoding is a AI-driven software with a claimed 20% improvement on HEVC efficiency.
Quizzed on this Rajan says Media Excel has trained its algorithm on tens of thousands of hours of HEVC content and has demonstrated at least 20% efficiencies. HEVC was the priority. Now they’re training it on AVC (H.264) data. “Currently on less than 10,000 pieces of content, so we're getting somewhere in the eight to ten percent savings,” Rajan says. “But as we train and tune the model further we expect to be able to increase the level of savings we can get.”
Training DIVA on VVC is the next plan and Rajan anticipates savings with this advanced codec in the 20% range. “Until we actually do it, it's a bit of a hypothetical,” he admits.
Built into DIVA is a video analyzer that looks at the incoming video stream, compares it to what DIVA has already learned with the training data set and then applies the best set of quantization parameters to that particular piece of content.
“We support all deployment modes whether it's an appliance equipped with a GPU or it's a virtual machine or a container in the cloud. The only requirement for DIVA is to be in a GPU-instance.”
The software is a licensable feature of its core HERO 6000 platform. It is codec independent and edge device independent.
“One of the motivations for DIVA for us was to contemplate what would happen if our customers did not have to wait for the typical 10- to 12-year codec transition timeline,” Rajan says. “What if they didn’t have to wait to implement the codec into all the edge devices before the end-to-end benefit can be realized. What if we could implement a ML technology that could be quality focussed, target bit rate focused and offer quantization parameter modifications - but would not affect anything else downstream at the edge devices.”
“For those streaming at some level of scale the cost to achieve 20% bandwidth optimization would be insignificant compared to the distribution cost saving. Plus, you would not have to wait for all the edge devices to catch-up. You could operate with the devices that are out in the market already.”
Candidly he admits that DIVA comes with some computational overhead making it less suitable for customers with a smaller subscriber base.
“But if you are crossing a particular threshold of subscribers and streams there is an intersection point where it absolutely makes sense. For those customers the video infrastructure costs rapidly become irrelevant in terms of the cost savings you get on the distribution side.
“We think that even the customers who choose us initially because AI/ML is a checklist item will benefit downstream from having that feature enabled on their platform.
“There's a deeper truth in our industry which is business uncertainty drives the need to control costs. Because of that customers are always asking the question, ‘what can you do to help me control the total cost of work?’”
“We support all the different workflows including VOD and live to VOD but live streaming is the thing that's hard to do. If you get that right you will get the other stuff right as well. So that's really been a focus point for us.”
Media Excel is working on more AI developments including DIVA Pre-Process which is part of the company’s ongoing work to improve visual quality by detecting artefacts and automatically correcting for it as well as optimizing HDR.
Another piece of development is around audio detection, subtitle generation and language translation based on existing AI libraries.
“These kinds of efforts are a little bit fragmented. So while Google for example may have a great library for this kind of translation it may not have the best library for, say, Korean audio detection or Japanese translation. We want to be able to choose the best libraries, incorporate it into our product and really create something that stands on its own.”
It also plans to collaborate with other AI developers. One example is a tie up with a scene detection specialist for fast turnaround and publishing of clips and highlights in sports.
Media Excel claims its products power 400 million multi-screen subscribers worldwide. Its broadcast and streaming customers include ESPN, F1, QVC and the Home Shopping Network and Telenor. Spanish soccer league La Liga also uses Media Excel product in its streaming platform.
But despite twenty years in business the company “feels a little bit new” says Rajan, not because it lacks for product or market acceptance but because it has a “visibility problem.”
That’s partly why the former MediaKind executive was hired as interim CEO ten weeks ago – and has now made the position permanent.
“The company isn’t a MediaKind or a Harmonic with a big portfolio having to drive hundreds of millions of revenue. We're a small engineering-led company but because of the transition through various leadership modes over the last three years now it's time to get out there and let the world know what we're doing.”
Media Excel will be at IBC Show showcasing DIVA and demoing live low latency 3 seconds glass to glass with HERO 6000.

Thursday, 11 July 2024

VideoAmp extends video ad planning capabilities to AR with Snap

Stream TV Insider

article here

If Augmented Reality is to become a potent part of brand campaigns, then marketers need data to back up their investment. A new tie-up between VideoAmp and Snap aims to help deliver on that goal, while providing cross-platform insights.

The TV measurement and currency vendor has partnered with the instant messaging app to integrate Snap’s first-party data, video, and AR inventory into VideoAmp’s cross-platform planning solution.

 As explained by the companies in a release, as AR adoption increases, VideoAmp’s existing video planning capabilities will provide tools for advertisers looking to explore and test new AR formats.  According to Snapchat, helping partners plan for new formats like AR is important as over 300 million “Snapchatters” engage with AR every day on average.

The idea is that advertisers will have more information about where to place budgets to reach audiences whether they are on linear TV, streaming or digital platforms.  

Snap also gains the ability to independently run supplemental measurement and planning alongside its core advertising, the companies said. And agency partners can tap VideoAmp reach planning and measurement tools to gain insights into how their campaign on Snapchat drive metrics like incremental reach to TV buys and TV tune-in.

The collaboration follows hot on the heels of last month’s study by Snap which found that using AR alongside video drives 5x more active attention compared to other social media platforms.

The study, carried out with Omnicom Media Group (OMG) and Amplified Intelligence, measures 'active attention' as the percentage of ad length during which viewers' eyes remained on the screen. This was tracked by Amplified Intelligence’s eye-tracking technology. Attention paid to video ads alone on Snap is 2x greater compared to similar inventory on other platforms, per the research. Adding Snap’s AR filter Lenses into the mix boosts that further.

As Snap/OMG puts it video + AR, “lifts both short-term brand choice and long-term brand loyalty, with shareable AR experiences supercharging impact.”

If Augmented Reality is to become a potent part of brand campaigns, then marketers need data to back up their investment. A new tie-up between VideoAmp and Snap aims to help deliver on that goal, while providing cross-platform insights.

The TV measurement and currency vendor has partnered with the instant messaging app to integrate Snap’s first-party data, video, and AR inventory into VideoAmp’s cross-platform planning solution.

 As explained by the companies in a release, as AR adoption increases, VideoAmp’s existing video planning capabilities will provide tools for advertisers looking to explore and test new AR formats.  According to Snapchat, helping partners plan for new formats like AR is important as over 300 million “Snapchatters” engage with AR every day on average.

The idea is that advertisers will have more information about where to place budgets to reach audiences whether they are on linear TV, streaming or digital platforms.  

Snap also gains the ability to independently run supplemental measurement and planning alongside its core advertising, the companies said. And agency partners can tap VideoAmp reach planning and measurement tools to gain insights into how their campaign on Snapchat drive metrics like incremental reach to TV buys and TV tune-in.In today’s announcement, Pete Bradbury, chief commercial and growth officer for VideoAmp called its pact with Snap “a game-changing moment for the industry and cross-platform planning.”

Alexander Dao, global head of Agency Development & Sales Partnerships at Snap Inc., said the tie-up gives advertisers “more choices to plan and measure across our video and first-to-market AR formats.” 

Also in the release, OMG’s Megan Pagliuca, chief activation officer, testified that consumer engagement with video content was increasingly fluid and diverse, “whether linear TV, streaming environments like CTV, social platforms like Snap or all of the above.”

She said, “What matters to us is understanding these complex consumption dynamics and reacting with smart decisions. Gaining visibility into Snap’s in-platform video consumption strongly supports our mission of holistic cross-channel video planning.”Snap claims it helps advertisers reach more than 800 million people on Snapchat every month from 414 million daily active users worldwide, a quarter of which are based in the U.S. The company reported revenues of $4.6 billion last year, up from $2.5 billion in 2020. According to Business of Apps the Snapchat net loss decreased in 2023 to $1.3 billion and it has not a profitable quarter since Q4 2021.

VideoAmp has been growing its business while undergoing turbulent internal restricting that saw Founder and CEO Ross McCray step down in January amid a slashing of 20% of its workforce. Since then, VideoAmp was one of two measurement vendors (alongside Comscore) that secured the greenlight from the U.S. Joint Industry Committee for certification of its measurement datasets as national cross-platform currency that’s suitable for transacting TV ad deals.  

In announcing new executive hires for its sales team at the end of April, VideoAmp said that its currency would guarantee more than $1 billion in ad dollars in 2024.

Driving in Virtual Production: Travelling without moving

Definition

article here (interviews and copy written by me)

Creating compelling driving shots used to be a headache – but virtual production has reinvented the wheel.  

In-camera VFX, a crucial pillar of virtual production, offers a controlled environment for shooting simulated travel scenes. An ability to swap between locations and times of day on a whim means a project’s vehicle shots can be captured quickly – and all in one go.

“There are no weather constraints or requirements to schedule shooting at the end of the day in the desperate hope of shooting a key scene during golden hour: these results are guaranteed in VP,” says Jonathan Davenport, executive producer at October Media.

VP allows the director to reset at the click of a button. “If an actor fluffs their line or the director wants to shoot a second take, the vehicle doesn’t need to be taken back to the start location, which saves time and frustration,” says James Franklin, VP supervisor at Dimension.

Low loaders, the traditional means of car process shots, never quite look correct through the lens. “The vehicle sits too high on the horizon,” Franklin explains, “whereas the content on an LED wall can be adjusted vertically to get the correct horizon line. Content can also be spun 180° to shoot reverses or 90° to shoot side profiles. For city shoots, it’s much easier to shoot a 360° plate and take that into a volume, than shut down whole streets for a practical shoot.”

Since opening MARS Volume in 2021, the facility has seen a steady stream and increase in productions choosing LED stages over location (low loader) and green-screen shooting to achieve highly realistic driving sequences.

“Drive under a bridge, and see the shadow of the bridge pass right over the top of the vehicle,” says Bild Studios CCO Joanna Alpe. “No green spill means achieving natural skin tones and colour management is a breeze here when compared with green screen.

“Time and time again, directors will tell us that access to the actors in the vehicle makes a huge difference, allowing them to easily take bathroom breaks, get refreshments and take direction with ease as opposed to the restricted access of a low loader.

“You can shoot at any time of day, in any hard-to-access location,” she says, “such as airport runways or Oxford Street at Christmas time for example.”

October Media is a production service company based in Norfolk that offers a fleet of US action vehicles, virtual production capabilities, studios, workshops, standing sets, crewing and early scheduling/budgeting facilities.

“Driving scenes shot using VP unshackle filmmakers from the many constraints of shooting on real roads, using traffic management and low loaders that never seem to have enough seats for all the crew,” says Davenport. “They can achieve the precise look they want using multicamera plate shots from libraries like drivingplates.com displayed on the LED volume.

“The interactive nature of lighting also enables artists to perform dangerous actions while appearing to drive as passing scenery is reflected in their eyes. Add to this that productions enhance their sustainability credentials by not travelling to locations!”

Tips from the experts

Bigger isn’t always better when it comes to car process shoots on a virtual production stage. “Great results can be achieved with a fairly modest volume,” says Franklin. “DOPs need to be careful of any reflections showing any seams in the volume – such as gaps between the wall panels and the ceiling panels. Wild walls (movable wall panels which can also tilt up and down) are a useful addition in this respect. Moiré is also something to watch out for, especially in the windscreen.”

Davenport warns of purely relying on the LED volume and practical lighting to create the look for driving sequences. “We’ve seen the best results achieved when practical SFX are deployed alongside VP,” he says. “It can be as simple as having a car window open with wind FX moving an artist’s hair as they travel in the car, or the use of rain FX striking the windscreen. These additional practical textures enhance the realism of a scene immeasurably.”

Elements that productions should pay attention to are: drive-in access for vehicles, size of volume (not too big, not too small: the Goldilocks zone), movable LED walls and a manoeuvrable ceiling that can be raised. Studios like MARS Volume complement this with an agile video playback system that syncs to turntables, camera and lighting fixtures.

That’s not to say it’s all plain sailing. Alpe’s advice for DOPs working in this environment is to plan, plan and plan. “Previsualisation is key – knowing where the car needs to be in relation to the screen is important to ensure correct reflection and lighting coverage. We’ve developed a pre-production workflow to support DOPs in this regard.

“Ensure you have adequate pre-light time and allow plenty of time for tweaks. This is the opportunity to play with the medium, see how the light and reflections dance over the metallic vehicle, and maximise these features for creative effect,“ she advises.

“Keep an eye on the blend zones on camera stitches. As with compositing plates in using traditional VFX pipelines, you have to be careful you don’t capture the blend zones in-camera, in your shots. Working with experienced and astute operators easily overcomes this.

The road ahead

Virtual production technology is one of the fastest moving in the industry. So it’s worth asking, where is the tech headed?

“We will see archive libraries increase their offerings of driving plates and the generation of plates with period looks,” Davenport thinks. “This is particularly exciting for filmmakers who are looking to enhance production value and generate scenes with period cars or landscapes in the background of their scenes.”

Alpe points to advances that will make ICVFX more accessible, specifically when it comes to environment creation, and ensures that a portion of MARS Volume’s bookable days are set aside for R&D.

Recent releases of Unreal Engine have now enabled playback at 16K EXRs in real time. “This increase in resolution looks spectacular on the wall,” says Franklin, “and the increased colour depth means that we can colour correct shots far more accurately, keeping details in both the highlights and shadows.”

Wednesday, 10 July 2024

Behind the scenes: Multi-format shoulder programming takes Centre Court at Wimbledon 2024

IBC

As The Championships heads to its finals weekend, the host broadcast team reflects on expanded behind the scenes access.
article here
SW19 is dormant from a broadcast point of view outside of the annual Wimbledon fortnight but the All England Lawn Tennis Club’s (AELTC) in house host broadcast team Wimbledon Broadcast Services (WBS) spends the full year on development.
“For the last five years we’ve been in a very good place covering all the tennis across the main draw and last year we began to focus on delivering more in-depth coverage and additional layers of editorial output,” says Paul Davies Assistant Director, Broadcast, Production & Media Rights, AELTC Wimbledon.  “We have had an unprecedented amount of downloads from our central content repository Mediabank.”
The number of beauty cameras have been upgraded to 22, there are now 26 stand-up positions around the media pavilion which itself has been now been completed to offer three levels of glass balcony for broadcaster presentation to overlook the fields of play.
The headline new camera is a drone provided by Aerios Solutions operating from the adjacent golf course. “To get permission we had to be sensitive to players and spectators so that we never impact on their experience,” says Davies.  “We did a lot of testing to identify how close we could come to the periphery of the ground and established a red zone where we can and cannot fly. The ability to fly 1500 ft for views into Centre and No.1 court is something we couldn’t do before and it gives us sweeping imagery over the lake and a number of other areas over the park.”
It's been known for a while that this was likely to be Andy Murray’s last hurrah for which a special filmed and on-court tribute was planned at the start of the year.
WBS commissioned Whisper to compose a video featuring highlights of his two decades on tour which necessitated clearing all the footage to ensure all international broadcasters could show it. Clips of greats like Roger Federer and Serena Williams were filmed and it was edited to a radio-friendly version of Radiohead’s Creep.
The main challenge was orchestrating when this would be played since it needed to be played out on Centre Court, and on the live feed for all broadcasters, simultaneously.
SP Global estimates that media rights for the Championship account for more than half of AELTC's total revenue which in 2022 was £350 million. One of its newest recruits is Amazon Prime Video which netted four years of rights 2024-27 to the tournament in Germany and Austria, beating out Sky.
“It’s the first one of the internet giants to work with us,” says Davies, who talks of onboarding a major new broadcaster to the Wimbledon experience. “They have a rooftop studio consisting of six 4x4m parasols providing a weatherproof presentation position. They had to acquire all the Wimbledon assets, consult with us about the look and feel and branding of their programming and it’s been really professional and creative to work with them.”
However, neither Amazon Prime nor the largest US broadcaster ESPN is airing the Championship in UHD HDR, which is why just Centre Court and No.1 Court are filmed in the format. Instead they are taking 1080P HDR.
“We’re not hearing a clamouring for more UHD HDR including from North America,” says Davies. “The bandwidth and overheads involved in moving UHD around are so much greater versus the trade-off in the increase in picture quality.”
More important from a rights holder point of view, and therefore for WBS’ focus, is the shoulder programming. Davies calls it “critical” and tasked Whisper, Wimbledon’s production partner, to expand it this year.
Whisper expanded Access
As last year Whisper is producing the World Feed, international highlights, an official film of the Championships, as well as Access All England – the focus of this year’s expansion, explains Executive Producer, Harry Allen.
“The feedback from broadcasters was that Access All England offered them a valuable resource to cover more behind the scenes action then they would be able to alone. The difference this time is that it is a curated feed offering another layer of editorial value.”
Access All England is a daily stream of BTS cameras plus additional ones at practice courts, arrivals, player restaurant and dressing room.
Whisper invited rights holders to suggest which players it would like covering on which day. This works both ways, since it means Whisper are able to react to stories around players that are perhaps less well known and outside the top seeds.
Allen says, “We would like to be in a situation where a broadcaster comes to us and says ‘We have one of our big hopes playing tomorrow’ and we can give them wall to wall coverage from the moment the player arrives, to their warm-ups and pre-match interview.”
The latter is particularly new and something Whisper are doing on the Aorangi Terrace, commonly known as ‘Henman Hill.’  
“Rights holders are allowed to capture interviews at designated points but we can go everywhere and provide extra depth,” says Allen.
Other changes include offering an additional shallow depth of field Cine-cam (a Sony Venice) capturing players as they arrive at SW19. Later each day this is offered as a live camera on court, for instance, when player’s change ends and for crowd reactions.
Four BTS cameras have had their audio disabled “because they are in discrete areas- where players don’t expect to be heard,” says Davies.
As last year certain players are mic-ed up during their practice sessions on Centre or No.1. Joe Bennett, Whisper’s Creative Director Digital, explains, “We will give them Apple AirPods and enhance this with directional parabolic mics. It’s a low friction method of gaining insight into game play without hindering the player’s movement.  You could also eavesdrop into a three way conversation between, say, a journalist with Andy Murray and his coach by giving Andy and his coach an AirPod each.”
Plucking an innovation from the ATP Tour, 9-year-old Priya Rose Brookwell is interviewing players including Frances Tiafoe, Alex de Minaur and Aryna Sabalenka “in a cheeky style” which is intended to “bring out a player’s personality,” says Bennett.
Introduced this year is film of the current Champions returning the Wimbledon trophy. Carlos Alcaraz and Markéta VondrouÅ¡ová were filmed in separate courtesy cars, cuddling their trophies on the way into Wimbledon while reflecting on what lies ahead.
Returning is Wimbledon Threads a strand fronted by fashion influencer Morgan Riddle and also Purple Carpet, a series of interviews with celebrities and those in the Royal Box.
Bennett says, “We work on all the biggest properties in sport from Formula One and The Euros to Test Cricket and Paralympics and the one ticket everyone wants is a ground pass for Wimbledon.
“We want to try and sell the society and the occasion element of being at Wimbledon. In the closed season we’ve created a larger red carpet arrival area so that when celebrities arrive they get a sense of occasion and hopefully it will become a focal point for interviews where they want to be seen and have their photo taken.
“Some host broadcaster content can be very bland so we’re trying to bridge the gap between delivering vanilla content and creating content that broadcasters will run again and again.”
This is the second and final year of Whisper’s contract. Its aim is to keep rights holders happy so that they rebook with the AELTC and so, in turn, the Club rebooks with Whisper.
Although Netflix tennis doc Break Point was not recommissioned this year, a number of other doc filmmakers are being accommodated by Wimbledon including Morena Films’ Alcaraz, the docuseries for Netflix another featuring legends Chris Evert and Martina Navratilova.
Social media content
While millions of people watch the linear output many are also keeping up to date on their phones so there’s a bespoke and expanded coverage to fulfil a diverse range of audience says Georgina Green, Senior Broadcast & Production Manager, Wimbledon.
“We’re producing all that in broadcast quality putting it up on Mediabank and making it available in all formats - 16x9, 9x16, 1:1. It’s a normalised workflow this year.”
Green speaks of a “a level of reactivity” in seeing what works on what platforms in which territory. It works with agencies to help tailor content for territories like China.
Official social media teams have priority access court-side. Players with a social following of 2 million or more are offered the chance to bring their own social media person behind the scenes to film content for their own channels, and which WBS has access to as well.
“We’ve got real energy and buy in from players,” says Green. “The more they do with us the more reaction they get on social.”
Video filmed from the POV of a player in their hotel, putting on their kit, in the car en route to Wimbledon, in the Aorangi, and stretch area had great traction on social, she says.
“It proves there’s a thirst for that type of content.”
James Muir, Broadcast Technical Manager, says speed is important to make everything available, particular on the first few days when more nations have more players in the draw.
“Even a ten second clip of a player practicing is important for rights holder coverage so they can play that before they go on court.
“We don’t use AI but do use lots of automated media management workflows. A press conference for example doesn’t need a human to mark in and out or to make a decision to publish. That workflow can be automated.”
NEP and UHD HDR coverage
NEP is the main facilities provider. Sam Broadfoot, Technical Project Manager at NEP UK explains, “This year we have significantly more feeds delivered to Rights Holders in HDR compared to 2023, but continue to offer HD SDR feeds as we have previously. This change has caused a big increase in bandwidth requirements through our IP delivery system, but we have managed to reduce the number of overall conversions to SDR, using over 150 channels of conversion. For the 2024 Championships, every court is available in HDR, as is the new Access All England feed.”
Mediabank, the media asset management solution, is used for remote access to match highlights, press conferences and other content to be ingested, managed and distributed for rights holders. This is being facilitated remotely from IMG to Oslo, Norway via 10G link provided by NEP Connect.
Onsite NEP division Edit & Ingest are providing over 130 EVS products for the Championships and 1PB of central storage. This includes media asset management for over 10 onsite and remote Rights Holding Broadcasters. Additional broadcast services from NEP include 50 EVS VIA machines, 93 Sony cameras, 150 talkback panels and over 90 km of cable installed each year. More than 500 NEP broadcast engineers, technicians and crew members are onsite supporting the host broadcast and other rights holders.
Seven courts are equipped with automatic camera system, TR-ACE, from NEP division Fletcher. TR-ACE cameras use image recognition and LIDAR to automatically track players on the court, meaning just one singular operator can control and manage the system for all seven courts.
Specialist camera and RF
EMG division Aerial Camera Systems (ACS) is supplying 46 robotic camera systems and a Fancam Cine (in partnership with EMG Connectivity) to WBS. The robotics are predominantly Sony HDC-P50s fitted to SMARThead remotes and supplied with custom made 12G SMPTE fibre transmission systems.  Centre Court features five robotic cameras including a 10m baseline track sitting behind the players and tracking their horizontal movement. 
New for 2024 is an ACS high 20m rail camera inside Centre Court capturing the expanse of the court and the spectators within it.
Centre also has two robotic compact cameras, one for each player, fitted discretely to the Umpire’s chair, and a remote at camera position 11 and in the northeast corner of the stadia.
No.1 Court is the same minus the NE corner remote, No.2 has two positions and most of the other courts have at least one robotic camera taking a wide master on a high pole or on the side of a building all with bespoke mounts. There’s also another track system on the southern court approximately 25m in length.
Various robotic SMARThead’s capture beauty shots from the trophy balcony and clubhouse (which sports a 100:1 box lens), player’s balcony, crowd cam and even a ‘flower cam’ – the latter among those in UHD HDR. The press conference area also has 3x discretely mounted robotic cameras for interview coverage.
The practice area is also covered with robotic systems enabling rights holders to provide live coverage of players warming up.
EMG Connectivity is providing RF links across Wimbledon for WBS including for two handheld cameras, a Steadicam, the Wirecam and the drone all in UHD HDR. It is also providing four roving handheld RF links all in 1080p50 HDR two of them for the Cine style cameras.
It has 48 antennas around the grounds of Wimbledon to cover every nook and cranny all extended on fibre back to the RF cabin and switched with an RF router.
All the cameras have GPS positioning systems on them so EMG can locate and track them for efficient use of the RF router.
Thirty six broadcasters have an on-site presence with ESPN sending over hundreds of staff working out of a refurbished ‘Disney wing’ of the broadcast centre. This is a double storey area featuring a 12m high window “which effectively is the window on Wimbledon for the American market,” says Davies.
“They've created a new studio set, which is absolutely stunning, very modern looking with all production suites edit suites, green rooms. Certainly other broadcasters are looking on rather jealously.”