Wednesday, 3 September 2025

IBC Conference: Google’s Justin Gupta on how to “maximise digital revenues” with ad-tech

IBC

article here


Justin Gupta joined Google in 2006 just a few months before it acquired YouTube. “It was a super interesting time to join Google and it looks like that was a good deal in hindsight,” he says with some understatement.

Earlier this year YouTube overtook ITV to become the second most-watched home service (after iPlayer) in the UK, according to Ofcom.

“YouTube has proven to be a great distribution platform for content and a lot of the great video content that exists comes from traditional broadcasters,” he says.

Already a seasoned executive working on interactive TV at the BBC and Red Bee Media, Gupta has experienced 19 years at the heart of Google working on everything from Google Maps and YouTube to broadcaster and publisher monetisation products like AdSense, Google Ad Manager and Dynamic Ad Insertion.

In this, his eighth visit to IBC, Google’s EMEA Head of Broadcast and Video Ads is focussed on helping broadcasters to grow their digital revenues.

“When a broadcaster adds a streaming service or wants to use digital technology to enable linear addressable ads then my team help them work on that. We also enable some broadcasters to sell their own media on YouTube using Google Ad Manager.”

Speaking on the IBC Panel: Is this the end of advertising as we know it? New models, new partnerships, new technologies, Gupta will deliver a positive message for broadcasters making the move to streaming.

“The industry is at a really exciting tipping point since every broadcaster is now a digital and a linear media owner.”

He cites data from Omdia revealing that UK broadcasters are currently generating 28% of their revenues from digital. That’s ahead of other major European markets Germany, Italy, France, and Spain which make about 10% of revenues from digital. The direction of travel is nonetheless the same.

“The growth trend is clear. For example, Channel 4 in the UK have stated they will be a digital first broadcaster by 2030. The experience of Channel 4 is that distributing on YouTube has not cannibalised their audience and in fact they are bringing new audiences back to TV. What you want to do is make people aware of your content and bring people back to your own platforms. Channel 4 have been the stars over the last couple of years by really making a success of that.”

In tandem with this feedback loop to attract new audiences the real push has to be to maximise digital revenues as people move from one to many linear experiences to digital experiences, which are one-to-one.

The opportunity to be grasped is through programmatic advertising. Gupta says this is crucial, bordering on existential, for enhancing the value of their TV media.

“Broadcasters must maximize their digital revenues in the face of increasing competition from global players. Programmatic advertising is the essential path forward to do that.”

Programmatic TV advertising leverages automated technology to facilitate the buying and selling of ad inventory in real-time while enabling highly relevant, audience-first targeting across platforms

“If you think about it, Connected TV now facilitates one-to-one advertising rather than one to many while programmatic advertising improves the efficiency and effectiveness of TV trading. This combination creates exciting opportunities to show new advertising formats on TV that weren't previously possible.”

Enter SGAI

Gupta highlights emerging industry standards like Server Guided Ad Insertion (SGAI), a hybrid method that combines client-side and server-side ad insertion strengths. SGAI he says enables new addressable formats, such as ads shown side by side with content in boxes or L-shaped banner ads where the screen reduces as the advertising is shown. Picture in Picture experiences are another possible format where the content continues in a smaller window on screen.

“Because we can make these formats fully addressable, you can now show different ad experiences to different viewers without disrupting the viewer experience.”

Gupta says Google see most near-term opportunities for SGAI in live sports where additional commercial opportunities can be added in a non-disruptive way to highly engaged audiences - he will demo on stage. It has used the SGAI approach with one of the largest sports leagues in the US.

“SGAI is still nascent so there are some wrinkles here and there but we're seeing a lot of the innovation happening in the sports vertical,” he says. “What I expect will happen is it will launch first at scale with sports and then, over time, other broadcasters will be able to take advantage of this technology. The brilliant thing is it's bringing different advertising formats to TV that we've not had before. And they're also addressable and programmatic.”

Programmatic TV is being promoted by a European consortia of broadcasters, ad agencies and ad tech providers of which Google is one. The European Programmatic TV Initiative aims to dismantle the barriers to programmatic TV growth across measurement, currency and identity.

“We’re aligning on standardised approaches, removing obstacles to address technical and operational hurdles and fostering better collaboration to reduce buyer and seller misalignment,” Gupta explains. “As TV advertising has developed uniquely in different countries the industry is now working to harmonise approaches across the five largest European markets.”

The report for Stage One of the Initiative—A Roadmap for Programmatic TV in Europe—has now been published (and can be downloaded here). This set out a series of practical next steps to unlock the full potential of programmatic TV in Europe to be taken forward in Stage Two.

Generative AI will democratise advertising

The rise of Generative AI video might have sent shivers up the collective spine of the traditional advertising community but Gupta points out that AI and Machine Learning has been used to optimise ad placement for years. He acknowledges that Gen AI has the potential to revolutionise advertising workflows in the coming years but this will unlock huge potential.

“In the same way that YouTube democratised broadcasting, Gen AI video is going to democratise content creation. In advertising that means it's going to become easier to create high quality ad content at an affordable cost, and this unlocks a significant new opportunity within digital advertising.”

He continues, “One promise of addressable advertising was that every viewer could see a different version of the same ad, but in practice the production workflows required you to bake addressability into the production process if you wanted to create a thousand variants of your ad with a different end segment for each town or city in your country. Now you can do this on-the-fly. As these tools mature, you may even be able to create such variants in real time in products like ours.”

Where does that leave the creative agency? Gupta thinks the smart ones will use AI to make their creative better, more efficient and more cost effective so that broadcast advertisers can gain the benefits.

“Ultimately, AI doesn't replace creativity. All it does is speed up some of the execution,” he insists. “Generative AI can enable advertisers to create high-quality creative assets at unprecedented scale supercharging the creative process.”

For smaller advertisers who were priced out of advertising on TV before, Generative AI video creation and adaptation tools are an enabler. “They'll lower the cost of creating and reversioning video ads and that again will enable small advertisers to create high quality ads.”

Making linear experiences addressable

Since Dynamic Ad Insertion remains a big focus for the broadcasters Google works with, Gupta outlines three main approaches to making linear experiences addressable, supported with Google Ad Manager.

For example, in streaming, using Dynamic Ad Insertion - as DAZN recently did for the FIFA Club World Cup, streaming 1:1 addressable ads to over millions of concurrent viewers across the world. Watch out for a case study that Google is launching on this with DAZN at IBC2025.

Second, in set-top-boxes with linear ads replaced with digital ones during live viewing. Sky have been doing Dynamic Ad Replacement successfully for around a decade using Sky AdSmart. 

And thirdly, in Connected TVs around free-to-air experiences, using technologies like HbbTV to essentially stream addressable ad breaks ‘over the top’ of the TV content.

To underscore the importance of broadcasters leaning into programmatic advertising in TV, he highlighted that “TV advertising is funding a significant portion of the content we love. So, if you love TV, Ad Tech will continue to be a strategic enabler for the future of TV.”

Tuesday, 2 September 2025

Esports World Cup: “Everybody born on this planet is going to be a gamer”

IBC

article here

The Esports World Cup demonstrated a future in which competitive video gaming blends with traditional sport and entertainment on a global scale.

In many ways the competition to win the Esports World Cup was a sideshow to the bigger picture laid out in Riyadh, Saudi Arabia of a future that fuses sport, gaming, music, social media, fashion and film with esports at its epicentre.

While the Saudi-backed Team Falcons defended its title on home soil, the seven-week tournament was a showcase for how culture verticals, both virtual and physical, are already mixing.

“Esports and gaming are no longer boxed in as being purely a standalone experience,” said Mike McCabe, COO of EWC organiser Esports World Cup Foundation. We see it as something much bigger.”

“Games are no longer just products but platforms to global culture,” added Paul Cairns, EVP & Chief Business Officer, Electronic Arts during a two-day conference running in parallel with the EWC final.

“There’s a reason we called it the ‘New Global Sports Conference’ and not an esports conference,” said Ralf Reichert, CEO, EWC Foundation. “In the not-so-distant future the lines will blur even more and both sports and esports will go hand in hand.”

The gap between traditional sports and digital versions of them are getting closer. F1 drivers, for example, spend 80% of their time in a simulator to improve the 1% on track. F1 driver Lando Norris visited EMC25 and was quoted as saying the skill on display was insane. “This is not hype,” said Reichert. “This is excellence recognising excellence.”

Although esports has had to fight for acceptance at the head table of governments, mainstream media and sporting governance in many countries, that’s not the case in the Kingdom where support is sanctioned from the highest level down. The Crown Prince of Saudi Arabia, Mohammed bin Salman Al Saud, also known as MBS, attended the closing ceremony while state ministers of communication, tourism, investment and sport took to the stage to espouse esports as a fundamental pillar of the country’s 2030 pivot from oil to human resources.

“He has grown up as a gamer,” said Reichert of MBS. “That's why he understands this. It is endemic here but generational change across the world means that political acceptance of gaming in esports will happen as a matter of course and pretty quickly. This is not step by step. This will have a huge impact across the world to the point where it will blend with most traditional sports.”

That the majority (63%) of the country’s 36 million population are under the age of 30 and 68% of identify themselves as gamers is an indicator of esports domestic popularity.

“The reality is that everybody born on this planet going forward is going to be a gamer,” McCabe said.

Traditionally, video games were a much more solitary experience in which kids sat in their bedrooms playing with a console or PC. Now games publishers are developing their products with mass participation and cross-pollination of IP in mind.

“With esports you combine entertainment, technology and sports which attracts a lot of young talent towards it,” said Paul Cairns, EVP & Chief Business Officer, Electronic Arts (EA). “We are evolving from a traditional video games company into more of a connected community driven entertainment company.”

This is particularly the case for EA’s biggest franchises like EAFC and NFL title Madden which have become massively connected online experiences.

“The reason we've done that is to offer more for our players to do in and around the game,” Cairns explained. “We’re also giving players the opportunity to create by personalising their avatar or introduce other forms of UGC. It's all about social interaction, self-expression and connection with your friends.”

He added: “Gaming is the fastest growing form of entertainment. We see the future of entertainment, much more like an interactive connected game rather than a passive sit back experience. Those who are natively interactive are in a really good position to lead that going forward.”

One of the most exciting new developments is the introduction of real world live sports matches streamed onto game platforms. Earlier this year EA integrated live broadcast MLS games inside the EAFC mobile app. The potential is for viewers to stream the match action live and then simulate match highlights, putting themselves in as a player in the match.

“Now, you can do everything that we've always dreamed of,” said Matthew Ball, CEO of consultancy Epyllion. “Could we out dribble Ronaldo? Would we have made a better substitution? Could we perform at the level that we imagined we always could? These forms of investment that are about intensifying community, intensifying culture, bringing the game to geographies traditionally overlooked or under invested. That’s where we're going to find the game industry growing from 2030 and 2050.”

Everyone’s a creator and everyone’s invested in the game

An obvious synergy is at Amazon which is exploring links between live sports matches streamed on Amazon Prime and its video game streaming platform Twitch.

“EA’s Madden is streamed on its own Twitch channel via our cloud platforms,” said Steve Boom, VP, Audio, Twitch & Games, Amazon. “We’re exploring how we bring that together with the NFL matches we broadcast.

“One of the benefits of this form of trans media across broadcast and video games is we have the opportunity to expose it to a huge audience. Whatever the audience is for a typical NFL game [17 million in the US], we can expose it to a gamer audience. Maybe we can instantly replay [rendered as photoreal graphics] some of the plays that they just witnessed live in the actual game. It's up to us to make it as simple as we can. If viewers don't have a game controller, could they use a TV remote or their smartphone as a controller and start playing. Stay tuned! There is some really cool stuff coming!”

McCabe declared: “The next paradigm shift is the merger of games with mainstream entertainment. We've seen it with IPs that have crossed over from games to TV or games to movies – some amazing, some not so much. Those lines are blurring from a combination of the technology that we have in gaming and the technologies that are being created for TV. We build worlds for games with Unreal Engine and film and TV creators use it to capture and create on volumetric stages and virtual studios.”

Participatory IP is also on the rise. The gamer community is encouraged to contribute and share their own content using game assets. Boom added: “Everyone is a creator, everybody wants to be involved and it actually builds affinity and retention in the game if they have a vested interest in what's going on. We are building those kind of tools.”

‘King of Meat’, a new Amazon-published dungeon creator action game released in October includes functionality allowing players to create their own dungeons and to upload them for the community to play.

“This is an area where we really see AI playing a big role because AI is going to make it a lot easier for the average person to create,” said Boom. “I would expect to see an explosion of content from that coming soon.”

AI to supercharge player participation

The industry has a long history of players making modifications (mods) to games to make new games out of them. Just as in film and TV production, Generative AI could democratise a domain that used to be the preserve of financial gatekeepers and specialists.

“It shouldn't be foreign to games publishers either,” Boom said. “Maybe you lose a little bit of control over your IP but at the same time your IP gets bigger and people have a stronger affinity to it. The community itself will eventually start enforcing its own standards and rules. If someone tries to take the IP in a direction it shouldn't go, a vibrant community will rein that in. In an age of infinite content, we believe the way to cut through the noise is to build trust in a community. If you're an IP creator, you need to get comfortable with the fact that this is where the world's going.”

Sony’s Chief Strategy Officer, Toshimoto Mitomo seemed relaxed about the idea. “You're going to see interesting advances from the big game development companies but like every new technology the really innovative stuff is going to come from people we've never heard of.

“A lot of innovation will come from the AI-native generation – kids who are graduating college now. For them, AI is just what you do. They don't have to relearn the way they did things. That's why I think in the next 3-4 years we’re going to be blown away by what we see.”

Ball suggested that AI will accelerate the emergence of new game experiences and original ways for how we play a game. “We will soon see new genres being born as AI is adopted. I cannot underline how important that is the growth of the games industry which hasn’t seen a new genre or groundbreaking IP in a decade.”

Experiential and global

On the ground at EWC25 the organisers had extended the video game world into a festival of gaming, music and entertainment throughout the venue area of Boulevard City. This block of Riyadh is designed as an entertainment and sports hub all year round. Ronnie O’Sullivan has a branded snooker excellence club there. At EWC25, space was devoted to activations of games open to the public, drawing more than three million visitors in the heat of the summer.

McCabe explained: “Post COVID people have pivoted to experiential moments so running the festival in parallel with the Club Championship has been an opportunity for people to celebrate everything that's gaming.”

Stars from the traditional sports world including Cristiano Ronaldo, Tony Hawk and Alisha Lehmann made appearance to generate headlines. YouTube creators The Sidemen were in town. Superstar game creator Hideo Kojima (Metal Gear) talked about

the blurring boundaries of cinema and games. Post Malone and K-pop boy group SEVENTEEN performed at the EWC opening ceremony which took its cue from Super Bowl halftime shows. Norwegian Grandmaster Magnus Carlsen brought the intensity of world championship chess to the EWC stage, winning the inaugural event and $250,000.

Soccer star Alex Morgan, two-time FIFA Women’s World Cup Champion for the USA, drew parallels between the rise of esports and women’s football.

“Seeing these new esports formats being created in a lot of ways relates to women’s sports because of the progress made. As an advocate for women’s sports I see great change and it’s impressive to witness the change that can be made even in a few months. The progress here in Saudi Arabia with girls and women and the inclusion of them in sport is encouraging.”

By its own metrics EWC25 was a roaring success. It set new records with 750 million viewers and a peak of 7.98 million viewers during the second week’s League of Legends tournament. Some 340 million hours of content were watched, outperforming 2024’s inaugural event across the board (the comparative figures for 2024 saw 250 million hours of content streamed to 500 million viewers and a peak viewership of 3.5 million).

The effort to attract a wider audience outside the esports community was centred on EWC Spotlight, a new global broadcast production managed by IMG. In total, 7,000 hours of live content were produced (proudly proclaimed to be second only to the 2024 Paris Olympics) across more than 800 channels and 97 broadcast partners including DAZN and ITV, in 35 languages.

Casey Wasserman who leads the Los Angeles Olympics 2028 Olympic committee was given a front row seat. He said: “There’s no question esports will be a permanent part of the ecosystem 10 years from now. As the technology evolves to where you’ve got billions of connected devices, zero latency, 4K video and hundreds of people playing together in a peer-to-peer environment, that then becomes a different version of esports, a different sort of competition.”

As if the EWC were not enough, a new multi-title tournament Esport Nations Cup was announced for November 2026 (also in Riyadh) which pits national teams against each other. Nothing on this scale has been attempted in esports before – and with good reason, since pride in playing for one’s country has no legacy in the sports’ development. The Foundation is confident it can change that.

“There are 600 million esport fans around the world and that's already very significant. But there are more than three billion gamers globally. That’s a gap we need to bridge,” said Mohammed Al Nimer, CCO, Esports World Cup Foundation. “By leveraging national fandom and national pride we can achieve an additional step to making it more mainstream.”

Focus on emerging markets

As with film and TV so it is with the business of gaming. The mature markets like US, Germany, UK and even South Korea have stagnated and the future hubs of global creativity belong to India, China and the Middle East.

“In the US there are actually fewer active players on a weekly basis than prior to the pandemic,” according to Ball. “South Korea has also seen a 5-7% reduction in active players. Globally there has been stagnation or marginal decline since the peak in 2021.”

Yet this is largely a western mature market issue. Emerging markets are exhibiting growth as high as 6% a year. “Look no farther than the Middle East and Africa where over the last five years an average of 45 million new players have onboarded,” added Ball. “That’s more players in five years than the US has in total. In parallel, close to 30% of total global player growth in each of the past five years has come from the region.”

Ball predicted significant returns to growth globally over the next few years as many billions of new players will be found “with billions of new experiences to support them, but we have to look in slightly new areas.”

Key to this growth for Ball was not population demographics or smartphone adoption, but regional specificity. “When we look outside of North America, Australia, and Japan, we see very different leaders, much larger players, and clear differences in culture, religion, in art.

“It’s not Ariana Grande, but Blackpink, it’s not Christmas, but Ramadan. It is Bollywood or Chinese language movies dominating those markets, not Hollywood. It is essential that games developers and international content creators reflect this cultural understanding if they want to grow in these markets."

 


Monday, 1 September 2025

Epic and intimate: Shawn Mendes Red Rocks Live in VR

interview and words written for RED

Nominated for a 2025 Primetime Creative Emmy in the Emerging Media category, Shawn Mendes: Red Rocks Live in VR is a groundbreaking venture in the world of immersive music concerts.

article here

Created by Meta, Light Sail VR, Dorsey Pictures and 7 Cinematics, this project delivers the ultimate front-row seat to the concert recorded last October at which Mendes performed his latest album, Shawn, in its entirety.

It is also the latest in a series of high-quality VR music experiences designed to be experienced in a Meta Quest device with performances by artists Louis The Child, Tyler Childers, Santa Fe Klan and DJ Alison Wonderland featuring in the Emmy-nominated first season of ‘Red Rocks Live in VR.’

“We built our workflow on RED right from the very beginning,” says Vincent Adam Paul, CEO, 7 Cinematics. “Our original RED was an Epic Mysterium-X, serial number 000302, and we’ve continued to build our ecosystem around RED through all iterations of the cameras first in the 2D world and now into immersive 3D.”

Red Rocks is a stunning outdoor amphitheatre carved out of red sandstone in Colorado, with a seating capacity of 9,500. Nighttime shows are spectacular and demand a camera that can capture its beauty as well as all the lighting and pyrotechnics of a live stage event.

“Two of the biggest issues when filming any concert performance are confetti and laser lights but with RED the dynamic range (rated 17 stops with up to 20+ in Extended Highlight mode) is incredible,” says Robert Watts, managing partner and executive producer at the creative studio Light Sail VR, a specialist in immersive storytelling. “The dynamic range of a nighttime shoot at Red Rocks really comes through when you're shooting RED. It always looks like you're actually there.”

For the 83-minute Shawn Mendes concert the team arrayed a variety of camera systems at Red Rocks including the RED V-RAPTOR, with a Canon RF 5.2mm F/2.8L Dual Fisheye lens in key positions front of house, on a drone and on a jib.

“A touchstone for us is intimacy,” Watts explains. “For me, VR is about presence – the idea of being in a particular place. We’re trying to replicate the feeling of being there. We see everything from our eyes and from our POV and we want to make it feel very authentic and natural.”

Conveying this sense of presence requires an understanding of how the inter-pupillary distance (IPD) - the gap between the centers of the dual lenses - translates into the optimum distance from camera to subject.

Since the Canon Dual Fisheye has an IPD of 60mm, which is close to most people’s own IPD, Light Sail VR operates in a sweet spot of 5 ft to 15 ft from the subject. One key difference in the storytelling for VR is that camera movement is slower and more considered.

Storytelling cadence in VR180

“Whereas a 2D multi-camera plan has grown into a big symphony of shots to include all manner of camera moves on jibs, cable-cams, drones and Steadicams, shooting VR is more about camera placement because the experience is so personal,” advises Paul. “The lenses are fixed focal length, and the cameras are all locked off to avoid lateral motion which can make a viewer feel uncomfortable when they're not expecting it in the headset.

“We still do slow pushes in and pushes out. We can crane up and crane down because people are starting to get their ‘VR sea legs’ if you like, and getting used to the motion and appreciating the dynamism of the motion.

“Eventually, I think the 2D and the 3D experience will merge but right now we're trying to ride the razor edge of technology to get to that place.”

The Shawn Mendes VR experience was produced, directed and cut in a similar way to conventional concert films destined for theatres. “Every shot has tempo and flow,” says Paul. “It's cut like a movie but optimized for viewing in a headset.”

Watts confirms, “There is a cadence to VR storytelling that is a little slower, but you can actually do frequent cuts. We're cutting every seven or eight seconds. It's not like we’re using long establishing shots. We have enough coverage between all the camera systems to cut between them and create a seamless experience. You can do a lot of really interesting things once you're working with a post team that understands the geometry of how best to capture and edit for a headset.”

Light Sail VR built a preview system which can output a live feed for up to multiple headsets for select crew and representatives at each show. Watts says, “We can basically live switch between each of the camera positions. We'll see the flat Fisheye feed from every single camera position on a monitor and then we'll have the wrapped VR180 viewable in the headset so the band’s management or Meta executives or the artist themselves can come up and check it out.”

The choice of 180 format, rather than full VR 360, is considered preferable by immersive content producers and hardware developers including Meta, Google and Apple for subject based content while VR360 is more suitable for location-based content.

“We have a phrase we use here called ‘pixels per degree’,” Watts explains. “By producing in VR180 versus VR360 you can push all the pixels that would be basically wasted behind you into the front screen and make the resolution much higher and dynamic.”

Adds Paul, “If you're going to the Pyramids and you want to look all around, I'd shoot that in 360 but if you're shooting U2 at the Pyramids we're going to do it in 180 because you're going to be looking at U2.”

In turn, that entails working really closely with the band and their management to make sure that we get our cameras in the right place to produce a premium VR experience without blocking the sight lines of the audience.

“You can’t even buy a ticket to some of these locations because you are on stage from a reverse angle at the audience, or on a jib, or a drone. VR180 delivers a really rich visceral experience.”

Optimized V-RAPTOR

Light Sail VR used V-RAPTOR cameras owned by Meta, and custom modified by RED to remove the Optical Low-Pass Filters (OLPFs). The team then equips each body with the Fisheye lens to turn it into an 180-degree immersive imager.

“We remove the OLPF to increase the sharpness of the image when paired with the Canon Dual Fisheye lens,” explains Matt Celia, creative director, Light Sail VR. “We did tests and found that without this removed, the image was less sharp than Canon's R5C. Removing it dramatically increases the sharpness as well as giving us all the benefits of V-RAPTOR with the huge dynamic range, professional connections, and robust construction.”

Capture is at 8K 59.94fps with the final stream delivered as an 8192x4096 file to Meta, but the resolution audiences see is determined by their internet speeds. For best results, Light Sail advises users to ‘cache’ the high-quality playback in Meta Quest TV which renders the full resolution video.

“Recording at high resolution is critical with Fisheye lenses because the number of pixels per degree of the lens is vital to the perceived sharpness,” Celia says. “On RED V-RAPTOR we're able to get around 22 pixels per degree.”

Post, audio and data management

Recording is made directly on to the cameras. In post at Light Sail VR, each of the Fisheye feeds are brought into Resolve and flattened into a single equirectangular video for editing before finishing by adding VFX and noise reduction. An editor will cut as normal on a conventional monitor and review cuts in a headset. The final cut will be re-wrapped into a sphere for streaming to a Meta headset.

As you can imagine the data throughput from camera to post is extraordinary with each VR concert project running anywhere from 50 to 100 terabytes.

Ambisonic mics are placed at every camera position to capture spatial audio. The artist provides their final mix as well as the mixed stems with effects. This is handed over to sound design team Q Department, based in New York, to spatialize the mix for Meta headsets.

“They blend in the audience reaction from each camera position so every time you switch camera angle it doesn’t feel like you're in a different position,” adds Paul. “It's a nice balance of being at the concert and feeling like there's people around you. So, when you hear a guitar solo you want to move your head to watch the guitarist. We lead you through that by cutting to the guitarist in VR180 so now you’re immersed with the guitarist.

“As a producer going through thousands and thousands of hours of footage I rarely look [behind you] because I want to focus on the band on stage in front of me. When you watch VR180 you shouldn’t really be aware that there's empty space back there.”

Broadcasting live VR is possible using RED Connect to stream RAW 8K files direct from the V-RAPTOR over IP to a CCU in real-time, but the market for live VR needs to mature.

“In an ideal world, using RED Connect would absolutely be a very advantageous workflow because we could monitor each camera from a video village where we pipe in all the live preview tech,” says Celia. “We could even press a button and go live with an 8K stream which would be very cool! Maybe for next season!”

“There can be no mistakes”

As it stands, the production of the concert shows for immersive 3D actually feels like a live shoot every time. “There can be no mistakes,” Paul stresses. “When you're filming a sold-out show live for one night only, we have very little time to prep. One of our biggest jobs at 7 Cinematics is acting as the liaison between the artist and our team so we can put cameras in place during the sound check. That's about the only opportunity we have.

“There is no rehearsal. We come in, we place our cameras, and we walk through the stage management with the production manager, tour manager, and overall management. They'll tell us, ‘Yes’ or ‘no’ or ‘maybe.’ We push everything a little further by showing them what’s possible with VR in the headset. Then we’ve got to do it live.

“It's literally like a train passing the station. If you're not onboard, it's leaving without you.”

Two more Red Rocks Live in VR shows have since landed on Meta produced by Light Sail VR, Dorsey Pictures and 7 Cinematics featuring performances from Grammy nominated Omar Apollo and Norwegian singer-songwriter Girl in Red.

 


Riding the Peaks of Success: 100 Foot Wave

 interview and words for RED

The HBO Max multi-Emmy winning docuseries 100 Foot Wave combines the jaw-dropping skill of elite surfers riding the world’s most awesome liquid mountains with an insider’s view of the camaraderie and lifestyle of this sporting community.

article here

Directed by Chris Smith (Tiger King), the latest instalment focuses as before on legendary big wave surfer Garrett McNamara and features the series’ signature intimate interviews, vérité photography, and dramatic visuals. Stunning water footage showcases the death-defying beauty of big wave surfing as the athletes risk it all in a quest for the ultimate high.

Every single camera is RED, either V-RAPTOR or KOMODO, wielded by expert cinematographers including two-time Emmy winner for outstanding cinematography for a nonfiction program, Laurent Pujol.

“It’s far from being a studio or a set out there,” says Pujol with some understatement. “You can only control so much. Even when changing your battery, there’s a risk you will miss a moment. You have to be ready every second.”

Season one, released in 2021, was shot with multiple types of cameras, consumer, professional cine and high frame rate, offering varying capabilities and output. Pujol chose to shoot his work in the water with RED RAVEN, changing up to RED DRAGON 6K for the second season, and he believes this influenced the production’s decision to standardize on DSMC2 for S3.

“Everyone loved the look of the RED and they thought it would be a great idea if we shot RED across the board. The image quality is exceptional and consistent which makes it a lot easier for post-production to grade.”

By the end of filming on season two, he’d purchased a RED V-RAPTOR which is the camera he used to film all of this year’s episodes. “V-RAPTOR opened up a lot more horizons not to mention the sheer image quality. It’s a full sensor, you can crop into it, stabilize it. You’ve got 200 frames a second in 5K.”

Pujol is nominated for an Emmy again this year, along with fellow cinematographers Michael Darrigade, Vincent Kardasik, Alexandre Lesbats, Karl Sandrock and Chris Smith.

“The series is about 20 percent surfing, divided between the water shots, drone shots and panoramas from shore and the rest is vérité footage,” he explains. “We're trying to make that 15-20 percent of action all be incredible shots.”

While the first two seasons largely documented McNamara’s quest to conquer big waves in Nazaré, Portugal, the latest five-part series explores his professional setbacks and mental challenges as well as filming surfing contests in locales such as Cortes Bank in the Pacific Ocean; Safi, Morocco; Montaldo, Italy; and on O’ahu, Hawaii.

While vérité shooters focused on interviews and everything happening on shore Pujol leads the water unit, filming from the back of a jet ski. His team includes a driver and a spotter on the cliff with binoculars and a walkie-talkie giving the pair on the ocean information about when the big waves are coming.

“The most important to know is if the next wave is bigger than the one we're filming, because when we’re down there we don't know how big the wave is until one goes by or we're right above it. I'd rather shoot the five biggest waves of the day than fifty good ones.”

The camera has to be easy to pick up, quick to power up, and shoot with at a moment's notice. “I'm the only shooter out there for 100 ft waves, so I need to be mobile. I try to be everywhere although I'm only going to get 75 per cent of what's going on.”

“My V-RAPTOR is basically a 10kg package in a waterproof housing which I tend to hold from underneath and lock my arms to my side to get more stability. I like to put one hand on the handle and hold the housing by the bottom. That's how I feel I'm getting my more stable shots, but even as we’re pulling away to get out of the way of the waves so we don't get hit by it, I’ve still got to shoot because there's a lot going on and I don’t want to miss anything.”

Preparation before heading out onto the ocean is key. He carries several one and two Terabyte cards and three to four 150kW batteries. Returning to shore is not an option.

“That's going to take me 30-minutes and I could just easily miss the biggest, best wave of the session or of the year. So, I'm changing cards and batteries on the ski. Luckily, the housing I now have has just four clips where I can make a swap as quickly as possible. Obviously, with water flying everywhere, some water does get in there every time which is not the best thing for the camera for sure.”

A second RED body (DRAGON 6K) is ready to go just in case. “I love that camera. The quality on it is insane. It’s a perfect backup.”

He records RAW usually 6K at 160fps 17:9 for 2.35 extraction “to generate as much quality as possible for the editing team to work with.”

“My sweet spot is 6K 160 frames a second,” he explains. “Sometimes I'll go down to 5K 200 fps. I’ll rarely do 8K 120 because it gets a little shaky out there. The higher the frame rate the more the image stabilizes.”

On season two and three, he mostly shot with a Canon 50mm lens at f/1.2. “One reason I use the same lens is to have less confusion as far my distance is concerned. If myself and my driver know the distance between my lens and the subject it is such an advantage. Experience tells me not to get too close because you can miss the top or the bottom of the wave. You really need the whole wave in the frame to do it justice, so it's almost better to be a little bit away to capture that.”

With production running multiple cameras virtually non-stop every day, the amount of media runs into hundreds of terabytes.

“The more they have, the happier they are! The first season, I would just press record as soon as the guy took off and as soon as he finished the wave I stopped. The edit team came back to me saying ‘Man, you gotta keep the camera rolling. We need to get more of this stuff. We want the pickups. We want the guys stressing out. We want the guy stoked.’

“At first, I didn’t quite understand because I was getting all this killer action, but once I started to see the series in full, I understood how important for the storytelling that material is. It’s more important than the action. In the end, if you've captured somebody who caught the wave of his life and he's out there with tears in his eyes - that's the shot. If the guy gets a big wipeout with blood pulling down his face, as happened to Garrett in season three, I need to be there getting that.

“Sometimes it feels a little intrusive, sticking my camera in their face when he's crying with emotion or a fight breaks out, but it’s part of the vérité of it all.”

Indeed, Pujol’s favorite shot of this season is not awe-inspiring action or a monumental wave (although there are plenty of those). “It’s just a guy sitting on his board in the evening pretty much all by himself with these red sunset colors. I’m getting my driver to go back and forth so there’s some movement. It just felt right for a moment of reflection about Márcio.”

Tragically, Brazilian big wave surfer Márcio Freire lost his life at in Nazaré in 2023. It was the first death linked to surfing there and the event is recounted in ‘Chapter II: Undertow.’

“There was a lot of discussion about what we should include but I think the episode was a great homage to him and to everything he's done for big wave surfing.”

That Chapter also includes the aftermath of Garrett’s own wipeout which left him seriously injured, while brother-in-law, CJ Macias, gets back on a board after recovering from his own significant accident. The dangers are real, something that Pujol knows all about.

“For all the spotters you've got, and knowledge of wave period [the time it takes for two successive crests in a swell to pass a specified point] you never really know whether your subject is going to catch the wave or what's going to happen,” he says.

Pujol’s drivers are critical partners in the endeavor. He has been working with Joao Guedes and Antonio Cardoso for ten years.

“They know exactly what I want. I can just focus on my composition and keeping my camera straight and I don't have to give any instructions anymore. They just allows me to do my thing.”

With so much unpredictability, Pujol relies on instinct, skill and teamwork to capture the action in a split second – all while making sure he and his driver are safe. He calls it “provoking luck.”

“Sometimes, I'll be in the perfect spot in a position where I can lock my elbows and because of the way the skis are moving I become a human gimbal. When the ski goes right or left, so do I, while trying to keep the camera as straight as possible. Most of the time I'm not even looking at my monitor. I’m holding the camera above my head as we're driving away from a wave trying to get that last second shot.

“It’s a case of constant adapting to the situation and it's never the same situation. That’s what makes it exciting. Sometimes what you didn’t think was a good wave, ends up being an incredible shot when you check the rushes later. Other times the wave you thought would look amazing doesn't look that big when you play it back. Sometimes, it's just luck as to where we are on the wave that can gives you that optical illusion of power and performance with perfect framing. It’s luck, but you earn it through experience and perseverance.”

He says he is proud of his work on the show over the last six years and in particular that it has helped audiences to truly understand the motivation of elite surfers willing to test themselves against the power of nature.

“The movie Point Break was a crazy kind of made-up thing which for me never hit home. On 100 Foot Wave we are telling real stories of real surfers and it’s a big deal for us that the public appreciate it.

“Just being out there in the mix, travelling and hanging out with the guys, is something I love. I don't surf the big waves as I used to so I'm kind of living through them now. I’d miss it so much otherwise. It's a part of me.”

Special thanks to DP Laurant Pujul for giving the RED community a closer look at his work on the Emmy-nominated 100 Foot Wave.

Shonda Rhimes: “I'm never going to write a show that doesn't include me”

IBC

The Grey’s Anatomy creator recounts her groundbreaking career and calls Bridgerton “a workplace drama” while receiving the Edinburgh Fellowship Award.

 article here

If anyone can be said to have changed the face of TV drama it is Shonda Rhimes. The creative force behind Bridgerton and CEO of the global media company, Shondaland, Rhimes is the first black woman creator and executive producer of a top 10 Network television series, Grey's Anatomy, and the first woman to create three television dramas, Grays, Scandal and Private Practice, that have all achieved the 100-episode milestone. She executive produced How to Get Away with Murder, for which Viola Davis became the first black woman to win an Emmy for outstanding lead actress, among other accolades.

“I'm never going to write a show that doesn't include me,” Rhimes told the Edinburgh Television Festival where she was honoured with the inaugural Fellowship award.

“Creative powerhouse doesn't even come close,” said Bridgerton actor Adjoa Andoh, presenting Rhimes the award. “She is a global icon, a woman of colour who redefined the television industry, but did so entirely on her own terms. For so many of us in the UK and beyond, she is the blueprint for those who have had to fight to be heard or seen. Shonda represents what's possible when a black woman dares, not just to write the story but to own the pen, the paper, and the whole damn publisher.”

In an interview following the award, Rhimes essayed her career.

“If I spend much time thinking how great I am or whatever then I’m not thinking about telling stories or not telling anything authentic for sure,” she said, “I understand that the audiences are what helped me make it, and if you don't stay in tune with those audiences they can go away at any time.”

Roots

Rhimes grew up the youngest of six children and recalls watching Roots, The Cosby Show, and Friends but also that didn't watch much television growing up.

Her parents both worked in education and encouraged Shonda and her siblings to go to college where she discovered an interest in writing.

“At first, I wanted to be a novelist but I knew that there were expectations. My parents were very hard workers. They had raised five other children, and then they paid for an Ivy league education for their sixth, which was very expensive in America. It felt important to me to do something real.”

She chose USC film school after reading that it was harder to get into than Harvard Law School.

“I remember sending my parents the New York Times article that said so and I told them that I could be a professor, just like them. I wasn't even interested in television at that time.”

Leaving Film School she worked as secretary for a social services organisation while sustaining her writer’s dream in the evenings.

“I was nobody's nepo baby. I didn't have any connections. It was about figuring out how I was going to jump in.”

Her first spec script for a rom-com sold not once but twice. It was about an older white woman who, in my head was Susan Sarandon, who falls in love with a younger black man, who in my head was Will Smith, when she answers the wrong personals ad.

It never got made but the money from being in development was enough to fuel her ambition to write another. She wrote the screenplays for Crossroads starring Britney Spears and The Princess Diaries 2: Royal Engagement but it was only when she was home alone with her first born that the penny dropped.

Catching the TV writing bug

“I started watching television, shows like ‘24’ and I'd binge watch Buffy the Vampire Slayer. It dawned on me that this is where character development's happening. In two hours of a movie you can grow from point A to point B, but on a television show you get tons of character development opportunities.”

Rhimes was 33 when she wrote the pilot for Grey’s Anatomy after learning that the head of Disney [Bob Iger] was on the hunt for a medical drama.

“I was pretty strategic. I’d written the pilot for a show about war correspondents which wasn’t picked up so I wrote a medical show. I was obsessed with surgeries. I loved the idea of the hospital and just wrote a show that I really wanted to watch. Luckily ABC wanted to see it too.”
 
Over 450 episodes later the series is now in production on season 22. “Medical television until then was all about the patients who you care for,” she says. “My show wasn't about the patients. It's about how the doctors feel about the patients. I wanted to show what happens when doctors are careless.”

She pushed this too far with a storyline in one early episode for the network executive’s liking. They forced her to reshoot it. Her first creative compromise.

“The network felt like it was in the poorest taste possible, and they were actually angry about it. It definitely made me more determined to figure out how to tell the stories I wanted to tell within whatever parameters we were given. It made me more creative because I had to work around these constraints.”

Rhimes was thrust in at the deep end as showrunner on the first season of Grey’s, a role that she relished.

“There are 300 people looking at you asking for a decision. You’re talking to costume designers, you're walking the sets, you're talking to the actors, you're working in a writer's room, you're doing a million things that I had never even thought about before. It was daunting, but also fantastic to be able to see the thing that was in my head actually on the screen, so I knew the answers. Somebody would come up to me, and they would say, like, ‘What colour shirt should Meredith Gray be wearing – and I would instinctively know the answer.”

Netflix come calling

By 2007 she had the three leading shows on ABC’s primetime Thursday night, the most commercially valuable time of the week on US networks.

Aside from Gray’s she created and ran White House political thriller Scandal starring Kerry Washington (debuted 2012) and Gray’s spin-off Private Practice.

“So now I'm responsible for the entire Thursday night of how ABC makes their money. It was a lot of pressure. There’s not a lot of complaining I can do about any of this because I absolutely love my job. It was exhausting, but I had so much fun doing it. It never felt like work itself was a problem. I felt like exhaustion was the problem. How do you fight the exhaustion to get to do what you want?”

While still overseeing her ABC shows, Rhimes struck a multi-year deal to produce content for Netflix in 2017 for a reported $100 million.

“Netflix worked like a startup. Disney worked like a like a solid old school Corporation and I loved the idea of going someplace new and having all the problems be different.”

Bridgerton global franchise

In the three years before Bridgerton released in 2020 she admitted to anxiety based on the expectations she believed Netflix had of what she was required to do. She refused invites to Ted Sarandos’ parties because she felt she hadn’t written anything to justify celebrating and may not have found Bridgerton had she not chanced on one of Julia Quinn’s novels when holed up ill in a hotel room.

‘The Duke and I’ about the children of a family in regency England was perhaps not the obvious choice for a woman of colour but Rhimes saw it differently.

“To me, Bridgerton is a workplace drama,” she explained. “The women have no power in any other areas of their lives. The only place they have power is who they marry and how they marry, so that becomes a workplace with colleagues coming together to make sure that their fates and futures are sealed in a way that is positive for them.

“If you didn't marry, you were useless by society's standards. You could become destitute, how literally who you marry is all you've been raised for and all you've been raised to do after your marriage is to be a wife and a social person. I felt like we could hold up a little light to see what that world is when that's your whole value.”

More importantly, she could see herself in them, “If a black woman in the 21st century can see herself in Eloise in Regency England, then there's a story to be told that can really connect with audiences.”

The show has since become a “lifestyle brand” marketed as such by Netflix. “I can't tell you how many tea sets have been sold or how many people in the US have Bridgerton proms and Bridgerton themed weddings,” she said.

Netflix rewarded her with an increased deal, reckoned to be worth $300-$400m for new global hits.

“It's really rare that a show spawns a franchise. What worries me is that people are producing shows based on algorithms and numbers versus making shows based on creative quality and telling a good story.”

What comes next

She confirmed Bridgerton will run for eight series, each corresponding to the marriage fate of a different daughter, just like the books.

“There's a possibility for prequels and Julia Quinn has written other books that are sort of offshoots of Bridgerton.”

As to the Bridgerton casting she admits to trying to shake things up. “There's nothing wrong with other shows being the way they are. I just find it less interesting when I don't see my own face.”

She won’t respond to criticism of woke casting and won’t be drawn, in this session, on her left leaning politics which have seen her remove her account from Twitter (“because of Elon Musk”) and resign the board of the performing arts institute John F Kennedy Center.

“I learned a long time ago not to read things that are written because if you decide to believe the good things that are written about you, you are obligated to believe the bad things too. So I've decided that none of it matters.”

Shondaland HQ is based in LA with fellow offices in New York and London. Rhimes lives in Connecticut, “right in the middle.”

She ceded show running control of her series to producers she trusts a long while ago, partly because there was no way she could micro-manage every decision as she once did but also to carve space for her writing.

“In the midst of all the business parts of my job I always aim to find real, quiet, creative time to sit down and tell stories. That’s what they're paying for!”

She says she takes meetings on Mondays and Fridays leaving the mid-week for writing and thinking during which time “it needs to be an emergency for me to take a meeting.”

The narrative of every season and all major character arcs are still pitched to her and she’s involved in the casting of major characters but she no longer oversees the writer's room.

“It has to be somebody else's show, because if my creative brain is there inside the process, then everyone's going to bend towards me, and I don't want that. It's why I don't watch episodes anymore before they air. If I did, I would have opinions and [the showrunner] would have to take those opinions as notes, and then it's no longer their show.”

Right now, she’s in the thinking phase of her next original project, she teased.

“My process is, I think for nine months and write the script in a day.”


ends