IBC
Automated contextual advertising uses AI to help brands fine-tune the contextual relevance of their addressable campaigns.
article here
ITV and Sky Media have both introduced contextual ads which allow advertisers to buy spots based on the specific content within a programme, rather than more broadly against the type of show or film genre. New contextual targeting capabilities will enable content advertisers to align ad placement using written, audio and visual metadata created using AI.
In a world where consumers are increasingly ad-savvy, context is king, with 72% of consumers saying their perception of an ad is influenced by the surrounding content (Integral Ad Science, The Context Effect, September 2021).
It’s worth reminding ourselves that contextual targeting is nothing new in the world of television. In fact, it was a key component of how many advertisers and buyers targeted television in the past.
“Establishing the appropriate editorial moments for advertising adjacency was pretty much beaten into a television buyer when they joined an agency,” says Rhys McLachlan, Director of Advanced Advertising at UK commercial broadcaster ITV.
“So it’s not new in that respect. What we’re finding in the digital space is that contextual can be done in a much more sophisticated way. And as ITV as a business migrates at a rate of knots from a linear push broadcast ecosystem to an over-the-top-distributed and consumed IP system, the enablement of AI or machine learning-powered contextual targeting makes the canvas just so much greater.”
UK pay TV operator Sky’s ad sales arm Sky Media says more than 5,000 pieces of VOD content had been scanned and tagged against ad-tech standards organisation the Interactive Advertising Bureau (IAB)’s content taxonomy (which runs to more than 700 categories). Each tag includes a sentiment score, identifying positive and negative moments with which advertisers may choose to align messages. While Sky’s service is in test, ITV Adlabs launched Automated Contextual Targeting (ACT) two years ago. ACT uses AI technology to scan through every show across ITV Hub, categorising every scene in them by mood, object or moment.
Key themes
The technology, which has the ability to recognise facial expressions and words spoken within shows, initially allowed the team to draw up three key themes for the pilot: Food, Drink and Mealtimes; Moments of Joy; and Beauty and Cosmetics, with pharmacy chain Boots and supermarket Sainsbury’s as pilot brands.
“The tests helped us understand more about the complexities which we anticipated would occur in a live environment,” says McLachlan.
Algorithms must accurately identify the content’s meaning, nuances, and intent to deliver relevant ads. Inaccurate content and context interpretation risks irrelevant or misplaced ads and a poor user experience.
ITV withdrew ACT from the market and put it back into the labs to work with digital intelligence platform Captify to enhance the proposition.
ACT relaunched a year ago, claiming – in a world first – to identify emotions during ITV programmes for advertiser targeting. Captify’s natural language processing and machine learning models are used to extract more precise show themes which are categorised under an extensive contextual taxonomy. Buyable segments are then created from themes and made available for selection in ITV’s ad platform Planet V.
The launch campaign was the DHSC Every Mind Matters campaign (via OmniGOV and Wavemaker) to coincide with World Mental Health Day. For this campaign ACT was used to identify emotional moments in ITVX shows that were “uplifting or joyful”, to find audiences who are more likely to be feeling anxious or stressed, and to direct viewers to the Every Mind Matters online portal.
OmniGOV were reportedly delighted with the results. “It has enabled us to tap into the core audience behaviours and deliver a timely, supportive message to those on their own mental health journey,” said Louise Turpin, Head of Client & Implementational Planning, OmniGOV.
Pete Markey, CMO at Boots said, “Aligning with contextual moments is hugely important in our marketing activity, but particularly at Christmas, as aligning with those moments of joy is integral to our strategy.”
Nearly 200 campaigns have since been run through the ACT system. “We’ve got 18 cohorts available for ACT on ITVX via Planet V,” adds McLachlan
“From an ITV perspective, ACT has been instrumental in enabling us to unlock budgets for brands or for targeting cohorts that we wouldn’t otherwise be able to,” he said.
Food advertising
The example that constantly crops up when ITV illustrates ACT is food.
“We’re not big on food programming,” McLachlan explains. “The BBC and Channel 4 typically dominate the food magazine shows. If you were to analyse our output you’d probably find that in any year we would do a couple of hundred hours of food programming. But using our machine learning proposition we can scan the catalogue and categorise all the moments where food is consumed.
“There’s a mealtime in every drama and soap or reality show. When you combine them and do that analysis you establish that we’ve got tens of thousands of hours of moments where food’s being consumed. That’s where we start to unlock those brands that are looking to target food environments.”
He also highlights “real success” over the course of the summer with an Olympics ACT proposition. “Evidently, we’re not the broadcast partner for the Olympics but we do have substantial editorialisation of the Olympics across our channels. We were able to find moments that were ‘Olympic adjacent’ in our channels and attracted Olympic partner brands looking to advertise in and around Olympic-adjacent content, or content with an Olympic sentiment. This is where it’s starting to be really smart for us.”
There might be questions as to how much an AI can decipher what’s going on in a video. Is someone on screen really cooking or are they doing something else (a lot of murders take place in kitchens…).
McLachlan says ITV’s AI is scanning the imagery and, critically, the subtitles. “It’s not just the visual components presented on screen that are being evaluated. We’re also using the audio description and subtitles function, so we know that food is being prepared or the scene is around the meal table. Imagery, audio descriptors and subtitles are combined to give us trusted results.”
Desired brand building
Commercial broadcasters claim that adjacency to relevant TV content improves advertising relevance, boosting a campaign’s brand effect. Some advertisers might expect a greater measure of performance, of bang for their buck, with such a targeted solution but McLachlan says they still very much desire brand building.
“The way our addressable advertising works with our advertiser partners is there’ll be several lines on a plan. The first line will be a mass reach proposition with a ‘widely defined addressable audience’ using our data. Then it works down through the funnel through some ACT proposition so the brand is adjacent to the moments that matter and that make editorial sense with the tone or other aspects of the advertising copy. Then you’ll get even more granular addressable audiences about people who have previously bought the products or people who are buying competitive products from all the data sources that we have. It’s very rare that we will have an ACT-only campaign.
“What we’re able to do now is provide a myriad of jigsaw pieces that come together to build a cohesive campaign that works across all the addressable targeting components that we have. ACT is one of them.”
It is also possible to automate the placement of adverts within programming but this is not something deemed primetime for ITV.
“We are not using AI to brute force breaks in content immediately adjacent to relevancy, and that will be our policy for the foreseeable future,” says McLachlan.
He says he finds it “quite remarkable” how SVODs with ad-tiers “are struggling to establish policy” around where and how breaks should be presented in editorial content, such as feature films, that were produced without advertising in mind.
“We will continue to ensure that from an editorial perspective, we are developing and adhering to best practice and break management rather than using the tail to wag the dog.”
No comments:
Post a Comment