Friday 8 April 2022

Hyper-nuanced content discovery

CSI

p16 article here

Content search/discovery has been talked about for years but is now top of the agenda for SVOD services and content app owners. The reason is competition. 

“The old saying is ‘Content is King.’ But since all premium global SVOD services have exclusive AAA content, they are all kings,” says Niklas Björkén, director of product experience, Accedo. “Content discovery has risen as a key way to compete by retaining users in your service. As long as your users keep finding more things to watch, they will stay.” 

Churn, once seen as a challenge for pay-TV providers, is above 30% for SVODs in mature markets like North America. 

“SVODs and content app owners face the same problem with consumers feeling they have come to the end of their pandemic viewing and looking to cut the number of subscriptions,” says Peter Docherty, CTO and co-founder at ThinkAnalytics. “The increase in super-aggregation has also made search/content discovery even more critical. Content catalogues are bigger than ever and users don’t want to launch different services to find content.” 

It's not just rival SVODs competing for attention either. “Social media, games and more are battling for the audience’s time and attention,” says Naveen Narayanan, head of sports and data products, Firstlight Media. “The most powerful search/discovery capabilities do two things: they give me what I want right now; and they predict what I will enjoy watching next, so that I stay engaged with the platform.” 

From an end user perspective the ability to find content to consume is still a frustrating process. Practitioners point out that it is not metadata per se that is at issue but the use of it that matters. 

“The customisation of the viewer experience has become a differentiator for SVODs in such a hyper-competitive world,” says Lucia Johnstone-Cowan, sales manager, Codemill. “Service providers need to be able to lock the customer in with tailored profiles, demonstrating a deep understanding of individual subscribers.” 

“For Netflix to make great recommendations to its consumers, it needs not just great metadata, but also great interpretation of that metadata,” says Jonathan Morgan, CEO, Object Matrix. “Is that down to the metadata, the interfaces or both?” 

Greater understanding of metadata 

In many cases the over proliferation of metadata collection can cloud the objectives for gathering that metadata in the first place. Now that older content from the archives is being leveraged within VOD and OTT services, there is a huge need to identify and categorise material.  

“While the opportunities to monetise archive content are a driving force, the process of collating ‘good’ metadata starts out with clear objectives,” says Johnstone-Cowan. “It’s crucial to define a simple statement regarding particular data that is important to know - for example, instances where actors appear in scenes within the archive.” 

The way the industry interprets metadata needs to evolve, argues Morgan: “During searches some metadata will be more important than other metadata in returning good results. But that is system and situation dependent rather than tied to the metadata itself.” 

Understanding metadata means being able to suggest other content based on user behaviour. Parham Azimi, CEO at iconic says, “It is about making that metadata actionable with some degree of intelligence applied to determine what types of content will appeal to someone who has just been watching Star Wars, for example. 

“For better discovery, you need to collect as much metadata as possible. However, if you don’t have API-first solutions in place to pull in metadata and transcript data and carry it through to the entire lifecycle of your media, it will be difficult to realise the full value of it.” 

When it is working best, metadata is a two-way street. Most obviously it communicates to the end user critical details about the programmes on the service that will inform whether they want to watch.  

 

However, it also needs to communicate information about the consumer back to service. Michael Kraskin, director, product management, Xperi says, metadata should provide “emotional and psychographic information” about the viewer to enable streaming services to make better recommendations and future content investment.    

AI to the rescue? 

Today, much of the industry still relies on manual processes to input metadata, which is costly, inefficient, and difficult to sustain at scale for large content libraries.  AI can improve speed, reduce costs, and improve accuracy to automate the creation of richer metadata at scale.  

Comcast Technology Solutions, for instance, offers VideoAI with which customers can automatically analyse video assets to identify and tag key onscreen moments (hard cuts, black frames, transitions, etc.) and audio events (silence, specific sounds). 

“Using software detectors that analyse video and pre-trained algorithms, VideoAI generates actionable metadata that can be used to create entirely new capabilities for content owners and advertisers,” says Bart Spriester, VP and GM of the Content and Streaming Providers Suite for CTS.  

He says the underlying technology has been applied across millions of video assets to create such features as metadata segmentation for DAI; segmentation detection, such as detecting intros, credit rolls, auto-chaptering; and creating automated on-screen highlight reels during live sporting events.  

Other companies such as Vionlabs are even making it possible to tag based on mood data with in-depth analysis of scenes to determine colours, stress levels, and genre of the content being analyzed. 

Others think information that reflects the emotional, thematic and aesthetic qualities of a film or TV show are beyond the capabilities of a robot.  “It is hard work and likely the largest untapped goldmine of data for content owners and streamers,” says Kraskin.  

Advanced emotionally resonant metadata, he adds, can make connections between programmes that traditional strictly factual metadata cannot. “It can drive carousels based on moods, aesthetic qualities, and more. This enables better recommendations in the moment, but ultimately can drive stronger emotional connections to a service.” 

Even as AI improves, there will likely always be a need for some level of editorial input, “especially when it comes to grouping relevant content,” reckons Azimi. “For example, this might include using metadata to enable topical grouping for use cases such as ‘simulated channels’ collating similar content into a UX likely to appeal to a certain demographic.” 

Docherty argues for a balanced approach of AI tools with expert oversight. “A number of vendors have sprung up saying they have AI for metadata but that’s only half the picture,” he counters. “It’s not until you’ve deployed content discovery to drive engagement that you know what works based on evidence from hundreds of millions of viewers.” 

He elaborates, “Much of what we see in AI metadata enrichment is adding volume to keywords/tags but not necessarily quality to improve viewer engagement. While automation delivers breadth, manual intervention delivers depth. When you have millions of items in your catalogue, you need automation for the reach, but you get better quality metadata by adding human expertise on top of the AI for your premium content.”  

Opportunities 

For some in this sector more could be done to smooth current business dynamics and commercial agreements. “Media companies need to move away from their existing investments in legacy platforms that are not flexible enough to support dynamic metadata enrichment workflows,” Narayanan says. “They also need to commit to invest to build AI/ML capabilities that allow videos to be enriched automatically with actionable video descriptors and video fingerprints.” 

If this is done then opportunities for greater metadata driven user engagement open up. One such is in providing a “deeply personalised” experience to users.  

“This involves understanding content similarities at a deeper level (that is, going beyond basic genre, language, cast) and overlaying user preferences (likes, dislikes, location, device, day of the week, and more) to surface content that is appropriate to the context in which they are viewing the content,” Narayanan explains. 

Another opportunity lies in improving the discoverability of long-tail content catalogue based on its similarity to other popular titles that drive user engagement. Yet another lies in utilising customised metadata (images, description) for customer acquisition as well as engagement marketing campaigns. 

ThinkAnalytics says it can extract real value based on “content intelligence” which feeds into truly individual personalisation. “This can be pro-actively presenting a niche piece of content in a subject that the viewer really cares about or proposing a movie with their favourite actor which has just been added to the catalogue,” Docherty says. “Once true content and audience intelligence is in place, the number of personalised experiences is unlimited.” 

Editorial is also emerging as a discovery mechanism. Video services like HBO Max, for example, are letting actors and directors share their recommendations.  

“We will also likely see more end user filtering for large catalogues,” says Björkén. “Video providers can help users understand the breadth and depth of their catalogue by showing them all the ways to slice it beyond free text search (such as filter by year, genre, keyword, director, country). Metadata is also about imagery and cover art. Most premium VOD services look stunning with perfect artwork for every screen, which will attract users’ interest.” 

Open formats, better standards 

Data licensing issues laws that hinder metadata collection and usage can be a problem but these are generally understood as there for a good reason.  

There are other drawbacks though. The most frustrating situations to Object Matrix are where metadata is held in bespoke systems or databases. “Often the metadata belongs to the customer, but the ways of accessing that metadata is via closed solutions. This can’t be right. We need better ways to hold metadata in open formats.” 

Johnstone-Cowan agrees that the increasing preference for microservice systems from multiple vendors within broadcast organisations is snarling up workflows. 

“Whether that is cleaning metadata or developing best practices, too many silos can prevent the whole organisation sharing in these successes. Some teams may use just one system and there isn’t a process in place to share that information with others. Cooperation and sharing information are key, the duplication of effort when it comes to metadata is a big problem and can lead to huge inefficiencies such as time lost to searching and indexing.” 

The solution is generally considered to be a form of standards, something Firstlight Media terms an absolute ‘must’ to fully realize the potential of video metadata. It calls for metadata standardisation to minimise fragmentation of discovery experiences and says the lack of standardisation is especially an issue for international and Tier 2 and Tier 3 content. 

 “The industry needs a standard that unifies all metadata but is flexible enough to capture various video descriptors/attributes generated through AI/ML and editorial curation,” Narayanan says. “Standards must maintain a consistent representation across languages and as well as a reference to a centralised video identifier registry; and to handle all of the various video types (full length, clips, news, highlights, user-generated) and business models (SVOD, TVOD, AVOD).” 

The standards Object Matrix would like to see are clear text storage of metadata in sidecar objects. Morgan explains that this would allow multiple applications to harvest metadata into their own databases for interpretation.  

“It would always come back to a transportable and open format that would allow the metadata to live for as long as the video / object lives, without being tied to complex structures. The more complex the structure or format of the metadata the more inevitable it is that the format will become short lived - addressing the needs of the day rather than encouraging flexibility.” 

Codemill believes a strict, shared taxonomy of the importance of well-defined metadata schemas should be a priority. “Within a QC context, if one content operator uses the terminology “loudness issue” while another uses “loudness error” it can be tricky to follow-up and gather useful insights,” says Johnstone-Cowan. “A terminology can also become an issue across different teams, if the team working on archive content via a MAM has one set of definitions for metadata, this needs to be shared with the new-release team for consistency.” 

She stresses, “Teams should collaborate on consistent terminology, to build a gold-standard approach to metadata definitions, that still incorporate the needs of different content workflows.” 

Ultimately, SVOD providers want to enhance the customer experience as much as possible. That means easy content discovery and content recommendations that make sense. There are lots of ways for this to develop. 

No comments:

Post a Comment