Tuesday 23 April 2019

Curious kicks on with BBC3

Broadcast 
Dov Freedman on the thinking behind BBC3’s scheduling of Britain’s Youngest Football Boss.
When episode seven of Britain’s Youngest Football Boss lands on BBC Three next month, it will not only update the progress of West Ham United Women and the team’s 19-year-old manager Jack Sullivan. It will also provide further evidence that a linear schedule online can grow a fanbase.
The unusual distribution tactic followed in the footsteps of BBC Studios’ 14 x 15 minutes red-light district documentary Sex, Drugs & Murder in being dropped monthly onto the BBC iPlayer.
“BBC3 had been looking for a show that could repeat the linear schedule with when we pitched the idea for Britain’s Youngest Football Boss,” says executive producer and Curious Films chief Dov Freedman, whose credits include Channel 4’s The Island and The Trouble With Dad.
“It was BBC3’s suggestion for a monthly drop and since we’re a new company we jumped at the chance to innovate,” he adds.

Filming on and off the pitch across the season, the series follows West Ham’s squad of female players as they debut in the recently-announced, Barclays-sponsored professional Women’s Super League, having leap-frogged two divisions from the part-time third tier.Kicking off before Christmas, the show was neatly timed as women’s football explodes into the mainstream in the run-up to this summer’s FIFA Women’s World Cup.
For Curious, the series marks its debut commission, with its announcement coming simultaneously with the official launch of the company back in November 2018.
Freedman, a veteran producer of two decades, launched the factual-focused company with long-time production partner Charlie Russell, who has directed shows like Chris Packham: Asperger’s and Me.
Their first challenge as an indie came from former BBC3 channel boss Damian Kavanagh and commissioning executive Michael Jochnowitz, who tasked Curious with making Britain’s Youngest Football Boss into a 10 x 25-minute series.
“The conventional model for this kind of live-access programme is to film for six months, edit for several months and go through all screenings and reviews before getting a slot for TX,” says Freedman.
“In this case, making an episode once a month was immediately challenging to production.”
Besides the logistical challenge of turning around an episode every four weeks, the producer wanted to maintain high production values.
“There could have been a rough and ready way of making the show, but we wanted to stick to our core beliefs and keep that craft of filmmaking in there,” says Freedman.
“It means really thinking about the way the story is shot and trying to tie events on the pitch with the personal stories of the players.”
Storylines run between six and eight weeks behind when each episode airs. Freedman likens the editorial approach to running a newspaper.
“We meet every Monday to discuss the latest developments in matches and personnel,” he says. “We didn’t want a show that only appealed to football fans, as it’s also about gender in the workplace. The key for us is to make the audience care about the characters as people and as players.”
Curious kept the production team small in order to build relationships with West Ham’s 18-woman squad, the coaching staff and the Sullivan family (manager Jack is the son of the West Ham chairman David Sullivan). Producer/directors Alana McVerry and Steven Prior alternate between episodes so that while one is filming the other is in the edit working with editor Jamie Williams.
“With any programme of this nature, it comes down to trust between production and the subject,” says Freedman. “We told [WHU Women] our ambition for the show was to make the audience care about them.”
He adds that the show’s main reference point was Netflix’s behind-the-scenes collegiate American football doc series Last Chance U, which he describes as “an aspirational story to the 16-34 audience”, which was “something that appealed to BBC Three”.
Freedman adds that the BBC3 execs were not fixated on how well the first episode of the show would perform on iPlayer. “They were confident the audience would grow as the series rolled out,” he says.
Indeed, the broadcaster recently began scheduling the whole run in a Saturday evening slot on BBC1 following Match of the Day. The second episode peaked at 1.4 million.
Freedman and Curious are set to link with with wildlife presenter Chris Packham again on their next project, which is on a single doc about over-population for BBC2. A third show has also been greenlit but not yet announced.
“We enjoy working with talent to produce popular factual programmes that have an authentic feel,” Freedman says. “We are producing entertainment, but we want to execute and deliver shows with strong documentary values.”

Thursday 18 April 2019

Quality assurance key for OTT success

content marketing for Rohde & Schwarz
The cloud allows for revolutionary gains in speed, flexibility, and collaboration, with the industry on a fast track to deploying workflows for VOD and live events. Public and private cloud services are being used as well as hybrid solutions which combine on-premises storage with media processing and layers of AI and analytics in the data centre. There are many benefits for doing so from serving audiences richer more personalised and interactive content to operational cost savings realised by new pay-as-you-go business models.
It’s little wonder that NAB2019 was awash with new product and service launches based on the technology.
However, competition in the SVOD market is increasingly fragmented and intense, putting an onus on service providers to secure and sustain the attention of consumers. For this to be possible they must guarantee a consistent Quality of Experience (QoE) to ensure the highest quality streaming while still reducing operational costs and complexity.
As the market moves to event-based business models where consumers are paying to view a specific event – such as a boxing match or a music concert – effective real-time monitoring of the distribution chain is deemed essential. Without it, service providers are open to real reputation damage, even involving class action lawsuits, as (in)famously happened to Showtime following the Floyd Mayweather Vs. Conor McGregor fight in 2017.
Video issues can crop up at multiple locations in the end to end chain, and usually when one bottleneck is isolated, the next will soon spring up. Fluctuating and peak demand can cause unpredictable results, especially where the third party content delivery networks are being used to deliver other voice, data and video at the same time. Using multiple CDNs can facilitate this, but the bottleneck may just move to the access networks, where it’s not so easy to dynamically switch suppliers.
The challenge of figuring out where the video feed is going wrong is highly complex, and takes a combination of passive monitoring and active testing to test the availability of the streams in different regions. Ideally, an early warning system for video problems would flag issues like bad picture quality, accessibility errors, buffering and outages. This includes testing immediately after the content is produced and packaged, and then periodically at multiple geographic locations after it leaves the CDN (in data centres, on premise or in the cloud). Sampled coverage testing at the edge of access networks, whether broadband cable, Wi-Fi or cellular must also be part of the matrix.
The important part is to put a system in place that can monitor and test before the event to make sure everything is solid, and then during the event to provide an early warning system across the end to end delivery chain, and also across multiple geographic regions and access networks.
In other words, the whole system should be self-aware, probing all aspects of the stream to provide an efficient early warning of distribution faults, so service providers don’t have to wait until they start to receive complaints from viewers.
Rohde & Schwarz has just done this with a new cloud-based OTT monitoring solution that broadcasters and content providers can deploy quickly and without dedicated hardware.
R&S PRISMON.cloud was designed to enable OTT providers to easily adapt their monitoring infrastructure, for example when peak loads during the transmission of large events require extended service. A live multiview function, automated analysis of audio/visual data and error assignment in real time make it possible to permanently measure the QoS, store it in a cloud and visualise it in a timeline format on a web interface.
In combination with the on-premise R&S PRISMON monitoring solution, analysis data from physical and virtual sensors can be displayed on a single dashboard. End-to-end analyses quickly and easily reveal errors such as deterioration of video or audio quality, or poor CDN performance as a cause of churn – with just one tool.
What’s more, as a monitoring as a service (MaaS), it allows users to reduce investment costs by ordering their individual monitoring services based on current requirements.
Permanent error monitoring and the resulting quality assurance, especially of audio and video data, are the key to high customer satisfaction and a win in the intense SVOD battle that lies ahead.

Tuesday 16 April 2019

Sky and ITV Make Major Ad Tech Moves

Streaming Media
British broadcasters have made separate moves to advance ad-tech in a further defence against online competition but questions of scale, cost and measurement remain. From July, Sky is extending its AdSmart technology to make addressable television advertising available across the Sky and Virgin TV footprint in the UK. It is also working with NBCUniversal, part of its Comcast parent group, to combine AdSmart with its NBCU's Audience Studio to create an international offering.
Meanwhile, commercial broadcaster ITV has found a partner to roll out a programmatic ad platform by the end of the year.
Definitions of "addressable" and "programmatic" often differ depending on who you are talking to, and the two are not necessarily mutually exclusive. By and large, programmatic relates to an automated, software-based buying mechanism, while addressable represents the ability to target and deliver ads directly to specific audiences based on user/household data. 

Sky Expands AdSmart
AdSmart enables different adverts to be shown to households watching the same programme. This gives advertisers and brands the ability to tailor their campaigns to specific audiences and locations.
The Sky deal in the UK will give advertisers access to 30 million targetable television viewers across around 13 million homes, and covers both targeted linear channels and VOD advertising. Sky Media will be the exclusive advertising sales agent across the entire AdSmart network in the UK. Later this year, Virgin Media will also trial AdSmart on its free-to-air TV service in the Republic of Ireland.
"Aside from access to market-leading technology, the key thing that Virgin Media and parent Liberty Global gain from partnering with Sky is scale, not just of reach but also audience data, without which accurate targeting at scale cannot exist," says Matthew Bailey, Senior Analyst, Media and Entertainment, Ovum. "Integrating into one core platform will also make it easier for advertisers, for whom navigating an increasingly fragmented digital marketplace remains a challenge, to move more spend into addressable TV."
With NBCU, Sky plans to launch a global product. According to Linda Yaccarino, the chairman of advertising and partnerships at NBCUniversal in a release: "The world is getting smaller, and the opportunity for international marketers to make an impact with consumers is getting bigger. The industry has demanded a global premium video offering, and now, one will finally exist."
Bringing AdSmart to the U.S. and combining it with Comcast and NBCUniversal's tech and reach should increase its impact in both geographical markets. 
"Ad Smart's technology is held in high regard amongst many in the industry, but its confinement to the UK and European market—which is quite far behind the U.S. in uptake and awareness of addressable TV advertising—has so far limited its overall market impact," says Bailey.
NBCU is also piloting contextual media planning to align brand messaging with relevant scenes across national programming to "enhance advertising effectiveness and provide a more coherent viewing experience." 
The joint offer gives global advertisers the ability to plan and measure TV campaigns across international markets under a unified platform—something that can already be done in the digital world through the likes of Google and Facebook. 
"There are dangers to a globalised approach to TV advertising," the analyst warns. "Consumer tastes and TV advertising market dynamics vary from country to country. Plus, regional broadcasters are already experts at cultivating the content required to engage specific audiences, and some may be reticent to cede control of inventory if the quality and tone of the ad experience doesn't match up."
He suggests that the employment of AI-enabled contextual alignment—which matches ads with broadcast content—should help here.

ITV Hub Goes Programmatic
ITV's deal with ad-tech specialist Amobee will see it launch a new programmatic, premium advertising platform as the broadcaster looks to scale up its VOD content. Amobee, owned by Singapore telco Singtel, acquired ad tech firm Videology last August.
"Currently, advertisers looking for short-term results will mainly avoid TV advertising in favour of digital, as planning and buying a TV ad is often involving and doesn't offer the near-instant visibility of return on investment found on digital platforms," Bailey says. "This also extends to broadcaster VOD, which is still often sold as an extension of TV, rather than an entity in itself despite competing with Google and Facebook in the online space."
Rolling out a self-serve programmatic platform will also enable smaller advertisers to target select groups of consumers through ITV Hub across multiple devices. 
"Advertisers will benefit from the high-quality, brand-safe environment of TV without needing to invest the same amount of time and money required to buy and run a traditional TV ad campaign," Bailey says. "This kind of mechanism will grow in importance for broadcasters and other OTT video service providers as more players start to explore hybrid approaches to monetization, including both subscriptions and ads."
While more broadcasters are starting to experiment more with this kind of technology, there are still several creases that need to be ironed out before it becomes the norm. The issue of measurement, for instance, remains a hotly debated topic in the industry. 
"More collaboration will be required across the TV ecosystem before a consensus is reached on exactly what can and needs to be measured to bridge the gap between broadcast and digital/OTT viewing, although some progress has been made in this area," Bailey says. 
It's worth noting that what ITV has announced relates to its digital inventory—applying this kind of technology to linear, broadcast TV advertising represents a completely different challenge. 
"While programmatic platforms are often seen as more efficient and less expensive than traditional ad sales channels, re-educating less agile advertisers and realigning decades-old processes to fit into a programmatic-first TV advertising environment will require both time and financial investment. And, even once this has been achieved, it is likely that significant human input will still be required – you just need to look at the issues faced by Google and Facebook over the past few years to see why."
Bailey adds that addressability has a big part to play in increasing the value of TV advertising for broadcasters and MVPDs, "who should see an uplift in CPM for addressable ads;" for advertisers, "who can effectively target TV ads at a much more granular level;" and consumers, "who will see more relevant and useful ads."
"But TV advertising's biggest strength has always, and continues to be, the ability to reach large, mass, simultaneous audiences with big brand messaging campaigns. As such, it's probably not a case of whether we are able to [deliver ads tailored to zipcode] but rather a case of whether there are enough advertisers that want to."

Monday 15 April 2019

NAB: Building virtual worlds


IBC
NAB 2019: Virtual set solutions powered by games engines proved a big draw for the live broadcast as well as the scripted market.
The fusion of games engine renders with live broadcast has taken virtual set solutions to another level with photoreal 3D graphic objects appearing indistinguishable from reality.
At NAB, all leading developments in this area come powered with Epic Games Unreal Engine. Originally designed to quickly render polygons, textures and lighting in video games, these engines can seriously improve the graphics, animation and physics of conventional broadcast character generators and graphics packages.
Every vendor also claims that their integration of Unreal Engine creates the most realistic content for virtual and mixed reality and virtual sets.
Some go further and suggest that their virtual production system transcends its real-time broadcast boundaries, providing real-time post-production and high-end content pre-visualisation for episodic and film market.
One of those is Brainstorm, the Spanish developer behind InfinitySet. The latest version of this software takes advantage of the Nvidia GPU technology and Unreal Engine 4 (UE4) for rendering. Nvidia’s SLI technology can connect several GPUs in parallel, multiplying the performance accordingly, so that InfinitySet can deliver real-time ray tracing for much more accurate rendering, especially with complex light conditions.
 “Ray tracing offers more natural, more realistic rendered images, which is essential for photorealistic content generation,” Héctor Viguer, Brainstorm’s chief technology officer and innovation director explained. “InfinitySet can create virtual content which can’t be distinguished from reality.”
These developments open the door, he says, for content providers to create “amazingly rendered” backgrounds and scenes for drama or even film production, “significantly reducing costs”.
He claimed: “For other broadcast applications such as virtual sets or AR, InfinitySet provides unmatched, hyper-realistic quality both for backgrounds and graphics.”
Like competing systems, InfinitySet works as a hub system for a number of technologies and hardware required for virtual set and augmented reality operation, such as hardware chroma keyers, tracking devices, cameras and mixers.
Vizrt’s Viz Engine 4, for example, includes a built-in asset management tool, downstream keyers, switcher functionality, and DVEs. The company billed its NAB update as one of the most important ever, presenting its new Reality Fusion render pipeline for delivering more realistic effects and real-time performance. Integration with Unreal Engine 4 adds the flexibility of having backdrops with physical simulations such as trees blowing in the wind, combined with a template-driven workflow for foreground graphics.
The release uses physical-based rendering and global illumination among other techniques “to achieve realism for virtual studios and AR graphics to levels never seen before,” said Vizrt chief technology officer Gerhard Lang.
ChyronHego offered Fresh, a new graphics-rendering solution based on UE4.
“News, weather, and sports producers no longer need to struggle with multiple types of render engines and toolsets, which can result in varying degrees of quality,” contended Alon Stoerman, the ChyronHego’s senior product manager for live production solutions.
“This means producers are able to tell a better story through AR graphics that look orders-of-magnitude better than graphics created with traditional rendering engines. They’re also able to do it faster and easier than ever before, since Fresh can be operated as an integral part of the rundown. These are truly unique capabilities in the industry.”
He explained that a built-in library of 3D graphic objects sets Fresh apart from competitor systems that require the broadcast elements to be created in a traditional graphics-rendering engine and then added as a separate layer on top of the Unreal scene.
“Not only does this requirement add more time and complexity— a liability during a breaking news or weather event — but the resulting graphics lack the realism and ‘look’ of the gaming engine,” Stoerman argued. “With Fresh the graphics are rendered as part of the UE4 scene and carry the same photorealistic and hyper-realistic look as the other scene elements.”
Ross Video had adapted The Future Group’s (TFG) Unreal Engine broadcast graphics software into its virtual set and robotics camera solutions but has now parted ways with the Oslo-based developer.
Instead, it is offering its own UE4- virtual studio rendering package called Voyager. It works with Ross’ control software and will work with a variety of tracking protocols and camera mounts.
“Ross has worked hard over the last few years to put Unreal based rendering on the map for virtual production,” said Jeff Moore, EVP at Ross. “We see our in-house development of an Unreal-based system as a natural evolution of our ability to provide more complete solutions and this liberates us to move at a faster pace, without external constraints.”
The Future Group, meanwhile, has been evolving its technology. Rebranded Pixotope (from Frontier) the company makes extravagant claims for the software.
“With Pixotope, we take the incredible 3D, VR, AR and data technology that’s emerging at an ever-increasing rate, and make it possible - easy, even - to use it in every production, at almost any budget,” said Halvor Vislie, chief executive officer.
“The real and the virtual digital worlds are converging, setting the scene for artistic and commercial opportunities on a massive scale. Mixed reality, interactive media productions can engage audiences in new ways, opening up new business models.”
The software runs on commodity hardware and is available for subscription. It’s a model that will change the virtual production landscape forever, TFG claim.
“It’s transformative for the industry,” Vislie said. “All the power, quality and stability demanded by broadcasters, without the need for expensive, proprietary hardware. And no massive capital outlay: just an easy monthly payment.”
Pixotope includes a real-time compositing engine working at 60fps in HD and 4K and claims perfect synchronisation between the rendered objects and the live action. In a partnership with Ikinema and motion tracking technology firm Xsens, TFG is offering a real-time production process for capturing AR character or ‘talent interactive content’.
“Real time animation and live virtual character puppeteering … is one of the most costly and difficult types of production to do. We have collectively created the solution our customers are looking for” added chief technology officer Marcus Brodersen.
With Arraiy, a provider of computer vision solutions, TFG aims to deliver AI-based real-time VFX. By the end of 2019, Arraiy will release tools to perform real-time matting without green screen and another to generate real-time depth extraction and occlusion solving, both of which will be added to Pixotope.
UK broadcast graphics company Moov, whose clients include Sky and BT Sport, announced at NAB that it would use Pixotope for use on virtual studio and augmented reality graphics projects.
Virtual Reality was little in evidence at NAB, as the technology takes a back seat until 5G enables realtime high-resolution high-fidelity 360-broadcasts and more robust ways of making money from its production are realised. One of the leading live VR producers, NextVR, did not exhibit in Las Vegas.
Sports graphics specialist FingerWorks, though, showed how 3D stereo graphics could output for viewing in head mounted displays from PlayStation VR, Gear VR, Vive or Facebook.
“Broadcast VR cameras record the two ‘eyes’ of the stereo VR image. Our graphical telestrator technology is inserted on the broadcast side by an operator or analyst. They then stream out equirectangular images to the headsets where they are stitched together to present a smooth global visual presentation,” explained FingerWorks.
Sony previewed a development of its Hawk-Eye graphics tool that can now be used to generate 3D models of a game. The system tracks 17 different points on the skeleton of every player as well as the ball via six cameras around the field to create a large dataset. Combined with virtual graphics it would allow a fan user to see a 3D model of the game and then move around the 3D world.
Ncam Technologies unveiled the latest version of its camera tracking solution, Ncam Reality, which supports an additional camera bar for tracking environments that are subject to dynamic lighting changes.
Ncam chief executive Nic Hatch said: “This new release solves many of the common challenges faced by AR users, as well as providing new features that will greatly enhance AR projects, whether for VFX pre-visualisation, real-time VFX, virtual production, or live, real-time broadcast graphics.”
VR on hold
Although 10.6 million ‘integrated display’ VR Head Mounted Displays (including console, PC and all-in-one headsets) shipped in 2018, with an additional 28 million mobile phone based VR viewers shipped, analyst Futuresource Consulting expects the pace of uptake in VR hardware to remain at modest levels for the foreseeable future, with limited content available engaging with the mass market.
In the long-term, the outlook for VR remains positive, Futuresource suggests, with VR technology and hardware continuing to develop and improve user experience. “Access to and the availability of successful VR content continues to be a sticking point throughout the industry, with content development caught between a currently small active base of VR users, limiting the potential sales for VR software, and the substantial investment required to produce high quality VR experiences,” said analyst James Manning Smith.
“We expect that as the installed base of VR headsets grows, there will be further interest in content creation as the potential eyeballs and players for VR content creates a more lucrative market.”


Friday 12 April 2019

Posthumous performances: Should we digitally recreate dead actors?

IBC
The ethics of digitally recreated performances by actors who have passed away was questioned during an SMPTE panel at NAB.
In 2016, a dead man acted in a Star Wars film. Peter Cushing passed away in 1994 but a CG likeness of him was painstakingly recreated from individual frames to perform his scenes in Rogue One.
It is increasingly common today to scan important actors at various stages of the filmmaking process as an aid to VFX. It happens in TV with the lead cast of BBC and Amazon show Good Omens getting full photo scans.
But what happens to these assets in case of an actor’s untimely death? Can they continue to perform posthumously? And who owns our digital likenesses? The studio or the actor and what happens if they become public domain?
These were some of the questions tackled by a SMPTE panel at NAB ‘Our Digital Selves in a Post-Reality Era’.
Of course, Peter Cushing was not the first. Oliver Reed, Paul Walker and more recently Carrie Fisher passed during filming and had their performance completed in CG. Go slightly further back and Audrey Hepburn and Gene Kelly were digitally resuscitated to ‘act’ in new commercials.
“Digital technology is clearly coming to a point where photorealistic representation is possible and we must ask ourselves if we feel comfortable using likenesses of celebrities that passed away,” says VFX artist Arturo Morales, who worked on Walker’s last posthumous appearance in Fast and Furious 7.
“I think that it is ethical as long as I can bring the digital actor on the same level of talent and if they are used in roles that follow the path of their career.”
Techniques for re-creating human performances are increasingly sophisticated. Lola VFX, for example digitally de-aged Samuel L Jackson by about 30 years in Captain Marvel. A stand-in actress was used as a body double in Bladerunner: 2049 onto which Sean Young’s face was overlaid. The same film also featured a holographic Elvis and Marilyn Monroe playing in a Las Vegas casino.
“We don’t just have the ability to volumetrically scan an actor’s face, body, movement but there are AI tools which can manipulate that ‘character’ independently of the actual performer,” explains cinematographer Andrew Shulkind.
In a world in which deep fakes are in many cases indistinguishable from the real thing, the ownership of our selves in a digital era isn’t just an issue for A list stars. We may all have an avatar in the 3D internet, so who protects that?

AV1: Codec wars erupt


IBC
Just as AV1 gains significant momentum, its royalty-free agenda comes under threat.
Reporting on developments in compression is like building with sand. No sooner do you think you’ve grasped it than the silica slips through your hand. Imagine how exasperating that must be for device manufacturers and online content providers who need to get a fix on tomorrow’s roadmap.
The good news for supporters of AV1, a rival or successor to HEVC, is that there is growing momentum behind its implementation - but there’s a fly in the ointment.
On the plus side, Samsung announced its intent to join AV1 development board Alliance for Open Media and seems likely to put the codec into its consumer devices from TVs to Galaxy smartphones.
With Apple, Amazon, Netflix, NVIDIA, ARM, Facebook, Microsoft and Google already AOMedia members the move would seem to lock down AV1 as the codec of choice for streaming media distribution.
It’s only been a year since AV1 specifications were launched but according to analyst Jeff Baumgartner of Light Reading, the codec has made significant progress in terms of adoption and commercial-readiness with “ample evidence that a decoding and encoding product ecosystem is building around AV1.”
Though early implementations of AV1 are software, notably running in Google’s Chrome browser and Mozilla Firefox, Socionext’s FPGA-based encoder (claimed to accelerate AV1 10 times that of software) is among the few AV1 hardware products. More of these can be expected at IBC2019.
New technologies from virtual reality to online gaming and interactive entertainment all have the potential to offer users higher calibre experiences with richer images and a wider range of colours at faster speeds.
“AV1 tackles the challenging technical hurdles to enable this growth – while using compression to reduce data demands,” endorsed Samsung. “By joining AOMedia at the highest level, Samsung will directly participate in expanding adoption of AV1 to help bring better media experiences to customers and their devices around the world.”
Even more significant is the announcement by Intel and Netflix at NAB of a new offshoot of AV1 which is believed will accelerate the codec’s adoption.
The pair are promoting Scalable Video Technology for AV1 (SVT-AV1) capable of running a 4K video stream at 60 frames per second at 10 Mbit/s using the latest Intel Xeon processors.
It’s believed to be the first software-only AV1 implementation capable of such performance which according to expert Jan Ozer “represents an order of magnitude acceleration of AV1 encoding.”
The SVT architecture is not new and is one that has already delivered codecs, including SVT-HEVC and SVT-VP9 but the SVT-AV1 implementation is claimed “unique” by Intel and Netflix in that it allows encoders to scale their performance levels based on the quality and latency requirements of the target applications” - ranging initially from video on demand and in future to live streaming.
While AV1 is marketed as royalty free, there are those arguing that the complexity of implementing AV1 in practice mean that its compute cost is higher than HEVC and many times higher than VP9.
With the processing muscle of Intel behind it, SVT-AV1 could bring that cost down.
“The SVT-AV1 collaboration with Intel brings an alternative AV1 solution to the open-source community, enabling more rapid AV1 algorithm development and spurring innovation for next-generation video-compression technology,” explained David Ronca, Netflix director of encoding technologies.
Importantly, SVT-AV1 is being offered to the open source community and so will be available to AV1 codec developers like Harmonic, Bitmovin and Beamr.
“Compared to today’s most popular codec (H.264 AVC), SVT-AV1 can help service providers save up to half their bandwidth, delivering leading-edge user experiences that can be quickly and cost-effectively delivered at a global scale,” said Intel’s Lynn Comp, vp of the Network Platforms Group.
She pointed to nonlinear interactive programming like Netflix Bear Grylls’ You Vs. Wild as an example of the type of content that requires a higher level of processing.
AV1 faces patent protection challenge
While AV1 has been promoted as royalty free in deliberate contrast to HEVC, there have been those warning that the proof is in the pudding. HEVC has been mired in patent issues but there’s a similar fly in the ointment emerging for AV1.
This is the launch, strategically made prior to NAB, of a new video coding licensing platform targeting AV1 and VP9.
It’s being fronted by Luxembourg-based Sisvel International, a global IP protector and patent pool administrator whose online video pointedly questions whether AV1 and VP9 should be FREE?
The campaign on its website is run with the tagline: ‘Do the right thing. Reward innovation: play the right future’ and essentially argues that successive generations of compression standards have built on the intellectual property of those that have gone before. It suggests that there’s a direct line between some of the companies which held patents in VP8 and VP9 to AV1 and that therefore payment for this development should be expected by the industry.
It has marked out a patent pool for AV1 licenses applied to “consumer display devices” (smartphones, PCs, TVs) and “consumer non-display devices,” (e.g set-top-boxes).
The rate for these, states Sisvel, is €0.32 for display devices and €0.11 for non-display.
While it makes clear the licenses do not cover content encoded in the format, it will cover playback of that content on devices with embedded AV1 encoders.
As streaming video development specialist  Mux points out that could mean a bill for Apple of $29 million a year based on current sales of iOS gear.
Sisvel argues: “If those funding R&D activities are not fairly rewarded, there will be less incentive to innovate further. Securing funding to support R&D activities is fundamental to foster the innovation ecosystem.”
“The video codec technologies used today are the result of decades of investments in innovation by many parties,” it maintains. “These continued investments resulted in different generations of technologies, allowing for continued and drastic improvements in the video codecs domain.
“Revenues from patent licensing can be and regularly are re-invested in R&D activities. This creates a self-sustaining cycle in which the fruits of previous innovation can fund new research, generating an inventive loop in which the intangible assets acquire real economic value. Innovation is a long and expensive process, it requires the possibility of substantial returns to be worth pursuing.”
None of this cuts any ice with the Alliance which promptly rebuffed the approach.
In a terse statement released at NAB said, “AOMedia was founded to leave behind the very environment that the [Sisvel] announcement endorses – one whose high patent royalty requirements and licensing uncertainty limit the potential of free and open online video technology. By settling patent licensing terms up front with the royalty-free AOMedia Patent License 1.0, AOMedia is confident that AV1 overcomes these challenges to help usher in the next generation of video-oriented experiences.”
Members of the pools behind the Sisvel intervention reportedly include JVC, Philips and Toshiba along with operators Orange and NTT all of whom have also licensed patents to MPEG LA for either the AVC, DASH, or HEVC patent pools.
“The idea that these [multinational corporations with multi-billion dollar revenues] need this component of revenue feels unrealistic,” contends Mux streaming specialist Phil Cluff. “While we agree it’s important for intellectual property to be respected, this feels like potential profiteering on the behalf of the patent holders and Sisvel, especially considering the admirable project goals of AOM and the AV1 project in particular.
“With no transparency of the patents offered by this group, it’s hard to gauge the impact or legitimacy of Sisvel’s patent pool.”
Additionally, MPEG was at NAB explaining its decision to fast track Essential Video Coding (EVC), a new codec aimed to compete with HEVC on bit-rate efficiency and to be “licensing-friendly”.
As Harmonic’s vp of Video Strategy, Thierry Fautier has pointed out, EVC is not royalty free. “Its baseline profile might be but its compression performance is less than for HEVC.”
He suggests that if EVC achieves 24% bit-rate efficiency savings over HEVC this will be with royalty; “The devil is in the details.”
The Media Coding Industry Forum, which launched at IBC2018 and includes companies like Canon, MediaKind, Sony, Nokia, and Apple, will have its work cut out policing all of this to avoid another HEVC licence debacle.


NAB 2019: Esports targeted by kit vendors

IBC
The rapid growth of esports is proving to be a godsend for live event kit vendors, and as a result was a major focus at this year’s show in Las Vegas.
With the global audience forecast to hit 450 million and industry revenues set to breach $1 billion for the first time in 2019 according to Newzoo, esports’ rapid growth is a godsend for under-pressure live event kit vendors.
NAB show organisers recognised this too, introducing a dedicated ‘Esports Experience’ exhibit.
“Clearly, esports is a programming content category that is moving beyond millennials and into the mainstream of American society,” explained Dennis Wharton of the NAB.
For the media and broadcasting industries, esports and, more broadly gaming and game streaming, “will be a key component of their future media strategies,” underlined Newzoo’s Cleo Sardelis.
Vendors keen to cash in include those whose camera channels, switchers, replay machines and graphics gear has been staple to the outside broadcast industry for the last 30 years.
Grass Valley, for example, hosted a live esports competition at its booth to demonstrate its ability “to leverage decades of live-production experience to meet the specific needs of esports companies and specialist venues.”
Partnered with esports-production company FACEIT, Grass Valley demoed kit more commonly seen at premier sports environments including LDX 86 series cameras, Korona production switchers and the LiveTouch highlights system.
Grass Valley also attached itself to the NAB esports event alongside fellow sponsors Akamai, eBlue and The Switch.
The latter is a US-based operator of private cloud services for media and it used NAB to launch Switch eSports which is being pitched as a way into live streaming and remote production for broadcasters.
“The real value [of the initiative] is our ability to connect a live gaming or esports producer from anywhere in the world onto The Switch network, enabling them to then use all [our services] including [our] 10 studios across the US and in London, and our dedicated connections to more than 800 content producers and distributors, including the leading digital platforms,” explained Keith Buckley, CEO of The Switch.
It’s an effort that would seem to rival Forbidden Technologies’ Blackbird which offers similar capability. At the show, Forbidden reconfirmed its pact with UK-based esports producer Gfinity which uses Blackbird to create highlights and other content from live streams for social media. Blackbird was also used to drive fan engagement for the live pro-celebrity Fortnite event in LA last year.
Also promoting its presence at the NAB Esports Experience was KVM maker IHSE USA “Esports is the perfect application for a custom-designed, dependable KVM extender,” reckoned Dan Holland, IHSE marketing manager. “Gaming pros typically require a lot of screens, both onstage and elsewhere, in a venue that can offer sufficient support for high-end, distributed video and direct computer access, together with minimal latency or mouse lag.”
LED video wall manufacturer Absen hoped that the Esports Experience “will lead many NAB attendees to learn more about the new Absen Aries Series 1.5mm panel.”
EVS lays claim to building esports’ first in-game slow motion replay solution in which ‘observer PCs’ are placed into a live game, viewing the action as if they were cameras. Feeds from these are recorded, ingested by an EVS server and slowed down to the broadcast-standard 60Hz. EVS also promotes DYVI, its software switcher, for esports producers to cut together content from the live play output from the gamers’ PCs and for the live programming.
According to Intel, which sponsors esports competitions and has a vested interest in selling PCs with its chips to gamers, esports is the fastest growing spectator sport.
Revenues are on track to hit $1.6bn annually by 2021 but as with regular sport it’s the potential earning from the betting market around esports that are astronomic. Research firm Eilers & Krejcik Gaming forecasts that global esports gambling could top $13bn by next year.
Naturally, the Luxor hotel and casino on the Las Vegas strip has a 30,000 square foot esports games and broadcast arena.
According to analyst eMarketeer, the esports fan is a conundrum for brand marketers. Nonetheless, esports viewers “tend to be loyal, tuned-in, digitally savvy and more affluent than the average consumer,” it states. “The trick is learning where, when and how to reach them.”
Keen to keep up with millennials, major sports governing bodies including the Olympic movement, are embracing esports tournaments while keeping them at the moment at arm’s length from their main competitions.
Even cricket, seemingly the most digitally immune of sports (EA Sports axed computer game EA Cricket in 2017) can’t ignore the tide. The sports’ governing body in England, the ECB, has appointed consultancy Strive to evaluate the potential of video gaming and esports as a platform to engage young people.
Despite the sizeable viewership figures, esports constitutes a small niche within gaming. Active gamers in China alone number 619.5 million with figures on the rise. This is driven by the growth in mobile gaming, a sector that was responsible for 49% ($63.2bn) of the games industry’s annual revenue in 2018. It’s these impressive numbers and the associated first-party data opportunity that is driving many governing bodies to explore gaming and esports.
Motorsports is taking a lead. Formula One owners Liberty Media has more than doubled the prize money for this year’s esports world championship and will include more races and live events on the calendar. Last year’s competition, which was fronted by F1 teams, was watched by 5.5 million viewers.
“The first two seasons of esports for F1 have been so successful,” F1’s head of marketing, Ellie Norman, told ESPN. “It will be a matter of time before we see the convergence of these worlds getting an absolute overlap. I think that’s a really exciting apex that’s going to come.”
F1 is behind the curve though. Upstart motosport G1, run by Israeli car manufacturer Griip, has already put virtual races on par with the track. Using simulations of some actual G1 race venues drivers are competing head to head in simulator games and will score points toward this year’s Championship. Further plans include fast tracking an esports competitor into a seat in a G1 car for future series.
LiveU is providing the live streaming technology for the 2019 G1 series and was plugging this by showing off a G1 racecar on its NAB stand.
iRacing, which claims to be the leading online motorsports simulation game, announced it is using Limelight Networks’ content delivery network to deliver software updates containing the latest track and car information to players. Software updates range from 4GB to 5GB for existing players and up to 30GB for new members, however customers had complained of inconsistent download performance. Perhaps not any more.
Games engines driving virtual productionThe core technology driving realtime computer gaming is also advancing the production possibilities of broadcast TV and feature film. Games engines are integral to so much virtual production and real-time broadcast workflows that developer Epic Games was awarded a technical Emmy in the brand new category ‘3D Engine Software for the Production of Animation’.
At NAB, companies exhibiting workflows with Epic’s Unreal Engine included AJA, Avid, Blackmagic Design, Bluefish 444, Brainstorm, ChyronHego, Maxon, Mosys, Ncam, NewTek, Ross Video, The Future Group, Vizrt, wTVision, and Zero Density.
Epic Games technical product manager for mixed media Andy Blondin noted in a NAB conference session, that Unreal Engine is over 20 years old, and on its fourth generation. Epic Games recently launched Unreal Engine 4.22 which introduces ray tracing and a host of virtual production tools, including multi-user collaborative editing features.
“It’s a super-powerful collaborative and filmmaking tool,” he said. “We hope to compress pre-vis, tech-vis, post-vis into one engine, so everyone can visualize in real time.”
Unity Technologies head of cinematics Adam Myhill reported that more people play Unity games than watch TV. “Computer game playing has reached such a massive scale that games are bound to influence movies and other entertainment.”
He observed that James Cameron’s Avatar had pioneered virtual production. “It wasn’t done with game engines, but was the beginning of using real-time systems to try out ideas in movies,” he said. “For directors, it’s like having a finger in every department, from lighting to animation and editing.”

Wednesday 10 April 2019

Behind the Scenes: The Virtues

IBC
The latest drama from This is England writer-director Shane Meadows was created in a semi-improvised fashion which required highly flexible production planning from indie producer Warp Films.
“We broke a lot of the rules for how you are supposed to make a TV drama,” says producer Mark Herbert about the production of Shane Meadows’ new film The Virtues.
“This is clearly authored by Shane which means it has a huge heart, incredibly powerful performances and that mix of humour he finds in characters and situations, but this is very much a focus on one man rather than an ensemble piece,” Herbert explains.The Virtues features Meadows’ trademark mix of humanity, revenge, tragedy and bittersweet humour familiar from 2004 crime thriller Dead Man’s Shoes and four series of This is England, the Bafta-winning study of British subculture. But it also represents something of a departure for Meadows.
Stephen Graham (who played Combo in This is England and also features in Martin Scorsese’s upcoming The Irishman) is Joseph, a moral yet troubled man with repressed memories who travels to Ireland to confront demons from a childhood spent in the care system. Helen Behan (also from This Is England), plays Anna, the sister Joseph hasn’t seen since they were separated as children. Frank Laverty (Michael Collins) is her husband.
“It’s a story that Shane has wanted to tell for a while even before This is England began. It’s all about timing. Is there the right actor and right time to tell this story.”
With regular writing partner Jack Thorne, Meadows wrote a ‘scriptment’ for all four episodes but then treated it as a jumping off point open to exploration with the actors.
“The script was a blueprint which was workshopped and changed in rehearsals with the actors,” Herbert explains.
What was unusual is that this process continued right through filming, with the production often stopping for hours, a few days and even three weeks at one point, in order for Meadows to work through a scene with the actors.
“Shane has to have a truth to everything he films and we both felt that this time around we’d break the conventions of the shoot and reach for that truth a different way.
“Normally for an hour-long drama you’d allot two weeks to shoot it and you block all the scenes together for each location since that’s the intensive and expensive bit of the whole production. What tends to happen is that even if a scene is not quite working you fall under pressure to keep to schedule because you might only have the actors for a certain amount of time. With the way Shane wanted to work we needed to break that structure.”
They set up camp in a disused former school in Sheffield, not far from Warp Films’ offices. It was a space to which the actors could return and rehearse at any time.
Editor Matthew Gray (The Stone Roses: Made of Stone) set up his Avid in the same building so that Meadows could quickly see how things were progressing in the edit.
“It’s not like it was the expense of being in the middle of Soho,” Herbert says. “The rent was peanuts. It was perfect to give Shane the creative space to take a break from filming if he needed.”
Commissioning broadcaster Channel 4 was on side with this based on its previous relationship with Meadows including This Is England. “They trusted Shane and knew that we’d deliver on budget and schedule.”
It helped that The Virtues is a contemporary drama. Trying to do this with period sets which may require street dressing or props hired for only limited windows would have been impractical.
Instead of the classic ten weeks preparation, the show had the luxury of five months. Only when they were ready to shoot were technical crew brought onboard.
“Sometimes we stopped for a few hours and cracked it before going to shoot it, other times we had weeks of break. That was stressful for myself and the production team but everyone signed up to the process. I talked with the crew, many of whom we’ve worked with before, and explained how we were going to work from day one and gave them a deal that encapsulated the whole process.”
Scenes were lengthened, shortened, new dialogue was written, character storylines extended and until the start of episode four they had no conclusion to the drama.
Director of photography Nick Gillespie (camera assist or b-camera op on Kill List, Sightseers and Stan and Ollie) describes the process as “very organic – a really freeing way to work to tell the story” adding that the director’s approach is like “documenting drama.”
He explains, “From my point of view I was trying to be a little bit invisible, to let the actor’s performance dictate where the camera should be. Rather than locking down focus or blocking a series of shots I wanted to remain flexible to find the moments Shane was looking for.”
To do this Gillespie used multiple cameras, sometimes up to seven shooting simultaneously “to give everyone an angle”.
A and B cameras were handheld ARRI Alexa Minis fitted with lightweight ARRI Alura zooms to which he added locked-off cameras like Sony AS7s with Zeiss fixed speed glass.
A series of flashbacks to the 1980s which pepper the narrative were filmed on VHS tape as a cost-effective alternative to 16mm.
“I had used a Panasonic video recorder when I was a kid so when Shane came to the set with one I wasn’t scared to use it,” Gillespie says. “It’s just a point and shoot with old vintage zooms but it has a nice look to it. We attached a mobile phone to the recorder so we could monitor what it was shooting from a rough point of view and just in case the VHS didn’t work.”

Most interiors were shot in Sheffield and Chesterfield with location work in Belfast, Birkenhead and the Liverpool to Belfast ferry. Shooting was chronological in order to maintain flexibility with changing storylines.The taped material was upscaled to HD, the mixed camera formats matched and the grade performed at Dirty Looks, supervised by Meadows.
Although the process is in similar to the actor-led creation of much of director Mike Leigh’s work, the big difference here was the starting point of a complete script.
“We had a really solid basis for a story and rounded characters,” Herbert says. “Even for the secondary characters he has developed full backstories. When the actors came in they’d take this on board but bring their own personality and interpretation of the character with them which is exactly what Shane wanted to develop. Shane and Jack had that worked out but if something better came along they had the freedom to change it.
“You can’t go off-piste in this way unless you’ve nailed the basics,” he adds. “Shane doesn’t jump into a production unless he hundred percent believes in it.”