Friday, 13 May 2016

Shot in the arm for HFR

Screen Daily May p63
Ang Lee's latest picture, shot with extreme frame rates, is challenging Hollywood to revise decades old production and distribution models.
The first studio feature to be produced in a groundbreaking combination of 3D, 4K resolution and projected at 120 frames per second (fps) is testing the limits technology but could shift the dial from production to distribution.
So innovative is the format which director Ang Lee and Sony's Tri-Star Pictures have chosen to make Billy Lynn's Long Walk Home, it cannot be played back in any conventional cinema. Even toget the movie made, a whole pipeline for post-production and mastering had to be developed and built specifically for the project which has a modest $50m budget.
“The industry has been brainwashed into how to make movies,” said Lee, presenting an 11-minute preview at the SMPTE Future of Cinema Conference in Las Vegas. “Nothing has changed for such a long time. We are all dying for change so that we can go back to being a kid in the cinema again looking forward to something exciting.”
TriStar partnered with Film4 and Jeff Robinov's Studio 8 on the story, which centres on 19-year-old private Billy Lynn and his company, who survive a harrowing battle in Iraq and return to the US on a promotional tour the centrepiece of which is an American football game.
It was a perfect chance to test new media,” said Lee. “It’s all about experiencing what [war] veterans feel and people don’t understand. I thought if I can bring that sensation to the Dallas halftime show, that will be incredible and freak people out.”
It's rare that a filmmaker will come to us with a technical challenge that will change a lot of different parts of the industry at once,” added Scott Barbour, vp, production technology, Sony [speaking at SMPTE with Lee and the film's key creatives]. “Usually a film might push one aspect, like vfx, but this pushes boundaries lens to lens.”
Instead of shooting at 24fps the production has ramped the speed up five times. Not only does this eliminate the strobing and motion blur which blights 3D presentations, but delivers a hyper-realistic look which, for Lee, bonds the audience closer to the story.
“The motivation for using high frame rate (HFR) emerged when filming Life of Pi,” explained Lee, Oscar winning director of the 2012 film. “In scenes where the raft is bouncing around on the water you couldn't see the actor's performance because there's too much motion blur on his face. At 60fps, and more so with 120fps, the viewer has a more natural, spiritual connection to the story.”
While Lee stresses the need to bend technology to his vision in adapting Ben Fountain's novel, the use of HFR and 4K plays into a wider movement to entice audiences with premium cinema experiences. This includes introducing wider colour and contrast ranges (high dynamic range/HDR) and laser projection which ups screen brightness to further enhance visual clarity.
“We are at the beginning of finding out what digital cinema means,” said Lee. “It means more realism, greater detail, higher resolution and proper brightness. This is not yet a commercial application but 120fps is quite revolutionary from what we are all used to seeing.”
Richard Welsh, CEO, Sundog Toolkit, whose software systems were used to manage the project's data, suggests that audiences will find such extreme frame rates a big change in their cinema experience. “From the exhibitor side it needs to be part of a whole package of image enhancements including resolution, HDR and HFR to really give the audience something they will notice and come back for.”
However, the film's native specification will likely have very limited availability on release in November since the format exceeds the capacity of existing DCI-compliant projection equipment.
The experimental system used to preview clips at SMPTE is designed for theme parks. It paired two Christie Mirage 4K projectors together with servers from 7th Sense. 
The installed base of Series 2 digital projectors capable of showing any form of HFR content is also hard to quantify, according to analysts IHS. Estimates range from as low as 3000 screens which upgraded to show The Hobbit in 2012 to as many as 60,000 screens, or 40% of all DCI-compliant screens, which have upgraded since. 
Systems from vendors NEC, Christie, Barco and Sony will be able to playback either 120fps in 2D 2K or 60fps 3D 2K, according to Ben Gervais, the film's production systems supervisor. If two projectors are used then 120fps 3D 2K is also possible, he said. Even at 60 frames it would be the highest frame rate ever seen in a major release.
Dolby Vision projection systems, which deploys dual Christie 4K laser projectors and Dolby's proprietary HDR technology, is also capable of playing back 120fps 3D 2K. There are 22 Dolby Vision projectors installed worldwide (18 of which are in the US) with Chinese cinema company Wanda on track to build 100 sites in China by 2020.
Texas Instruments is reportedly developing technology to upgrade projectors to play the format and there is work being done on more efficient compression algorithms to improve the efficiency of systems without damaging the overall image quality. 
“The bottom line is that we will need adoption of these better encoders and upgrades to the projection systems to get to the point where 120fps 3D 4K can be distributed as a playable DCP,” says Welsh.
Hindering Billy Lynn's commercial prospects further is the negative perception that many cinema-goers and exhibitors have had with HFR. This stems from the mixed critical reception to Peter Jackson's use of 48fps for The Hobbit.
The Hobbit has damaged the HFR brand,” says David Hancock, director, head of film and cinema, IHS Technology. “Exhibitors were disappointed that there was not a continuation in the availability of HFR releases. Many will be hesitant about the Ang Lee film, questioning whether this will be a one-off or whether there will be more HFR movies in the pipe to justify investment.”
“The industry failed to join forces to communicate HFR as a positive new way of presenting movies,” agrees Sony's sales director digital cinema Europe, Oliver Pasch. “Ang Lee is trying to show what is possible and is helping put HFR back on the agenda. Technically, any combination of technology including 120fps in 4K can be built into a projector. The question is whether audiences want it and who will pay for it.”
“Conversations with exhibitors about HFR are starting up again,” reports Tom Bert, Barco's senior product manager, digital cinema. “We need to rethink how to get HFR into the market because it has been overtaken by other attributes like HDR, laser and 4D cinema.”
Those closer to Billy Lynn's production are promoting the format's potential to give more control to both filmmakers and distributors by extracting multiple versions for all platforms from cinema to iPhones.
When you shoot in this manner it allows you a lot of different options for exhibition which could change the entire industry,” claims Barbour. “We could deliver in 24, 30, 48, 60 or 120fps in a variety of resolutions.”
The sheer amount of data recorded on-set, ten times more than a conventional 4K film, gives the production greater latitude to tinker with the image in post. 
“Instead of throwing away information from 120fps to get to 24fps we are frame blending to give us a crisper image than if we had originally shot at 24fps,” says Gervais. “We have the ability to change shutter angle in post so we can add motion blur to just one part of the frame or just one character and leave the rest of it sharp. This doesn't involve the use of expensive visual effects. It means we can iterate new creative choices very quickly.” 
The same technique will allow Lee to select frame-rates for different scenes within the finished film using the same raw material.
“It's a science project,” admitted editor Tim Squyres. He cut the picture in 3D, at 60fps using beta software from Avid, projected onto a 12-foot screen to reproduce the theatrical experience as closely as possible. “We can make some scenes look more normal – like they were shot at 24fps in the context of a film where other scenes are at 120fps. It allows us to create many different delivery formats so Sony doesn't have to decide yet how to deliver it.”
Lee says he is anxious about the audience's reaction to Billy Lynn“It was a long difficult uphill road. Focus pulling, lighting, performance, make-up are different. It’s very complicated, terrifying and exciting at the same time.”
Given his award-winning pedigree the film is a likely Oscar candidate. Its success would give Lee the confidence to return to the Muhammad Ali boxing project he abandoned for Billy Lynn. He felt the proposed 24fps 3D production wasn't of sufficient clarity to reveal the micro-expression of an actor's performance during fast action.
High frame rates are not confined to cinema. Computer games are produced over 100fps and virtual reality experiences will also benefit from the continuous movement. Broadcasters plan to produce content with frame-rates at 60 and higher on 4K TV sets.
While a wartime drama may be too niche to lift HFR out of the shadows, director James Cameron is on track to deliver the first of four Avatar sequels in higher frame-rates from 2017. It could be the franchise that proponents of the technology will look to to reignite mainstream exhibition.


Thursday, 12 May 2016

Computational cinematography: Light field imaging is here

TV Technology 
Advances in audio technology have resulted in infinitely flexible object-oriented sound. Could Light Field Imaging usher in an era of object-oriented video?
http://www.tvtechnologyeurope.com/acquisition/computational-cinematography-light-field-imaging-is-here/01239
Instead of recording a flat picture, what if we could capture all the light falling on the camera? And if we could do that, could we then generate a perspective from any position? And possibly even display it as a three-dimensional holograph?
That's the theory behind light field imaging, which has potentially revolutionary consequences for visual storytelling. Recent advances in processing power and sensor technology have made the technology appealing to electronics giants like Microsoft and august cinema engineering bodies like SMPTE.
A light field – a concept originally proposed in 1846 by Michael Faraday - is defined by the number of light rays within a given area. It is technically five-dimensional: three spatial (x, y, z), plus two angular dimensions describing the direction of the ray. To capture a light field you typically either array cameras which will simultaneous record different angles of the same scene, or place a micro-lens in front of conventional optics to funnel information (about intensity, direction, colour).
At present, there is no way of post-producing the sheer volume of data produced, or of displaying it, but that doesn't mean there aren't useful applications for the technology around the corner.
Researchers at German institute Fraunhofer IIS, for example, have developed a system comprising 16 HD cameras arranged in a 4x4 grid. Last September, it released a plug-in for Nuke as an aid to processing the data and shot a short film, Coming Home, with Stuttgart Media University, which showcased the technique's capabilities for live action filming. The plug-in can be downloaded from the Fraunhofer website.
The chief advantage, Fraunhofer contends, is that light field imaging will offer a more cost effective way to produce film and TV.
“On-location retakes are time-consuming and expensive,” says Frederik Zilly, head of Fraunhofer's Computational Imaging group. “What if the focus was incorrectly set during shooting or the perspective has to be changed? The use of multicamera systems opens the door to a world of new post-production possibilities.”
Among the possibilities are dolly-zooms, Vertigo and Matrix camera tricks which could be rendered out of existing material in the cutting room. “Expensive effects, previously the preserve of cinema, can be brought to TV with light-field recording,” Zilly says.
Also known as computational cinematography the idea is anathema to most cinematographers. If all the important camera parameters, such as position, viewing angle, depth of field, aperture and exposure, can be determined in post there are big questions about where this leaves the DP's craft.
“Cinematographers will worry that light fields take away one of their primary tools – composition - because the viewer can move around the space, and see things from different perspectives,” says Ryan Damm, founder and light field systems developer of Visby Camera. “On the other hand, this opens up lots of new creative possibilities, but completely changes the creative toolkit.”
The main driver of interest in light field today is its potential application in virtual reality. Most current VR systems position multiple lenses in a sphere then stitch the resulting images together. Despite some tweaking in software this approach arguably lacks the subtitles of parallax which allow a VR viewer to have positional tracking - to move their head side to side, forward and back, look straight up or down without the illusion breaking. In theory, light field-captured 360-degree video would create a more genuine sense of presence and freedom of movement for live video which is only possible today in CG VR experiences.
“Cameras shooting 360-video can't use position tracking to synthesise a single perspective,” says Damm. “That is VR video using existing standards, rendered using game engines, and that model won't work.”
Lytro, Californian maker of the first consumer light field still cameras, announced Lytro Immerge last November and plans to launch it at NAB. Immerge consists of a five-ring globe that captures what Lytro calls “light-field volume”, dedicated servers for storage and processing, an editor for integrating data with NLEs and compositors and a video playback engine.
Earlier this month Lytro announced a new Lytro Cinema Camera (pictured below), which it claims is the first system able to produce a light field master by capturing and processing the entire high resolution light field. Captured data can be rendered into multiple formats, including IMAX, RealD, Dolby Vision, ACES and HFR. The Lytro Cinema camera features a 755 RAW MP sensor which can capture images at up to 300 fps with 16 stops of dynamic range. The company calls it "The highest resolution video sensor ever designed".
“Everybody is talking about light fields and nobody fully understands the potential yet,” said Aaron Koblin, co-founder and CTO of VR production outfit Vrse which helped develop Immerge. “We’re just waiting for the moment when we have the tools. I think both the capture and playback of light fields will be the future of cinematic virtual reality."
VR headsets (Oculus, HTC Vive) and augmented reality systems (Meta, Microsoft Hololens - both in closed beta) are the only means to display light fields at present. In the pipeline are holographic screens, such as that in development at Leia3D, with Samsung among tech giants to have filed similar patents.

None of these displays is capable of showing live action video, though that may change with the release of Immerge. The bigger challenge is creating a camera with enough fidelity that it may be better termed a holographic video camera. 
“With a micro-lens approach you end up with an effective resolution equal to the number of micro lenses,” says Christian Perwass, founder, Raytrix. “Even with a 40 megapixel camera, with 20,000 micro-lenses you will only end up with 20,000 pixels. The higher the effective resolution, the shallower your depth of field becomes which means you can't take advantage of all the different views.”
Raytrix, a German company selling precision measuring instruments for industrial work, has effected a compromise by deploying a micro-lens with three different focal lengths. Based on a 42 megapixel sensor, its R42 camera (pictured below) offers an effective resolution of 10 megapixels at 7 fps.
Perwass believes existing light field systems are limited by the laws of physics. “They are workable with close-up subjects like a face but if you want to extract depth information for scenes 10-20 metres away you might as well use standard stereo 3D cameras,” he says.
There is a third way, using traditional optics: This is to film a scene with multiple arrays of micro-lens imagers or with higher resolution sensors – or ideally a combination of both. Phase One released a 100MP stills camera in January, Canon is developing one with 120MP and even has a prototype 250MP chip. However, this only shunts the problem down the line.
How much data does a hologram require, exactly? Damm, presenting on the topic for SMPTE at NAB, has done the math. A rough approximation: for a 2 square meter surface, you would need about 500 gigapixels of raw light field data, taking up more than a terabyte. At 60 frames per second that's about 400 petabytes per hour. “That equals a whole lotta hard drives,” he says. “People are cutting various corners to try to make it work, but it's a hard problem.”
Visby, Damm's company, has a light field codec in development but doesn't plan on releasing anything until next year, at the earliest. “In the near term we are able to capture light fields and collapse all the data down to non-three dimensions for manipulation in post,” says Simon Robinson, chief scientist at The Foundry. “Imagine looking out of a window in your home. Now imagine that as a holographic picture. That is where we are headed in the longer term.”

Wednesday, 11 May 2016

Deep storage: How to save your data for a billion years

TV Technology Europe


Fresh breakthroughs in tape, disc, film and glass herald a new era of eternal data archiving. If employed intelligently, there’s no reason we won’t be able to preserve Keeping Up With Kardashians for our great-great grandchildren. Adrian Pennington reports.


Many in the industry are concerned about how to store their data over the next year or two. But how do we preserve our data for the next decade? Or the next century? Or beyond?

In Egypt, around 196 BC, someone carved an honours list in three languages onto a slab of granodiorite. The mundane text was rediscovered in 1799 and finally decrypted to provide the essential key to modern understanding of ancient Egyptian civilization. The Rosetta Stone is the perfect database. It has physically lasted for centuries and its information can be read without any new technology. If only the quest to find an archive solution for digital media were as simple.

The world is overflowing with digital data, and the digital universe is doubling every two years according to IDC. A share of this digital universe has value in the long term so what are the options?

In the digital age, tape has proved surprisingly durable. Anyone who has seen the film The Big Easy will know how easy it is to put a magnet next to tape and erase its contents. Tape is subject to degradation and bit drop out over time, and while industry standard LTO gets around this by recording data without the revolving head drum used on video tape, the system needs manual intervention every few years in order to migrate the data stored on it to the latest generation.

The new generation of LTO-7 tape, manufactured by Fujifjlm, is composed of Barium Ferrite, a medium with magnetic properties which means the tape does not deteriorate, and it gives tape headends a longer lifespan. Plus the capacity has jumped from 2.5TB to 6TB.

“It's like a whole new format,” says Fuji's commercial manager Richard Alderson. “Nothing has been done like this in the past and we are the only manufacturer who can provide gen-7 tape.” Which is increasingly important given the move to UHD.

“A single movie at 4K can need over a petabyte and as the data sets get bigger, customers are realising that tape is far safer and more reliable than disc as a storage medium,” explains David McKenzie, storage and archive specialist, Oracle.

Oracle's StorageTek division is readying a new enterprise version of its tape drive called T10K for release early 2017. This will have capacity for 10-15TB. In addition Oracle is working with the team and the Diva technology from Front Porch, the firm it acquired in September 2014. Meanwhile LTO-8 with a projected 12.8TB capacity and 427MBps speed is expected in three years.

“Tape is far from dead. In fact it is a lot cheaper than disc. It is more environmentally friendly and most important it is far less corruptible. It's the reason why broadcasters like the BBC and Sky choose to archive their programme catalogues on it.”

30 year optical disc
The main alternative to LTO is optical disc, which, as McKenzie alludes to, can drain power in order to keep the mechanism cool. Earlier this year, Sony and Panasonic launched new optical disc-based storage systems for data centres. Sony's Everspan can store 181 Petabytes for 100 years. Four systems can be ganged together to offer 724PB of total storage. To grasp that, if you were to envision one bit of data as the equivalent to one second, then 1PB would equal 285 million years.

Sony says Everspan is able to transfer 18GB of data per second, “outpacing the best performance of tape libraries and archival drive platforms. Because of the durability of optical discs, unlike other storage media, users are expected to never need to migrate data.”

The initiative is led by Frank Frankovsky whose start-up company Optical Archive was acquired by Sony last year. Previously, Frankovsky led a project for Facebook to store the social network's burgeoning data and helped Panasonic develop something along similar lines called freeze-ray. It seems that Facebook is hedging its bets by deploying both Sony and Panasonic variants of Frankovsky's system.
Frankovsky says the goal is to make it possible for customers to store everything for as long as they wish in a low-touch, low-cost optical library. “We’re finally bringing a product to market that will make tape obsolete technology,” he says.

The Everspan media developed by Panasonic and Sony is the same as used in Sony's next version of its Optical Disc Archive (ODA) unveiled at NAB 2016. A single cartridge has doubled in capacity to 3.3 TB. ODA is designed for use in near-line applications, deep archive storage or disaster recovery systems. Hardware configurations range from stand-alone to large, scalable robotic archive systems. The main components of ODA Generation 2 include: a stand-alone USB drive unit (ODS-D280U), an 8 GB fiber channel library drive unit (ODS-D280F), for use in robotic systems, and the Optical Disc Archive media cartridge (ODC3300R).

100 year metal alloy tape
While LTO tape has a lifespan of 30 years, DOTS (Digital Optical Technology System) stores digital data onto metal alloy tape and is claimed to be archival for 100 years. Originated by Kodak and developed since 2008 by Group 47, the technology's software converts a digital file into a visual representation of the data. With sufficient magnification, one can actually see the digital information.

Its specification – the 'Rosetta Leader' - calls for microfiche-scale human readable text at the beginning of each tape with instructions on how the data is encoded and instructions on how to actually construct a reader (it even resembles the Rosetta Stone – see image). Because the information is visible, as long as cameras and imaging devices are available, the information will always be recoverable, the company says.


500 year film
However, the only technology which has proven it can last a century is film. What's more it has the valuable benefit of easy reading simply by shining a light through the negative. Yet celluloid is fragile, some types are notoriously flammable, and it’s expensive despite the fact that the bulk of film stock made by Kodak and 35mm scans made from the material are now for the archive market.

With Fraunhofer and Norner, Norway's Piql has devised a way to use the preservation qualities of photosensitive film combined with the accessibility of being part of a standard IT infrastructure. Its turnkey solution includes all equipment and processes needed for writing, storing and retrieving files and is claimed to last 500 years. A high-precision piqlWriter records digital files and related metadata onto photosensitive film. Checksums are applied to verify the integrity of the data. Forward Error Correction is used for controlling errors, making it possible to fully retrieve even damaged or corrupted data.

“Both digital and visual storage of data is possible,” according to the company. “This means users can select between storing data in computer readable digital format (binary codes), or as text or images. It can even combine the two, allowing users to get visual previews of the data.

It provides a self-documenting preservation master containing all information needed for decoding and understanding the preserved data. The source code of the decoding software is open and written in text format on the reel.


A billion years and more
Scientists at the University of Southampton have gone way further. Using glass, scientists from the university’s Optoelectronics Research Centre have developed the recording and retrieval processes of five dimensional (5D) data, which is calculated to survive for billions of years.

The glass isn't the common or garden double glazed variety. The data is recorded via an unbelievably fast laser, with pulses of light fired at 280 quadrillionths of a second onto self-assembled nanostructures created in discs of fused quartz.

A file is written in three layers of nanostructured dots separated by five micrometres (one millionth of a metre) and in five dimensions: the size and orientation in addition to the three dimensional position of these nanostructures.
It sounds like science fiction and has already been christened as the ‘Superman memory crystal’, yet Hitachi also announced a similar etched glass data storage solution in 2012.

Cultural heritage documents like the Bible and Magna Carta have already been fused in 5D (see image, top) and the team are looking for partners to commercialise the technology.

The medium permits thermal stability up to 1000°C and virtually unlimited lifetime at room temperature and can be read by combination of optical microscope and a polariser, similar to that found in Polaroid sunglasses. Just don't drop it.

Tuesday, 10 May 2016

Brazil's AV sector feels the pressure

AV Magazine

Brazil's economy is in a parlous state as the world prepares to celebrate the Rio Olympics, but there are a few signs of hope on the horizon.

This should have been a banner year for Brazil, the B in BRICS for fast-moving emerging economies, with the world’s attention turning to Rio de Janeiro, host of South America’s first Olympic games. Yet the country is tetering on economic and political disaster, wracked by financial scandal and looming impeachment against its president.
By the end of 2016 Brazil’s economy may be eight per cent smaller than it was in the first quarter of 2014, when it last saw growth. GDP per person could be down by a fifth since its peak in 2010 which is not as bad as the situation in Greece, but not far off.
How Brazil got itself into this state and how it might get itself out (making hard decisions about pensions) is outside the scope of AV, but the context has unavoidable consequences for any company seeking business in the country.
“In 2014 at the time of the soccer World Cup the economic and political outlook was favourable, with forecasts for economic and social growth,” says Jose Fonseca, commercial director at Savana, Clear-Com’s local distributor. “The recent problems have placed the advertising market in a very different condition with depressed revenues and investments.”
Daktronics opened an office in São Paulo in 2012 to capitalise on the pending World Cup and Olympics and expected demand for airport advertising, DOOH and government infrastructure spend. “Leading up to the World Cup the economy was very prosperous and Daktronics was awarded four stadium projects, two of which were used for the World Cup,” reports international market manager, Ben Aesoph. Since then the Brazilian Real has experienced massive decline against the USD.
Exchange rate woes
In March 2011 the exchange rate was $1 to $1.6 BRL. This March the rate was $1 to $4 BRL. “This exchange makes it very difficult for foreign companies to sell products in Brazil,” says Aesoph. “We are seeing fewer opportunities in all segments of our business in Brazil. The OOH market has the best prospects there, but is certainly not where it was in 2011-2013.”
The legacy impact of global events on local AV is debatable since most of the equipment is temporarily imported and executed by non-Brazilian companies. “Local companies are not investing too much in new AV equipment,” notes Peter Lindquist, ceo at KJPL Arbyte, a Dataton Premium Partner in Brazil. Citing projector imports down by 40 per cent, he adds: “The economic and political situation has a direct influence on the AV market and makes it difficult to do any long-term planning.”
German national Hans Ulmer founded Absolut Technologies in 1998 and says the industry is in the worst state he’s seen. “It is difficult to close business. Everyone is holding budgets because we don’t know the outcome of things like the impeachment process, or whether the exchange rate is going up and down.”
Much of the upgrades in consumer and professional equipment, for example in displays and the national broadcasting and telecoms infrastructure, occurred prior to 2014.
“Sales of displays were high during the World Cup since soccer is the national passion but with people losing their jobs or fearful of losing their job the Olympic Games will not have as much impact,” suggests Carlos Bellei de Siqueira, NEC Latin America. “This holds for professional Large Format Displays also. The population is saving money.”
According to InfoComm’s 2014 Global Market Definition and Strategy Study, Brazil accounts for about 30 per cent of the Latin American AV market, which is estimated to grow to roughly $6 billion in 2016, up from $3.8 billion on 2012. InfoComm estimates that the market has been growing at around 13 per cent – faster than in many other parts of the world.
Now, says InfoComm’s Rodrigo Casassus Coke, CTS, senior director of development for Latin America, Brazil’s economic crisis has led to a freezing or downsizing of projects, especially in the integration and event markets.
Not all is gloomy, though, if only because, as BroadSign’s director of marketing and business development, Stephanie Gutnik puts it, Brazilians are a resiliant and  positive people. Not for nothing is this the home of carnival.
“Even though the prospects are not fantastic, we believe in a couple years time it may be back to normal,” she predicts.
The longer term prognosis may actually indicate growth. “Brazil’s is actually a great economy. The current economic situation is mainly due to political mismanagement,” says InfoComm’s Coke. “Companies and entrepreneurs will be eager to invest once the situation has been resolved, which some say could happen within the next 24 months.”
So what should AV firms be preparing for today? According to Coke the Brazilian AV market is sometimes characterised by the idea that less planning goes into projects than in other countries. “Especially in the rental and staging area, there’s a certain degree of informality to the business,” he says. “Brazil shares with the rest of the Latin American AV market the fact that AV buyers do not usually consider AV a major purchase and tend to cut costs due to a lack of understanding about AV technology.”
Low-cost preferred
Pricing is the number one purchase decision criterion for pro-AV in Brazil. Low-cost solutions are preferred and Chinese pro-AV products are readily available. According to InfoComm specific products or brands are not a major consideration in the overall solution.
“The market is very sensitive to price,” confirms NEC’s de Siqueira. “The market wants displays and related solutions but you must match their expenditure capacity besides showing clearly the value for their money.”
Margins are squeezed further by notoriously high import taxes which are levvied for some products at 70 per cent on top of cost, insurance and freight. “If an international AV company is looking to do business here they have to deal with very strict import laws epecially when dealing in hardware,” says Gutnik.
For Ulmer, importation is a science. “You need financial and logistical skills,” he says. “It takes at least 30 days to get a product into the country and it will cost double by the time you’ve done so. If you have to send a faulty product back outside the country and you want to re-import it, this will take 60 days, so it’s advisable to keep spares and products here in case of any damage.”
Absolut Technologies, a visualisation and high tech collaborative environment specialist and member of the Global Presence Alliance, has had many years experience of this and says it can typically get product imported in 15 days. “We have an office in Miami where we consolidate all the hardware and ship once a week to Salvador or whereever we need it. It is very hard for anyone not familiar with how Brazilian laws and beaucracy work to establish a business here. But once they do there is an appeite for AV innovation.”
The number of variations of invoices required for one job can be up to 450, says Ulmer. “You need invoices depending on the type of company you are selling to, a national invoice and others for a particular state (Brazil has 27 states). Instead of presenting it on Excel or Word you have to use official online government systems and they will immediately start charging taxes as soon as they receive it, so you’d better make sure your accountancy and planning is right.”
For this reason Fonseca reports that the biggest problem in Brazil is after-sales support: “Because of the difficulties with legislation and shipping of equipment for repair out of the country, it is close to fundamental to have local customer support.”
Nonetheless, Brazil is a vast country with 300 cities and 5,550 municipalities. São Paulo is the country’s financial centre replete with stock exchange (BOVESPA) and the HQs of the biggest banks. Google launched its latest digital entrepreneurship campus there in February, the first start-up community in South America. The state capitals, such as Brasília, Belo Horizonte, Recife, Florianópolis, Curitiba, and Porto Alegre “each has its own distinct market, large enough to sustain growing local companies in every segment of our industry,” says Coke.
In de Siqueira’s opinion the greatest beneficiary of the current crisis are the touristic cities (Rio, Salvador, Florianopolis, Gramado, Búzios). “The exchange rate helps attract tourists, and also helps domestic tourism because the Brazilian currency versus foreign exchange in general does not favour Brazilians to travel outside the country.”
Partner locally first
Local contacts suggest that AV firms wishing to enter the market should start by partnering with a Brazil company (probably in São Paulo). Once established, AV firms are in a better position to look for partners with firms in other regions.
“In a continent where the language is predominantly Spanish, having an ability to converse in Portuguese is advantageous,” says Fonseca. “If, on one hand, this hinders integration, on the other hand it represents an enormous market for the production of national content and will favour those companies who hire local people.”
There is a shortage of skilled pro-AV experts in Brazil, something that the TecnoMultimedia InfoComm Brazil, in its second year, is intended to address. The skills shortage represents an opportunity for multinationals willing to invest in local labour development, training and education. “Manufacturers and SIs who have successfully expanded into the region know that this is what it takes,” says InfoComm.
“Quite often we have to find solutions within a short time-frame, so I would define our section of the market as more innovative,” says KJPL’s Lindquist who was involved in outfitting the Museum of the Portuguese Language in São Paulo and the Imperial Museum in Petropolis. “We have been involved extensively in the fixed installation market, such as museums, where the focus is always on finding a good solution, and not necessarily the latest technology.”
Culturally, there is a “curious constrast” between the willingness of Brazilians to be perceived as innovative people in almost everything they do, “and the conservative business decisions that most business people make, especially regarding investment on technological equipment,” finds Osvaldo Toshimitsu, Daktronics’ regional manager.
Ulmer agrees that Brazilian AV “tends to copy everything from US or Europe” adding: “In technology terms we are one to two years behind. For example, everyone in Europe is talking about Microsoft Surface but Microsoft won’t release it here for another year. Brazilian AV companies are adopters of technology.”

Friday, 6 May 2016

Nations and Regions: post industry goes west

Broadcast
Along with Salford, Bristol is booming thanks to a busy BBC slate, leaving the likes of Liverpool, Leeds and Birmingham playing catch-up. Adrian Pennington reports

BIRMINGHAM

Birmingham has yet to recover from the switch of BBC factual outside the region and there is a feeling that investment in Manchester and Bristol has stalled any resurgence.
“It’s inexcusable that the region is so poorly represented by locally commissioned content,” says Neil Hillman, owner of The Audio Suite. “There is a huge mismatch between programming resources allocated to the region and the money that is raised in the area through the licence fee.”
Recent redundancies at indies Maverick haven’t helped, but Crow TV, the London facility that set up in the city three years ago to service Endemol, says indies are returning. “Producers are either setting up here because of talent or placing work on a regional basis,” says Crow director Victoria Finlay. The facility is currently working with 12 Yard and 7 Wonder (My Kitchen Rules) and Spun Gold (on a gardening project).
For Hillman, the high overheads of the traditional post model make no economic sense in the region. He provides a specialist online mixing service and offers feature fi lm sound design to a much wider client base.
“We opened in Australia because of projected growth in this kind of work,” he says. “However, we’re still confident of growing business in the Midlands.” So much so that The Audio Suite is upgrading with a Fairlight Xynergi to accommodate Dolby Atmos mixes alongside a voiceover studio for ISDN sessions.
“The BBC’s slicing and dicing of Birmingham has not gone down well,” agrees Scott Ledbury, managing director at corporate and promo producer Slinky. “There isn’t really a post scene and many freelance TV crews have departed. However, the wider creative industries are thriving.”
Game developer Codemasters has relocated to The Custard Factory, which is now run by former Wimbledon Studios chief Piers Read, with hopes of providing a focal point for digital media.

BRISTOL 

The diversity of production in the West Country has proved a magnet for London facilities. VFX firms Coffee & TV and Nineteen twenty have facilities in Bristol, with smaller London offices acting as feeders to the more cost-efficient regional bases.
The Farm’s long-running work on Channel 4’s Deal Or No Deal at the Bottle Yard is currently on pause, but the group is opening a finishing shop opposite the BBC on Whiteladies Road.
 “We can open relatively small here, but have the whole might of London behind us,” says operations manager Duncan Armstrong. “It’s an opportunity to give Bristol jobs to Bristol people.” The Farm will target BBC work and has local indies such as Icon, Warehouse 51 and Keo West in its sights.
Evolutions bolstered its substantive presence in March by absorbing Big Bang Post’s assets, people and buildings, giving the group four city centre facilities. It continues to attract factual work from Love Productions and Dragonfly, as well as BBC NHU (Wild New Zealand). But managing director Simon Kanjee suggests there’s been a drop in BBC features commissions. “Like post every where, it’s in finishing, not offline, that the money lies.”
The majority of boutique Doghouse TV’s work is for BBC Bristol, including returning strands such as Gardeners’ World and Fake Or Fortune?. “We hope to work more with indies in future,” says business development manager Sarah Miller. Doghouse has seen no direct impact from London facilities poaching work. “If the demand for post does not decrease, we would hope that we remain unaffected – but this remains to be seen,” she adds.

LIVERPOOL

When the British Film Commission hosted studio execs, including representatives of HBO, on a tour of the north of England in March, their visit included Liverpool. The city is used for filming more any other in the UK bar London, but post resource is scarce. That could change if plans to convert an industrial site a mile from the centre come to fruition.
Manchester developer Capital & Centric is reported to be spending £30m on turning a vacant building adjacent to the existing Wavertree Technology Park into a studio complex. Construction could start at the end of this year.
The company predicts that the facility could double the £20m annual revenue the city earns from location shoots within a couple of years and create more than 1,000 jobs. “It will have a big impact, but it’s got to be fit for purpose,” says Patrick Hall, head of post at indie LA Productions.
Lime Pictures managing director Claire Poyser, however, believes the city is no nearer to securing studio space. “The longer Liverpool doesn’t have a studio, the less chance there is to build a sustainable backdrop for media in the city,” she says.
Merseyside’s producers are typically resourceful. LA Productions handles DIT and dailies for film productions and puts its own drama productions, like Jimmy McGovern’s Reg and Moving On for BBC1, through in-house suites.
With 360 episodes a year of C4’s Hollyoaks, Lime’s 14 Avid and five dubbing suites are busy all year round. “There’s little reason to go outside of Childwall except for specialist finishing grades on series like The Evermoor Chronicles,” says Poyser. “We keep end-to-end production in-house for efficiency and economic reasons.”
Dubbing mixer Sam Auguste opened boutique Onomatopoeia in the city at the start of 2016 after freelancing in London. “It is less and less important that you are physically on site,” he says. “I was spending more time working from home, remote from the facility in London.”
Spying an opportunity to plug a gap in the north-west for low- to mid-budget feature post, Auguste picked up sound design for Hurricane Films’ trailer for A Quiet Passion and an animation for local indie Mocha. He is also looking to move into picture post support such as rushes transfer.

LEEDS 

VTR North had a traumatic end to 2015 as it restructured out of administration. “We’ve been fortunate to work with some faithful clients who have helped us back on our feet,” says managing director Spencer Bain.
The company specialises in audio, animation and VFX, with recent commercials for Bupa and Jet2 under its belt, but it’s a full-service house and completed the grade for True North’s Coastal Walks With My Dog for C4.
Since ITV transferred production outside the region a decade ago, Leeds has lacked a production base. True North retains all but specialist crafts in-house, but the region’s locations are popular for drama (Mammoth Screen’s Victoria; Left Bank’s DCI Banks).
“We tend to touch everything that comes up here, even if it’s rushes uploads,” says Chris Davey, head of operations at full-service house The Other Planet. “That said, since programme execs tend to be based in London, the final post disappears back south. Leeds doesn’t have a major studio for light entertainment, and CBeebies work stays in Manchester.”
Rollem Productions (BBC1’s In The Club) posts at The Other Planet, which has picked up factual shows such as Daisybeck Studios’ Channel 5 series The Yorkshire Vet.
At ADBS, owner Andrew Dobson says he has enough business to see him through the year. The firm mostly handles factual jobs such as Emergency Rescue Down Under and Canals: The Making Of A Nation. “We’ve lost some facilities in the region and gained some. Overall, I’d say the Leeds scene is small but thriving.”

MANCHESTER: NORTHERN POWERHOUSE


Bargain Shop Wars: post at Salford’s Core
MediaCityUK is generally applauded as a magnet for business, even if there are gripes about the volume of work that seeps out to indie facilities from anchor tenant the BBC.
In-house post for BBC North is managed by The Farm, while studio service provider Dock 10 is mid-way through a 10-year contract that guarantees a volume of post for shows like Match Of The Day and Dragons’ Den.
“Our main focus is not to be over-reliant on the BBC,” says Dock 10 head of post Paul Austin. “The aim is to become the ultimate one-stop-shop.”
Looking to widen its base, the facility struck a deal with Red Productions, which locked in dramas Happy Valley and The Five (filmed in Liverpool). It also set up a VR division and moved into short-form VFX by acquiring local outfit Edit 19.
“There’s a slow but sure move of facilities from the city centre to MediaCityUK,” suggests Brian Barnes, managing director at Manchester facility Sublime, which works closely with video agency Activideo on live-action and animated corporates.
The Salford hub boasts several established houses, including Flix (which has a link to the capital in partnership with Molinare) and Core (post on Crackit North’s Bargain Shop Wars), but others feel no need to move.
“The ad agencies are in town and it’s easier for talent to get here than Salford,” says David Jackson, managing director at 422 Manchester, which welcomes Caroline Aherne to narrate Gogglebox every Friday.
While ITV and the BBC have pulled out of other regions, Manchester remains a viable centre for production with a significant pool of crafts, from camera ops to make-up. It’s also good for location shoots. “You can close off half a dozen streets in a day, which you could never do in London,” says Jackson.
Phantom Post is the latest MediaCityUK recruit, albeit one set up by former Timeline North executive Eben Clancy within the same Blue Tower building. Since Phantom specialises in providing remote editing and finishing facilities, does location even matter?
“We would not have picked up the work we have if we’d started in London,” says Clancy. “You can get extremely good, relatively inexpensive office space here with fantastic connectivity.” The firm’s first project is the Potato-produced series Bear Grylls’ Survival School.
“The talent in the north-west is of high calibre and MediaCityUK is a big draw,” he adds. “That said, our model works for producers willing to break the mould of big post. Instead of having to sit in facilities for 10 weeks, remote production frees creatives to work in their own space.”

Wednesday, 4 May 2016

Data Is King

Digital TV Europe p18

http://media2.telecoms.com/e-books/DTVE/magazine/aprmay16/files/20.html

Data is the currency on which TV Everywhere players depend but service providers are facing organisational hurdles to managing the wealth of information at their disposal.


Data is everywhere but it's how you use it that counts. Many service providers are reportedly only scratching the surface of what is possible. Even where there's a will to interrogate data and effect rapid response to consumer needs or business models, many multi channel networks face an uphill task to overcome legacy organisational barriers.

That's in stark contrast to pure-play streamers like Netflix or Amazon which have built a business on integrated customer service and technical operations. Cross-correlation of data sources is part of their culture spanning quality of service (QoS), marketing, advertising and content recommendation pulling in remote control commands to the set-top box sent back over the return-path and server logs recording media player interactions.
With traditional broadcast TV networks there wasn't a huge need for a lot of data. The equation could be reduced to quality of content versus how many people watched. With pay TV the model barely shifted. If shows were packaged at a decent price the subscription rate went up. The need for data wasn't great since the variables didn't waver.

When TiVO introduced the concept of time-shifted consumption the cracks began to appear. OTT has taken this to another level.

“The internet opened a floodgate of consumer choice,” says Keith Zubchevich, chief strategy officer at OTT video optimisation specialist Conviva. “It handed control of the TV from networks to viewers. The data that is needed now is of a fundamentally massive order of difference compared to what has gone before.”

Data types

There are broadly three types of data: consumer viewing behaviour, programme metadata, and network performance statistics. The latter has been a factor in network capacity planning for some time but is also starting to be used in other areas, such as content acquisition.

“For a long time QoS monitored bit rates, how many sessions failed. Increasingly the data is about what devices are being used and how user behaviour differs device to device. How long are average sessions and when do they occur?,” says Edgeware's VP products, Johan Bolin.

For instance, through insights gained in QoS analytics, a service provider can understand specific customer preferences and experiences, and tailor products more accurately. Understanding what services or content are more popular during certain times of day, in certain geographies, or certain devices, can give insight into holes in content offerings. This data can then be used to adjust existing services, or add or remove suppliers in the ecosystem to better support the delivery goals.

A lot of the focus is still on more technical use-cases but the ability to get data from end-customer devices is driving additional areas such as marketing, customer base management, customer care and service management,” says Per Unell, business development, Agama Technologies.

“Processes like fault detection and localisation, network optimization and change management are very much a reality,” Unell adds. “We also see significant use in service management such as SLA and overall service performance tracking. Some customers are also using data to support their customer care and customer understanding processes.
In general, it’s more straight-forward to see quantifiable benefits in operational processes – fixing problems faster, solving customer issues at first call. At least as much value can be realized by systematically and proactively tracking down issues before they become problems, but it’s harder to quantify beforehand.”

Another component of consumption analytics is device-level data, which is increasingly important as the number of devices and form factors grows. “The power of this type of data is granularity,” says AIB Research in its white paper on the topic [http://www.conviva.com/conviva-whitepapers/abi-whitepaper/] published by Conviva.
It can be used to guide purchasing decisions for service providers, operators, or content creators. Real-time data can help with quick monetisation decisions for unexpectedly popular content, and can guide future investments to identify more popular content, states AIB Research. Popularity of live events can be more accurately judged, and licensing and delivery decisions can be centred around this.
“You’re beginning to look at being able to measure the way that people interact with a TV system in the same way that they interact with the internet,” says Andy Hooper vp, Cloud, Solutions & Services, Arris.



Scratching the surface



“It is relatively early days for the use big data for service providers,” says Peter Docherty, founder and CTO at recommendations engine provider ThinkAnalytics. “Data is already captured but is not being taken advantage of as much as it could be. The risk of not using data to drive the business is a lost opportunity. Let's say only 40% of your VOD catalogue has been watched. If you don't have data you won't know that and if you don't have data about what is watched or being routinely declined then you can do nothing about it.”

It's not as if operators haven't recognised the need or that solutions are having no effect. ThinkAnalytics' research suggests that just a few months after integrating its recommendations engine, clients saw their subscribers increase their viewing time by 20-50%, and the number of channels watched rise by 25-35%.

Subscriber management platform Paywizard says it has delivered acquisition campaigns that drive conversions up to 25%, and has also run reduce churn programmes that achieve conversion up to 60%.

The most recent figures released by Sky from its 500,000-home Sky Viewing Panel show that channel switching during Sky AdSmart commercials was 48% lower than for standard non-targeted ads – an effect consistent regardless of channel, household type and amount of viewing. “As viewers cannot distinguish AdSmart commercials from any others, higher viewing levels can only be attributed to customers finding them more interesting or engaging,” observed Jamie West, Sky Media’s deputy MD.

Pancrazio Auteri, CTO at content personalisation firm ContentWise explains some of the ways it uses data to assist customers like Maxdome, Mediaset and Sky Italia. The first is to boost churn prevention by detecting anomalies in behaviour.

“If patterns in behaviour diverge from established patterns this may mean a user is using a competitive service and can be an early detection of cancelled subscription,” says Auteri.

ContentWise data is also used to make service provider promotions more relevant. “If users are sent irrelevant items then any associated communication from that service provider will also be seen as irrelevant so we match the promotion to user habits or tastes.” Doing so has seen a rise of between 20% to 40% in subscribers opening (viewing) a promotion, ContentWise claim.

“The same method can be applied to advertising. If a service provider is sending promotions of different advertisers we look at the profile of the user and lifestyle traits and try to narrow down the promotions while increasing their relevance.”

A third data usage, dubbed ARPU Rebuilder, will be offered as a module within the ContentWise Content Personalization system this fall. Targeting pay TV skinny bundles it will include a set of algorithms designed to up-sell related micro-subscriptions to users and will focus on the free trial phase of a service and the first month of subscription to ensure that the new user understands the value of the content offer.

“Based on our research the obstacles to large-scale adoption of TV Everywhere are the lack of awareness from consumers that the content they are interested in is available, and the fragmentation of the applications, making it difficult for users to find and seamlessly consume this content,” says Auteri.

ContentWise uses viewability tracking to ensure that each user can see a different, uniquely personalized UI. “Given the fragmentation of the TVE applications, metrics and KPIs cannot be computed across all the applications owned by single content providers,” says Auteri. “The unified discovery provided by the pay TV operator UI is the best place to measure the user behaviour, across all touch points (i.e. screens, apps, devices).”

Subscriber management systems (SMS) developer PayWizard estimates that between 5% and 8% of a pay TV operators’ opex is consumed by subscriber management activities and that data from SMS can have “a dramatic impact” on the business' “viability, competitiveness and profitability”.
“SMS are a natural collection point of statistical data regarding a pay TV operator's business processes and subscriber community,” says Bhavesh Vaghela, CMO. “This raw data, when given context and viewed against trends, allows marketers to develop and track sales initiatives. Questions like: 'Do our free month offers lead to full subscriptions?' or 'Which device is the most popular for viewing as to impact our application development strategy?' can be answered by a whole host of valuable insights uncovered through reporting and analyses. These answers can help solve short term issues and improve the longer term profitably of a pay TV business model.”


Data consolidation

Vendors uniformly contend that from a technical (i/e from their product) angle, big data collection and analysis is not the issue. The chief problem is the ability of service providers to handle it.

“We have all the tools and databases to perform analysis,” says Docherty. “The challenge is joining the data together from a business perspective. Service providers are trying to gather data from different parts of the business but have quite a way to go. There's a lot of emphasis on the customer acquisition side and on customer retention / churn reduction programmes but not so much focus on the stages in between. For example, when you've got an active customer how do get them to spend more? That's a lot to do with not having data in one place able to serve a more personalised engagement with customers.”
“The main hurdles are about getting a wider understanding across the operator of what is possible and overcoming organisational 'stove-pipes' in access to data,” agrees Agama's Unell. “It’s about cherry-picking the most promising use-cases and overcoming the compartmentalisation of existing systems. If this can be done the business case is often very positive.”
Arris reports that a number of operators have built out their own big data teams offering it as an IT service internally with mixed results. “This works well in organisations which understand the value that flows end to end from capturing and analysis to taking action from big data insights,” says Hooper. “In other organisations, however, there are too many barriers to collecting and sharing data. Departmental teams tend to keep data within their own group. It's classic big organisation type of problem.”

An example, cited by Hooper, is a company deploying a video monitor solution into its multiscreen video apps for smartphone and tablet. “Using that solution they were getting a huge amount of data yet people in charge of the call centre had no view on when a customer session failed. Despite the operator paying a licence to this vendor it had no oversight on the poor customer experience arising out of buffering video. Either through inefficiency or deliberate obstruction, that information was not being leveraged end to end across the business.”

As Hooper sees it, the customer's experience with a service provider now spans traditional data silos from the broadband call centre, to the TV call centre to operational teams examining video player and session data to digital marketing teams interrogating intent to purchase.

“This is not being done at anywhere near enough scale,” he says. “Customer experience management crosses organisational boundaries. Some service providers are addressing this by installing a chief digital officer or customer experience executive, even at board level, but they need to to do more.”

Big data is an established IT discipline in many industries but most telco or cable companies retain a legacy of network / operations teams separate from marketing and consumer facing departments.

Hooper points to home network management as another area barely addressed by the pay TV operator. “Trouble shooting of this falls on the responsibility of the service provider, explicitly or implicitly,” he says. “It includes management of the home network, whether there's good home WiFi, whether the kids are moaning at Dad because he can't download game updates. All these things are part of the subscriber experience and typically they will end up talking to different bits of the service provider organisation when it should be one entire customer experience journey and one digital strategy with big data enabled to deliver insight into this.”

Sharing data sets and personalising offers are hampered by limitations on content rights. “Targeted advertising is still in its early phases, with just a few operators using it in production,” says Unell. “It also requires additional infrastructure and generates privacy and personal integrity requirements on the solutions used.”

“Video professionals are struggling to adapt to new business models and must use data analytics to manage and grow their services,” concludes Sam Rosen, vp, consumer at ABI Research. “Leveraging a single, unified dataset for the needs of different functions in an organisation—and opening up the avenues for data sharing between affiliates jointly responsible for a video service—can help align everyone on a common definition of success.”

Edgeware's Bolin believes a single repository is desirable, and perhaps possible, but not soon. “Given that there are so many different systems and sources of data it would be a challenge to have a centralised data store continuously updated and compliant but as the market matures and business intelligence advances I wouldn't rule it out.”

“You can spend more time crunching data than analysing it,” agrees Zubchevich. “I don't think we'll get to a unified single data service. Publishers and pay TV operators have to break data into chunks and look for key providers of, for example purchase data, advertising and content recommendation.” Naturally, anything to do with the playback subscriber experience should be measured by Conviva, he says.

“Failure to do means churn and subscriber loss,” he adds. “The simple fact is that consumers are polling publishers online. The consumer will terminate their relationship with a network if that service provider is not proactive.”

Insight into Ad fatigue

Buffering and delivery analysis should also be extended to ads. There is evidence from Conviva that as much as a 58% viewer churn is based on poor online ad experiences.

The impact on a viewer's experience from ads is massive,” says Zubchevich. “If I watch a show and it's riddled with ads I'm getting ad fatigue and I'm beginning to look for content elsewhere. The fundamental next step for service providers is to monitor ad impact.”

Not coincidentally, Conviva is launching an Ad Insight product which essentially expands the capabilities of its video playback experience monitoring.

Service providers could do more in this area,” agrees Bolin. “Today, very little is being done partly due to data being sourced from two different parts of the business. There is data from the ad insertion server about the actions of customers interacting with an ad, and data from the ad streaming server about the rendering of that ad. There is a need to cross-corrolate this data. As ad personalisation (targeting) grows I would assume service providers would see these reports as much more of a requirement.”


Acting on Quality of Experience


A study published in March by IneoQuest found that more than half of consumers who watch streaming video have experienced rage as a result of buffering. Buffer Rage is defined as “a state of uncontrollable fury or violent anger induced by the delayed or interrupted enjoyment of streaming video content from OTT services.”

It's no laughing matter. With cord cutting on the rise, and nearly three out of every four consumers watching streaming video daily a better understanding of the implications of Buffer Rage is essential, IneoQuest argues.

The contention is that metrics from QoS (such as packet loss and delay) can be used beyond an interpretation of the performance of network and services to account for the experience of the user. Accounting for subjective user experience with objective data is where Quality of Experience (QoE) comes into play.

“Tracking packet loss can highlight problems in a network, and these problems can be extrapolated to possible user experience difficulties; however, there are a few issues with this,” suggest ABI Research. “Not all service difficulties always lead to user experience degradation, users vary in their tolerance for network problems, and different content types (i.e., short-form video versus long-form video) are affected more by network difficulties.”

Logging a consistent stream of data—such as stream completion or early exit—and correlating this to available user data creates a powerful combination that allows for deeper and more personal data dives. Combining this data stream with dashboards to help with specific content filtering, such as geographic location, content source, user ISP, and more, helps real-time analysis and subsequent decision making.
According to vendors, however, most companies are primarily reactive and do not have the resources nor the time to proactively look for potential issues that might impact customers.

“What service providers need to do is to proactively poll the infrastructure and network that impact customer experience and match that performance information against fixed, dynamic and baseline thresholds as well as configurable SLAs to identify potential issues that will impact customer experience before the customer is affected,” says Gregg Hara, VP business development and marketing, Centrica Systems.

Centina's NetOmnia Cable Assurance is one tool that can provide this functionality. It polls all customer premise devices at a high frequency as well as polling all network devices then correlates this information in realtime against the network to identify performance and customer experience impacting issues.

“This can then automatically kick of a work order or trouble ticket to initiate a truck roll and get a tech onsite to resolve the problem before the customer even notices an issue,” Hara explains.

Hooper explains that Arris has systems in the field that monitor and manage the delivery of bits on the network layer. These can spot when, for instance, a customer’s QoE has dropped due to a network outage, and proactively schedule resources to fix the problem. This is important because viewers are increasingly impatient when QoE deteriorates and ever more likely to vote with their feet.
In its OTT: Beyond Entertainment Survey published last November, Conviva
concluded that one in five viewers will abandon poor experiences immediately, regardless of genre and that 90% of viewers choose to return to services that deliver a superior experience. After a poor experience, one in five will never return to that service.

This leads Conviva's Zubchevich to conclude that content is no longer king. “For the first time you can forget quality of content as being the single most valuable metric. The number one currency is QoE. We are just starting to see operators look at marketing the experience of viewing as important as the content itself. This is not something they've ever collected in the past and it's a fundamental shift.”

Publishers still care about the quality of content of course. Zubchevich's contention is that with so many sources to which a consumer can turn to get the same content, the defining factor will be the experience they receive from the site serving it.

“We need to redefine QoS from basics such as 'is the stream available at all?' and 'can I watch it largely uninterrupted? to questions about resolution. If I watch a HD or UHD TV broadcast and move to an IP-based provider I am not expecting a poor experience in comparison. We need a zero tolerance approach to starting the stream. What used to be acceptable is no longer and failure to address buffering or bitrrate issues in that moment means you lose the consumer.”

Arris' Hooper suggests that the mantra 'content is king' has been a fallback excuse for some service providers. “As the market fragments into different content sources you'll find that consumers will gravitate over time to the site where they're having least friction. That means pay TV doesn't just have their business threatened by OTT but by new data pipes which can provide a greater customer experience journey through the content lifecycle. Having exclusive content deals is a defensive mechanism. Enabling a better customer experience will deliver more positive brand benefits in the longer term.”

Freesat expands analysis

Freesat, the BBC and ITV hybrid satellite and broadband platform, launched its Freetime box in September 2012 with a December 2015 software update enabling real-time measurement and monitoring capabilities.

It was subsequently able to take advantage of solutions from TVbeat for realtime audience measurement of viewers who opt-in to allow collection of their data. Freesat also uses Google Analytics to monitor how its app, guide and content is used.

“One insight [from Google Analytics] was how much numeric entry is used as a shortcut for channel entry,” explains Matthew Huntington, Chief Technology Officer. “Such analytics can inform how production of channel numbers are used in marketing material. It shows that we need to protect and keep a channel’s number stable. It also informs us that we should not be developing a remote control without number keys at this time and deter TV manufacturers from only supplying numberless remote controls.”

Another insight was around the service's Now & Next view. “Customer research groups had suggested this was a popular feature but extrapolating that to the whole of our install base is a risky jump,” he says. “We were gratified to learn that the vast majority of EPG usage is indeed with the first page of that 'now & next' information.”

Freesat has yet to put in place a mechanism to monitor the quality of video. “We would like to have done that but we're not yet able to get that information,” says Huntington. “It would be useful to have more insight into signal strength, for example, so if a customer complains we can take a look specifically at that aspect. Our OTT service is provided by third parties, like BBC iPlayer or Netflix, which perform their own QoE monitoring. As yet we've not been able to get inside those players from a QoE or content usage perspective to extract information and do cross platform analysis. Over time we hope our partnership will develop so that they will share that data to the mutual benefit of our platform and their service.”

Huntingdon feels Freesat have only scratched the surface of the insight it can derive from the data it already collects.

“We are wary of the bottlenecks of trust that occur in organisations when only a few people can access key data and data gets locked into an ivory tower,” he says. “We have taken a democratic approach to exposing data in our company. The next step is to use that data more effectively to find insight or solve a problem.”