Tuesday, 12 June 2018

Innovation strikes at World Cup

Broadcast 
Tech innovation will be at the fore at this year’s World Cup as all 64 matches are captured in 4K UHD and HDR, and more cameras than ever cover the action. Immersive sound, 180-degree live streams and VR content are also in the match-day programme.
Fifa’s TV division and Paris-based Host Broadcast Services (HBS) have pulled out all the stops for the production of the World Cup 2018 in Russia.
The chief innovation is the broadcast of all 64 matches in UHD and HDR, a decision that necessitates a complex wrangling of data in the OB van to produce multiple deliverables.
These include UHD HDR and a pair of HD signals (1080p 50 and 1080i 50) to satisfy the bulk of international transmission.
Broadcasters taking the UHD feed include Sky Deutschland, which will show 25 live games in the format. With a remit to cater for all rights holders, Fifa TV’s watchwords are consistency and impartiality, all the while “keeping a balance between the different production styles used across the world”.
The UHD HDR and HD SDR set-up is being accomplished as part of a single production chain. The UHD feed itself will be produced within the traditional arrangement, in which each 4K camera feed is split into four HD SDI channels for ease of routing around outside broadcast trucks that have yet to make the switch to IP.
Likewise, the matching of all cameras, including for UHD HDR, will be made with the HD production as a reference.
Each of the 12 stadia is outfitted with 37 cameras (up from 30 in 2010 and 34 in 2014), including eight cameras capable of UHD HDR and simultaneous 1080p SDR output, and another 11 cameras with 1080p HDR and 1080p SDR dual output.
This basic match-day coverage is supplemented by eight super-slow-mo cameras, a cable cam and the Cineflex Helicam. Two ultra-motion polecams join the roster from the quarter-final stages. In addition, two RF cams will be added at each venue, focusing on fan coverage in and outside the ground.
UHD HDR
Around 75% of the live match cut is estimated to be in full UHD HDR, with the remainder up-converted. All replays, for example, will be up-converted from HD “with minimal visual impact on the UHD feed due to the natural softness derived from the slowmotion angle”, advises Fifa.
Viewers of each format will see slightly different coverage. The main UHD camera in a central pitch mid-height position will be framed wider than the regular HD main shot and automatically substituted on the UHD HDR feed every time the coverage cuts to the main HD position.
“The UHD shot is too wide for the HD feed, distancing the viewer from the story,” says Fifa. Aside from more picture detail, the wider frame of the UHD camera will offer a “new view of team tactics and options within the match”.
There is an interesting extrapolation here to 8K live event broadcasts, which will no doubt be part of the mix in Qatar 2022. The 8K sports experiments to date, including at the World Cup, have tended to train a wide-angled camera on the pitch with minimal intervention by a director – so great is the panoramic detail on display.
HBS is fielding eight crews, which will each travel between three venues and provide the host feed. This is supplemented by 32 Fifa TV crews, assigned to cover the activities of each team and to generate related colour from the 11 host cities.
As a guide, in Brazil, HBS generated 5,000 hours of content during the tournament, including training ground footage.
The production and flow of content is managed from the international broadcast centre (IBC) in Moscow, a facility that houses a satellite farm, studios and broadcasters’ production teams.
Video Assistant Refereeing (VAR) is another first-time feature of the tournament. This will be operated remotely at the IBC with VARs in direct contact with officials at the game. VARs watch the match in a video ops room with access to all host feeds plus goal-line tech.
HBS will explain the VAR process to viewers as a picture in picture and “eliminate any confusion that can arise with the more complex referrals using replays and graphics”.
UHD viewers also benefit from 16-channel audio (with embedded 5.1 bed), although the final mix depends on the configuration of the consumer’s speaker and TV.
The object-based audio makes use of microphones ringed around each pitch and software that tracks each player and the ball.
The software takes into account various parameters such as the polar pattern of the mics, their distance from the ball, or whether the mics are static or moving with the camera.
According to Fifa, the software is also capable of analysing the situation on the pitch to create more than one point of interest in the event that, for example, multiple players are congregating in one spot, such as when arguing with the referee.
The entirety of contribution relies on a 100Gbps redundant optical fibre network connecting each venue to the IBC, provided by the Russian government. Onward transmission by satellite is the responsibility of Eurovision Media Services, which delivers back to Points of Presence in London, Paris, Frankfurt and New York.
Only limited use of unilateral cameras is permitted during live play by Fifa, so ITV and the BBC will primarily augment their presentation pre-match, at half-time and post-match.
While in competition for viewers, ITV and the BBC will be sharing facilities more than at any other recent World Cup. This saves money and is said to make transmission more resilient.
For example, the broadcasters will share SNG (satellite news gathering) facilities and a studio in Red Square, with related shared router and talkback connections to their teams in the IBC.
After dabbling in virtual reality in Brazil, Fifa has gone a step further to produce live streams in 180 degrees of all matches, available via a World Cup app on platforms like Samsung Gear, and on Facebook and YouTube.
Not quite VR, perhaps, but at least a panoramic presentation and one that can be made more immersive by viewing over VR headgear like a PlayStation VR or Google Daydream.
Three streams will be taken from dedicated UHD cameras positioned behind either goal and in the main stand, with wide-angle lenses covering 160-180 degrees.
Separate 360-degree VR will be recorded as clips rather than full matches. One such rig will be in the tunnel surrounded by the players; another in the stand for a viewing experience surrounded by fans.
Fifa has been unusually restrictive in permitting any third-party supplier to talk about the World Cup. But we know that Sony is providing the bulk of kit for the UHD HDR and HD coverage; EVS is supplying its usual nearline server array and replay systems; and Deltatre is believed to be behind the VR experience.

UK Broadcasters Hope Netflix-style App Will Stop the Rot

Streaming Media

The UK's public service broadcasters (PSBs) have concluded that collaboration is the only way forward if they are to confront the power of global titans like Netflix and Amazon.
This includes a new Netflix-style app aggregating live and on-demand content from the BBC, ITV, and Channel 4-branded part of the Freeview Play platform.
It could eventually be fleshed out to fulfill the broadcasters' long held plans to offer a domestic streaming rival with something close to the scale to compete with the SVOD giants.
A mood of collaboration has been churning among UK PSBs for years but given fresh impetus with a proposal to inject £125 million ($142m) into Freeview Play, the subscription-free connected TV service owned by the BBC, ITV, and Channel 4 along with Arqiva, which owns and operates the nation's transmitter network.
The funds, spread over five years, will develop new services for Freeview Play, the first of which will be a mobile app due later this year that will allow viewers to access live and on-demand content from all the broadcasters on a range of smartphones and tablets.
"This new commitment from our shareholders is a major boost for UK viewers," said Jonathan Thompson, chief executive of Digital UK, which runs the Freeview platform. 
Other key developments include a restart functionality that will allow viewers to click catch-up links within the Freeview Play TV guide to start a show from the beginning, improved voice search navigation, and a "further evolution" of the Freeview Electronic Programme Guide (EPG) on TVs.
Digital UK added that investment will help Freeview adjust to changing viewing habits and "exploit the trend towards cord cutting" as viewers look to build their own TV bundles by combining free-to-view TV with low-cost streaming services.
An original proposal for a joint VOD platform from BBC Worldwide, ITV and Channel 4—dubbed Project Kangaroo—was slated to launch in 2008 but blocked by the Competition Commission, which arbitrates on mergers, joint ventures, and other issues relating to competition law in the UK.
That left the door wide open for Amazon, which acquired UK film streaming and DVD service LoveFilm in 2011, and Netflix, which launched in the UK in 2012. 
Netflix has since amassed 8 million UK subscribers and has an annual war chest for original content that dwarfs that of the BBC. Broadcasters are further braced for the launch of Apple's original content service.
The Guardianreportedin May that NBC Universal had held talks about joining forces to create a joint British streaming service with BBC, ITV, and Channel 4.
Speaking at the DTG Summit in London last month, Ofcom group director and board member Steve Unger said the failure to get a SVOD venture off the ground in 2008 was a "tragedy."
The same organisations apparently also held further discussions in 2016. While the BBC was reportedly keen to use iPlayer as the master brand for a joint SVOD service, what has changed since then is the growing strength of ITV Hub and All 4, and understandable reluctance of the other two main shareholders to commit.
The launch of Freeview Play on mobile may be the compromise.
"The idea that every individual UK broadcaster will have its own, independently produced online player is just not going to happen," said Unger. "I think some form of collaboration around the next generation of collaborative player is really important."
The BBC's annual report, published in March, revealed that 16- to 24-year-olds spend more time with Netflix in a week than with all of BBC TV including the iPlayer. 
"As the trend shifts towards on-demand viewing, the BBC risks being overtaken by competitors," the report noted. "Maintaining the reach and time that audiences spend with our output is difficult when they have so many other choices at their disposal. This challenge is most acute for young audiences."
The BBC director general Tony Hall has repeatedly stated that the whole approach to UK broadcasting needs to change if PSBs are to remain relevant in the near future.
"The global media landscape is going to be dominated by four, perhaps five, businesses on the west coast of America in the years to come," said Hall in speech coinciding with the annual report.
"Companies with extraordinary technical, financial, and creative firepower. Does music streaming spell mortal danger to radio? Can iPlayer keep pace with a rapidly growing Netflix?"
Earlier this year, Ofcom chief executive Sharon White urged PSBs to collaborate in to compete but also to co-operate with large digital rivals.
"By working with the likes of Facebook, YouTube, Netflix, Amazon, and Apple, PSBs can benefit from these companies' immense global reach," White said. "They may look to share expertise in technology, marketing and programme-making, in return for investment or prominence on digital platforms."
She added, "Our PSBs may increasingly need to join forces to increase their bargaining power, just as they are doing with TV manufacturers. Increasingly, they will need to collaborate to compete. We will take account of that need when assessing competition in the market."
Freeview Play launched in 2015 and combines catch-up TV with on-demand and live TV on connected TVs. The new app would free the service from the TV. More than 3.5 million devices have been sold with Freeview Play in the UK from brands including Panasonic, LG, Sony, and Toshiba, accounting for 60% of smart TV sales
The 2017 specification for Freeview Play includes support for HDR streaming using the BBC's co-devised HDR format Hybrid Log Gamma.
Commercial broadcasters are also hooking up at a pan-European level to counter FAANG (Facebook, Amazon, Apple, Netflix, and Google) in the advertising space. Germany's ProSieben.Sat1, Mediaset (in Italy and Spain), France's TF1, and Channel 4 have set up the European Broadcast Exchange to trade a combined inventory of 800 million premium video programmatically to brands with multi-territory video campaigns. This has yet to launch.

Friday, 8 June 2018

In conversation with Paramount Pictures futurist Ted Schilowitz


IBC

Paramount Pictures Futurist in Residence Ted Schilowitz explores the future of entertainment.

The obvious question to ask a futurist is, just what does a futurist do? It’s something Ted Schilowitz, who fulfils the first-of-its-kind role at Paramount Pictures, gets asked a hundred times a day.


 “I define it in different ways depending on the audience,” he says. “The most direct way to describe what I do is that I am a glorified lab rat. I’m experimenting and exploring future media and technology with an open mind, hopefully picking up tell-tale clues as to where it’s all going and then trying to advise, diplomatically, the things that we should pay attention to.
“Another way of looking at it is from the perspective of a really good futurist – and I would add that I am a futurist in training – who studies the past in order to see where the future might be headed.
”You can look at the history of change in entertainment and technology over the last fifty years to define the next fifty.”
A classic case is that of Eadweard Muybridge, the British-born photographer who pioneered the use of multiple cameras to capture motion that the human eye couldn’t see. His experiments in the 1870s had a direct influence on the invention of motion picture cameras and were the first to record the visual effect of ‘bullet time’, later made famous in The Matrix.
This has now come full circle, as we shall see.
A third way to describe Schilowitz’ work is an attempt to embed the culture of a start-up within a vast and profitable movie studio. Before Paramount, he held a similar position at 21st Century Fox.
Department of future
“My team and I come to work every day in what we call the ‘department of future’,’’ he says.
“We try to think and act like a start-up because I believe true innovation tends to happen outside of the big media companies. They tend to be too over burdened by history and legacy, whereas it is the smaller companies, an HP or Apple or Microsoft back when they were starting up, which beaver away at the breakthrough ideas and emerge on the right side of history.”
Schilowitz spends a lot of his time talking with Big Tech and Silicon Valley start-ups alike, experimenting with the latest immersive imaging technology, trying to find the most likely path to guide his bosses at Paramount as to what the content creation and consumption experience will be like fifteen years hence.
If this seems like so much random guess work, there is a consistency in his thought process. This is best summed up as ‘don’t be afraid to take risks.’
“You have to be ready to bet on the wrong side to get it right,” he says. “If you look at early adoption curves of media technology they tend to be super expensive and exotic, but the essence of the idea is often right even if the device itself may not be.”
Schilowitz has form. At camera maker Red he was tasked with convincing Hollywood and leading directors of photography to move to digital away from film at a time when digital cinematography was being dismissed as inferior and Red itself was ranked an outsider.
“Now [digital] is so right that 35mm is more on the wrong side of the curve,” he says. “The emotional and textual reasons for film remain valid, but there is no logical reason why you’d choose film when the raw image output of digital has surpassed that of analogue.”


Similarly, his take on VR and AR is that the essence of the concept is right but that current devices used to experience it “are mostly wrong.”
“VR is exactly where it should be in terms of market adoption based on where the tool sets are. This is an emerging media platform which will change as the friction points to the experience are removed. When I’m speaking with someone who has put a billion dollars into VR and telling them what they are doing is wrong you have to be sensitive. I’m not criticising these companies.
”Tim Cook (Apple), Mark Zuckerberg (Facebook), Eric Schmidt (Google) would all agree they are on a journey and that if they keep chipping away at the problems they will get there, because the essence of the
Community of tomorrow
Being a futurist for Schilowitz is about not settling in the present.
It’s a skin he has inhabited since youth, traced back to the moment his family moved from Brooklyn, New York City to Orange County, Florida.
“This was a culture shock. We dropped down into central Florida which at that time was definitely not a forward-thinking place.”
The time was 1970. A year later Disney World opened with plans for a community of tomorrow (later built as Epcot). It was also in the shadow of the Kennedy Space Centre, where his uncle worked on the Apollo project for NASA.
“Suddenly I was thrust into this world of possibility and futuristic concepts. Looking back, I understand that I was also going through a fairly massive personal change [in relocating to Florida as six-year-old] and found myself not just exposed to change but embracing it.”
He adds, “I became as comfortable with change as many people are uncomfortable with it. That culture locked into me.”
Growing up surrounded by children’s entertainment, it was natural that Schilowitz should move into the business. He set up a local production company producing promos, commercials and programming for Disney and Nickelodeon.
He recalls being hired by Disney to pretend to shoot a movie on a virtual studio lot as part of the launch promotion for MGM Studios in 1989. “They needed someone who could handle a Panavision camera and who knew something about directing a crew. It was all staged for the public, there was no film in the camera, it was a lot of fun and it was very meta.”
Theme park cinema
These entertainment environments shaped his youthful take on the world. “I spend a lot of time with kids these days, trying to see the world through their eyes. Their view of the changing landscape of tech as it relates to media and entertainment and socialisation is far more spry than anyone of my generation.”
One particular Disney attraction left an indelible mark. Interactive theme park DisneyQuest launched in 1998 with a cutting edge virtual reality attraction, Aladdin’s Magic Carpet Ride, which only closed in 2016.
“It was what we would now call location-based entertainment, and Disney was the first major entertainment company to attempt it. It was seminally important for a number of people in the industry as a touchpoint for where entertainment would head.”
Following a similar direction, Microsoft co-founder Paul Allen recently opened the Holodome in Seattle, a spherical room offering a 360-degree video, sound and haptic experience to small groups of six people at the same time.
Schilowitz himself was the Chief Creative Officer at Barco Escape, an immersive theatrical technology using three projectors to show movies like Star Trek: Into Darkness on three projection screens. However, Barco folded the business earlier this year.
“Clearly, we’re not a stage where theme park cinema can go mainstream. It needs to live in a specialist environment but there is a logic that will make sense at some point. People keep attempting this because there is a belief that the essence of the experience is worthwhile.”
Restless after a decade at his Florida facility, in 2001, Schilowitz joined forces with hardware company AJA on the US West Coast, which had a hardware partnership with Apple building boards and interface devices to transfer video in and out of Apple products. He left to start up his own hard drive company, G-Tech “beginning with cardboard boxes trying to figure out airflow and design” selling the product five years later to Hitachi (which subsequently sold to Western Digital where the G-Tech brand remains one of the most respected of its type).
It was this success in computer engineering that caught the attention of Jim Jannard, billionaire founder of the Oakley sunglasses empire, who invited Schilowitz to help bring his vision of a digital cinema camera to market.

As Red’s first employee and chief product evangelist, Schilowitz attacked his mission with zeal, seeing the RedOne camera launch in 2007 and eventually winning over the most hardened industry sceptics.
“It was a rollercoaster ride and amazing to me that no-one else wanted to take on the giants of the film industry,” he says. “I travelled all over the globe and had an incredible experience helping bring Redto life, finding people willing to take the risk with us on a digital motion picture camera that held the promise of someday replacing 35mm film as the acquisition medium of choice.”
On resigning in 2013 he was quickly snapped up by Fox.
The Manhattan Project
The most tangible record of his short time at Paramount is an ambitious plan to realise the possibilities of volumetric capture. The film industry is evolving toward the real-time production of live action seamlessly blended with CG animated and performance captured characters, blended into virtual environments. Harking back to Muybridge this is being accomplished using games engine renderers and multiple cameras.

Arguably the most advanced model for this is Intel Studios, a 10,000 sq ft complex, claimed to be the world’s largest stage for volumetric video capture – which opened in Manhattan Beach, California, at the beginning of the year.
Paramount is the first studio to support it.
“We’re asking what it means to build movie entertainment using a camera array, as opposed to filming a single point of view,” explains Schilowitz.
“When you look at advanced entertainment experiences, we are on a trajectory from watching things displayed on flat surfaces to advanced spatial displays that lets us look around and literally walk around the images as we like.
“We can create fairly sophisticated spatial entertainment and motion picture experiences in CGI and now we’re starting to learn how to do it in live action video.”
It’s not that Paramount has any particular project or goal in mind (at least not publicly) but Schilowitz has persuaded executives that you have to at least be in the game in order to understand what’s coming down the track.
 “If you put yourself in the position to be an early learner and an early explorer you end up with a better strategy to position yourself for the future.”
What’s more, he has put his own money on the line as investor and co-founder of HypeVR, a San Diego-based outfit working on advanced maths to create a volumetric experience.
“The idea of just static normal 2D video is the past. The idea of dynamic volumetric video that you can move around in, is the future.”


Thursday, 7 June 2018

What is Projection Mapping?


Content marketing for Dataton

Video mapping, projection mapping, 3D mapping, spatial augmented reality or plain old mapping, the technology has plenty of names but the effect can be summed up in a single word: wow! We take a quick look at what it is and when to use it.

From static object to captivating experience
At its very simplest, projection mapping is the art of making multiple projectors work together on a surface to create amazing visual displays. By playing video, animation or graphics off different shapes and textures, the practice creates a captivating experience of light and movement over previously static objects.
Also known as ‘spatial augmented reality’, ‘3D mapping’ or ‘video mapping’ the technology can transform everyday objects – cars, cakes, metro tunnels, airport terminals, entire city districts, even water – into interactive displays. In skilled hands, projection mapping will paint entire visual stories, fantastical illusions and immersive environments - delivering additional impact combined with stagecraft lighting effects. 
Mapping has a long history but has experienced a surge in popularity and accessibility in recent years. It's used in a wide range of applications – from projecting colour and design onto cars in an auto showroom or elevating product at an international trade show, to providing an ambient backdrop synchronous with the beat of an EDM concert or the atmosphere at a sports stadium. Most commonly, we see the technology being used to transform entire building façades or public spaces into art on a vast scale. 

The tools
The essential tools for projection mapping are self-evidently a series of powerful projectors and software (like WATCHOUT) which maps the coordinates of objects in relation to the projectors, aligns multiple projections together and controls all the hardware. The XYZ orientation, position, and lens specification of the projector are used to determine a virtual scene. Opaque templates can be used to ‘mask’ the exact shapes and positions of the different elements of the geometry or space of projection. Bringing 3D models straight into the software helps creators design and visualise more complex projection mapping shows.  
Advances in high-lumens and high-resolution projection technology – up to 8K and beyond – and increasingly sophisticated software have driven the creative possibilities of the technique. Crucially, reductions in price mean the technology has moved from the budgets of a rarefied few into the hands of any venue owner, advertising brand, design agency or live event specialist. 

A short history
Disney is credited with pioneering the use of projection on three dimensional objects in its Haunted Mansion display at Disneyland in 1969. Two decades later it patented a system for digitally painting an image onto ‘a contoured, three-dimensional object. 
By the late 1990s the technique, then called Spatial Augmented Reality, was the subject of academic research into futuristic office environments. The definition*, according to Henry Fuchs, Ramesh Raskar and Greg Welch, was: "In Spatially Augmented Reality (SAR), the user’s physical environment is augmented with images that are integrated directly in the user’s environment, not simply in their visual field. For example, the images could be projected onto real objects, using digital light projectors, or embedded directly in the environment with flat panel displays."
Jump forward to the early 00s and artists like Oliver Bimber began to explore its potential for superimposing images onto paintings. Video projection-mapping gained wider prominence through guerrilla advertising and has subsequently become a staple of campaigns for the world’s largest brands in major cities across the world.

Different surfaces, same technique
Mapping can be loosely split into two kinds: on 2D where you projection map flat surfaces such as screens, walls, ceiling and floors to amplify the event space, or on 3D objects, often curved, cornered, or otherwise irregularly shaped (like the mapped polar bear shown opposite), to create interactive displays. Especially the latter is capable of creating mind-bending effects through warping content around geometry with a control software, letting images take on physical form. 
Mapping requires edge-blending to ensure a seamless effect. Geometry correction and blending multiple projectors seamlessly on a complex surface can be a tedious and time-consuming task. Using an automated camera projection alignment system can save time and energy for both initial calibration and recalibration.

Map for Impact
Mapping content to an object comes with its own set of technical, creative and budget challenges. But as many artists and marketers have realised, it adds extra dimensions, enhances ambience, reinforces messages or simply creates the ‘Wow’ factor which leaves an audience breathless and taking away a positive impression of brand image. Even better, content, themes and branding created for projection mapping make an event portable, completely customisable – in any space – and unique. Any object can be specifically created for the sole purpose of being projected on – a feat where the only limit is the imagination. Just take a look at Shogyo Mujo developed by BARTKRESA studio and Josh Harker, as an example. 
Creative and skilled use of projection mapping delivers far more memorable impressions than traditional forms of media and renders the world around us as a digital canvas. 


Tuesday, 5 June 2018

Sony's future might depend on a robot dog

RedShark News
It’s not just traditional media that needs to reinvent itself in the face of competition from web giants. Big tech too needs a reboot if it’s to survive Silicon Valley’s hi-tech take-over and Sony is doing that with a dog.
Not just any dog of course. Aibo is the robot pooch which was put down in 2006 aged just seven, only to be brought back from the dead late November as a symbol of a revitalised corporation that just a few years ago seemed to be headed for the scrap yard.
What Aibo Mark II has that its predecessor did not is the apparent ability to bond with its human owners, and it’s able to do this because it’s stuffed with intelligent sensors.
According to Sony, these include “proprietary ultracompact actuators”, deep learning technology for sensing and analysing images and sounds, and unique AI technology that connects Aibo to the cloud.
It is the image processors needed to extract information from the environment which have emerged as essential to Sony’s future.
It just announced it will spend 1 trillion yen ($9 billion) on image sensors between now and 2021, saying that having a lead in sensors is crucial for massive emerging technologies like self-driving cars and artificial intelligence.
According to banking group and analysts Macquarie, Sony already owns half of the global market for smartphone imaging technology.
While Sony’s own smartphone business has shrunk (sales fell from 33 million units in 2012 to just 14.6 million last year), the firm has managed to beef up its imaging sensor business – a technology which is becoming as vital as the system on a chip (SOCs) at the heart of all smartphones - and that includes Apple which now buys Sony sensors to put in the iPhone.
In 2016, Sony bought into US-based AI startup Cogitai and established a venture capital fund to build partnerships with researchers and startup companies in AI and robotics.

Self driving cars

It recently began development of a self-driving prototype known as ‘New Concept Cart SC-1’, equipped with 360-degree image sensors on all four sides of the vehicle. These ultra-sensitive image sensors are intended to combine with the internal high-definition displays to give the driver an unobstructed view, even when driving at night without headlights and in fact eliminating the need for windows altogether. Imagine that?
Sony have. By placing high-resolution displays where the windows would normally be, the SC-1 can display all manner of images or video to passengers, it says.
Sony’s claim for this new image sensor technology is that it can capture information about its environment faster, more accurately and more precisely than the human eye.
The firm’s CMOS image sensor “excels in its speed, lighting range and absence of noise” touted Sony chief Kenichiro Yoshida.
He’s the new boss at Sony, taking the job in February, after the previous chap, Kazuo Hirai, had by all accounts succeeded in rescuing the company from the brink of ignominious sale.
It lost more than $3.5bn in 2012 and had to write off nearly $1 billion at its movie studio Sony Pictures in 2016, but by axing product lines like the Vaio PCs and lithium batteries for phones and focusing on the Playstation (which still makes a quarter of its business), Sony made $6.3bn revenues last year.
One trillion yen isn’t peanuts. But Sony’s deal to buy a stake in the company behind Snoopy and Charlie Brown is. It now owns a 39% chunk of Peanut Holdings (with the relatives of cartoonist Charles Schultz retaining 20%) as it seeks to expand into content for kids.

Monday, 4 June 2018

BBC Streams FIFA World Cup in UHD HDR, But Not to All Viewers

Streaming Media

Live UHD streams of World Cup football over iPlayer are a BBC first, but HDR is not under the corporation's full control.


The BBC's planned UHD HDR streaming of the FIFA World Cup Russia to viewers in the U.K. will be another milestone on its path to producing and delivering all its programming over IP. It seems to have chosen BBC iPlayer as the main pathway to delivery of UHD content, and in recent weeks has upped its research and development into making this happen.
While a significant leap forward in terms of quality (this is the first time that BBC iPlayer has offered such a high-quality stream) and scale (29 matches will be streamed live), its World Cup workflow will not be the finished article. The BBC has been quite open about this, referring to its World Cup streams as a trial. It plans to only reach tens of thousands of viewers with the full UHD HDR stream because of bandwidth issues.
One other hiccup is that the workflow for high dynamic range (HDR) video from Russia is not how the BBC would ideally like to move forward.
The BBC will base its coverage on the FIFA UHD HDR host feed and add its own studio presentation up-sampled from HD.
The iPlayer stream will use the Main 10 profile, and level 5.1 HEVC with HLG (hybrid log gamma) signalling. This is the same encode used for the BBC’s Blue Planet II trials (available on-demand in UHD HDR), for which the corporation was able to use non real-time encoding. For the live soccer coverage, however, real-time encoding is required and so the BBC will be offering the following four encodes: 3840 x 2160p50 at 36Mbit/s, 2560 x 1440p50 at 16Mbit/s, 1920 x 1080p50 at 10Mbit/s, and 1280 x 720p50 at 7Mbit/s.
Since a minimum capacity of 40Mbit/s is needed to get the full 3840 pixel resolution running at 50 frames per second, this restricts the number of viewers able to receive the World Cup streams.
“A 20Mbit/s connection should be capable of providing the 2560 pixel resolution representation,” says Phil Layton, head of broadcast and connected systems for BBC R&D. “As HEVC encoding develops we very much hope the real-time encoders will be capable of matching and eventually bettering the bitrates we use for on-demand, but this will require a big increase in computational power.”
Streams will be packaged with DVB-DASH using BBC R&D-developed software with final distribution via CDN.  
“The adoption of open international standards means that iPlayer gains the capability to decode UHD HLG using the capabilities of the receiver it is running on,” Layton says.
The BBC has played a major role in developing the HLG production format for high dynamic range, and HLG remains its preferred option. However, for the World Cup it has to take the host feed which is likely uses the Dolby PQ standard.
The BBC will convert the host feed to HLG for iPlayer so that standard dynamic range UHD receivers will be able to exploit the backwards compatibility of HLG.   
This universality of service is core to the public service broadcaster's mandate. It’s why it devoted so much time and effort developing HLG in the first place.
The BBC has continued that process by developing and testing a means of ensuring that SDR (standard dynamic range) content derived from an HDR workflow is consistent in terms of colour and light values for onward distribution. This "scene-light format conversion" underwent a large-scale test at the Royal Wedding last month.
While the Royal Wedding wasn’t transmitted in UHD, it was captured in UHD HDR (HLG) for posterity.
“Simple scene-light format conversion might at first seem like a small step towards completing the HDR production eco-system,” explains BBC R&D principal technologist Andrew Cotton. “In fact, the problem of ensuring near identical colours in SDR content derived from an HDR workflow, compared with those from an SDR production workflow, is one that the whole industry has been trying to solve. It has so far proved a block towards the widespread adoption HDR live production.”
According to Cotton, this type of conversion has proved particularly difficult to achieve because the majority of HDR down-converters on the market, including the BBC’s own licensed down-conversions, are based on "display-light" conversion technology.
“That means the conversions calculate the light emitted by a reference monitor being fed the input signal, and then convert that signal to one that would cause exactly the same light to be emitted by a reference monitor operating in the desired output signal format,” he says.
The BBC used scene-light format conversions for live coverage of the Royal Wedding. In this case the HLG HDR signal was converted to SDR (BT.709) after the UHD production switcher. 
Further details on this process can be found on the BBC blog
“For future events we hope the ground-breaking work at the Royal Wedding will lead to simpler, native HLG workflows which allow HD, UHD, and UHD HLG versions of the event to be produced from the same camera feeds and without the complexity of metadata to manage through the broadcast chain,” Layton says.
Although FIFA is offering 16 channel object-based audio (based on Dolby’s system) the BBC explains that multi-channel sound, while desirable, was not possible for its coverage on this occasion. This is due to a number of issues with receivers which “are difficult to work-around.” Instead, its audio will be stereo AAC-LC at 128 kbit/s. 
UHD Match Coverage
In terms of FIFA’s host production, the BBC will field 37 cameras at each of the 12 stadia in Russia with 37 cameras (up from 34 in Brazil 2014) including 8 UHD/HDR cameras capable of simultaneous 1080p/SDR output and another 11 cameras with 1080p/HDR and 1080p/SDR dual output.
This basic match-day coverage is supplemented by eight super slow-mos, a cable cam, and a Cineflex Helicam. Two ultra-motion polecams join the roster from the quarter final stages. 
In addition, it will use two radio frequency cams at each venue to record fans inside and outside the ground.
Around 75 percent of the live match cut is estimated by FIFA to be in full UHD HDR with the remainder up-converted. All replays, for example, will be up-converted from HD “with minimal visual impact on the UHD feed due to the natural softness derived from the slow motion angle,” advises FIFA.
Social Media From Russia
FIFA has a dedicated social media production team—another first—tailoring content for Facebook, Twitter, Snapchat, and Instagram in order “to boost engagement and spark online interaction.”
Its shareable content is designed to stand out on busy news feeds and offer short-form video with “simple, enlarged text and thumbnails” that give the user an imperative to press play.
Since most broadcasters already run their own digital platforms, FIFA is making SDKs available, composed of widgets, which deliver key data and editorial components. The aim is to help broadcasters integrate social media content into their existing digital platforms.
The white label web solution includes the matchcast alongside multi-angle content, VOD clips of events between matches, stats, and social media integration.
VR First 
For the first time at a World Cup, VR (virtual reality) video will be available for all matches as a live stream in 180°, as well as 360° VOD to rights holders.
The 360° VOD clips will be made available for Samsung Gear VR, PlayStation VR, Google Cardboard and Daydream, and Oculus Rift players. Clips will also be published to Facebook 360 and YouTube VR. 
FIFA commissioned a white label app from Deltatre (branded "FIFA World Cup VR") for the event, and requested customizations. The BBC is one of the rights holders taking this.

Friday, 1 June 2018

BBC claims to solve live HDR production puzzle


SVG Europe

The BBC says it has found the answer to one of the problems bedevilling the industry’s transition to live HDR production — that of not compromising the standard dynamic range signal.
It has developed a means of ensuring that the SDR content derived from an HDR workflow is consistent in terms of colour and light values, for onward distribution. This ‘scene-light format conversion’ underwent a large-scale test which proved its value at the Royal Wedding last month.
“Simple scene-light format conversion might at first seem like a small step towards completing the HDR production eco-system,” said BBC R&D Principal Technologist, Andrew Cotton. “In fact, the problem of ensuring near identical colours in SDR content derived from an HDR workflow, compared with those from an SDR production workflow, is one that the whole industry has been trying to solve. It has so far proved a block towards the widespread adoption HDR live production.”
The BBC has been trailing live UHD HDR streaming on iPlayer for some time, notably from the York City Knights and Catalans Dragons rugby match in April, working with OB providers to develop live HLG HDR production workflows for BBC use.
According to the BBC, the challenges are much harder for live production than non-live production, which is now well understood. Not only is there just one chance to get things right with live but live production must also handle a whole host of different sources (e.g. graphics, HDR cameras, SDR cameras, pre-recorded inserts, slow-motion replays), all available in different signal formats, and blend them together seamlessly into a single programme.”
That might sound straightforward with HDR format conversion technology now widely available, but in fact it’s fraught with difficulty. The BBC has been working within the ITU-R’s Rapporteur Group to document current best practice for HDR TV production.
Last October their findings were published in report BT.2408 [https://www.itu.int/pub/R-REP-BT.2408] which drew on the experience and expertise of broadcast engineers, colour scientists and colourists from around the world. As well as specifying signal parameters, camera line-up levels and methods for SDR/HDR format conversion, it documents BBC R&D’s findings on monitoring image brightness to ensure comfortable viewing of HDR images in the home, along with its work adapting the HLG ‘system gamma’ to ensure the highest image consistency outside of the reference viewing environment, and in the most recent revision [https://www.itu.int/pub/R-REP-BT.2408-1-2018] its work on signal levels for skin tones.
These were all put into practice for the UHD HDR production (though not transmission) of the Royal Wedding.
For the event, NEP UK configured 76 UHD cameras (including three UHD radio cameras) in wide colour gamut, BT.2100 Hybrid Log-Gamma (HLG) HDR. An important requirement was that the UHD HLG HDR signal had to be converted to conventional SDR BT.709 for onward distribution.
“It needed to look identical to the SDR signal that would be available from a conventional SDR camera,” explains Cotton. “The SDR signal could not be compromised in any way through having been derived from the HLG HDR signal.”
Display light conversion
According to Cotton, that type of conversion has proved particularly difficult to achieve. That’s because, he says, the majority of HDR down-converters on the market, including the BBC’s own licensed down-conversions, are based on ‘display-light’ conversion technology.
“That means the conversions calculate the light emitted by a reference monitor being fed the input signal, and then convert that signal to one that would cause exactly the same light to be emitted by a reference monitor operating in the desired output signal format,” said Cotton.
They may apply some adjustments, for example to make an SDR image appear brighter when shown on an HDR display, but the principle remains the same.  By doing so, a ‘display-light’ conversion maintains the artistic ‘look’ of the original production format when converting the signal to the new format.
“Display-light conversions work well for non-live graded programmes, but they are not generally suitable for live,” explained Cotton.
Since live workflows frequently need to convert between SDR and the HDR production formats, whichever is being used, this presents a problem because each format yields a different ‘look’.
“By an accident of its design, SDR images tend to be more colourful than nature,” said Cotton. “That was a useful feature when we viewed them on dim CRT displays, as the eye is less sensitive to colour at low luminance levels.”
A good example of this could be seen at the Royal Wedding where in SDR, the walls of Windsor Castle appeared to be built of a yellowish sandstone, rather than the real paler Clipsham limestone.
“By contrast the HLG HDR images are, by design, remarkably natural in appearance,” said Cotton. “We no longer need that colour boost as HDR images are intended to be viewed on more modern brighter displays, and the way in which the natural ‘look’ of HLG is achieved makes it easier to deliver consistent looking pictures across a range of displays of different peak luminance.”
So, converting between SDR BT.709 and HLG using a conventional display-light technology, that preserves the look of the original format, does not necessarily deliver what’s required for live production.
The BBC says that a display-light conversion of a BT.709 camera to HLG “would look more saturated than a native HLG camera when cutting between them”. Similarly, a display light conversion from an HLG HDR camera to BT.709 “would tend to look less saturated than the SDR signal from a camera pointing at the same scene.”
The differences arise because each format has a different relationship between the light in the scene falling on the camera sensor, and the light emitted by the display – known as the OOTF (opto-optical transfer function). Similar problems are encountered using display-light conversions with other HDR production formats.
What was needed instead reckoned the BBC, are ‘scene-light’ format conversions, where those issues can be avoided.
HDR handling by UHD cameras
To understand the process further, Cotton outlines the HDR handling characteristics of UHD cameras. Per Cotton: Many cameras offer two simultaneous outputs – one HDR and another BT.709. The linear light signal from the camera sensor is considered to be normalised in the range zero to one.  In practice other normalisations might be used to achieve the desired signal levels and noise performance.
The linear light signal from the sensor is passed through the HDR camera OETF to convert it to the HDR electrical signal for output.  The camera exposure is adjusted for the HDR output, and the SDR signal is derived by applying a fixed gain to the linear scene-light from the sensor, followed by a conventional BT.709 OETF.
SDR cameras are usually adjusted so that, under controlled lighting, the SDR signal hits full-range for a 90% reflectance test chart i.e. a card that reflects 90 percent of the light falling on it. In practice, under varying lighting conditions such as those found at an outside broadcast, the cameras are adjusted for correct exposure of skin tones, but they are roughly equivalent setups.
The ITU-R BT.2408 recommends that the same 90 percent reflectance card should deliver an HLG signal of around 73 percent.  By working through the mathematics, it can be found that applying a gain of 4.193 (12.4 dB) to the linear scene light signal before applying the SDR BT.709 OETF, ensures that both SDR and HDR signals are correctly exposed. “In practice other gain adjustments may be included in the camera channels,” said Cotton. “We found that we had to dial-in a gain offset of 9.0 dB on the cameras that we were using to achieve the desired exposures.”
For the Royal Wedding, BBC R&D needed to take the HLG HDR signal and convert it to SDR BT.709 after the UHD production mixer (or switcher). To do this, the RGB HLG signal is passed through an HLG inverse OETF to recreate the linear scene-light signal produced by the camera sensor. The scene-light signal is then passed through the same SDR processing chain found in the camera – a scaling of 4.193, followed by a clipping stage and finally the standard BT.709 OETF.
The HLG to SDR conversion could apply more sophisticated tone-mapping from HDR to SDR and colour management.
The final results from the scene-light conversion were so good, reports the BBC, that its vision engineers were able to shade cameras as normal, using SDR monitoring and the SDR output from the CCUs.
“They knew from our earlier tests that the exposure of the HDR signals would track the iris adjustments made for SDR and deliver spectacular HDR images,” said Cotton. “On the day, they were sufficiently confident with the conversion that it was left to just me and my R&D colleague Simon Thompson to monitor the HDR, and the subsequent down-conversion to SDR BT.709.”
Cotton reports small differences “were just perceptible” when comparing the SDR conversion and native SDR from the camera CCU side-by-side, due to minor differences between the HDR and SDR camera channels.
“However, one was not necessarily better than the other, and the vision supervisors were very happy with the results.”
So much so that the HLG to SDR down-conversion was the signal source for the international 1080i SDR feed, reaching an estimated audience of 1.9 billion viewers worldwide.
The BBC believe this new workflow, “greatly simplifies HDR production without compromising quality” adding; “Now that the solution has been shown to work so well, and on such a large scale, it says “we are hopeful that it will enable a significant increase in HLG HDR television production.”