Wednesday, 21 September 2016

Dan Sasaki; Panavision interviewed


British Cinematographer
Revered by cinematographers as a magician capable of conjuring quintessentially distinctive looks, Dan Sasaki, vp Optical engineering at Panavision modestly dismisses his lauded status. 
https://britishcinematographer.co.uk/dan-sasaki/ “I am definitely no Leonardo da Vinci,” he says. “I'm a person who knows how to deconstruct something and how to learn from others. When I've made mistakes cinematographers have let me know it. I'll bring something that I think answers all of their problems and they'll say, 'Why have you brought me this? It's unusable' or they'll push me much further into developing something with a far greater emphasis than I'd imagined.”
The truth is that, despite his humility, Sasaki embodies the Panavision brand in his dedication to artistic perfection. His successes have included the flare lenses used by Janusz Kaminski ASC on Saving Private Ryan (1998); the specialised 20mm Anamorphic that John Schwartzman ASC used on Pearl Harbour (2001); the design of the Anamorphic Wide-angle Zoom (AWZ), which has become known as the 'Bailey' lens, after cinematographer John Bailey ASC; and the refurbishing of vintage Ultra Panavisions for Robert Richardson ASC to shoot The Hateful Eight (2015).
“Designing a lens is an art form because you're helping artists to express themselves,” he says. “It doesn't come about by some mathematical formula, but by adding or subtracting detail to produce the image that a cinematographer desires. It's a very analogue, human process.”
Of his three decade-long Panavision career, Sasaki says, “I didn't have a choice. My father had worked at the company since I was a year old, and at the dinner table growing up it was always a fascinating time to hear him talk about his day, analyse the movies and explain the technical intricacies of the cameras.”
Sasaki recalls being invited to an advance cast and crew screening of Close Encounters Of The Third Kind (1977, DP Vilmos Zsigmond ASC HSC) and his father Ralph talking about working on the cameras and lenses for films like Fiddler On The Roof (1971, DP Oswald Morris BSC) and Jaws (1971, DP Bill Butler ASC) in the 1970s.
Although Sasaki senior, who retired as VP of operations in 2008, invited his teenage son to join him during school vacations in the Tarzana headquarters’ workshop, this was more because Sasaki junior showed a passion for mechanics rather than about any plan of succession.
“I had an innate curiosity about how things worked and was constantly taking things apart, like TVs and radios,” says Sasaki. “I used to watch Dad put cameras together and think that this was cool, but it didn't really cross my mind to make working with cameras my career.”
In fact, nothing really prepared him for work at Panavision, not even the physics he studied at Long Beach State. “My training really started when I got there,” he says.
He joined the optical department of the company aged nineteen in 1986, and spent the next four years in the servicing division repairing glass - dusting lenses, cleaning, engraving and checking inventory.
“Dad was very protective of Panavision's reputation – he wasn't going to let just anyone get involved,” he says. “Also, he didn't want to show any favouritism, least of all to me, so I believe he held me down for longer than he would have with other people until he felt I knew what I was doing and that I'd matured enough to move up the ladder.”
Sasaki became lead repair operative then manager of the section, gaining more responsibility and experience over time. It was a formative period bringing him into contact with cinematographers and assistants. Some of the cinematographers that helped a lot in the early years were Wally Pfister ASC, Samuel Bayer, John Alonzo ASC, and John Schwartzman ASC. Some assistants that helped were Richard Mosier, Alan Blauvelt, Dick Meinardus and Norman Parker.
“I vividly remember the first feature I worked on which was Dances With Wolves (1990) with Dean Semler ACS ASC and camera assistant Lee Blasingame. I think it was then that Dad realised that maybe I'm not too dangerous to be let loose, that I wasn't going to upset the apple cart.
Sasaki was mentored by Tak Miyagishima, Panavision's fabled lead mechanical designer from 1955-2011. “He told me never to give up. He said, 'treat every challenge as a learning process' and he taught me how to get the best out of working with cinematographers and not to be afraid of bringing a different perspective to the table. He impressed upon me that being different is part of the innovation process.”
Sasaki credits his first break as creating the first C60mm Anamorphic lens for László Kovács ASC on Multiplicity(1996). “László thought there was too big of a focal length span between the 50mm and 75mm. His thoughts were that a 60mm would be an ideal focal length to bridge this gap and it would closely correlate to a 27mm lens in the Super 35 format. As a result, this became such a popular focal length and the 60mm became a standard focal length in our future builds.”
Sasaki used The Hunt For Red October (1990) for Jan De Bont ASC, as a platform to optimise the classic C-series line of anamorphic lenses. The process included updating the optics and determining ways to flatten the images on the film plane. This paved a path for the C-series lenses in use today.
“It took me a while to figure out a signature look and I'm quite sure that someone else will have another way of customising lenses which will do the job just as well. If I have achieved anything it all stems from the cinematographers and camera assistants, Tak and my Dad who inspired me.”
The look of a show is an alchemic mix of elements from the optical density of the photographic emulsion to the shooting style and visual effects. “It is never as simple or as binary as removing the coating from a lens,” says Sasaki.
When Kaminski sought Panavision's help for Saving Private Ryan, Sasaki figured that being a WWII film a gritty old lens would fit the bill. “When we did initial tests Janusz felt it looked too modern, so we removed the coating,” he says. “That only served to add another layer of error into the lenses – which gave them an aberrant quality, something pre-1940, but it offered a flare that they wanted. That process of trial and luck worked perfectly for that film but only when combined with other elements like the shaky camera movement and the timing in post.”
Now, instead of removing the coating from lenses Panavision has found a way to produce the same look without having to destroy the lens and achieve better control over the degree of effect and unwanted glare.
“Many cinematographers come in asking for a look that’s in their head and it’s our job to try to interpret that,” he says. “Other times we’ll offer them some choices and suggestions. For example, on Star Wars: The Force Awakens (2015) Dan Mindel BSC ASC said he wanted it to have a 1970s kind of aesthetic. We researched what a lens from the period might have looked like with modern film stocks and started to build something that we felt would maintain that aesthetic without the liabilities associated with an older lens.”
Surprisingly, Sasaki says he doesn't own a camera. “I'm no artist or photographer so my impression of what a cinematographer wants may not be quite right. Sometimes they will send me a still photo or example of what they would like to see. Based on the example or description we can start determining methods to customise a lens. It usually takes a couple of iterations before we match the cinematographer’s expectations.” 
The vogue for degrading of lenses – decoating, recoating, antiquing – is exemplified by the exacting demands of Robert Richardson and director Quentin Tarantino in their search for a distinctive look to match the wide screen wild west of The Hateful Eight.
“Bob and Quentin wanted to shoot 5-perf 70mm film but the only choices were old System 65s used on Far And Away(1992, DP Mikael Salomon). We tested these for them but they suggested that the image looked round and predictable. Bob challenged me to find a look that hadn't been seen before. Bob always knows what he wants and he knew that if he kept talking my ego would get the better of me. So we looked in the archive for Ultra Panavision lenses which hadn't had light through them since Khartoum (1966, DP Edward Scaife BSC). On most the grease had atrophied so the lens couldn't move or the glass was fogged up, but one lens was in working order. We threw that onto the test projector and before I knew it Bob declared that they will shoot the picture with that style of lens.”
Sasaki worked with Richardson's first assistant Gregor Tavenner to retrofit 15 lenses to work on a reflex camera. This included converting them for follow focus, increasing the working depth, and varying focal lengths (some that didn't exist in the 1960s) in just a few months.
The digital revolution has introduced a wave of super sensors making large format much more accessible. Sasaki feels that artistically, large format offers many depth perception cues that are very attractive to the human visual processing system. “Large format offers many features that both cinematographers and directors can identify with immediately, including increased magnification, perspective and character,” he says. “We’ve seen recent films that span from Super 16mm all the way up to IMAX. Cinematographers are no longer bound by old standards in formats or capture media. Ultimately, though, whether 65mm, 35mm, 16mm, Anamorphic or spherical, the cinematographer will choose the best format to fit the story. They now have more choices than ever.”
Dan continues to mine Panavision's library of vintage optics. In some cases, lenses are built from the ground up and some are mechanical updates to the originals.
Richardson turned to Sasaki again for Live By Night (2017), directed by Ben Affleck, combining the ARRI 65 with the newer Sphero 65 series, also used by Jess Hall BSC on Ghost In The Shell (2017). The Primo 70 series were used by Rodrigo Prieto AMC ASC on Passengers (2016) and System 65 lenses by both Ben Davis BSC on Doctor Strange(2016) and Adam Arkapaw ASC on Assassin’s Creed (2016).
“These three series of lenses have intrinsically different imaging characteristics. The one thing that is becoming essential in this format is the request for T2.0 or faster optics. In nearly every case we had to create optics from scratch to achieve lenses with such large imaging diagonals and high speed.”
Another example is the re-optimization of the Ultra Panavision 70 anamorphic lenses used by Greig Fraser on Rogue One: A Star Wars Story (2016). “We started with the base 1.25x anamorphic squeeze lenses used on The Hateful Eightand made many modifications to suit Greig's needs and accommodate the Alexa 65 camera. In many cases, we had to completely start over with the base lens and rebuild it to become a more modern version that met his expectations.
He is currently working with Hoyte Van Hoytema ASC and director Christopher Nolan on Dunkirk (2017), at least 60 per cent of which is being shot in native IMAX and for which Sasaki has built optics to fit airplane cockpits. He also recently developed anamorphic lenses for Steven Spielberg's Ready Player One (2018, DP Jansuz Kaminski).
Has he ever not managed to solve a problem? “There are times when I try to violate the laws of physics. I sometimes try to violate the Lagrange Invariant [a conservation of energy law] by trying to get more light through a system than it will allow. We have to try.”

Saturday, 17 September 2016

One-Shot Wonders


Screen Daily

New technology and old fashioned editing smarts are coming together to produce strikingly long single shots with no noticeable cuts.

P72/73 http://edition.pagesuite-professional.co.uk//launch.aspx?eid=053e0638-8ed0-4c26-848e-f924ae2f0ec5



A scene in Captain America: Civil War shows the character Tony Stark in flashback with his parents as both a younger man and his older self. The scene is notable not only for the 4000 frames of digital face-lift applied by vfx artists to actor Robert Downey Jnr but for its three minute duration without a noticeable cut.

“Marvel films have a reputation for aggressive cutting and whip-smart action sequences,” explains Jeffrey Ford, ACE the film's editor (whose credits include The Avengers, Iron Man 3). “In this picture we made a conscious effort to expand some scenes and play them without as many cuts in order to create a different pace and dynamic to the narrative.”

It is one of a spate of recent films in which the single shot – or 'one-er' in Hollywood terminology – have been employed. Alejandro Iñárritu and director of photography Emmanuel Lubezki won Oscars for Birdman and The Revenant in part for their groundbreaking design of extended sequences blended together in post from shorter shots. Similarly, Spectre opened with an apparently continuous four-minute-long sequence of Bond in Mexico City moving from street level, into, then outside of a building.

“Using a long duration shot as a way of lulling the audience, then completing it with a full action impact, can be very effective,” explains cinematographer John Seale ACS, ASC (Mad Max: Fury Road). “I'd use it to extend an emotion of an actor if a cut would take the audience out of the moment. Others have used it as an explanation of situation or to establish boundaries.”

Seale suggests that The Revenant is a good example of holding onto a close-up for long periods to bring the audience into the scene. “Some thought it was overdone and boring because of the screen time per shot, but the director wanted to get the audience very visually involved to experience 'being there'.”

Long duration shots have been part of the lexicon of cinema for decades. Directors like Orson Welles and the classic three and half minute opening to Touch of Evil (1958) or Alfred Hitchcock's experimental black comedy Rope (1948) composed of several single 10-minute film reels, have always sought to push the boundaries of cinematic time and space. 

The introduction of body-mounted camera rig Steadicam in the 1970s freed the cinematographer to plot ever more complex and fluid compositions. A three-minute shot in Martin Scorsese’s Goodfellas in which Ray Liotta’s mobster leads his date into a nightclub and Alexander Sokurov's 96 minute-long Steadicam sequence traversing St Petersburg's Winter Palace in Russian Ark(2002) are highlights.

“I can’t say I'm enthralled with one-ers unless they’re both sensible and valuable,” says Steadicam inventor Garrett Brown ASC. “But the freedom to get the lens exactly where it’s wanted, to carry on up steps and over doorways in french curves that would drive a dolly crew berserk, remains completely seductive.”

More recent advances in technology have enabled filmmakers to go even further. All 138 minutes of indie drama Victoria were shot in a single take (on the third attempt) on a Canon EOS C300, a camera small enough to be handheld without Steadicam, allowing cinematographer Sturla Brandth Grøvlen DFF to improvise with the actors.

“Being able to shoot HD on a very lightweight camera and light sensitive camera is very helpful technically,” explains Grøvlen. “We were convinced that even if the single take didn't work and we would have to cut, the process of making the film like a one-take would influence the style of the film in a direction that we felt was right for the story.” Although Victoria was conceived as a single shot by director Sebastian Schipper, he planned a version using jump cuts as an insurance for the film's financiers.

The combination of lightweight gear and the ability to stitch shots together in post “substantially helps productions to attain desired shots and makes for an awesome weapon,” says Seale.

Peter Honess, ACE (LA Confidential) suggests that the main purpose of a single shot used for a complete scene and with no other cover [takes] is a means for the director to control this part of the film completely. “Without cover, the only way changes to a single shot sequence can be made is by removing it entirely from the film.”

However, single shots composed from several elements – visual and audio – continue to draw on the editor's craft. Iñárritu dissolved between shots in Birdman and The Revenant in the middle of a pan [the quick horizontal movement of the camera] following editor Stephen Mirrione's (ACE) advice that this is when an audience would least notice a cut. Mirrione was also involved in the meticulous choreography of the scenes' shifting character perspective.

“What looks like a seamless long take can represent a lot of editorial and vfx work,” says editor Tim Squyres, ACE (Life of Pi). “There are often hidden cuts and split screens, pieces of the frame or dialogue taken from alternate takes. Editors bring special insight into pacing and storytelling, and can help in the design of long shots. There's one long, complicated shot in Pi that's actually five separate pieces of footage plus one section of pure CG.

Ford was heavily involved in the design of the Tony Stark sequence. “The shot was built around [Downey Jnr's] performance. When I edit the elements together I'm still looking for the best take. I had to choose those performances and find the most elegant way to integrate them against each other. I chose the transitions, adjusted some dialogue and sound and input into the pacing and character design – just as I would normally.”

Squyres points out that single shots are rare because of their complexity. “It's much harder to fill a minute of screen time with a single take that works well technically and for performance, and is interesting visually, than it is to get multiple takes of four setups and cut between them.”

Yet with previsualization (modelling a sequence in a computer) and virtual production (mixing CG with live action on set) a director and cinematographer can increasingly work alongside the editing and vfx team to help make the shot practical, affordable and appropriate to the story.

“Filmmakers are becoming more comfortable with digital technology,” says Ford. “Rather than making a film in order of pre-production, production and post these disciplines are happening all at once. Directors can be confident of pulling off longer takes as a stylistic approach because the technology gives them greater flexibility than they've ever had to take the brave choice toward cutting less often.”

While Technocranes can give the shot elevation, as used by Hoyte van Hoytema ASC in Spectre, cameras mounted on gyro-stabilised drones will enable filmmakers to take the long single shot airborne.

Seale, though, warns against making the shot for its own sake. He points out that Mad Max: Fury Road had an average shot length of 2.3 seconds. “I believe that the human mind can analyse a frame very quickly, and if a shot then keeps going and going, an audience can get bored very quickly. If they then use that time to notice the technique or to 'jump ahead' of the story in anticipation, then the film hasn’t worked.”

Modernising the red button

Broadcast

The UK is finally set to adopt the Europe-wide HbTV standard, which will deliver more functionality to its interactive TV services and a seamless customer experience.



What is HbTV?
Hybrid broadband broadcast TV is a standard (ratified by European Telecommunications Standards Institute) established in 2010 to fuse broadband delivery of internet content with digital terrestrial television (DTT). 

Why are we talking about it?
The BBC, in collaboration with Freeview, DUK and the DTG, says it will adopt HbbTV in time for consumer equipment manufacturers to build the technology into their 2018 product releases.

Why has it taken them so long?
The UK was the first country in Europe to move to DTT in the mid-1990's, before HbbTV existed. The BBC pioneered the interactive red button service based on encoding format Multimedia and Hypermedia Experts Group (MHEG). Since then, all connected TVs, set-top boxes and multiscreen devices sold in the UK market (some 100 million) have been required to be built with MHEG-5 in accordance with the DTG's D-Book which sets out the technical specification for DTT in the UK. With the launch of HD on Freeview in 2009 the D-Book was updated but retained MHEG as the underlying technology for interactivity.

What has changed?
A primary aim of HbbTV is to harmonise the manufacturing process for vendors who would prefer not to have to develop different models for different markets. The DTG admits to some pressure on it from manufacturers to adopt the standard, but more importantly, MHEG is no longer fit for purpose.

What is MHEG used for?
As well as red button/digital services, MHEG is currently used for: radio slates – logos and on screen information for audio only services; pop-ups – used during Digital Switchover and retune events to notify the viewer; time-exclusive services – handling services that use the same capacity but at different times of the day; Obfuscation – keeping adult content behind an application to require payment or login; and MHEG Interaction Channel Services – linkages from a logical channel number to IP delivered video services.

Why is MHEG outdated?
MHEG was developed to include an interactive channel based on internet protocols but it is limited in scope, designed to run on very basic devices and is primitive compared to modern browser-based systems. The chief building block of HbbTV, on the other hand, is HTML, universal coding language of the internet. 

What else is in the HbbTV technical specification?
HbbTV references existing specs, including CEA, DVB, OIPF, MPEG-DASH and W3C. The latest version 2.0 incorporates web technologies HTML5, CCS3, JavaScript and compression schemeHEVC
“MHEG requires specialised knowledge whereas HTML offers a more web-like experience and allows applications and services to be developed using widely adopted web skill sets,” explains Simon Gauntlett, CTO, DTG.
Implementation of the specs into hardware is performed by manufacturers and technology suppliers, while broadcasters and producers develop HbbTV compliant apps. There are also apps built in HbbTV which have little to do with TV, such as a gaming portal in Germany.

Where is HbbTV already deployed?
Fifteen countries have adopted HbbTV including France, Germany, Turkey and Spain (but not Slovenia, Croatia, Romania, and Greece) with over 35 million compatible receivers and 250 apps made with it. According to GfK, more than 60 per cent of televisions sold in Germany are now connected TVs, 92 per cent of which are HbbTV compliant. Portugal and Sweden are trialling Hbbtv. The other chief outlier in Europe has been Italy which rolled out equipment based on MHP (Multimedia Home Platform). Italy, too, is moving to HbbTV. There are also deploymens in New Zealand, Australia, Singapore and North Africa.
Illgner heralded the BBC's announcement as “a really substantial move in light of the UK's long tradition in interactive TV. I think it sends out a bold message that it makes sense to agree on a common technology platform.”

Will consumers notice any change?
Ideally, no. MHEG will not be switched off overnight and indeed will exist in parallel with HbbTV for some time. Broadcasters and operators together with equipment manufacturers, must work out when to pull the plug on MHEG so as not to impact on any of the legacy devices in people's homes.

What difference will HbbTV make?
Since the economies of scale of a larger market will make it easier for manufacturers to build product the cost of smart TVs, STBs and other hardware should be cheaper. “However, we don't know the amount that an HbbTV stack makes up of a total bill of materials for any one product so these savings may not be passed on to the consumer,” notes Klaus Illgner, chairman of the HbbTV Association.
HbbTV will also make it easier to write and deploy services such as accessing a website via a red button feature, launching a specific app, or accessing additional video content that can be streamed over-the-top of the broadcast. The W3C WebSocket technology supports second screen interactivity.
“There should be a more seamless interaction between linear and online services and more tools for developers to enhance those services,” says Gauntlett. “The main focus is to make sure viewers get a consistent experience.”
Adds Illgner, “HbbTV has so much functionality that users should be able to use very rich service formats and see fascinating new service formats in time.”

What are the next steps?
The DTG initiated the move with a white paper ‘Towards a Common European Technical Standard for Interactive Services on Free-To-Air TV Platforms’ published in January. Tests of HbbTV 2.0 software incorporating elements required to migrate the UK and Italy are timed to finish this summer. The idea is to enable manufacturers to ensure correct behaviour of the new HbbTV-based services. The DTG will publish the ninth iteration of the D-Book in October containing the new specs. Freeview Play is already based on HbbTV 2.0.

Who is in charge?
The HbbTV Association was launched in 2009, led by IRT, the German institute for television research, and ANT, now part of Espial, and OpenTV, now part of Nagra. Other founding members included satellite operator SES Astra, and French broadcasters France Télévisions, TF1, and Canal+. The DTG has been a member of HbbTV for many years and is on its Steering Board. DTG head of technical development chairs the HbbTV Test Group and the Certification Group.

NASA’s plans to share Mars with the world

Red Shark News
NASA’s mission to Mars could be the dominant world news story of the 2020’s with the Agency saying that video is essential to its technical and scientific success as well as mankind’s ability to experience an entirely new world.
As NASA’s Imagery Manager, International Space Station, Carlos Fontanot explained during a fascinating IBC presentation that the space agency recently greenlit the next stage of the mission which will see a new rover launched in summer 2020, landing on the planet in February 2021 for two years of investigation. 
What’s more, NASA wants image analytics and optical specialists to work with the space agency on building cameras to take to Mars.
“We are looking for small cameras with low data rates that have the intelligence to auto track objects and detect change,” said Fontanot . The cameras must also store images for review. “If you have something that helps document what’s happening to people on the ground we want to hear from you.”
NASA also plans to mount VR cameras on the vehicle.  “We use VR to engage the world so that we can all come along for the ride,” he explained.
"Cameras with a 360-degree field of view with no moving parts would also be advantageous for spaceflight because we wouldn't have to fly the mass of a pan tilt unit as well as design a pan tilt unit that could survive the vacuum of space and extreme temperature variations," he said.
During the presentation NASA showed off its latest 360-degree experience of astronauts training for space walks at NASA’s Neutral Buoyancy Laboratory in Houston. 
The 2020/21 rover mission a key next stage to sending humans on board the Orion spacecraft to an asteroid by the mid-2020s and to Mars also on board an Orion craft, in the 2030s.
The mission’s overall goal is simple — and strangely familiar — “to seek the signs of life,” according to Project Scientist, Ken Farley.
“Most of our astronauts are very interested in imagery,” Kelly O Humphries, News Chief - Johnson Space Station/NASA told IBC. He was also the voice of mission control for more than 50 shuttle missions. “It’s like when you go on vacation you want to come back and share what you’ve seen. Astronauts are the same.”
Astronaut training includes the build of 4K camcorders from a series of component parts in orbit. “We write procedures for them to build the rig, mount a lens, put a cage around the body,” explained Humphries.
They create so many videos that it’s a full time job to process them all back home. “We need to review the video to make sure it’s not restricted for proprietary reasons or a medial privacy issue,” said Humphries. “Plus, the files are so large they take a lot of bandwidth and the archive grows by terabytes a year. As we get into Petabytes of video we have to be more aware of what we archive and what imagery we discard.”
The Mars Science Laboratory Curiosity rover is still sending back radiation data and images from the red planet having landed in 2012. 
The rover being built by NASA’s Jet Propulsion Lab (JPL) is a similar 1 tonne Wall-E style design but will pack a new set of scientific equipment including an upgraded suite of imagers.
The new rover’s main camera, Mastcam-Z (think of Wall-E’s head) will be able to take panoramas and stereoscopic images just like Curiosity, but it will also be able to zoom in further than the current rover can. 
Microphones have flown on previous missions to Mars, including NASA's Phoenix Mars Lander in 2008, but have never been used on the planet’s surface. That will change in 2021 when a mic will capture sounds from Mars and in particular the entry, descent and landing sequence. The AV data will be used to assist in planning future Mars landings, for example in understanding what a parachute looks like when opening in the Martian atmosphere.
JPL is already using mixed reality sources to build the rover. Its ProtoSpace project uses Microsoft HoloLens to project images of the rover design in 3D. A second mixed reality project, called OnSight, is a VR rendering of Mars based on Curiosity’s data. 
A Hololens is on board the International Space Station (ISS). "Instead of receiving written instructions or being talked through a task on radio, astronauts use the system to get real time instruction and reduce the training regime," said Fontanot. "The camera on the Hololens transmits what the astronaut sees to ground control and a NASA expert can superimpose text and graphics annotations over this view."
So no-more Apollo 9 pencil and paper math then...
The agency has always ensured that its space exploration program is chronicled, first with film cameras and then video, including the globally viewed footage of Neil Armstrong setting foot on the moon in 1969. Last November it launched a Ultra HD channel in the US filled with decades of stock footage and the latest materials from the ISS and Mars rover.  
The same parameters under discussion at IBC are also beneficial to NASA - but in a different way. These include  High Dynamic Range and High Frame Rates. HDR is critical for space flight, explained Fontanot. Changes in colour, contrast and intensity are essential for imagery analysis as it applies to performance of rocket engines, structures, satellites and other hardware.  Any slight discoloration it was explained, on a surface may be an indicator of material fatigue resulting from abnormal temperatures or radiation. Cameras that can capture HDR may be a key during troubleshooting.
Perhaps it’s worth noting that the US won’t have Mars all to itself, though. China also plans to send a rover to explore the Red Planet in 2020.

Friday, 16 September 2016

A massive boost in bandwidth makes 8K inevitable and 16K feasible

Red Shark News 
A decade ago recording in 8K resolutions let alone broadcasting it looked about as realistic as a manned landing in the Andromeda Galaxy, but it was apparent at IBC that 8K is being lumped into a suite of media enhancements set to hit us by 2020 - or sooner. 8K is becoming a regular part of the conversation and what’s remarkable is that this is no longer a surprise.
The reason for all this lies in a dramatic leap in bandwidth and a parallel demand for video which is pushing tech innovation.
Spencer Stephens, CTO, Sony Pictures summed the picture up at IBC: “Bandwidth drives content and content drives bandwidth,” he said. “As we get more bandwidth we can do more things with it, but if people want to do more things with it it becomes a greater demand for bandwidth.”
Good news is that bandwidth is coming in the guise of the 5th generation mobile communications network 5G. This will have baseline speeds of at least 20Mbps and extremely low latency and very low power consumption — a combination that will make high resolution individually delivered media a reality.
Factor in the scaleable concept of end to end IP production, where the underlying fabric won’t have to be ripped and replaced but simply expanded in modular fashion to achieve greater data throughout, and 8K is a real prospect.
All of this will come together around the Tokyo Olympics in 2020. Not for nothing did Olympic host broadcaster OBS capture 130 hours of 8K content in Rio, downgrading it to 4K for playout by NBC and others.
While NHK plans live terrestrial transmissions of 8K to domestic audiences, in theory the arrival of 5G could speed 8K video to the home anywhere. It could be a hybrid delivery model with 8K issued over fast broadband pipes and transferred around the home on WiFi.

5G: transformational

“5G will be transformational,” said Ulf Ewaldsson, CTO of Ericsson. “It means we can change the production of content, change the way we distribute things and we are able to create new content such as combining 8K with AR. This is not so far away.”
Indeed, Discovery Communications CTO John Honeycutt said his company — owner of Olympic rights in Europe for a decade from 2018 via Eurosport — is exploring VR and overlays of extreme high resolution pictures with AR (virtually racing against Usain Bolt was Honeycutt’s vision).
Whether there’s any benefit to viewing 8K on a mobile device is beside the point. The point is that bandwidth speeds will increase so dramatically that an unprecedented wealth of data will be available to mix and match applications like AR, VR, 3D, 4K, 8K in realtime.
“While 8K may be still a moonshot for many, demonstrating just how easy it can be helps underscore our future-proofing message; that is, there is an affordable answer today that costs the same for SD, HD, 4K, and, if you wish, 8K,” said Jan Weigner, CTO and Co-founder at Cinegy.
The German developer says its software products, principally its Daniel2 GPU video codec, are primed for 8K today (and 16K tomorrow).
Compression will still be key to transporting video over 5G, though. Advantech for example has debuted a HEVC video encoder which is rated to process 8K at 60 frames a second.
As the firm’s David Lin, senior director of the video solutions, points out the performance of this type of technology not only benefits 4K and 8K but VR and 360 video applications.
It’s not the only codec developer with 8K on their minds. Nova also makes the case for its codec Perseus to unlock more reliable and better quality UHD, VR and mobile services.
“We are demonstrating delivery of UHD VR experiences at 4K,” says Fabio Murra, SVP Product and Marketing. “It has to be at least 4K and it will probably be more... 8K, 16K.”
AV Stumpfl was showing an 8K ready version of its Wings Engine Raw server intended for digital advertising and live event installations. The machine delivers four streams of 4K uncompressed content at 60Hz, plus media overlays, text generation and show control on top.
A single server can manage content over multiple LED screens, or drive a 4x4K projection system with soft edge blending, mapping and geometry correction. Alternatively, Wings Engine Raw can be used to blend or overlay multiple parallel HD, 4K, 5K or up to 6K streams in realtime.

8K cameras

While kit manufacturers like Ikegami have been commissioned by NHK to develop an 8K live production range (and an OB vehicle used in Rio), development is also happening in cinema. Panavision's Millennium DXL, due for release early 2017 is a collaboration between Panavision, RED and VFX house Light Iron. It combines RED's 8K sensor, core electronics, and recording format with Panavision optics (notably all Primo 70 lenses) and Light Iron's colorimetry to replicate a highly stylised cinema look. It can be rented exclusively through Panavision.
RED itself is arguably innovating faster than any other camera maker. Its 8K Dragon sensor, housed in a $60,000 Weapon camera body, is shooting Guardians of the Galaxy 2, while the company is also developing a new 8K sensor for the Weapon, called Helium. This will cram 8K's worth of pixels (that's 16 times the resolution at which most current films are projected) into a smaller, Super 35mm-sized chip making it suitable for a wider array of lenses. RED custom built a camera containing the chip for director Michael Bay, who plans on using it on Transformers: The Last Knight.
The point to shooting resolutions beyond 4K is less about image sharpness and more about the point at which visible artefacts disappear. Pixelation is most noticeable in VFX-intensive scenes where the additional image information can prove useful. That's why other camera makers are keen to push 8K. Canon has previewed a prototype of an 8K camera housed in a EOS body. Sony, which already permits 8K shooting via its CineAlta F65 currently denies it will launch a slimline version for TV work but its consumer electronics division is reportedly working.
The flagship finishing suites from SGO and S.A.M. (Quantel as was) are already capable of manipulating 8K images.

NHK broadcasts

At IBC, NHK showed off a OLED panel just 1mm thin made out of plastic. The rollable screen is being devised for Japanese homes, which tend to be smaller than western counterparts. Putting up giant 100-inch TVs to view 8K transmissions may hog too much living space, but not if the screen can simply be scrolled up tidily into the ceiling. It’s this attention to detail plus an unswerving belief in its vision that has seen NHK successfully rollout 8K test transmissions daily with a view to going live as earlier as 2018, and going ballistic around the 2020 Games.
Clips from Rio were viewable on NHK’s stand via a 130-inch display composed of 4 x 65-inch 4K ultra thin glass OLEDs. The Japanese broadcaster had two 8K production teams in Rio manning four production vehicles. To generate the 22.2 surround sound, NHK explained that in Rio it had devised a ‘microphone tree’ of different types of mic, such as shotgun omni and cardioid. Each tree had eight to 12 microphones arrayed in two or three layers, and three or four of those trees were placed around the venue.
NHK is currently trailing daily 8K broadcast aired in the broadcaster’s local bureau which are open to the public. Content includes material from Rio, a concert by J-pop star Kyary Pamyu Pamyu and footage of Japanese conductor Seiji Ozawa in a Beethoven concert.
Now, about that rocket trip to the neighbouring galaxy...

Wednesday, 14 September 2016

IBC Wrap: IP Lifts Media Into a Phase of Experimentation

Streaming Media 

IP and IT are transforming the broadcast industry, causing seismic shifts in both technological and business strategies, say execs from EBU, ITV, IABM, and Ericsson in this look at the most important trends coming out of IBC 2016.


The IBC 2016 show will be seen as a watermark for IP/IT networks in content production as TV executives began to see its potential beyond cost savings and toward new creative possibilities.
"Before IBC the use of IP in a programme environment was theoretical, but what we've learnt is that IP is working," said David Wood, deputy director of EBU Technology & Innovation's technology and development, in a summary of the event. "This makes all the difference to programme makers since they can see how flexible it is to create new forms of content. IP is now at a tipping point, no doubt."
At previous broadcast kit trade events the only discussion seemed to be about picture quality and the latest video format. The industry has been trending away from this in recent years, but it's now clear that IP/IT is transformational across all aspects of the business. It is not just about changing one set of standards for another, or new kit replacing an older set with substantively the same workflows. This change impacts skills and workflow, and fundamentally how the media industry does business.
"We are in a multi-format world in which SD, HD, [and] UHD, as well as elements like high frame rates, high dynamic range, and 360° video can be acquired and need managing," said John Ive, consultant, technologist for the IABM (the international trade association for suppliers of broadcast and media technology). "Quite simply the traditional video production system can't cope. Something new is needed which is format agnostic, and this is a key driver to IT and IP."
"IT/IP should enable a future where you can add UHD to a plant without having to rebuild it," said Simon Fell, director of technology and innovation, EBU.  "More than that, it means we can plan for the next 30 years. IP/IT is more than just moving video and audio from A to B but about new opportunities to deliver content and services in ways we have yet to think of."
Tom Griffith, director of broadcast and distribution technology at UK broadcaster ITV, summarised the plethora of new technologies as one of "digital disruption." "There are a whole range of opportunities out there and the question is how we take advantage of them, blend them into the media creation process, without losing the best of what we have at the moment," he said.
"The industry is going through a period of experimentation," said Ive. "Nobody has all the answers but we need to scale up what works well and turn off what doesn't. Having IT, TV, and telco people in one place is creating a very important dialogue about the sorts of creative applications we need to go to market."
The underlying technologies of faster bandwidth, higher-resolution sensors, greater storage capacities and incoming internet protocols enables this agility. "Often in the past you had to be pretty sure about what you are going to do because it was very expensive and it took time to set up," says Griffith. "Now you can launch and try something out and if it doesn't work you can simply dump it and move on."
There are also warnings to the industry about GAFA—the Google, Apple, Facebook and Amazon tech giants which should be seen, according to WPP CEO Sir Martin Sorrell, as media companies. He highlighted Amazon in particular—a retailer—as the key player to watch who will treat content as product and viewers as consumers. But what all have in common is the use of analytics to feedback customer behaviour in a way that broadcasters have barely tapped.
"Companies like Amazon and [Chinese retailer and net portal] Alibaba and Tencent have big aspirations in media. They think about the way they move assets around in a very different way," noted Lindsay-Davies.
"It's all about moving away from the mindset of a production chain—which is linear—and toward managing digital assets," said Ted Malone, VP products, strategy planning and management, Ericsson. "Your job is to help monetise these assets by delivering to the right customer. This means that technology does not set agenda but technology is an enabler for lot of options and either end users or consumers will be the ones choosing between options and determining success or failure."

IP Interop Shows Consolidation

IP networks are destined to replace to SDI, and there was evidence of this consolidation at IBC. The Interop Zone featured nearly 40 suppliers collaborating to make IP/IT systems work together and capable of reproducing everything done in a broadcast chain from acquisition to distribution. Issues around the synchronisation of AV signals over IP seem to have been resolved, although there are questions over standardising the process which are outstanding.
A new IABM survey released in Amsterdam, revealed that 85% of media organisations—the customers of kit vendors—believe it vital that suppliers provide them with interoperable product. Yet more than 90% of them also indicate they will adopt cloud technology in 2-3 years or are already doing do.
"The transition is massive and it's as much a cultural as a business thing," said IABM CEO Peter White. "This is changing the face of this industry. We are going to have to work together. We need to agree on standards."
"Traditionally our industry is about selling expensive black boxes on premises, but that capex model appears broken," added Ive. "We are now moving toward a revenue based model based on opex."
In other words media organisations are being encouraged to rent or subscribe to services—playout for example—running "virtually" in a data centre.
Ericsson—owner of playout service provider Red Bee—has in fact signed up UKTV for the UK's first virtualised live event broadcast. Meanwhile, Sony is working with Swiss telco Swisscom to perform live switching entirely on Swisscom's data centres.
Although Sony calls this a pilot, it is the customer not the tech it is testing. "The technology works," says Michael Harrit, Marketing Director, Sony Europe. "The discussions we are having are around how much customers are willing to pay for this service."
These SaaS models need working out. "I get told by lots of vendors that I can pop-up a channel in 5 minutes but when I ask whether I can pay for that for 24 hours or by the minute they always want to sell me perpetual licences or rental for a year," observed Griffith.
Aside from the near-term transactional aspects of moving to IT/IP the industry is also looking upon the technology as an enabler for whole new ways of going to market.

VR Standardisation Bodies Form

There is particular enthusiasm about the creative possibilities and revenue generating potential of VR, AR, and mixed reality—the main challenge now being to work out exactly what to do with it.
"The onus on making this work passes back up to the creatives in terms of giving us new storytelling experiences," said the EBU's Fell.
Interoperability of VR systems from headsets to compression is an issue which the industry is scrabbling to resolve.
Richard Lindsay-Davies, CEO of the UK's Digital TV Group, revealed that the DTG would participate in a global VR/AR alliance.
There are other moves in this regard too—and perhaps they are all talking of the same thing or will merge.  For example, the DVB is heading an investigation into whether or what standards are needed. The study consists of 30 members including the BBC, Qualcomm, and Ericsson.
"Right now we have two systems, one based on mobile phones and another based on headsets, and there could be more," said David Wood, who is heading the investigation. "Will it be commercially successful? Absolutely, but there's also motion sickness, and we need to know the type of technology that will be most suited."
Germany's Fraunhofer IIS is also pressing for standardisation of VR and its possible successor, light field.
"Before professionals can tap into all that this technology has to offer, strides must be made," said Fraunhofer's Dr. Siegfried Foessel. "This includes simplifying and enhancing any workflow for processes that combine the real world and CGI, such as 3D or VR."
Several technologies are necessary for VR, including image stitching, metadata enrichment, and video coding, he said. This leads to an array of different types of formats, coding, delivery mechanisms, 3D projection, metadata or signalling. The result is market fragmentation. "Each of these components must be standardised for optimal interoperability," he said.

IBC '16: Google Says Nobody Owes Publishers a Living

Streaming Media

Google blames some publishers' ineffective digital strategies for "waterfalls of intrusive ads" but says technology offers opportunities to innovate out of the impasse.


Online publishers, advertisers, and agencies have a collective responsibility to set and enforce digital ad standards or risk going out of business, warned a group of stakeholders including Google and Unilever.
"We are not here to kill off old legacy businesses, but if they don’t adapt they will be left behind," urged Mark Howe, managing director for EMEA agencies at Google. "The same goes for advertisers and agencies. The reality is that some businesses feel the world owes them a living."
Graham Wylie, senior director of market development and channels for AppNexus, a cloud-based ad optimization platform, described the publishing model as evolving from a content-based entertainment and information platform to being an audience-based monetization engine. "If things continue on this path you will end up with publishers not able to support their business on the revenues they are seeing and ultimately a much smaller, less healthy internet."
The consensus at the IBC conference session "Ad Blockers: Can Internet Advertising Be Fixed?" was that consumers do not have a blind entitlement to content but the industry needs to better communicate this message.
"From a consumer perspective, what is driving the rise of ad blocking is annoyance, irrelevance, intrusion of poor quality ads," said Chris Le May, SVP and MD of Europe and emerging markets for DataXu. "It’s a simple human nature factor that says 'I don’t like what I wanted to do to be interrupted'."
In fact many consumers do not want to opt out of of the advertising contract. In excess of two-thirds of British consumers are willing to be exposed to advertising as opposed to paying hard cash for content as long as the ads are less intrusive and more relevant, according to IAB data.
"As an industry we've got to get together in all markets, in Europe and the U.S., to set global standards and then we've all got to buy into those standards and adopt them," said Wylie (who is also chair of IAB Europe’s programmatic committee).  "It’s about understanding what the consumer is doing, building standards, offering consumers choice and then as an industry, not just talking about doing it, but actually enforcing those standards and rooting out bad practice."
Unilever's media operations and strategy director for Europe, Richard Brooke, agreed. But he pointed out that the practice isn't new. "The first ad blocker was the kettle. The second was the remote control. The responsibility for it lies in everyone's hands. The advertiser has responsibility to create good quality ads, publishers need to create and present inspiring content that consumers want to engage with, TV companies need to manage their ecosystem, and consumers need to understand the relationship with good quality content. It is also incumbent on creative agencies to ensure that formats they make are not data heavy."
Why have ad formats became intrusive and abusive to consumer trust in the first place? Wylie put this at the door of online players disintermediating legacy media.
"Giant media business that create no content but signpost other people's content and have intermediated themselves with brands so they're taking the majority of the revenue available in the ad market, leaving the publisher to push ever harder to monetise their remaining traffic," he argued.
"We're seeing a tension as publishers are pushed as far as they can to create ad formats that deliver revenue to supports their business but ad blocking technology allows the consumer to opt out."
Unsurprisingly, Google's Howe disagreed. "No-one owes publishers a living. In an age of full information, the consumers are in charge. If a consumer is not visiting a publisher’s website for whatever reason, and publishers then have waterfalls of appalling ad loads on their publications, this leads to ad blocking. Waterfalls of intrusive ads means they aren’t monetizing their business effectively. They need to re-evaluate their own digital transformation, and if some of them fall by the way side then that's the consumer choice."
However, publishers and brands have the technology to innovate their way out of the impasse.
"There is far more opportunity to engage with consumers than there's ever been due to fragmentation in the market," said Le May.
Howe highlighted opportunities to create innovative formats around streaming, VR, UHD, AI, and machine learning capabilities "that can drive content to consumers in a very different way."
"Mobile is the biggest transformation facing us from a video and news point of view and a consumption point of view," he added. "Publishers must find a way to deliver high-quality journalism and conversation with their consumers but achieved in different formats and in different locations."
The next generation of digital advertising won't just be about the quality of content or the user experience, but also whether the ad serves a purpose.
"That's the great opportunity for digital, because if you look at the feedback loop between the digital consumer and the brand, the brand can signpost and nudge and use technology to be timely and relevant in way it couldn't in conventional media," argued Wylie. "But it has to do that in a way that respects what the consumer is trying to achieve.
Sky Advance, the broadcaster's targeted ad platform, was highlighted by Le May "as a much more sympathetic way to engage with consumers on behalf of the advertiser.  In a lot of cases the engagement can be initiated by a consumers own actions. By downloading a VW Golf brochure from a website, they may then see an ad for a VW Golf when they get home and watch TV. That may be old media in that it's still the same 30-second spot, but it makes the TV experience far more meaningful for the consumer."
Panelists repeatedly stressed that those publishers and companies that embrace new technology can turn the internet revenue model on its head by creating new brand/consumer relationships, but they also pointed at a few legacy models which have not done this, perhaps because they have less need to turn profit, and which have consequently fallen behind.
"It worries me that that The Guardian, still one of the world’s top news sites, lost money on digital revenues last year. That’s unacceptable," said Howe.