Monday 29 October 2018

Craft Leaders: Robert Richardson, Cinematographer


IBC
Three-time Oscar winning DoP Robert Richardson reflects on a career that has spanned 30 years, working with directors such as Martin Scorsese, Quentin Tarantino and Oliver Stone – and working in formats ranging from digital 3D to ultra-wide 70mm.
There are few more hot-bloodied or revered auteurs in the last thirty years of cinema than Oliver Stone, Martin Scorsese and Quentin Tarantino so it must take a cool head to be the eyes for them all.
 “I will always put the director and the story first. That does not make producers overly happy but I don’t know another way,” says Robert Richardson, ASC.
His extraordinary career has seen him craft nine films with Stone, five with Scorsese and five for Tarantino and counting. He’s experimented with everything from digital 3D to an ultra-wide 70mm format, landed three Oscars including for JFKThe Aviator and Hugo, and is the DP to whom Robert De Niro, Ben Affleck and Andy Serkis turned to compose their directorial features (The Good ShepherdLive By NightBreathe).
“At the beginning I was raw - now I offer experience,” he says. “I believe I understand how to tell a story.”
With his latest film, A Private War, he has come full circle with his first film, Salvador. Based on events leading to the murder in Syria of venerated war correspondent Marie Colvin, A Private War is documentary filmmaker Matthew Heineman’s narrative debut. Salvador (1986) is a polemic of Reagan-era Central America conflicts through the lens of gonzo journalist Richard Boyle (James Woods). It still retains its venom.
The power of the story
“I do feel I’m returning to my roots on this project,” says Richardson of Heineman’s film. “The subject matter aligns quite remarkably [with Salvador]. Both are low budget and both are essentially documentary in style. Of course, I felt blessed to have been asked to make Salvador but now I have the choice.
“In this case, Marie Colvin was a monumental inspiration, her influence is incalculable. In addition, Matt’s clear willingness to make this film does not hold back on the truth – Syria, in particular, where genocide continues. He is an extremely focused and honest filmmaker which is rare. Very rare.”
It is the chance to shoot a good picture rather than a good-looking picture which fires Richardson up.
“I began making movies at a time when the subject material was vital, and with directors who dealt with material that meant something,” he explains. “I’m drawn to their work and I’ve been incredibly fortunate to work with them.”
With Stone he shot PlatoonWall Street, The Doors, Born on the Fourth of July and Nixon. For John Sayles, Richardson shot Eight Men Out about an infamous baseball match fixing scandal while the Errol Morris documentary Standard Operating Procedure concerned Iraq’s notorious Abu Ghraib prison.
“Today, the subject matter for major pictures is less vital than its commercial potential,” he says, sadly.
Pioneering style
Richardson’s search for challenging material is reflected in an adventurous style that has seen him play with acquisition materials, visual textures, and aspect ratios to achieve the right emotional resonance for the story.
“I don’t ever want to stay the same. I believe technology and technique and advances in everything from cameras to projectors are there to be used to tell the story.”
Hugo remains one of the few mainstream films shot with dual cameras and a mirror to provide the parallax. “It’s creating true 3D as opposed to a post conversion,” he says.
The manic visual palette of Stone’s Natural Born Killers mixed 8mm, 16mm, 35mm colour and black and white with Betacam video, rear projection and double exposures to mimic the impression of someone flicking between channels or to alternate subjective viewpoints.
“The choice of format shifted depending upon the sequence and sometimes altered within a sequence to provide editorial alternatives to texture,” he explains.
For Errol Morris’ Fast, Cheap & Out of Control, Richardson employed a similar fusion of formats as well as Super 8 film, stock footage and cartoons to create an impressionistic collage of images.
Then for Kill Bill, his first film with Tarantino, Richardson duplicated footage over and over again until it attained the texture of old Kung Fu movies for one scene; shooting the whole film (and its sequel) with snap zooms, stylised lighting, and lurid colours in reference to the exploitation movies the director wanted to evoke.
Later when Tarantino wanted to release The Hateful Eight as a 70mm movie, Richardson captured the film’s super-widescreen images with Ultra Panavision 70 lenses that hadn’t been used since 1966’s Khartoum.
“I’m currently shooting film for Quentin on Once Upon a Time In Hollywood. He loves film, as do I, because there is a beauty to it, particularly in the way it captures skin tones. But for me, if film dies as a medium then so be it. Everything is ultimately released on digital today anyway. The Hateful Eight was shot on film, graded on film and released on digital but Quentin couldn’t care less since he wanted to make a special release on 70mm in theatres.”
He stresses: “I have no issues with new technologies. In fact, I search them out.”
On Affleck’s 1920’s gangster picture Live by Night, Richardson shot some scenes at 2000 ASA on an Arri Alexa 65, sensitivity to low light near impossible with negative film. Similarly, sequences depicting Homs in A Private War were lit solely with flashlights, random street lamps or street fires and very little else.
“I shot at 1600 ASA (on Arri Alexa Mini) with Zeiss Super Speeds and used standard light bulbs on strings to give additional light where necessary. It was far more minimal than I generally work with but this is the direction the future will take as digital capture improves in both range and fidelity of skin tone.”
He is also keen to experiment with high speed cinematography, an aesthetic explored by directors Peter Jackson and Ang Lee but which has yet to find either an audience or the right story.
“I met with Ang Lee to discuss shooting The Thrilla in Manila, [a project about Muhammed Ali and Joe Frazier’s legendary boxing clash] in 120fps HDR,” he explains. “I was looking forward to it because I think this is a visual step that has to been addressed. I would love to understand whether 120 can work to tell a story. It does deliver absolute clarity, almost like you can step into the picture. At the same time it means that production design, costume, make-up all need to catch up because it will show any imperfection. You can’t lie with this system.”
“It may be the case that you can play with the speed of different scenes as part of the grammar to storytelling,” he muses. “You could choose to shoot the boxing action at 24 frames and switch between 24, 96 back to 48 or 120 where it suited the story. You could do it subtly such that an audience wouldn’t be aware of what you were doing other than having the scene emotionally resonate with them. That’s my feeling about 3D too - that if done well it can be a subtle enhancement to the narrative and organic to film language.”
Richardson was also lined up to shoot Disney’s live action The Lion King before a conflicting schedule denied him. Like director Jon Favreau’s VFX-Oscar winning smash The Jungle Book, this is being filmed using virtual production techniques – green screen sets with minimal live action elements integrated seamlessly into photoreal computer generated environments and animations.
“I would have loved to have made that film,” he says. “I am really excited by the possibilities of the technology.”
Background and influences
Born in Massachusetts in 1955, Richardson developed an interest in photography while studying at the University of Vermont, but it was films like Lawrence of Arabia, 2001: A Space Odyssey and in particular the work of Swedish auteur Ingmar Bergman (Persona, The Seventh Seal) that altered his perception.
“I dropped everything to concentrate on film. I didn’t feel I was capable or mature enough to be a director but I was fascinated by creating stories and began to appreciate more and more that doing so needed the craft of a cinematographer. I made a decision to go in that direction.”
He joined the film department at the Rhode Island School of Design and continued studies at the American Film Institute [AFI] in LA where he was apprenticed with Bergman’s cinematographer Sven Nykvist and Nestor Almendros, the Spanish DP who lensed the films of French director Eric Rohmer as well as American new wave classics like Terence Malick’s Days of Heaven.
He says being on set with Nykvist shooting Canary Row (1982) and then much later with iconic Bergman actor Max von Sydow on Shutter Island (2010) was like “touching upon the time of gods.”
He had done some second unit work, notably on Alex Cox’s cult classic Repo Man (1984) and several documentaries including one for PBS on the civil war in El Salvador, but in 1985 when gung-ho writer/director Oliver Stone invited him to Mexico, it was both adrenalin rush and gamble. It was first sole DP feature credit.
“On Salvador I really didn’t know a great deal but Oliver and I collaborated intuitively and learned as we went along.”
Immediately afterward they went to the Philippines to shoot Platoon. Both films released in the same year with the controversial Vietnam pic, also shot semi-documentary style, landing the 1987 Best Picture Oscar and Richardson’s first Academy Award.
“We worked on successive films by each putting together our thoughts on the whole script. I would storyboard or shot list the entire script and present it as my thinking and I would get a negative or positive response. We did that for every film, always balance and adjusting feedback from each other, fine-tuning the story.”
When Martin Scorsese asked Richardson to shoot his Vegas-set mob opus Casino, Richardson tried the same approach.
“Before shooting I was called into the production office. Marty said he hadn’t read my ideas and nor was he going to until he was happy with the script. He said, ‘When I’m happy with the script I will give you every shot in the movie [to devise]’. Which he did. This transformed my vision substantially. Here I was responsible for lighting, framing and operating every shot on the movie. This was a massive learning lesson for me and for my crew also.”
It’s worth highlighting his crew, since for three decades Richardson has worked, by and large, with the same key grip, Chris Centrella, and gaffer, Ian Kincaid, and for fifteen years with camera assistant Gregor Tavenner; “They are all masters of their craft,” he says.
Aside from Nykvist he singles out John Alton’s composition and stylised lighting in work for director Anthony Mann in the 1940’s (T-Men, Raw Deal, Reign of Terror) as key influences but it is Vittorio Storaro (The Conformist, Apocalypse Now, Reds, The Last Emperor) who he reveres as the greatest ever cinematographer.
There is, though, one shot of his own making with which Richardson is deeply in love – and it is not one you’d necessarily pick. It’s the final sequence from Platoon, shot from a helicopter as the injured Chris (Charlie Sheen) leaves the hell of Vietnam behind.
“He is looking down, in shock, trying to comprehend the experience of all the lost lives and everything that is destroyed below. This wasn’t a hard shot, technically, but something about it still resonates with me - and not everything does that.”

Keeping the pirates at bay


FEED

It’s predicted that $52 billion in revenue will be lost to content piracy by the year 2022. The solution isn’t just technology – it’s also education and more user-friendly services.
This time last year the industry was on high alert. Hackers had breached Netflix, Disney and HBO, threatening to release script details or entire shows to the web unless ransoms were paid. Even then, Game of Thrones season seven was pirated more than a billion times, according to one estimate.
In recent months no such high-profile incident has occurred – or at least been made public. The industry would appear to have stemmed the tide. This could be partly due to the firepower being thrown at the problem.
Analyst Ovum estimates that the spend on TV and video anti-piracy services will touch $1 billion worldwide by the end of 2018 – a rise of 75% on last year. Increasing adoption of these measures such as DRM, fingerprinting, watermarking, paywalls and tokenised authentication will see losses reduce 3%, it predicts, to 13% in 2018 of overall TV revenues.
Even at 13%, the revenue expected to be lost this year by global online TV and video services (excluding film entertainment) amounts to $37.4 billion.
A report from Digital TV Research forecasts the cost of lost revenue due to piracy will reach an $52 billion by 2022. Piracy – euphemistically known as content redistribution – is rife in sports broadcasting, too.
At the start of the World Cup this year, Saudi TV channel BeoutQ was alleged by FIFA to be illegally broadcasting the opening games. Viaccess-Orca research, across 17 first round matches, recorded over 1 million views of illegal streams via Periscope, 3.1 million via YouTube and 7.5 million via Facebook. It identified the same top five ISPs hosting the sites used for streaming: two in the Netherlands (NForce and Quasi), Private Layer in Switzerland, Marosnet in Russia and Contabo in Germany. These illegal streaming links were not stopped by tracking services used by rights owners or TV operators.
Most of this piracy is the work of sophisticated, well-equipped organisations, using set-top boxes, Conditional Access (CA) technology and mainstream payment systems. But with a good screen and a good camera, anyone can create their own instant illegal streaming facility, redistributing content using Facebook, YouTube, Periscope, Twitch or other platforms and apps.
Awareness is being raised on all fronts. Netflix, HBO, Disney, Amazon and Sky are among more than 30 studios and broadcasters to form the anti-piracy Alliance for Creativity and Entertainment. Earlier this year it shut down Florida-based SET Broadcast, pending a lawsuit alleging content piracy and it has also initiated legal action against Kodi set-top box makers in Australia, the UK and the US for providing illicit access to copyrighted content.
As the technical quality of content is raised to UHD and HDR, its value and therefore attractiveness to pirates has risen too. MovieLabs, which was formed by the Hollywood studios to set technical specifications for the distribution of premium content, identified watermarking as one of the key security mechanisms for securing 4K UHD content back in 2014.
A forensic watermark, also called a digital watermark, is a sequence of characters or code embedded in a video to uniquely identify its originator and authorised user. Forensic watermarks can be repeated at random locations within the content to make them difficult to detect and remove.
Last year, the Ultra HD Forum, a promotional body for UHD founded by Dolby, Harmonic, LG and Comcast, included forensic watermarking in its guidelines, and in August MovieLabs updated its own specs for systems to securely mark video both at the server and/or the client end.
“There are a variety of use cases for watermarking, and different approaches are required for video-on-demand and live content, but the defining moment for watermarking has undoubtedly come with the rapid growth of 4K UHD content,” says Peter Cossack, VP of cybersecurity services at Irdeto.
He expects that, over the coming year, rights owners will increasingly mandate watermarking and other anti-piracy requirements into their licensing contracts.
Paul Hastings of watermarking tech specialist, Friend MTS, also reports some content owners writing into contracts a stipulation that broadcasters or service providers must be able to provide subscriber level watermarking for set-top boxes and OTT.
Of course, particular focus on watermarking should not mean neglect of other security methods.
 “Operators ask if they still need to expend so much effort on secure chipsets,” says Cossack. “Well, yes, you do: if there is a weakness there, the pirates will go for it.”

Consumer education is required, too. Illegal streaming services are increasingly sophisticated, with slick websites and advertising, secure payment facilities and money-back guarantees to trick consumers into subscribing to an illegal service.
Three-quarters of pirate streamer sites openly advertise payment methods including Visa, MasterCard and PayPal, according to a survey by Irdeto. It suggests more could be done by these brands.
“If media organisations threaten to vote with their feet against payment platforms that enable piracy, it’ll be fascinating to see who blinks first,” suggests Mark Mulready an Irdeto cybersecurity expert.
Cryptocurrencies, incidentally, only accounted for around 4% of payment method mentions on the sites it analysed.
Of course, there are many people who knowingly head to an illegal streamer, the equivalent of getting the dodgy DVD with the xeroxed cover from a bloke down the pub. Some visit pirate sites in frustration with attempting to pay for and access a pay-per-view event, as happened en masse just ahead of the Mayweather vs McGregor boxing clash.
Many of us also think sharing passwords for paid streaming services is, well, an okay thing to do. A study from Hill and Magid found that 35% of 18-30 years olds (yes you, millennials) share login credentials. US streamer Hulu loses $1.5 billion a year due to such nefarious activity, according to business management specialist, Cleeng.
However, a study from Ampere Analysis suggests that password-sharing among Netflix users is not as problematic as most believe. Only one in ten users share Netflix passwords with their family or peers, it found.
What this suggests, according to Cleeng, is that although account and credential sharing have risen, the most successful OTT services have happy customers.
“In short, rather than focusing on locking people out, success will lie largely in encouraging loyalty and enticing fresh subscribers through a winning mix of incredible content, a flawless user experience and an innovative approach to the service,” the company advises.
The most sensible protection strategies apply layers of security mechanisms and close the loop by monitoring and then enforcing – in the courts if needs be – action against breaches.
“The most effective approach to countering threats of piracy starts with education, then moves into rights expertise, with rights enforcement being the final step,” says Verimatrix CTO Petr Peterka.
Experts candidly state that no device, content or data is ever 100% safe but a 360-degree security approach with constantly updated protection technologies will at least ensure theft
is minimised.


Thursday 25 October 2018

AV in Russia: Market report

AV Magazine

Emerging slowly from economic recession it can feel a slog doing business in the world’s largest country but vendors can capitalise on the halo left by the World Cup.


In preparation for the World Cup soccer this summer Russia was a whirlwind of activity. Eight stadiums were fully renovated, hotels either revamped or newly built, investment poured into airports, railway stations and roads. However, as quickly as the final whistle blew, the pop-up fan zones with giant screens and networked audio were taken down and the AV business reset.
Few vendors expect any serious growth or major new opportunities over and above their normal business out of the region. That’s because oil-related price erosion, inflation, political isolation, and the West’s punitive economic policy are impinging on Russia’s ability to grow.
Nonetheless, IHS Markit anticipates that the Russian pro AV industry will cease to contract, and help the wider sub-region grow at nine per cent, year-on-year on average from $2.9 billion in 2016 to over $5 billion in 2022.
The impact of US and EU sanctions triggered by Moscow’s annexation of Crimea and the recession caused by the fall in energy prices shouldn’t be underestimated. At the end of 2014, for example, the rouble had doubled against the US dollar.
“This meant customers had half the money to buy foreign pro AV equipment,” explains Andy Lee, sales and account manager for Datapath. “On the other hand, it made local, cheaper, AV brands more popular. For the same reason, Chinese vendors are increasingly in demand and taking a lot of market share from EU/US manufacturers.”
Economic recovery
Barco charted some recovery in local business since 2016 but says annual growth is not as fast as it would like it. “It’s difficult to predict the future. We hope the current trend toward growth will continue but the economy isn’t healthy yet,” says Andrey Mankos, business development manager.
“The weak currency and unpredictable exchange rates are a main challenge. Customers think in roubles and if we have strong exchange rate deviations, it affects budgets for sure. Projects tend to be postponed, have longer life cycles or their budgets cut.”
Russia’s economy grew 1.5 per cent last year, according to data from the country’s Federal Statistics Service – the first annual rise for three years – 2018 should mark another year of recovery with growth expected to reach 1.7 percent, according to a World Bank forecast, and well above the near three per cent contraction the country saw from 2015-2016.
Going forward, vendors expect a stable business without significant growth. Any fluctuation in the oil price or further political and economic sanctions will have a negative effect on the market.
However, none of this is to suggest that pro AV is stagnating in the country. Instead the local industry has matured.
“During the past five years pro AV has grown rapidly,” reports Lee. “A number of new installers have appeared. Large IT system integrators have created pro AV departments and developed this business.”
Digitisation of billboards and posters has taken longer in Russia than in many other markets, though this is changing. The majority of digital signage in Russia is still indoor, but the combination of online and DooH has inspired many network owners to invest in public screens.
Barco’s Mankos suggests that more and more companies desire “eye-catching entrance zones with a digital signature”, others are investing in equipment for collaborative work.
World Cup halo
“The market is maturing with many more professional designers and engineers than a few years ago. Customers are more experienced and demanding too,” he says.
He also feels that the FIFA World Cup has opened a number of business opportunities: “The business approach hasn’t been changed, but competition has grown.”
Melinda Von Horvath, vice-president of sales and marketing – EMEA, Peerless-AV says: “The World Cup gave the Russian people an insight into new and exciting AV installations, and so they became more interested in this direction.” She reports that Peerless-AV has received many requests since, especially for its IP68-rated outdoor all-weather solutions.
Maxim Prokhorov, channel sales manager, NEC Display Solutions points out that many companies received “invaluable experience” in implementing such large-scale projects and that the World Cup has helped attract investment “and show that Russian pro AV has great potential.”
Moscow central
Moscow and St Petersburg are the main cash-flow centres where the headquarters of the biggest companies are located. Some 60-70 per cent of business is thought to be made in Moscow.
“Most of our business is in the western part of Russia and most locations could be accessed by one to four-hour flights from Moscow – or from London,” says Lee. “Generally, demand is driven by more and more cheap videowalls be it for retail or in command and control.”
Siberia has traditionally demanded “heavyweight solutions” for control rooms in the oil and gas sector, says Prokhorov. Corporate solutions for engineering companies and heavy metals are favoured in the Urals.
“The fact is that all business and processes can be controlled without leaving Moscow,” he says. “The same is true in regard to working with state budgets. All decisions and the bulk of projects come from Moscow.”
However, the country can be unlocked from regional centres like Novosibirsk, Ekaterinburg and Kazan which have their have own business activity and local integrators.
“We are counting on the amusement park’s market,” says Mankos. “It’s nearly zero now, but there are many projects in the works.”
Promethean’s distribution partner, for example, is responsible for managing the reseller relationship, supported by Promethean. There are also different types of resellers – some work from the main cities with others concentrate on working within their regions only. There are 83 regions across Russia with some being very large – like the Krasnoyarsk region, for example, spreading from the Arctic Ocean on the north almost to the Mongolian border on the south.
The most actively developing industry segments in Russia are energy, transport, education and B2G. In the business arena, Russia currently has more than 50 large infrastructure projects with a budget of more than $1 billion for the period 2018-2024, mostly linked to infrastructure and transport links.
Massive investment
By 2030, it’s estimated Russia will have spent nearly a trillion dollars on infrastructure projects including a North-South Transport Corridor connecting St Petersburg to India via Azerbaijan and Iran, and a Kazan-Moscow high-speed rail route. The 770 km line is expected to cost $22 billion, will open in 2020 and passes through the cities of Vladimir, Nizhny Novgorod and Cheboksary. The route is the first section of an ambitious, continent-straddling high-speed route linking Beijing with Moscow.
“There are a lot of advantages to selling in such a vast country,” says Von Horvath. “A broad product portfolio allows a company to offer professional AV integrators the most suitable solutions for their diverse mix of projects.”
A key challenge that Peerless-AV are observing is ‘import substitution’ where local manufacturers are competing for market share with lower cost products, but without the same level of quality and safety assurance as offered by trusted suppliers.
According to Datapath’s Lee, the main challenge for foreign businesses is the lack of Russian government support for small to medium sized companies.
“High taxes, with no (or very limited) access to bank financing are also having an effect on business,” he says.
The size of the market also adds a layer of complexity from the logistics and costs point of view, “because deliveries across country add the time to the product being available as well as extra costs for delivery,” says Svetlana Harwood, head of business development Russia at Promethean.
“Visiting Russia answers so many questions about the infrastructure, developments, level of business engagement and skill as well as support available,” says Harwood. “It’s also important to find a good partner/distributor. Depending on the nature of the business, this might be one or several, but equally important is to have a Russian speaker in the team.”
Anyone considering doing business in Russia, should establish a long-term strategy. She adds: “This is not a ‘quick buck’ market.”

Wednesday 24 October 2018

Behind the scenes: Bohemian Rhapsody


IBC

The Editor and Production Designer of Freddie Mercury biopic Bohemian Rhapsody tell IBC365 how the film recreated the classic Live Aid concert from Wembley Stadium – with 800 extras at Bovingdon airfield near Hemel Hempstead.
Queen’s showstopping 21-minute gig during 1985’s Live Aid is a contender for the greatest live performance in rock music history and is “the Star Wars Death Star sequence” of a new film according to its Editor.
“It is the moment the whole movie is building towards so there was a lot of pressure to fulfil people’s expectations of what that might be,” explains John Ottman, the editor of Freddie Mercury biopic Bohemian Rhapsody. “The biggest fear making this film was whether ending at Live Aid, with no epilogue, was going to be memorable and leave people emotionally satisfied or leave them blank.”
The film has had a turbulent history as befitting its rock and roll subject. Over a decade in which a revolving door of stars (Sacha Baron Cohen and Ben Whishaw signed to play the lead) and directors (Tom Hooper and Dexter Fletcher) were attached, the final script has the blessing of the band’s guitarist Brian May and drummer Roger Taylor. The director is Bryan Singer (whose credits include most of the X-Men franchise) although Fletcher returned to direct about a third of principal photography after Singer was fired allegedly due to misconduct on set.
“The film is the classic narrative arc of alienated artist and inflated ego, who is then humbled and who finally brings the band back together for triumphal come back,” says Ottman. “We tell it in a down to earth fashion with nothing hyper real about it except for the concert pieces. These, I made a little more stylistic. I didn’t want to represent them as a straight [forward] concerts otherwise you’d risk losing sight of the story but instead to have each piece move the narrative forward.”
During the band’s first big tour of the U.S, for example, a sequence is inserted about Freddie’s visit to a truck stop bathroom. “It’s the first glimpse into his sexuality,” says Ottman. Also, in the U.S tour sequence, Mercury calls out the names of the states and cities they play as a device to speed the timeline.
While all editors will have a hand in sound design and many have musical or sound editing experience, Ottman is unusual in regularly composing the movie score as well as picture editing. He’s done this with Singer for The Usual Suspects, Apt Pupil, Superman Returns, Valkyrie, and X-Men: Days of Future Past. He’s also composed music for other directors and their films, notably The Cable Guy, Kiss Kiss Bang Bang and Fantastic Four.
“I can’t wait to finish cutting the picture because what I look forward to most is the score,” he says. “I’m always seeing the film from both sides, from the point of view of music and picture.”
Bohemian Rhapsody, though, has no score, relying instead on Queen’s extensive back catalogue although Ottman embellished the soundtrack with opera, for instance in a scene where Mercury is on the phone to his partner Mary.
“I didn’t want to have the schmaltzy track list of a typical biopic but to add some theatricality and depth,” he says. “Brian May was a great help in supplying all the original stems for the tracks so we had a lot of control over mixing it. In a few instances where we needed to replace the vocals for technical reasons we used vocals from Mark Martel [a winner of the Queen Extravaganza Live Tour auditions]. Even the band couldn’t tell the difference between Freddie and this guy.”
The film features May and Taylor’s precursor band Smile. Queen contacted lead singer Tim Staffall who lent the production multi-tracks of concert footage from 1969. Staffall and Smile also re-recorded tracks at Abbey Road for use in the film.
“As amazing as Queen’s music is they wrote it to be an interactive experience with an audience so the music doesn’t really soar until you hear the audience sing along, applaud and clap,” explains Ottman. “That took a lot of multi-track work to finesse.”
The biggest challenge, though, was recreating the ambience of Live Aid, arguably the biggest concert in rock history. Featuring reunions of Led Zeppelin, The Who and Black Sabbath with U2 at their prime, David Bowie and Wham, the worry for Queen was whether they could match the competition.
The live televised phenomenon bookends Bohemian Rhapsody and shows Mercury in his element as a great entertainer. Cinematographer Newton Thomas Sigel gave the picture a suitable epic scope by shooting with the large format ARRI Alexa 65 paired with Hasselblad Prime DNA and Prime 65-S lenses. Live Aid takes up a substantial portion of the film’s third act and the filmmakers had to battle with the studio, Fox, not to cut it down.
“You always tend to battle with a studio in terms of length or pacing but in this instance I had to do a Hail Mary pass to save it,” Ottman relates. “I cut sections in other areas in order to save more of the Live Aid material.”
It was here that lead Rami Malek’s (Mr Robot) performance really came into its own “as he channelled Mercury into an extraordinary rendition” of tracks including an abbreviated Bohemian Rhapsody, Radio Ga Ga, Hammer to Fall, Crazy Little Thing Called Love and the finale of We Will Rock You into We Are the Champions.
An extended version of the Live Aid sequences shot for the film is being produced for release, possibly as a Blu-ray extra.
It’s not all career highlights though. The lyrically dubious track Fat Bottomed Girls features in scenes of the band’s U.S tour. Explains Ottman, “In reality the track was released a year later but Bryan [Singer] was adamant he wanted this in because it’s such a crowd pleaser.”
Replicating Live Aid
The task of finding a space to build a replica of Wembley for Live Aid fell to Production Designer Aaron Haye. He says it was by the far film’s biggest challenge due to shooting in the UK’s autumnal weather in 2017.
 “We put a lot of effort into how and where and how much of the stadium we were going to recreate,” he says. “Finding a location big enough was tough. We looked at a number of studio backlots but we figured it would be a muddy mess by the time we’d finished and with all the potential rain. Ideally, we needed an area with a hard surface.”
They located a site at Bovingdon airfield near Hemel Hempstead. “Even that wasn’t too easy since the land was split among five different owners and we were going to be straddling a number of them. We had to push a racetrack out of the way, liaise with two different farm properties and divert a weekend market.”
Working from footage and photographs taken on the day as well as the advice of several technicians and artists who were there, Haye created a miniature model of the set but admits it wasn’t entirely faithful.
 “Wembley has had its layout changed several times over the years and we scoured the local council and libraries but couldn’t unearth any plans from the 1950 to the 1990s. We know how it was originally designed and we worked from photographs to make our version of the back stage.
“In reality there were a bunch of trailers behind the stadium and dressing rooms in the concourse under the venue so we hybridised it to look as if everything is connecting together. Bryan [Singer] fell in love with an airstream trailer as a visual so rather than use a corridor we go straight from trailer – as Queen’s dressing room – to the stage.”
Haye also roped in Serious Stages, the rigging company which had worked on Live Aid to assist in the build. Attention to detail included adding tracks marks on the stage, the exact and fairly significant height of the stage to the press section and audience, and a couple of sofas placed in the trusses off-stage where riggers had watched the original concert.
The eventual 70,000 sq ft set included reconstruction of the stage itself, the backstage and concourse areas (under Wembley) and part of Wembley Way, the approach to the stadium. The entire set was dressed in a massive tarpaulin to protect it from the weather. Set extensions, including Wembley’s iconic towers, were added as visual effects by Dneg.
Around 800 extras, outfitted in eighties summer festival costumes, filled the area between the stage and front of house mixing desk and were photographed from dozens of angles performing choreographed dance and body moves. Facility Dneg used this as a basis to digitally replicate the look and feel of 72,000 spectators.
 “The concept for the design was to tell the story of Freddie’s arrival and journey to the dressing room, the band’s apprehension before the show and their walk from dressing room to stage all in one camera move.”
While that one-shot camera move was truncated in the final cut, much of the material remains.
It took three months of planning, design and build for two weeks of shooting including just six days of principal photography with the main cast.
“Rami and [the film’s] band memorised and performed Queen’s entire 20-minute five song set. “Brian May was there and he turned to me and said it had raised the hairs on the back of his head.”

Friday 19 October 2018

The rize of the robotic camera


BroadcastBridge

The rise of the PTZ camera, discussed in a previous post, fits into the wider trends for remote operation, robotic cameras, and automated content production. We spoke with Mark Roberts Motion Control (Nikon-owned since 2016), one of the pioneers in robotic cameras to gain insight into the drivers for this, new applications and future developments.

The film and TV industry continues to benefit from automation efficiencies with motion control robotics compatible with third party products throughout entirely automated workflows. Machine vision and real-time image analysis are helping to augment the control of moving cameras, while robust IP architecture permits extreme remote operation. Systems are getting smaller, saving camera footprints and opening up a wider range of potential positions while the layers of system redundancy and control options reduce risk in live operations
Assaff Rawner, CEO of MRMC suggests another reason for take-off: “Taking viewers closer to the action and providing more unique perspectives and engagement is one. Another is consistency of output made possible through repeatable programmed moves. Also, the demand on space, whether seats in stadiums or modular studios, is increasing pressure to reduce the operational footprint of cameras whilst accommodating for the demand of increasing camera angles.”
Simplicity of integration in existing production workflows is one of the key advances of robotic cameras. MRMC has focused on a simple plug and play approach to broadcast robotics releasing a range of Polycam and MHC products products tailored to remove the complex user operation usually associated with multi-head robotic camera systems.
Its acquisition of Camerobot last year, has also brought together best of breed hardware and software to create a new level of broadcast studio multi-axis robotic studio solutions including VR and AR integration with all the major vendors.
With the rise in quality of small format cameras, advancing motor technology and motion control software, together with new levels of product design aesthetics, robotic camera positions are bringing a level of motion usually associated with manually controlled fluid heads and in positions that add further value to productions. Such advances are allowing robotic cameras moves to be cut live to air (rather than just replays) and, through a wider range of payload and mounting options, opens the possibility of more camera positions without compromise to venue audience space.
“Another area of growth for robotic cameras has been in the rise of remote productions (or REMIs),” says Rawner. “With the increasing availability of stable high-bandwidth networks, the control of camera robotics over IP is an attractive proposition to lower production costs and minimise travel. MRMC has standardized on IP control for all of its robotic range with built-in features such as network diagnostics, IP video encoding at the camera head and localised user client applications for full feature remote control.”
Automated camera moves have been used in news studios for many years. In highly choreographed productions where consistency of product is key, robotic cameras moves are integral to the overall workflow automation.
Although this level of automation can produce highly efficient productions, there is little room for improvisation. The company has developed methods to provide all the functionality of studio camera automation but with a level of tracking interaction that allows the presenter to lead the camera position within the programmed move. These moves within moves provide a more natural engagement between camera and presenter whilst retaining the high production look of a fully programmed move.
Sports and live events are far from choreographed and camera motion requires immediate response to high speed changes of direction and unpredictable trajectory. Experienced operators understand their subject and create the incredible shots we have become accustomed to at the top end of sports broadcasts. Space constraints in many venues together with an increasing demand for bespoke camera positions for broadcast and non-broadcast applications (such as OTT add-ons, analysts, coaching etc.) is an on-going challenge.
Automating camera motion in sports can be achieved by using machine vision to analyse ball and player positions in real-time, feed those positions to the robotic cameras and automate the camera motion. Advanced algorithms working in real-time are used to frame the shots in a fluid and highly adaptive way to provide this level of automation.
Polycam Player is one example of a system that provides automation for certain camera positions in football using robotics.
Some automated systems use multiple cameras views stitched together to form a panoramic view with camera motion cropped out of the larger image. Due to the nature of the stitched image method, resolution is lost through digital zooming and the scene is presented from a mono-perspective.
“Robotic cameras offer the level of optical resolution for close up action and work in real-time, fitting into existing multi-camera productions and offer a range of angles that enhance the production,” says Rawner. “Integrating robotics in existing workflows allows for the best of both world – great story telling and emotive shots from manned camera positions and consistency of coverage, space saving and unique angles from automated cameras. However, within any automated live event camera workflow their needs to be, in our experience, a level of a human intervention that is seamless to the operation.”
He says that both MRMC’s Polycam Player and Chat solutions allow users to adjust framing within the automated tracking without having to switch off the tracking. Automation continuous to operate behind the manual operation allowing the user to relinquish control at any time without adversely affecting the camera motion.
“We believe augmenting manual control with auto tracking or auto tracking with manual control is key to the level of hybrid interoperability required to deliver the best range of options as the industry adapts to the benefits of new technologies and workflows.”

Applications
MRMC has been involved in Sky Sports coverage of the PDC World Championship Darts events since 2013. The productions use a range of robotics systems, including Ulti-Heads, SFH50 and AFC-100. The robotic heads support both full size and compact cameras (HDC-1500, 4300 and P1s) and 40x lenses. Control is extended to a remote position and operated either by encoded pan bars or the new dart board inspired touch screen MHC controller, designed specifically for the two main front of stage camera positions.
Russia World Cup
MRMC provided a number of high payload Ulti-Heads, together with multi-axis StudioBot robotics, for the Fox Sports Red Square Studio. All of the camera moves were fully programmable will manual joystick override and provided real-time positional data to the AR graphics engine.
 Australian Open Tennis
Earlier this year MRMC provided rental AFC-100 systems to Gearhouse Broadcast for their coverage of the Australian Open Tennis. The system form factor, payload capacity and IP control, allowed Gearhouse to integrate the robotics into their production workflow. The heads were placed on the top of high rise buildings whilst provide stable high overhead shots of the tournament. System control was linked to an operator position over a kilometer away via RF, providing a completely wireless pan, tilt, housing wiper, camera and lens control. Gearhouse subsequently invested in its own AFC-100 stock for a fully integrated camera robotic rental service.


Wednesday 17 October 2018

Editing and graphics trends

InBroadcast 

Editing tools becomes lightweight clients in the cloud while virtual set integrations with games engines render broadcast graphics photoreal


As the demand for OTT, mobile and social video continues to accelerate, publishers need to expand the volume of content they create and increase the speed at which they produce it. One way of accelerating production workflows to meet the demands of mobile and digital audiences is to host the editing application in the cloud.
Grabyo Editor is a set of browser-based editing tools for creating and distributing short video clips, highlights and social video from live streams, VOD and mobile sources. Combined with Grabyo’s live production platform and publishing capabilities, it will offer digital teams a cloud-based platform for fast, flexible editing with distribution to social, mobile and OTT platforms.
The browser-based experience will support picture-in-picture frames, graphics overlays, audio controls, and time-based asset-splitting.
“By removing the complexity of more traditional editing software, users will be able to take advantage of powerful video editing features without the need for extensive training or specific hardware or software,” Gareth Capon, Grabyo CEO, said.
Adobe Creative Cloud updates include Selective colour grading and management with new Lumetri Color tools in Premiere Pro and After Effects. It has new tools for 180-degree immersive video in Premiere Pro and After Effects, including ingest, effects and output in Google VR 180 for viewing on YouTube; and audio clean up features that permit you to dial down or remove background noise and reverb from a sound clip.
Perhaps the more significant announcement from Adobe is the beta of Project Rush, a video editing app available on mobile and desktop designed specifically for online content creators. Work automatically syncs to the cloud, so you can start on your phone or tablet and move to your laptop for further editing. Rush has all the same features on mobile and desktop, allowing you to work wherever you want without losing creative flexibility.
Rush includes colour correction and uses Audition to auto-detect audio to improve sound quality and reduce background noise. Adobe Stock is integrated right into Rush, for access to a constant stream of new Motion Graphics templates and it should be easy to create and publish different versions of video to suit different social media outlets.
Forbidden Technologies has what it calls “the workstation experience” in the cloud. Now released for Mac as well Linux, Blackbird Edge enables video ingest and editing for teams in remote locations with bandwidth as low as 2Mb per second. The Blackbird codec transcodes the video and uploads it to Microsoft Azure for collaborative production.
It plays edited media directly without any need for pre-render, supports jog and shuttle, as well as any playback speed and direction. It is claimed as only solution to provide such responsive frame-accurate navigation in the cloud.
“Blackbird Edge also removes the need for original sources to be uploaded before cloud editing can start with the Blackbird video being used to create the editing decisions, which are later reflected in a full resolution render,” explains Forbidden CEO, Ian McDonough. “With the ability to publish locally, in many workflows, only the final hig- rez version of the edit is uploaded to the cloud for distribution, reducing internet bandwidth requirement at the Edge location by up to 99%.”
Avid focused on Avid On Demand, a software as a service running on Microsoft Azure that allows users – principally large editorial teams - to access Web-based versions of MediaCentral, the company’s media production and management platform.
Avid Nexis E5 NL is a new storage solution that sports a web-based app for managing, controlling and monitoring its installation. The appliance can be accessed through MediaCentral Cloud UX or Media Composer, and also integrates with Avid’s production management system to drive collaboration.   
Other announcements from the firm include NewTek NDI (network device interface) output from Media Composer and Avid Artist DNxIP for media workflow connectivity supporting SMPTE 2022-6.
Explained Jeff Rosica, Avid CEO and President. “We’ve accelerated our delivery of open platforms, tools, apps, services and solutions to make it easier for our customers to end disparity among technology in their operations, so they can work faster on smaller budgets, and still thrill and retain viewers on any platform.”
Marquis Broadcast’s Fotonflite can be used as a point-to-point transfer system for Media Composer projects and media workspaces. It allows files to be moved between source Avid ISIS/NEXIS systems and a range of target storage types, including Avid, generic and proprietary storage. This tool is claimed to be unique in its ability to directly connect and securely synchronise live Avid ISIS/NEXIS systems, especially with Avid work-in-progress, and is ideal for connecting production centres over the internet, for example, Pinewood and Hollywood.
Games engines
Games engines are being integrated into film and broadcast virtual studio environments where their superior rendering power is capable of photorealistic scenery.
Frontier is Ross Video’s graphics platform based on Epic Games’ Unreal Engine and used on The Future Group game show Lost in Time. It uses Ross’ UX control platform as an operator-friendly front end, so operators are not required to know Unreal to use the system.
Brainstorm’s virtual set and AR solution InfinitySet 3, allows for the combination of the Brainstorm eStudio renderer alongside Unreal. Merging of Aston graphics projects within InfinitySet adds yet another level of potentially visually engaging content for virtual studio environments. The latest features include the dynamic control of external lights and chroma keyer settings.
“This virtual and augmented reality solution is the benchmark for high-end, real-time photorealistic virtual studio content,” declares David Alexander, Brainstorm’s commercial director. “Technologies like TrackFree or unparalleled features such as TeleTransporter, 3D Presenter or VideoGate are helping customers to engage audiences while significantly reducing costs.”
Vizrt and tracking company Ncam Technologies have demonstrated how the latest version of NcamAR for UE4 integrated with Viz Engine can combine template-based graphics and texts with Unreal Engine game assets for enhanced storytelling within live productions.
Mo-Sys has combined its realtime StarTracker camera tracking technology with Unreal Engine’s interface and the ray-tracing render strengths of Chaos Group’s V-Ra to create what it describes as a complete, automated virtual production workflow for VFX.  Projects with more modest budgets – such as soaps, commercials, corporate videos, and micro-budget feature films – can use it to create “a superior render quality” in real and near-time, according to Mo-Sys founder Michael Geissler.
“Before StarTrackerVFX, it was difficult and expensive for studios to film on a green screen when the camera wasn’t stationary. But StarTrackerVFX’s integration with the likes of Chaos Group has now given all-sized budgets the ability to precisely match the real camera’s position, orientation and lens distortion with the virtual world.”
Broadcast graphics
Vizrt’s render engine with Viz Artist 3.11 features new bone and skin-based mesh deformation tools as well as real-time motion capture and photorealistic rendering. Three-dimensional models can be imported with predefined animations, but can also be driven by live motion capture streamed to Viz Engine. Vizrt has also implemented HDR in Viz Engine supporting HLG and PQ HDR formats and compatibility with the S-Log3 HDR format developed by Sony.
Ross Video’s motion graphics engine XPression v8.5 includes NLE plug-in support for Adobe Premiere Pro for Windows, an HTML5 version of the XPression MOS plug-in and enhancements to XPression’s DataLinq tools. A 64-bit edition of the product extends XPression’s cache management, allowing for larger XPression projects. As part of this, users will also find support for importing Cinema4D models and scenes.
LyricX 3.4, is the latest iteration of ChyronHego’s graphics creation and playout solution. It now includes support for DNxHD with a new GTC clip player and clip workflow tools such as a GTC clip converter.
PRIME Graphics 3.1, addresses five use cases within a single graphics design and playout platform: character generation; a clip player, a video wall solution, a graphics-driven touch-screen platform, and a branding solution. ChyronHego’s CAMIO 4.5, the company's graphic asset management solution includes a new render engine; the ability to publish content straight from the newsroom to social media; and the latest version of LUCI5, an HTML5 plugin.
The firm’s weather graphics solution Metacast 2.6 arrives fully integrated with the CAMIO Universe and new features such as advanced animations for all layers, including weather model layers. In addition, Metacast also includes new playout tools that make efficient use of the keyboard for queuing and taking graphics to air.
VFX
Foundry’s 3D painting and texturing tool, Mari 4.2 introduces mirror projection features that brings simultaneous, symmetrical painting workflows to artists without the need for specialized UV layouts. Previously, explains the developer, painting the same designs on both sides of a symmetrically formed model required a considerable amount of asset preparation with a lot of repetitive actions. Now, artists can paint on one side of a mirror plane while Mari projects the same paint to the other side of the mirror plane, dramatically increasing efficiency.
This is the second major Mari update in 2018. The next feature release will feature the debut of the Mari Material System. This will ensure Mari can utilize any texture driven material data in a material painting workflow, and will be rolled out in phases.
There can be few VFX facilities which don’t have a tool or two from Boris FX. The developer is launching new versions of its flagship products next year but gave a sneak peak to IBC visitors.
According to founder Boris Yamnitsky, “Fans of Particle Illusion, now part of Continuum 2019, will love creating stunning motion graphics elements with an easy-to-learn interface and blazing GPU acceleration. The legendary Lens Flare inside Sapphire 2019 has been improved with a new streamlined designer UI and professional presets, and Mocha Pro’s new workspace and magnetic spline tools deliver significant advances in motion tracking and roto-masking.
It is also releasing a new Boris FX App Manager that will make activating and deactivating Sapphire, Continuum, and Mocha Pro licenses a breeze.
The big news from Blackmagic Design is the introduction of a RAW codec for its cameras including the URSA Mini Pro. This is intended to retain more colour information from acquisition throughout post. Kees Van Oostrum, President of the ASC, commended the initiative, saying Blackmagic RAW “could entirely change the workflow going from camera through post production. It will be an important change for post because the editorial team can work with the camera original files, which are fast enough to use for everyday editing. That means less confusion in regards to creative choices I make at the camera. The images can now travel throughout the entire workflow because we’re shooting, editing and grading with the same files.”
Support for the codec is available in DaVinci Resolve from the BMD support website as a free upgrade for existing users. It joins the now shipping version of Resolve 15 which among, a myriad other things, adds an entirely new Fusion page with over 250 tools for compositing, paint, particles, animated titles and includes a major update to Fairlight audio.
Netflix Post Alliance
Finally, it’s worth noting the formation by Netflix of a Post Technology Alliance of vendors intended to raise the production of value of Netflix content, particularly around UHD and HDR. Product from select vendors meeting Netflix specifications will be certificated with a logo. Members of the Alliance includes vendors involved in cameras, creative editorial, grading, and IMF packaging, with products from Adobe, Arri, Avid, Blackmagic, Canon, Colorfront, Fraunhofer IIS, Filmlight, Marquise, MTI Film, Ownzones, Panasonic, Red, Rohde & Schwarz, and Sony.