Friday 31 May 2019

BT Sport on target for HDR first at UEFA Champions League Final


IBC
BT Sport COO Jamie Hindhaugh outlines how the broadcaster’s Champions League final coverage will be the first time a live sports event will be broadcast in HDR to mobile devices.
The all-English clash in the UEFA Champions League Final will make a winner out of BT Sport which aims to prize maximum publicity out of expected massive domestic interest in the game between Liverpool and Tottenham Hotspur.
For a start, BT will ensure its coverage can be watched by anyone by broadcasting it for free on YouTube.
“The aim is to make this the most connected Champions League Final ever,” asserts Jamie Hindhaugh, chief operating officer, BT Sport.
The event on June 1 will also act as a showcase for BT Sport’s High Dynamic Range (HDR) offering as the telco seeks to wrest the title of most innovative broadcaster from Sky. The company claims that this will marks the first time that a live sports event will ever be broadcast in HDR to mobile devices.
Mediapro is broadcasting UEFA’s world feed, but it will not feature HDR, so BT Sport has taken upon itself the task of producing separate HDR-specific coverage.
In fact, it will produce three entirely separate versions of the live match from three OB trucks at the Wanda Metropolitano stadium in Madrid – home of Spanish football’s Atlético Madrid.
Aside from a version of the host feed, which will be augmented by BT’s unilateral cameras and presentation, it will produce a different BT-specific HDR version and another bespoke one in 360 VR.
The world feed is produced in 4K UHD with Dolby Atmos for distribution in the UK on BT Sport channels (also through Virgin and Sky), and without Atmos, on BT Sport’s YouTube channel and on BT Sport’s website.
Main match coverage will follow the template laid out in previous Champions League finals consisting of approximately 50 cameras into which BT will insert replays for its own presentation.
Gary Lineker leads studio punditry with talent including Rio Ferdinand. Additional BT crews will be at fan parks in Tottenham and Liverpool for pre-match, halftime and post-match reaction.
The HDR feed is being produced out of a Telegenic truck and made available to BT Sport’s 5 million+ subscribers through the BT Sport app, and via mobile devices, taking advantage of the HDR displays on most new handsets as well as on devices including Xbox One, Samsung Smart TV and Apple TV.
It’s a curtain raiser to an HDR mobile service launching for the start of the next football season in which up to 70 events will be produced live in HDR a year.
“The HDR feed will be the only place to see the whole of the opening ceremony,” Hindhaugh says. “With HDR you see the real pitch colour, detail and colour in the shirts and you’re much more immersed in the game.”
He explains: “When we’ve done HDR trials previously we’ve done so as a single workflow. Because of the nature of the event in which we’re not the host we are doing a separate coverage from 20 camera positions. The key is to make sure the World Feed stays clean.”
There are four broadcast cameras behind each goal for example: two for the host and two for BT Sport’s HDR output. There will be a bespoke commentary.
“We need to replicate the normal camera plan as far as possible. It needs to be a compelling watch. This is no experiment,” adds Hindhaugh.
“With our expertise in HDR we think we are further ahead of the game than most other broadcasters. Even when we’ve output SDR feeds recently we’ve been down-converting those from HDR.”
The format is HDR10 PQ since alternative format HLG works less well on mobile – and mobile is the telco’s primary target for the HDR service, particularly given the launch on 30 May of 5G networks in six UK cities by BT-owned mobile operator EE.
The HDR is being output in HD, even though 4K resolution is possible. “To be frank if we’re looking at mobile first then data usage for 4K is prohibitive whereas with HD HDR it is minimal. Also, when you consider the size of the mobile screen then adding 4K adds nothing [in terms of perceptual image quality]. Most of our perception of depth comes from contrast, not resolution.”
While its strategy for mobile is HD HDR, users will also be able to get 4K HDR when the proposition launches.
 “Depending on what device and connectivity you have you will get different flavours of the same event.”
A third BT truck is dedicated to 360-VR. This will be captured in 4K (output as HD), unlike a trial during the FA Cup Final earlier this month which was produced in 8K.
The VR from Madrid will be captured from 12 multi-camera rigs, stitched live and directed with separate commentary from Spencer Owen. BT has been producing 360-degree highlights for major events for some time and plans to continue that going forward but for this Champions League Final is will live stream the entire match in virtual reality.
“We can offer different angles to that which you would normally get,” explains Hindhaugh.
“Ultimately where we want to migrate to is to capture VR in 8K and offer users the chance to ‘pinch and zoom’ into the picture on their phones,” he says. “The image quality at 8K is so much better for this interactive experience.”
The 8K test at Wembley used two rig positions and Blackmagic Design Micro Studio cameras. Pictures were rendered using Tiledmedia’s encoding software and switched using the BMD ATEM Constellation 8K.
Hindhaugh says: “I’ve always believes that people will gravitate to watch events like this on the biggest screen available, but we are also about giving different viewing experiences all of which will take fans close to the heart of the live sport.”
BT Sport debuted as Europe’s first 4K broadcaster in 2016, produced the first 4K UHD host broadcast of a Champions League Final in 2017 from Cardiff, and made a worldwide debut of Dolby Atmos broadcast sound in 2017.
“What we are doing in Madrid, we did at Cardiff only in much more trial form,” he says. “It’s easier when you’re in control of the end to end chain and when you are not it creates a bigger challenge. This final is also not on our doorstep, so the logistics are more challenging but one we are comfortable with.”
A few years ago, when Sky was the only horse in town, it was pursuing stereo 3D as the height of the live sport viewing experience. Hindhaugh doesn’t think it will make a comeback.
“I’ve always said we’ll never do it. When you see HDR in purist quality with 4K then it looks three dimensional. I don’t think 3D works for live sport since sport is a social event you want to watch with friends, not behind goggles.”
Hindhaugh was speaking to IBC365 from Madrid having just flown in from Baku where he oversaw BT Sport’s coverage of the other all-English final - the UEFA Europa League. He acknowledged that the atmosphere before and during the game suffered from the game being held in a location many fans could not get to.
He adds: “Baku went as well as it could when you have a stadium that size and the pitch is a long way from the seats. We can only work with the stadium we are in and the audience who are there, but our team covered it well.”
There will be no such doubts ahead of Saturday’s match which will see England dominate Europe for a short while at least.


How Formula E and Virtually Live are pioneering a live broadcast-game hybrid


SVG Europe 
Formula E is arguably the most progressive of sports in its bid to capture a new generation of fans through media tech but its latest initiative could top the lot. Ghost Racing is the world’s first live racing game to allow gamers to race live, in real time, with real drivers, on the real tracks of the ABB FIA Formula E Championship.
While pioneered by the all-electric all city-circuit motorsport series, the technology behind the game could be applied to “speed bikes, yachts or camels,” according to the CEO of developer Virtually Live.
“With Formula E we have accomplished a quantum leap in gaming full of innovation including the world’s first in game live race commentary laying the foundation for Virtually Live’s success in the esport industry,” Markus Tellenbach told SVGEurope.
Ghost Racing launched in April to coincide with the Formula E-Prix in Paris. It is free-to-play and available on iOS and Android mobile devices. It combines live racing against real Formula E race drivers with hyper-realistic scenery and graphics, engaging online live commentary, upgradeable car collection and continuously updated challenges.
“The backbone of live is broadcast and without it a sport’s sponsors and share of fans nosedives. Live broadcast is the bloodstream of big sports events but it’s obvious that’s been under threat for some time from advances in distribution that have created an irreversible change in consumer behaviour.”
Tellenbach speaks with some authority as a veteran of the international media scene. Before launching Virtually Live he had been running Polish media group TVN for seven years, having previously been at companies including SBS Broadcasting, Sky Deutschland and Kirch.
“To Generation X, broadcast is non-existent,” he stressed. “That is extremely alarming for any sport because if you can’t create a fan following for your product then you have a long term structural issue.
“What you can do, though, is stream content across devices and match their expectation for snackable content. Without streamed and snackable content the younger generation are not interested. Period.”
Tellenbach launched Zurich-headquartered Virtually Live in 2017 having identified gamification of sports as the type of content that could match the expectations of Gen-X, and gamers in particular.
“Of course, gamification itself is not new,” Tellenbach said. “There is a great library of video games for each major sport out there. Anyone can create a fantastic sports video game but there’s never been a solution for live sport. That’s where it gets difficult.”
He found a partner in Alejandro Agag, founder and CEO of Formula E, willing to get onboard. “Alejandro is a visionary who thinks in a fan-centric way,” Tellenbach said. “When you ask younger demographics what Formula One means they will reply ‘Formula What?’ but Formula E has probably the most fan-centric strategy in sport since it reaches out to younger fans.
“Alejandro understood the idea of gamification as one way of reaching Gen X. Our vision was to create an immersive racing experience far beyond what existing racing games offer.”
Virtually Live’s starting point was much the same as any other computer game developer in rendering ‘scenery’ (race tracks) and ‘assets’ (cars) as accurately as possible in graphics.
“We start about 4-5 weeks prior to each race to produce the basic scenery and we will continue to adjust and finalise it up to a few days before the race until we have 99% accuracy,” explains Tellenbach.The main sources for the scenery are CAD drawings of each city track, its buildings and installations. Other sources included Google Earth, stills photos and drone video taken by Virtually Live at each circuit and information from Formula-E of the exact placement and content of billboards.
The next stage -and the critical element that makes the game live – is to ingest the assets into the CG scenery using realtime positional data.
For this they turned to Italian tech company Magneti Marelli which devised a GPS-based sensor, approved by the series’ regulatory body the FIA, for fitting to each of the cars.
“This allows us to track each car with near zero latency – a few milliseconds – and much faster than a broadcast signal.”
User can pilot their own car in real time in a real event
The GPS-driven data provides time and position and, therefore, speed. This is processed using Virtually Live’s proprietary machine learning engine to track the position of all 22 assets (cars) racing at 100mph or more, often wheel to wheel, and to deliver this contextualised in the scenery as a live stream.
“The way we treat the raw data from the sensor is vital in creating a continued CGI stream and one in which we have a lot of patents,” said Tellenbach. Treatment includes various algorithmic filters, validations and predictions to define the exact position of each car.
The resultant virtual stream can be applied in a number of ways. Users can, for example, select any viewing angle, including of their favourite driver, a seat in any cockpit live during the race, or watching from the tribune.
“You make your own viewing experience and if you wish you can rebroadcast it live on Twitch or YouTube. You can be a professional video-blogger or a car manufacturer or anyone can upload it and create a dedicated fan channel.”
The live CGI stream is an actual replication of the live event and allows Virtually Live to ingest a 23rd car onto the grid. This is Ghost Racing.
“A user can pilot their own car in real time in a real event,” said Tellenbach.
Formula-E’s broadcast production partner Aurora Media are also able to utilise the CGI feed within its output. This includes using the live CGI feed as a virtual track guide, highlighting ideal braking lines, tactical considerations or when to use Attack Mode. This feature, introduced this season, temporarily boosts energy levels from 200kW to 225kW and allows drivers a chance to chase down their rivals or build a lead to defend position.
“[The virtual stream] is a perfect tool to look at certain perspectives such as cars travelling through a tunnel where a slightly different angle gives you a different perspective on the driver’s performance.”
Braking points and angle of steering wheel are also used to create fidelity with the live action. “It will be increasingly used as an analytics tool integrated into the broadcast to highlight, for example, which driver had a better line and how impacted their race. The positional data was only used previously for marshalling purposes and not for broadcast. This is new to Formula-E and will develop massively to allow broadcasters to explain in richer detail to an audience why an event has happened.”
It is increasingly possible that the worlds of esports and actual physical sport will merge. Brendon Leigh, a 19-year-old from Oxfordshire, has won the F1 Esports Series tournament for the past couple of years playing 25-minute races simulated on existing Grand Prix circuits. Esports players like him are being scouted by motorsport teams alongside the traditional routes of karting.
“He has been posting lap times ahead of official drive times so I wouldn’t be surprised in the very foreseeable future if an esport player will find their way into a cockpit for real,” Tellenbach said.
It’s still very early days but Tellenbach reveals that the number of downloads for Ghost Racing has sped past 100,000. In development are plans to extend the product to PC and more serious rather than casual gamers. The quality of graphics will be upped too, including adding humidity, improved lighting and weather conditions for next season.
Nor is Virtually Live’s product restricted to automotive sports but could be applied to ‘Ghost Sailing’ events like a Volvo Ocean Race “or anything where participation make sense” – such as camel racing.
Other sports require a little more imagination. Tracking the position and action of football players during a live game, for example, can be done now.
“We need to understand the contextual side of how a gamer is involved in live football,” he said. “We have some ideas.”
The co-inventors of Virtually Live are Jesús Hormigo and Jamil El-Imad. Hormigo is the firm’s co-founder and CTO who previously led the technology R&D teams at NeuroPro, a Swiss based medical company that develops innovative solutions for brain data capture and analysis. El-Imad is a former software engineer developing code on IBM systems. He’s an Honorary Senior Research Fellow at the Institute of Biomedical Engineering at Imperial College where his team is developing algorithms for epilepsy prediction and other neurological disorders. He is also launching a bio-sensory cloud service offering big data management, storage and processing solutions.

Thursday 30 May 2019

Post grapples with gender gap

Broadcast
More women in senior management positions but under-representation continues in tech roles.
Women are securing more creative post-production roles but remain under-represented in the sector, several high-level execs have warned.
While women are now better represented in senior management positions, the number in engineering, sales, operations and technical jobs remains much lower than the number of men employed in similar roles, said Carrie Wootten, director of Rise, a broadcast tech support network for women.
“Businesses are struggling to find female talent,” she said. “The pipeline of talent entering the industry with diverse characteristics just isn’t there and until this is changed, the make-up of the sector will remain the same.”
Companies are taking several approaches to changing the narrative. Technicolor subsidiary Mill Film has set a target of reaching a 50/50 gender balance in its creative workforce by 2020.
Post house Envy has a 60-40 gender split in favour of men, which co-founder and creative director Natascha Cadle wants to improve.
“We know diversity is vital to our business, so we go around the whole country visiting colleges and universities to talk about post-production and find the next generation of talent from all sorts of backgrounds,” she said.
The Farm Group has found maintaining a strict 50% hiring rule tough as application numbers aren’t commensurate. “Women are typically underrepresented in some technical roles,” said Nicky Sargent, joint chief executive. “However, we have roughly 70-80% women in management and production roles.”
Rise’s research suggests less than 2% of chief executives running technology divisions in the broadcast sector are female. Sadie Groom, founder of TV tech PR firm Bubble, said change is happening, with more women “staying in the industry” and “raising their own profiles to act as role models”.
However, client expectations that senior-level production or creative staff should be available at all hours have hindered progress as it discounts child-care issues, said The Farm’s Sargent.
Initiatives for change include Rise’s mentoring programme, Women in Tech, and the Mama Youth Project, which is supported by Tinopolis, Fremantle, The Farm, Warner Bros and Procam.
“As a female business owner with a female business partner, we know we are rare,” said Lucy Ainsworth-Taylor, chief executive and founder of Bluebolt.
“On the VFX facility side, women at the top are few and far between. But as a new industry, these areas take time to grow foundations and get talent trained up. I hope we will see a lot of change over the next five years.”
Goldcrest Films senior colourist Jet Omoshebi added: “There has been a concerted effort to employ more women, but the lack of people with an ethnic background has not been addressed with anywhere like the same vigour.
“We must start to generate more ethnic diversity by being open and proactive about our hiring choices, training programmes and grassroots outreach. There has to be a willingness to make a change.”

Wednesday 29 May 2019

Mainstreaming Targets New Lows in Encoding and Latency

Streaming Media

Can a 60-minute HD video be encoded in less than 2 minutes? Italian startup MainStreaming claims its technology can—and what’s more it says it can do so fifteen times faster than Amazon.


Armed with a fresh multimillion-dollar investment the company is now looking to expand its "half world presence" in Europe and the U.S. by targeting Asia, and broadcaster video streaming and online gaming markets in particular.
"Our core business is focussed on helping improve quality of service (QoS) for our customers," explains CEO Antonio Corrado. "We are not a traditional CDN, since we have control over the entire stack."

Corrado, who worked for Computer Associates and IBM, describes himself as an entrepreneur. He founded Mainsoft Group in 2000, developing software for banks and telcos, and was its CEO until 2015 before leaving to start MainStreaming with fellow Mainsoft executive Giovanni Proscia, now Mainstreaming’s CTO.
"The problem we wanted to solve was the universal issue of buffering, lag, and poor QoS benchmarked by Conviva, which was turning viewers away from online," he explains. "CDNs are great for caching but less successful at streaming. We are built for real-time video streaming."
The Milan-based company cite a recent report from PwC, which found that consumers consider the quality of the user experience just as important as content. This explains, it says, why streaming platforms are trying to gain a competitive edge on their rivals through delivery of the best streaming experience possible.
A key client is Sky Italia, which uses the service to stream premium live, linear, and on-demand content directly to 4 million subscribers on the Sky Go, Sky Q, and Now TV platforms.
"Italy is often seen as a lab for advanced technology," Corrado says. He adds that the company’s solution could be adopted by Sky Deutschland—which, like Sky Italia, is part of Comcast.
In the U.S., it is working with Denver Broncos, Rolling Stone,and tier 2 broadcasters with further customer announcements pending.
Earlier this month it announced that it had secured $6 million in a new round of funding, led by Indaco Venture Partners, together with Sony Innovation Fund, and existing investor United Ventures. This brings the overall amount of funding to $10 million.
Indaco Venture Partners aims to help MainStreaming focus its efforts on expanding in international markets, particularly Japan. Sony's participation is intended to help accelerate delivery of cloud gaming.
Corrado calls cloud gaming an "infinity business," adding, "Gaming companies don’t want to sell hardware at all in 4-5 years' time but to give customers a Netflix-like service."
"Current best efforts for streaming games services is more than 50 ms of delay. Our network has the ability to cover Europe in less than 25 ms, halving the latency and making online gaming a reality." 
MainStreaming has built a fully vertically integrated solution that it says solves all the main streaming workflow stress points. 
"Our proprietary HyperNode technology focuses on vertically integrating every phase in the streaming environment to offer a complete solution or one that integrates with your existing workflow."
Its suite includes solutions for origin ingest, storage, encode, transcode and transmux, hosting, management and API integration and delivery.
Its Ultra-Fast Encoding technology is compatible with formats and protocols for on-demand content including HLS, MPEG-DASH, WebRTC, and HSS, with claimed ABR proficiency and support for repacketisation and progressive downloading.
The company claims its delivery throughput is 3x faster than Conviva’s Average Bitrate benchmark.
Using high-performance computing, MainStreaming also says that in side-by-side tests, content using its technology is encoded faster than competitors. A 60-minute 1080p video, for example, can be produced in five different formats in 2 minutes, versus 85 minutes for Zencoder, 50 minutes for Encoding.com, and 31 minutes for AWS Elastic Transcoder, its own research suggests.
"Another reason our solution is unique is that we are the only company optimising routing in real time and delivering an incredibly stable connection to users," Corrado says. "Akamai or Limelight normally manage the cache with no routing policy change. We don’t cache. If we observe congestion in a specific territory affecting QoS for 200 users, our algorithm can decide to change the path to those 200 users to try to solve the problem."

Tuesday 28 May 2019

Zappware Strategy Summit: WIND Hellas reinvents TV experience with UX Focus

DTVE
A year ago, Greek telco WIND Hellas launched its new multi-screen pay-TV service in Greece. Built and operated by Zappware as a turnkey service, WIND VISION has taken the market by storm, combining DVB and OTT in one hybrid Android TV box and propelling rapid growth.
“Our goal was to offer our customers a unique TV experience both in terms of technology and content,” explained Hermann Riedl, Chief Strategy & DigitalTransformation Officer of WIND Hellas, speaking at Zappware’s Strategy Summit in Ghent, Belgium on May 15. “The problem was that while we were the number three mobile and fixed operator in Greece (with four million mobile subscribers and 600,000 broadband customers) we were the only one in our market not to have a TV product.”
WIND Hellas identified TV as a strategic must-have if it were to grow its market share but knew it would be launching against incumbent OTE, pay TV satellite platform Nova and Vodafone’s IPTV offer.
“Coming last to market with a zero-customer base it was clear to us that going for exclusive content was not an option. For us it would never make sense and be prohibitively expensive. But we did have an opportunity to differentiate ourselves around the UX – if we acted quickly.”
In April 2018, WIND VISION was among the first in Europe to launch with Android TV including 5000+ apps and Google Assistant voice search.
“We required a platform to really engage customers, create loyalty and trigger users to enjoy all the available content. The only viable differentiator for us was the UX.”
The process began in earnest in late 2016 when the company ran an RFP out of which it selected Zappware to bring its high-end TV services to the Greek market.
“We had no idea how to run TV,” Riedl admits. “That’s why we turned to Zappware as our managed service provider and its NeXX 4.0 turnkey video solution.
“If anything did go wrong, we would know who to kick,” he adds, candidly. “But we had no major issues. On the contrary, “Zappware built the complete end-to-end system from satellite dish and video head-end up to a leading edge UX on the devices in less than 12 months”
Zappware’s turnkey solution reaches from video head-end infrastructure and cloud-based ‘back-end’ to client software for Android-TV set-top-boxes, smartphones, tablets and web-browsers. The whole video-chain is provided and managed by Zappware.
Features include linear and catch-up TV, and network recordings. Along with the cloud-based back-end, the Android TV STB platform offers the flexibility to integrate third party content and applications to end-users in one seamless experience.
“The key UX criteria for us was simplicity and openness,” says Riedl. “We believe in forging a new role for media that goes beyond the walled garden of pay.”
To help entice consumers, the operator designed a compact stylish STB which its ad agency described as “iconic” and made the device a centrepiece of its marketing campaign.
“It has a very small form factor- one button, one LED, complemented by fewer buttons on the remote to avoid complicating the user experience,” explains Riedl. There are no colour buttons, for example.
Simplicity also meant WiFi connectivity. “The STB is a full unicast ABR solution to give our customers the option to unicast. Feedback tells us it is the most highly valued feature.”
The UI itself is a “Netflix-like interface” on which WIND expose catch-up services alongside app integrations. “Some run directly from our UI, including Netflix and Spotify, and the rest from the Android Play Store. The UX is cross device. A customer’s user name and password for the mobile app is the same for the TV platform.”
By using Android TV, the integration of third-party content from the likes of Netflix and YouTube is done in an efficient way, resulting in a great user experience.
Openness, in WIND’s strategic equation, means dominating HDMI 1. “Our base proposition is pay TV channels with catch up also all FTA channels.”
Riedl explains, “Due to the geographic complexity of the Greek market I cannot make an agreement with channel owners in all 100+ islands but with a DTT tuner we allow any viewer to have any content they want on our box.”
‘Openness’ further means open to Chromecast and Bluetooth connectivity. It is 4K-ready compatible with HDR 10 and HEVC.
“Currently, except for high performance online gaming, you can use our STB for any TV or gaming app you like,” Riedl says.
Zappware’s Amazon AWS-based cloud back-office serves the Android TV boxes as well as the mobile devices with WIND’s new multiscreen TV service. This makes for a very scalable, flexible and future-proof architecture.
By fortune or design, the launch of WIND VISION coincided with that of Netflix in Greece in April 2018. It meant that while not having major TV brands or exclusive content from day one the operator was able to advertise availability of Netflix exclusively on its service.
“One year in and our expectations are more than fulfilled,” Riedl shared. “We have amassed a 40,000 subs base representing about 4% of the market and on track to grow this rapidly. We have significant churn reduction and our NPS (Net Promoter Score) is a double digit number of points higher than our fixed subs without TV.
“What’s more we have achieved a substantial shift in brand image. TV is the  key product that differentiates us now. Our competitors simply don’t have this functionality.
“Our users are now able to enjoy a unique TV experience in which smooth integration of all available content is available on all screens. Users are now one click away from their favourite channel, sports game or Netflix series. We now look forward to growing our business with our new attractive household propositions.”

Behind the scenes: Once Upon a Time in Hollywood

IBC
Robert Richardson ASC describes recreating the summer of 1969 for Quentin Tarantino’s new blockbuster.
Any new Quentin Tarantino film is an event, not least because the director is far from prolific. It’s been four years since his last film, The Hateful Eight, was released, during which time anticipation has been building toward his next opus. Once Upon a Time in Hollywood has the added spice of touching on the infamous Tate La-Bianca murders of 1969 and is by all accounts quintessentially Quentin.
The story, “oscillates between humorous, serious and spooky,” according to cinematographer Robert Richardson, ASC who spoke exclusively with IBC365.
“I was mesmerised by the sheer spread of the story,” he says.
In classic Tarantino fashion the production is peppered with pulp film and TV references from spaghetti westerns to cult B-movies and comes with a soundtrack of 1970s hits including one from Paul Revere and the Raiders.
Having had a draft of The Hateful Eight leaked online, secrecy was so tight that principal collaborators including Richardson and editor Fred Raskin were only allowed their first look at the script in the privacy of Tarantino’s LA home.
“I hadn’t seen Quentin for a substantial period of time, so we had coffee and just spent a simple hour talking loosely about what had taken place since Hateful Eight,” Richardson explains. “He asked if I wanted to read the new script, sat me down at a small dining room table and put the screenplay before me. While I read, he sat nearby. I knew very little about the subject matter beyond what was circulating on the internet. The screenplay was so rich that I had to take notes to keep up with all that it referenced in respect to films, television shows, actors, directors, music and so on.
“We proceeded to eat a meal with [Tarantino’s then girlfriend, now wife] Daniella and we all spoke about the script as well as my reactions to it. The entire first read was an intimate, almost an out of body experience. I can recall only fragments of that first read. I was spellbound by its richness and texture.”
The Sony Pictures release is set at the height of the counter-culture and centres on Rick Dalton (Leonardo DiCaprio), a washed-up actor struggling to make it in the movies, and his long-time friend and stunt double Cliff Booth (Brad Pitt). They happen to be neighbours to the beautiful, fast-rising star Sharon Tate (Margot Robbie). The action takes place over two days and has a sword of Damocles hanging over its characters because you know that at some point the Manson family is going to show up at Cielo Drive, the fateful Tate/Polanski residence. Also, in the ensemble are Al Pacino, Tim Roth, Dakota Fanning, Emile Hirsch, Kurt Russell and the late Luke Perry.
 “It is immensely enjoyable to work with actors of this calibre,” Richardson says. “The combination of putting Leo and Brad together always brought a huge smile to my face.”
There are elements of Pulp Fiction – with the intersecting Los Angeles-based storylines and Kill Bill – with the picture journeying through different genres of film with pastiches of TV shows and commercials.
One storyline involves DiCaprio’s character playing the villain for a TV Western called Lancer. The series aired on CBS from 1968-1970, and its pilot was directed by Sam Wanamaker (played in the movie by Nicholas Hammond).
Cultural references
This is Richardson’s sixth film with Tarantino, a relationship that began on the two volumes of Kill Bill. At an early stage on ‘Hollywood’, Richardson says he knew which elements of the script would require a certain look.
“I knew when we would be in black and white or in colour and if we would be 1:33 or 2:40 aspect ratio. I knew if he wanted a contemporary look for the series of commercials that are within the script or whether to create a more retro, less slick feel. The answer was different for the various commercials - one was to be black and white, another with a period feel, another more modern but not stunningly clean.”
He began by researching the cultural references thrown up in conversation with the director. He watched Lancer and contemporary western TV shows including every episode of Alias Smith and Jones, the entire Maverick collection as well as box sets of Wanted Dead or AliveThe F.B.I and martial arts crime caper Green Hornet. For films he rewatched the classic 1963 war drama thriller The Great Escape, starring Steve McQueen who is played by Damien Lewis in Tarantino’s movie and more obscure war films like Eagles over London, directed by Italian Enzo Castellari who cameoed in The Inglorious Basterds.
“Of course, I turned to books to learn aspects of the Manson story,” Richardson says. “When it came to music, Quentin was extremely specific. After I’d read the script for the first time, he played some major hits from the late 60’s as the type of music he was considering. The soundtrack is vital to the movement of the film. It’s bold…a game changer in respect to the film.”
Once production had begun, Tarantino screened movies for the cast and crew including The Valley of the Dolls (1967) starring Sharon Tate and The Wrecking Crew, a vehicle for Dean Martin which also co-starred Tate, scenes from which are recreated in ‘Hollywood’.
We see a couple of clips from Bounty Law, the 4x3 format black-and-white Western which made DiCaprio’s character a star. For that, the VFX team rough up the edges of the frame, while the sound editorial team, headed by Wylie Stateman, Harry Cohen and Leo Marcil, compressed the audio and added a warble to give the sense that it was being projected in 16mm.
There’s a scene from The Fourteen Fists of McCluskey, a 1:1.85 men-on-a-mission movie from the early ‘60s, which VFX supervisor, John Dykstra, took through a duping process to make it look like a print from the era. Additionally, Richardson allowed the editing team to seamlessly replace an actor in a popular TV show from the mid-1960s with DiCaprio, photographing him from the exact same angles and exactly matching the lighting and grain of the film stock.
Location, Location, Location
Production designer Barbara Ling (with whom Richardson had worked on The Doors) scouted the film’s eighty locations with Tarantino around Los Angeles. One of them was the Spahn Movie Ranch which in the 1940s and ‘50s was used to shoot B movie westerns. It was also notorious for having been the primary residence of Charles Manson and his followers in 1968-69. The sets had burned down in 1970, the land now part of a national park, and all that is left is scrub and rock.
“It was simply barren land,” Richardson says. “Barbara and Quentin had scouted this and planned a large set build. My instinct at that time was not positive because I thought it would prove hard to get good light for the sequences listed there. That turned out correct, but I didn’t create a fuss.”
He adds that on location scouts “Quentin is as specific as possible so that we can all make sure we arrive on set with the equipment we need. He doesn’t storyboard unless for a specific sequence that requires VFX or SFX to plan out in advance what is in his mind - but that is rare. Quentin doesn’t shoot a lot of coverage - only what he requires. He’s like Marty [Scorsese] in that respect.”
For the Lancer sequences Richardson attempted to create a blend of film and TV styles from that period; “cleaner than other sequences but with a roughness to the visual aesthetics,” he says.
 “I was hoping to achieve a smooth quality but with visible limitations. Quentin and I wanted a 70’s feel, or 70’s retro, something not perfectly definable so that when watching the film, it feels off balance. Just enough so that you recognise something familiar from the past but sitting in the present.”
Like The Hateful Eight the film retains an epic cinemascope presentation but rather than 70mm this time he shot mostly in 35 anamorphic using Panavision E, C, and T lenses. The film stock was Kodak 5219, 5213 and 5222 (black and white) with one small sequence on Super 8 Ektachrome and part of one short scene on 16mm Ektachrome; “Quentin loves the wide screen and didn’t want to consider a spherical super 35 option.”
“We’re pushing colours to places we do not ordinarily go these days,” Richardson says. “A touch more grit and grain than we’re accustomed to. There’s a blend of past and present in the wardrobe, set design, hair and make-up which all greatly influenced the overall aim to look toward the past without sliding into cliché. I believe it’s fresh yet maintains a truth about the time period in which the film is set.”
Editor Fred Raskin says Tarantino loves the editing process just as much as the other stages of filmmaking. “Sometimes my assemblies end up in the film virtually unchanged. Other times, we’ll go back to dailies and start from scratch. In the latter cases, it’s generally a reflection of Quentin’s passion for a particular sequence. He conceived of the scene, wrote it and directed it, and he wants the opportunity to assemble it as well.”
Dailies were developed and printed at FotoKem. Colourist Yvan Lucas at Harbor Picture Company in Santa Monica worked with the facility to set the look of the film dailies and also supervised the digital intermediate.
 “The DI for Quentin is only necessary in order to get the film out into theatres not to correct the look that we have captured,” Richardson says. “There was some colour correction shot to shot as it’s impossible to colour correct days of film ingestion properly, but the work was to be as minimal as possible.”
The lighting package was much the same as Hateful Eight, or Django with some small versions comprised of a string of light bulbs. “The lenses are not the fastest so with film, unlike digital, you need to abide by that restriction,” Richardson says.
Almost all movement in the film is from either dolly or crane. “We had an underwater camera for one sequence in a pool. Steadicam was used on a few scenes as was a stabilised head (Pursuit system) on a car to work with horses.”
Another scene, among Richardson’s favourites, is the reveal of Dalton studying his lines for Lancer in his pool. “The camera moves from a wide overhead, across the roof of an adjoining house, and down that roof into a medium shot of Sharon Tate and Roman Polanski as they exit their house and move to their car,” he says.
“The most challenging aspect of the shoot was the scale,” Richardson says. “The script reaches far and beyond the normal and it is brilliantly written and performed and as a camera person or production designer or costume or hair or producer … one needs to rise to that level. It’s a day to day challenge.”




Thursday 23 May 2019

The Chi: Bringing life on the street closer to home

Panavision
One of the strengths of the acclaimed series The Chi, created by Lena Waithe, is its multifaceted take on the experiences of living in Chicago’s South Side. It also does not shy away from the big issues that impact the city. Indeed, the nuanced complexity of its storylines have been likened to The Wire.
Showtime’s eagerly anticipated second run is lensed by cinematographer Abraham Martinez, who moved into the role after serving as second unit DP on the show’s first season.
For the second installment, Showtime requested delivery in 4K rather than HD. Martinez employed Panavision’s Chicago location to assemble his camera package, with the Alexa Mini as the show’s primary camera (with ARRI Amira for Steadicam).
“One of the first things I did after we decided on the camera was talk with Katie Fellion (co-founder and head of business development and workflow strategy at Light Iron, Panavision’s post production division), and we decided to up-rez the footage from its native sensor to 4K,” says Martinez.
“A big plus for me in terms of working with Light Iron on this show – and also on other shows for which I’ve shot a mix of formats – is the ability to color match the cameras seamlessly,” Martinez notes.
Panavision Chicago outfitted each camera with its own set of Panavision Super Speeds (14-150mm) with PVintage as the go-to glass. These were complemented by an ARRI Ultra Prime 8R for time lapse and a Nikon Nikkor 600 telephoto to capture the hustle and danger of life on the street.
A notable change in style between seasons was the incorporation of more negative space in frame composition, but otherwise Martinez retained The Chi’s signature “handheld and alive” camera movement.
The lighting choices marked a more definitive enhancement from season one, a topic the DP discussed with Light Iron Supervising Colorist Steven Bodner at length. The change spurred from new showrunner and executive producer Ayanna Floyd Davis.
Martinez explains, “Ayanna mentioned that she wanted more of a differentiation between the two main dramatic focusses of The Chi which are the mob and the family. I felt that we needed to focus on making the interiors for the family feel warmer and more domestic. The creative intent was to put the audience inside people’s homes in the south side.”
This was set to contrast with the background violence that permeates The Chi as it depicts life on the street with gangs and guns. “For the mob, we wanted to lean into this urban backdrop, make it a little more heightened, but still the touchstone is realism,” Martinez relates. “We embrace the sodium vapor feel of street lights and tone the color temp down.”
At the same time the thread running throughout this series is one of fatherhood – boys growing into men. “We have a great range of skin tones and I really wanted to bring those out and create a real intimacy and reality to our story,” he continues. “These were the emotional points I talked over with Steve throughout production. We worked together to ensure there was a consistency of look for each of these two worlds, the domestic and the street. For example, in one scene we see a mom at her front door with a feel of domestic warmth and she’s talking to a kid on the street, and when we cut to that character, we feel the cooler blue exterior of that world.”
Martinez adds, “Lighting the space and atmosphere was equally as important to lighting actors, having the freedom for authentic character movement was essential.” The show has so much movement and energy to it that I wanted to liberate our camera operators to go into tight places and follow the actors. In turn, that meant working with Steve to draw out the darker skin tones that might otherwise get lost in the shadows. This wasn’t a cut and paste job. I felt that at each step I was able to control and fine-tune the process working with dailies and into color grading.”
Shot on location in Chicago, the ease with which Martinez was able to talk over the fine points of photography or grade with Light Iron in Chicago, New York or Los Angeles was something he greatly appreciates. While Martinez was in Panavision’s New Orleans’ office, a conform would be sent to Light Iron in LA who in turn would send a conformed ProRes file to Bodner in New York City.
“I’d do a first pass on the full episode, usually taking two days, and then we’d arrange a remote DI session, which I would run in New York with Abe in New Orleans and his post team at the Los Angeles office,” Bodner says. “These three-way color sessions are becoming more and more common now that we have a far reach with our technology.”
“No matter the time zone I am in, Light Iron is always open,” notes Martinez. “I can always reach out to someone in their offices with any issue. It’s so important to me to always have this line of communication. From the moment I develop a camera package with Panavision to the moment I start conversations with Light Iron, I am extremely confident I am being taken care of – and that’s a weight off my back.”
Martinez graded 27 episodes with Light Iron last year, including all ten episodes of The Chi with Bodner, 13 episodes for Queen of the South (season 3), and the four-part documentary Alternative Living.
“I was in Light Iron New Orleans when I was working on the doc series and on the same day able to remote grade an episode of The Chi while reviewing test shots of grain for Queen sent from New York City. That, I felt, was a game changer in the sense that the job really can be done remotely and flexibly thanks to Light Iron and Panavision.”

Creating the Look of Late Night

Panavision
A legendary talk show host employs a young scriptwriter and sparks fly in the new indie feature Late Night, premiering at the Sundance Film Festival. Written by and starring Mindy Kaling, and co-starring Emma Thompson, the set-up depicts them as poles apart in generation and culture, adding further twists in the comedy.
The interplay between the two leads, as well as the desire to convey differences in the characters’ personalities and environment, led director of photography Matthew Clark to work closely with Panavision and Light Iron, who he has collaborated previously with on Set it Up and Little Evil.
“Director Nisha Ganatra (Transparent) wanted to try and create as naturalistic a world as we could for the actors. That way, they would feel free to play around on set and location,” explains Clark. “I feel that with comedy especially, you want to allow your actors the freedom to move around physically and to improvise in order to find that true comic moment. So, we needed to keep our camera and lighting package small and our sets an open space. To be clear, we weren’t aiming for the natural style of a documentary, but more of a heightened realism.”
The film’s three main locations are given a different tonal look. “The parts of the story which cover Katherine’s (Thompson) personal life at home in a town house in Brooklyn were given an elegant, rich, warm feel to which we added a little smoke haze to create some depth,” notes Clark. “The talk show studio (on a set in a Greenpoint Studio/Warehouse) was crisp and clean with a more neutral feel to it. The writers’ room, which we shot in an office on the 17th floor in a building on Manhattan’s east side, was given a cooler palette, more office or business-like.”
Clark chose a Panasonic VariCam 35 to accommodate a tight budget and 25-day schedule. “Sometimes, we had three or four locations a day. That means you have to move fast. I liked the ability to shoot at 5,000 ISO at night on location. I knew we would have a good base to start. Then, Ken Shibata (gaffer), Tommy Kerwick, Jr. (key grip) and I could use a minimal lighting package and still get our look while working at that speed.”
For lens selection, he visited Panavision in Woodland Hills, and got his pick of ‘70s era Ultra Speeds and Super Speeds. “Dan Sasaki presented me with two tables full of lenses from classic old Primos to new glass. I had the whole place to myself and was able to test shoot a lot of it.”
Clark continues, “I took the tests to Light Iron in Hollywood where colorist Corinne Bogdanowicz and I tried different looks, playing with saturation, shadows, highlights and even adding grain. That color prep really helped us move quickly and seamlessly when the production moved from L.A. to New York for principal photography where I collaborated with Light Iron NY colorist Sean Dunckley.”
“We tried out some pretty stylized filmic emulations, and then adjusted the levels of color in the blacks and highlights to create a more refined look,” notes Bogdanowicz. “The grain tests helped us add some subtle texture to the images.”
Clark also worked with Light Iron Dailies Colorist Aaron Burns for a couple days to establish a LUT which Dunckley then finessed. “I like to light for the LUT on location and I tend not to change it,” says Clark, “but if needed I will play around with it later. That’s what Sean and I were able to do.”
On finessing that LUT, Dunckley adds, “I worked with Matt to create a LUT that was a bit more dynamic, pushing some cyan into the shadows and smoothing the skin tone colors. The LUT was used for dailies, and because Matt takes the time in pre-production to make the post process easy, we were a step ahead of the game when starting the DI.”
Clark continues, “One of the things you used to be able to do with film was pre-flash the raw stock. This opened up the blacks and allowed you to add color in the shadows depending on the percentage and color of the flash. It’s a way to create an emotional effect. I wanted to do this digitally for our story. So, we mapped out the emotional change as we went through different scenes. The basic narrative arc is that the character’s start out as individuals and wind up working together as a team. So, we start out with colder hues in the office, introducing a steely look in the shadows, and as the story moves forward we minimize blue for more rosy, warm tones with addition of some grain. There’s a big juxtaposition between a scene in Katherine’s bedroom, which is tonally rich, and the next day when she comes into the office and realizes she has to do something to get her job back. It’s not a straight arc from blue to warm but changes according to the scene and context of the scene.”
Lens choices included an 11:1 Primo Zoom – “the tried and trusted workhorse which is still beautiful and very creamy,” Clark says. The older glass he chose helped break down the digital image in the way he wanted. “The coating on the 40 delivers a warm glow which we mostly used for Katherine’s personal world while several of the lenses had this rainbow flare which I felt gave us something different. These lenses are not ‘perfect’ which makes them perfect for this film. The aberrations, the older coatings, and the glass itself is less precise than most contemporary lenses and gives the digital image a softer look with more personality.”
The two-camera, 4K production was lensed mostly on tripods and dollies with some Steadicam and occasional handheld.
“A lot of the time, we were shooting close to wide open, maybe 2-2.8. That’s hard on a 100mm with minimal lighting, so all credit to my A camera AC Pedro Corcega and B cam Adriana Brunetto-Lipman. Pulling focus on a Steadicam moving through a party is not for the faint-hearted. If you don’t have talented people like this on your team then you would have to adjust for them and then it changes your look.”
Following the initial grade, Clark had Dunckley make an HDR pass. “We didn’t shoot for HDR but it was amazing to see what Sean was able to do to adjust for things like clouds in the sky and enhancing depth in the image. It revealed things I didn’t know were there.”
“After the theatrical grade, I used Baselight’s color management to map the project to a 1,000 nit HDR display,” notes Dunckley. “We then worked on contrast and highlights to soften the feel of the image. Overall, it was a really seamless DI process.”
Late Night — which also stars John Lithgow and Veep actor Reid Scott — is produced by FilmNation Entertainment, 30West, Scott Rudin Productions and Imperative Entertainment.