Friday, 17 August 2018

Broadcast Management Systems: Moving Just in Time Further up the line


InBroadcast

BMS solutions are powering dynamic changes and individualised delivery to different audiences
Broadcasting is shifting from a ‘create one distribute to many’ to a narrowcasting model in which multiple variations of content are created for distribution to different groups, platforms and ultimately individuals. The Broadcast Management System (BMS) is an essential component in this evolution required to streamline and automate workflows. Content is being delivered directly to the consumer as OTT, on social media, or in packages sold to third-party platforms. Pop-up channels are created for specific audiences and events.
This shift impacts every area of the media business – the nature and structure of IP rights, the scheduling of media and how advertising packages are put together and sold. However, the change has not been a ‘big bang’ and while BMS vendors are increasingly implementing digital first solutions, traditional linear television still represents the largest revenue source for many of their customers.
“The key to supporting our customers through this change has been to provide them with a suite of applications and solutions that can handle both broadcast and narrowcast, non-linear and linear in a single application,” says Sina Billhardt, product manager, Arvato Systems. “This is combined with automation in workflows to mitigate extra workload from the additional variations and an approach to delivering software and solutions that anticipate and can accommodate future shifts and opportunities without knowing the detail of what they might be.”
According to Geert Van Droogenbroeck, marketing manager for MediaGenix, the changes make it essential for operators to recommend content based on usage. “Algorithms are written to track viewing preferences and content is personalised based on this information. Metadata becomes all important in creating sufficient relevant tags to define preference in ever greater detail. AI is being used to process large amounts of data in a meaningful way on a unique user level.”
He adds, “The more sophisticated this becomes, the more personal the service will be as the UI will be different for each user. Added complexity comes when a service needs to take into account both what a user is watching in a linear / catch up environment and an on-demand environment.”
Another challenge is that the content has to adapt to the destination communication devices, not just technically, but also from an editorial point of view. Facebook’s long form story format needs to be different viewed on web site or a connected TV.  “The changes have a huge impact on content itself,” says Droogenbroeck. “Traditional players have become more creative in how they package content for a target group. They split content into smaller parts that fit social and on-demand media, repackage seasons into different themes or add bonus content for the true fan.
“Media companies need to be able to slice and dice content for use on different platforms,” he says. “They need to manage rights for new versions of content that are assembled from many existing content parts. They will also want to present existing content in new and different ways, and group titles into collections for selling, planning or re-packaging. This requires a flexible content-centric system and quick editorial decisions that rely on a powerful management of media and material workflows and of complex rights and underlying rights and royalties.”
The BMS should therefore assist producers, curators and schedulers in managing this complexity and make it simple to connect the right audience to the right content version.
“Unfortunately, the total revenues of the market remain flat or show only a limited growth and broadcasters have to survive with lower incomes per channel,” points out Michal Stehlik, director - product development, Provys. “This leads to an increased pressure on the automation of all processes associated with each individual distribution channel.”
Provys believe that this automation process is possible only when it is built on a strong foundation of content and rights libraries, regardless of the type and coverage of the channel. The key to success, it asserts, is finding the right content to offer, efficiently utilising all available rights and using information to support further content procurement or production.
“This is why we think BMS is a key system for transformation from a linear channel broadcaster to a content centric, multichannel operator,” says Stehlik. “We guarantee that Provys is the right solution to support this transformation.”
Just In Time assembly
In general, it is more efficient to package the channel for delivery at the end of the broadcast technology chain – a strategy which supports efficient reuse of content. The time when commercial breaks were compiled onto a single tape, subtitles burnt into the picture and audio tracks recorded together with the picture, are over. Today, playout automation assembles all the necessary pieces ‘just in time’ (JIT) with graphics rendered during playout and no need to use post production resources.
“It is now possible to introduce changes just a few minutes before transmission and produce multiple feeds with different branding from a single media,” says Stehlik. “From the BMS system perspective, we first define the rules and then schedule individual elements as separate objects. We expect that more and more broadcasters will discover the beauty and power of the information kept in our system which enables an enhanced, individualised experience for viewers of the future.”
In many ways, scheduling applications used in broadcast environments have been applying a just-in-time methodology for a while. Placeholders are commonly used for content that is not yet available while workorders, analogous to those used in manufacturing production processes, are sent to Media Asset Management systems to ingest/create/produce the media.
“The key to making this work is tight integration between the MAM and BMS systems and an understanding in the workings of both,” says Billhardt. “This is not common in the industry as few vendors offer solutions in both areas and on the user side, solutions are often specified and implemented by different departments. It’s an area where Arvato Systems offers a unique and valuable perspective.”
Looking a little way into the future, there’s work in progress to extend the Interoperable Mastering Format (IMF) specification for advertising. According to Billhardt this has potential to move the JIT assembly process even closer to the consumer and presents some really significant opportunities to further push the boundaries on ad sales.
Stream Circle, maker of the eponymous TV automation platform, also anticipates that the future lies in network streaming to narrow groups of viewers “provided that we know their exact profiles,” says CEO Josef Vasica.
Stream Circle works with raw content and, using its own graphics engine, “is able to assemble the final stream at the very last moment in the light of the latest available information,” says Vasica.
“Our system functions on the basis of a generic programme definition which is then enhanced by secondary events, graphics, ads, self-promos, etc. strictly on a JIT principle.
“IBC2018 can expect to see our latest multi-layered playout functions with all the latest and greatest features of IP-based television,” he adds.
Live sports programmes are at the forefront of just-in-time content assembly, as small segments are created from the live event and used quickly.  Catch-up and on-demand follow the linear programme with ever shorter delay. A soccer match can be available on-demand in a matter of minutes after a game has ended.
ProConsultant Informatique (PCI), which refers to its BMS (called LOUISE) as a Business Management Solution, says the BMS must be integrated with Business Process Management (BPM) tools to manage the workflow’s operations and tasks and to bring significant operational efficiency to media customers.
“Non-linear platforms are going to be more and more specialised, addressed to precise targeted individuals or groups, based on their characteristics, with multiple variations of content,” says PCI’s Laurence Thill.
“In this framework, LOUISE is providing integrated functionalities to manage these different variations and to personalise the specific content addressed to the individuals and/or groups of final viewers. Media companies using LOUISE will be able easily combine all of this information in order to precisely adjust and feed, non-linear platforms with the appropriate content addressed to viewers.”
Cleary, all of this must be done in compliance with the rights and rules associated with each content. Since the rise of non-linear platforms has significantly changed the rights management needs for broadcasters, PCI will introduce at IBC2018, a fully integrated module within LOUISE, which enables users to manage the sale and/or re-sale of rights to third parties.
International media groups are centralising their content in a global content management system so that it is ready to be shared by channels and platforms across the globe. As the media asset management in MediaGenix’ WHATS’ON pilots all video, audio and subtitle flows, the content is ready to be shared by all channels in whichever region, platform, version or language the content is needed.
Swiss public broadcaster RTS can, for instance, automatically create clips for fast publication online.  WHATS’ON users open the frame-accurate player from their WHATS’ON screen and set markers segmenting the content.  This facilitates its distribution on any platform while tracking the various rights on every individual segment of the content. The system informs users about rights problems and the additional costs for clearing the content.
At IBC2018, Mediagenix promises to deliver on a new concept of content itself – one that “breaks down the barriers between interstitials and products, episodes and programmes, but also between media and nonmedia content, such as derived products, apps, books, entertainment events.”
This will apparently make it even easier in What’s On to split content up, assemble new content with constituent parts, schedule additional content, present it in alternate ways, and group titles into collections for selling, planning or repackaging.
“The whole exploitation lifecycle will be managed in an integrated way. With ‘Flights’ you will not need more than one scheduling action for multiple publication windows on multiple services and platforms,” he says. “Ultra-dynamic publication with one click of the mouse.”
For IBC, Arvato is focusing on “programmatic advertising” and bringing the best of online and digital advertising to traditional TV. “With linear still providing strong revenues for broadcasters and brand safety for their advertisers, by combining big data on audience insights with smart, automated placement optimisation, our customers are able to offer advertisers accurate targeting and reduce waste by controlling reach and frequency on linear channels for the first time,” explains Billhardt. “Alongside metadata-driven automated scheduling, we’ll also be demonstrating how the placement optimisation algorithms in our S4AdOpt application can now also be applied to promos, increasing viewer numbers and providing further revenue opportunities for our customers.”


Looking beyond the Game’s end

Broadcast



The Northern Ireland film and television sector is looking to a future beyond the final episode of the hit HBO show by planning to compete on the global stage.
In the decade since HBO agreed to shoot a pilot for its new worlds series in Belfast, to the final day of filming last month, it is no understatement to say that the industry in Northern Ireland has been revolutionised.
Previously renowned for documentaries but lacking network commissions – described to Broadcast by Green Inc Film & Television owner Stephen Stewart as “chronically under-achieving” and with no history of large, incoming productions – the region has been transformed.
“The supply chain infrastructure is unrecognisable from what it was then,” says Richard Williams, chief executive, Northern Ireland Screen.
“We have two new studios both effectively built on the optimism and value proposition of Game Of Thrones and a depth and breadth of skilled resource from crew to post-production that is giving a generation of talent the feeling that anything can be made here.
“Arguably the most significant change lies in the perception and credibility of Northern Ireland, in London and particularly in Los Angeles,” he adds.
Succession plans
Williams’ screen agency is widely credited among indies for its ambassadorial and practical support. It has been ambitiously planning for Game Of Thrones’ succession, with the drama ending after the eighth series.
“It was strategically extremely important to have Belfast Harbour Studios open before Game Of Thrones came to an end for the simple reason that we aimed to shift from an ecosystem that broadly supported one largescale, inward investment project to one that supported two such projects,” Williams explains.
The privately funded, £20m Harbour Studios comprises 64,000sq ft of soundstage. It is busy with the second season of Warner Horizon’s Krypton, “meaning a large chunk of supply chain companies had a degree of business no matter what happens”, says Williams.
Post-production houses Yellow Moon in Holywood, County Down, and Ka-Boom in Belfast, have both benefited from Game Of Thrones’ location in the nation.
Yellow Moon has employed more permanent staff, leased several buildings for the HBO team and installed new kit and editing suites, while Ka-Boom has expanded into wider production services, including being CAA-approved drone pilots.
Demand for craft and crew facilities is being further shored up by a growing number of UK-anchored TV dramas.
These include 3 × 60-minute period drama Death And Nightingales from Imaginarium and Soho Moon for BBC2 and three-parter Mrs Wilson, starring Ruth Wilson (and based on the memoir of her grandmother), co-produced by the BBC and PBS’s Masterpiece.
Romantic indie feature Normal People (co-produced by Canderblinks Films and Out Of Orbit) starring Liam Neeson and Lesley Manville (Phantom Thread) is currently filming based on a script from Irish playwright Owen McCafferty.
Later this year,  BBC1’s 8 x 60-minute The Dublin Murders from Euston Films, Element Pictures and Veritas Films will shoot in Dublin and Belfast, while the fifth series of World Productions’ Line Of Duty will return to Belfast.
HBO’s confirmation of a pilot for new Westeros saga (w/t The Long Night) at Titanic Studios’ Paint Hall in October is more good news. “Our hopes and expectations are that HBO will remain in Northern Ireland for many years yet,” says Williams.
“We are still keen on studioscale feature projects and certainly when we ramp up to three inward investing projects over the next four years we expect at least one of those to be a feature.”
The estimated value of HBO’s investment to date in the region is £206m – not a bad return on £16m in Northern Ireland funds (see chart).

The Game Of Thrones halo is less tangible outside of drama but has impacted nonetheless, not least in raising the profile, skill levels and work load of location scouts to costume designers.
“Everyone knows they can come here and make high-end shows,” says Kieran Doherty, the joint managing director of producer Stellify Media. “They know our crews are world class.”
Sony joint-venture Stellify is riding high on multiple wins, including Channel 5’s revivals of Blind Date and quiz Gino’s Win Your Wish List, plus ITV’s resurrection of Who Wants To Be A Millionaire?.
While the company has made entertainment formats like Can’t Touch This for the BBC and is making social experiment show Celebrity In Solitary for C5 in warehouse spaces in Belfast, Doherty says the region lacks suitable studios for larger-scale shiny floor shows.
Who Wants To Be A Millionaire?, for example, is housed at Dock10 in Manchester.
“One benefit of the Game Of Thrones crossover is that we can draw on set design and construction or make-up talent to make shows here, but high-end Saturday night shiny floors are harder to make without a dedicated TV studio,” says Doherty.
The keys to a burgeoning entertainment and fact ent sector in the region, however, lie with network commissioners. Locally this is known as ‘the Sean Doyle effect’ after the impact made by the London-based, Belfast resident commissioning editor at C5.
“He doesn’t have a remit to look to the regions but because he knows the sector here there’s an immediate trust and understanding of what we can all deliver,” says fellow Stellify managing director Matthew Worthy.
Doyle recently ordered a pilot for Celebrity Meltdown, about Britney Spears, from Waddell Media.
“A big turning point for all Northern Ireland indies would be if the BBC and Channel 4 could find someone who could fit Sean’s mould,” says the indie’s managing director Jannine Waddell.
“This is still a relationships business. There is more engagement from those broadcasters, but the difference is that I can meet Sean for a coffee today, whereas I’d need a day, spend £500 and arrange other appointments, in order to catch-up with execs in England.”
Green Inc’s Stewart adds: “Sean has been a very successful commissioner for the community here but that’s a direct result of him knowing who is on the ground. A lot of executives simply don’t have that knowledge. C4 and the BBC are doing a lot of good work to get more local commissions but there is more work to be done.”
Unfortunately, C4 recently struck Belfast off the shortlists for its new national HQ and creative hubs, which would likely have propelled production in the city and surrounding regions into overdrive.
Some indies are launching satellite offices in Belfast. Initially, perhaps, this was in anticipation of an increased C4 presence but it is also in order to tap network quotas, Endemol Shine’s Darlow Smithson Productions launched a Belfast base in April to expand its factual output.
Headed by producer Anne Stirling, who was hired from running her own production outfit, the indie is up and running with series three and four (40 eps) of Ill Gotten Gains for BBC Daytime.
Working together
For Green Inc and Waddell one answer lies in increased co-pro alliances. “It’s down to finance – broadcasters want more bang for buck,” says Waddell. “Americans tend to move faster than broadcasters in the UK but it’s always a slow process trying to get everything together.”
Waddell has half a dozen returning series, including Find Me A Home, Francis Brennan’s Grand Tour and At Your Service, all for RTÉ.
“The expectation of broadcasters in terms of development is so high and so expensive that you can’t compete with the big guns who have massive budgets unless you join forces,” says Stewart.
Green Inc co-pros include BBC4’s Hive Minds with Saltbeef and Ireland’s Got Talent with Dublin’s Kite Entertainment.
Northern Ireland Screen’s most recent funding incentive aims to boost co-finance deals with Canadian producers. Around £330,000 over three years is being made available to support development of digital media and TV projects.
Everyone is searching for a long-running returnable series such as 24 Hours In A&E or Bargain Hunt. “Once you have that volume of hours you can build an industry around it,” says Stellify’s Doherty.
Waddell adds: “A few years ago productions shooting here would have had to bring a lot of people over here, while talent growing up in Northern Ireland would have felt the need to move away to find work. That has changed. Now our talent can see that there is a consistent volume of fantastic work to build their careers on their doorstep.”


IP adoption gathers momentum


IBC

IP is steadily gaining traction around the world with IBC2018 a forum to boost education and discussion about interoperability and the ramifications of SMPTE 2110.
For many people, the most iconic IP project in the world is Australia’s Andrews Production Hub. Based in Sydney and Melbourne, the Hubs have been producing live broadcasts of Aussie rules football, rugby, tennis and cricket for Fox Sports Australia since March direct from cities that are thousands of miles apart.
A remote and centralised production on this scale is delivering immediate logistical savings on freelance crews to the broadcaster. More significantly it is using SMPTE ST 2110 for the IP transport of the video and audio signals.
“This project basically redefines sports broadcast production for a whole continent,” says Andreas Hilmer, Director Marketing and Communications, Lawo, which with Sony, EVS and Ross was one of several vendors that worked with NEP Australia on delivery. “It’s seen as the blueprint for building a distributed infrastructure that makes use of IP networking in combination with the data centre principles.”
The -10, -20, -30 and -21 portions of the SMPTE ST 2110 suite cover the most essential parts of television infrastructure – video, audio and timing – and specify core capabilities for moving separate essence streams across IP networks.
ST 2110-40, the latest to be published, in May, enables packets to be moved synchronously with associated video and audio essence streams. This advance means that every element that has been part of the traditional SDI studio can now be put into an IP studio.
At Wimbledon, IP technologies were deployed to create the equivalent of a 3000 x 3000 router, which meant an operator could call up any signal needed from any of the courts. This simply would not have been possible with SDI where, for instance, signals would have had to be carefully selected and delivered via SDI router tie-lines. An additional benefit is that the system is also able to support UHD production without requiring a separate routing infrastructure.
“TV is going to evolve into something more immersive, more pervasive, more interactive and more personalised,” BBC technology chief Matthew Postgate predicts. “For the BBC the question about our IP future becomes not just when, but also how.”
A clue can be found in Cardiff where the BBC is shortly to open a regional hub based around the latest IP protocols for incoming and outgoing feeds including 2110-40. It is seen as the template for its future development.
The end goal of an all-IP environment is widely agreed across the industry, but the journey is as important as the destination. People come to IBC not only because it showcases proven, standards-based IP solutions, but also to encounter the expertise to help them navigate the transition.
“A handful of IP implementations have already proven their worth in terms of flexible routing, switching, remote production and more where circumstances can be tightly controlled,” says Bryce Button, Director of Product Marketing at AJA Video Systems.
Vice News in New York was one of the world’s first news organisations to implement a pure IP-based technology infrastructure. And the Recife facility of Brazil’s TV Globo is the first Latin American broadcaster to implement IP for live production.
“SMPTE ST 2110 is speeding the transition by making it simple and safe to create best-of-breed solutions through standardised communications,” says Steve Reynolds, Imagine’s President Playout & Networking.
“Momentum is building for IP adoption,” concurs Mike Cronk, Chairman of Alliance for IP Media Solutions (AIMS). “Operations everywhere are recognising the key benefits of IP, such as much greater systems scalability than was ever possible with SDI, the ability to build UHD and HDR systems without having to compromise on signal counts, and the ability to share resources to improve equipment utilisation.”
However, the Andrews Hubs are outliers in an industry that remains a long way from full IP transition. Even AIMS admit that rollout is in its infancy.
“The truth of the matter is that this is an industry in transition, and that is going to continue for some time to come,” says Reynolds.
Imagine’s own research suggests 6% of media enterprises worldwide are already all-IP, and a further 8% just have the odd SDI island. Set that against 40% which are still all SDI.
“Our survey says that in five years’ time, as many as 10% of businesses will still be all SDI. No prudent business is going to throw out viable technology, so IP connectivity will take over as legacy hardware needs to be replaced.”
Button agrees. “A number of broadcasters are reluctant to make the leap to an all-IP workflow until remaining auto-discovery and security challenges are resolved. As a result, a majority of deployments we see in current broadcast efforts will continue to be hybrid SDI/IP implementations for a while yet.”
Even if, for most broadcasters, the question of ‘why IP?’ has basically turned into a question of ‘how?’ and ‘when?’, “there’s still a vast amount of confusion out there,” says Hilmer. “Many customers are seeking qualified advice.”
This view is backed by audio codec specialist Digigram. “IP has not yet completely won over everyone and we still have to face some fears from certain CTOs,” says Raphael Triomphe, Product Director. “To counter this reluctance, the future of IP lies in our ability to make it easier to use and deploy.”
With 98% of technology users demanding interoperable solutions, the IABM urges vendors to adopt a more open approach to product interoperability.
Most live production is wedded to legacy architectures. Liberty Media may have driven digital innovation into Formula One, including the launch of OTT service F1TV, but the host broadcast feed which is channelled online is produced from flyaways based on SDI.
Likewise, the two sporting organisations with the most financial firepower, have yet to invest wholeheartedly in IP. FIFA’s production workflow for this summer’s World Cup was an all UHD and HDR affair but routed all audio, video and metadata signals around its OB trucks and to and from the Moscow IBC using conventional circuits. The Winter Olympics in South Korea at the start of the year moved more of its networks than ever before towards IP for video transport to accommodate increased broadcaster needs for remote operations, but as with its 8K, 5G and VR innovations this was a side order to the main event.
IP Showcase at IBC2018
A record number of vendors – more than 60 – will be presenting at the IP Showcase, sponsored by AES, AMWA, AIMS, EBU, SMPTE and VSF. Product categories cover just about everything under the sun, including cameras, test and measurement equipment, replay servers, encoders, asset management systems with native ST 2110 ingest capability, router control systems capable of using AMWA IS-04 and IS-05 Discovery/Registration and Connection Management.
An important goal of the IP Showcase at IBC2018 is education. There will be wall displays and in-theatre demonstrations (from Pebble Beach among others), that give further insight into why people are moving to IP-based systems. In terms of protocols the focus is on open standards and specifications including the ST 2110 standards suite, AMWA IS-04 and IS-05.


Friday, 3 August 2018

Pressure grows for tighter digital regulation, but public opinion may force change, anyway

Videonet
The early days of the Internet were often dubbed a ‘wild west’ to portray a frontier where rules were bent, broken or non-existent and content was often unregulated, harmful and illegal. Those concerns have bubbled under the surface and recently erupted as part of the fallout from the alleged illicit mining of Facebook’s user data to fuel propaganda. This has led to louder calls for digital media to be made as accountable for the dissemination of content as traditional publishers are.
Last month, for example, a British parliamentary committee told the UK government it should hold technology companies responsible and liable for “harmful and illegal content on their platforms,” and that misinformation and “fake news” is threatening democracy. Facebook, the subject of most ire, has long maintained that it is a tech platform, not a media company, yet defended itself against external regulation in a U.S court in March by arguing that it makes editorial decisions, which are protected by the first amendment.
Mark Zuckerberg’s later testimonies indicate his company is ready to yield to change – if only as a necessity to shore up its brand reputation. “I think it inevitable that there will need to be some regulation,” he told the U.S House of Representatives in April. “We need to now take a more active view in policing the ecosystem…At the end of the day, this is going to be something where people will measure us by our results.”
Governments around the world are considering options. In Europe this is putting strain on the legal framework established in the E-Commerce Directive. The UK government has committed to bring forward online safety legislation as part of its Digital Charter. UK regulator Ofcom has wavered on the topic, wary of regulation’s fuzzy boundary with censorship on the one hand, while on the other protesting that digital giants ought to be doing more to ensure their content can be trusted.
Unsurprisingly, media publishers are also weighing in. It is in their interests to trim the advertising power and reach of large digital rivals. “A consensus is growing that further intervention is needed to address platforms’ role in governing online content, given its importance to the public interest in a host of areas,” stated the report ‘Keeping Consumers Safe Online’ published in July and funded by Sky.
Calling it “the single biggest gap in Internet regulation,” the report recommends a Code of Practice, an oversight body plus incentives and sanctions. Conscious of Sky’s own digital independence, the report caveats, “oversight needs to be cautious, and limited in statute, to mitigate potential risks to openness, innovation, competition and free speech.”
The growing clamour for intervention masks the efforts digital intermediaries are already doing to regulate the content uploaded to their networks. Google reports that its automated technology detected about 80% of the 8.28 million videos removed from YouTube in the last quarter of 2017 (working alongside thousands of human moderators).
Facebook acted against a record 1.9 million pieces of content on its platform in the first quarter of 2018, detected as fake accounts and fake news by its AI system. Even the wild west was not lawless. Its citizens acted to protect and uphold societal values. The weight of public opinion alone, without state intervention, may be sufficient for digital platforms to govern themselves.

Wednesday, 1 August 2018

The science behind the fiction - in conversation with Joe Letteri

IBC


He doesn’t draw or paint nor animates in the traditional sense but give Joe Letteri a computer and he can figure out how to create any image in the world – or a world yet unknown. The four times visual effects Oscar and BAFTA winner has shaped three decades of visual effects and become a master of creature and character creation.

Working with Steven Spielberg, Peter Jackson and James Cameron, Joe Letteri has built the astonishing photoreal worlds and iconic CG characters of Jurassic Park, Lord of the Rings, King Kong, the Planet of the Apes series and Avatar and is currently working on the highly anticipated Avatar sequels.
 “Understanding how and why something happens, moves or works in the real-world gives you a deeper understanding of what make something look real in a virtual world,” he explains. “That has been the driving force behind the way I look at things. I want to see how far you can stretch the techniques and the technology in pursuit of realism.”
He grew up outside Pittsburgh, Pennsylvania, and excelled at science, maths, astronomy and physics. Attending the University of California, Berkeley, he gravitated toward media and film.
“I didn’t see a computer until I was at university,” he says. “But when I began using them to evaluate complex maths problems I became interested in graphics and how you could use computers to create organic shapes like clouds and mountains.”
Around that time mathematician Benoit Mandelbrot began using computer graphics to create and display fractal geometric images. “The infinite possibilities of visualising things with equations and making pictures out of data fascinated me.”
Letteri’s first job, at LA post house Metrolight, was designing logos and graphics for commercials and TV stations. It was enough to attract legendary VFX house ILM to hire him in 1990.
“This was my breakthrough. ILM were working on the opening shot for Star Trek VI: The Undiscovered Country and needed an explosion of the Klingon moon Praxis. They’d tried to do it as a practical effect but couldn’t get it to work.”
Letteri deployed his knowledge of fractals and the simulation of natural phenomena to design and animate an exploding ring of fire; “I was a huge Star Trek fan. I couldn’t believe my luck that this was the first shot I ever produced.”
Up next was Jurassic Park, the seminal picture that ushered in a new age of CGI. Spielberg had intended to use stop motion animation for the dinosaurs but changed his mind when shown a test of a CG T-Rex for which Letteri, working under ILM’s VFX supervisor Dennis Muren, helped establish a new level of realism.
“The idea that you could actually write code [in Pixar’s 3D software RenderMan] to get the skin to look exactly how you wanted and how it reflects light was a revelation. But it wasn’t just about how real we could make [the dinosaurs] it was also about fitting them into the photography. We now had questions to answer about how these CG characters work dramatically within the context of the story.”
Computer graphics was emerging as a tool which combined photography and optics alongside with disciplines like biology for artists to construct increasingly more complex 3D creations. Letteri learnt cinematographic lighting techniques, drawing on his understanding of the physics of light transport, to make characters “believable for a story or setting that never been told or seen before.”
His credits at ILM include Casper, Mission: Impossible and the 1997 special edition of Star Wars.
Pixel by pixel
“Unlike special effects, where you built a model and moved it and photographed it, now you could manipulate the pixels directly,” he says. “It opened the door to a whole different way of thinking about the art, in that anything you could describe you could capture pixel by pixel.”
Jurassic Park’s success in making CG characters integral to the story had far reaching effects, ultimately inspiring director Peter Jackson to begin planning The Lord of the Rings series. In pre-production for the trilogy in 2001, Jackson hired Letteri to join his fledgling studio Weta Digital in Wellington and create Gollum.
“Gollum was the big attraction for me,” says Letteri. “He is one of literature’s great characters. What we needed to do was to make his skin look soft and translucent like human skin, something that hadn’t been done before.”
He devised a subsurface scattering technique to create the effect and also helped pioneer advanced performance capture with actor Andy Serkis, making Gollum the first character to appear alongside live-action characters in a way that didn’t differentiate between the two.
“Creating digital characters with that emotional range and subtlety has been the basis for all of the characters we’ve created since,” he says.
Performance capture evolution
The original contract with Weta Digital was for two years but Letteri never left. Now 61, he is its director, a shareholder, and responsible for all the work that goes that the studio generates including X-Men: The Last Stand, The Day the Earth Stood Still and I Robot.
Under his supervision, performance capture has evolved from a post-production process to a real-time one, allowing a director to shoot actors and digital characters and environments as if they are filming a live-action movie.
A highpoint for this was Avatar(2009), the landmark stereo film of the modern era. “Jim [Cameron] gave us the opportunity to take stock of everything we we’re doing and question all the assumptions of what worked and what didn’t,” Letteri recalls. “We built digital assets and environments, then combined these with the digital puppets being driven by the motion capture. It wasn’t just making believable CG characters it was the scale, detail, and complexity. We had to create an entire planetary ecosystem, with waterfalls and animals so that anywhere Jim wanted to point the camera he would see Pandora come to life in the viewfinder.”
Weta’s virtual production pipeline was refined over successive films. It was used extensively on The Adventures of Tintin: The Secret of the Unicorn and The BFG, both for Spielberg, and advanced again with The Planet of the Apes films which managed to take performance capture out of the studio into a forest and then even harsher conditions of snow and rain.
Under Letteri’s direction, Weta wrote in-house renderers, Manuka and Gazebo, which take their cue from physics to calculate how light interacts with each surface – down to the level of calculating wavelengths of light separately.
The facility’s artists and programmers have built on this to create a suite of tools that aim to exactly match CG renders with the actual lighting on set.
“On Avatar we used pretty unsophisticated lighting which was just good enough to allow Jim to sketch out his intent for composition,” Letteri explains. “What we really wanted was for it to be physically correct so that what the director sees on set translates in sync with the virtual lighting.”
PhysLight uses measurements of lumens and nits, colour temperature and ISOs taken from the camera. “We know the precise ND filters, the specific iris, shutter angle, sensor, framerate, even the serial number of the camera and lens. All this data is used to simulate and synchronise with the cinematography.” 
The Avatar sequels
All of these cutting-edge developments are being trained on Avatar 2, 3, 4 and 5 which are in production in New Zealand and LA for release beginning in 2020 until 2025. Everyone expects the sequels to break ground – and in multiple ways. Cameron has reportedly test shot sequences underwater. Yet with technology moving so fast, striking a balance between taking risks on innovation and using tried and tested routes is one the VFX super must weigh.
 “The risk lies in not doing it,” declares Letteri. “We’ve never been conservative about what’s required of us to figure out a way of doing what hasn’t been done before. If you don’t try you are never going to get there.”
While not strictly a visual effects technique, high frame rates can be another tool to accentuate realism. Letteri worked on The Hobbit which was filmed at 48 frames a second although the hyper-vivid quality was not to everyone’s taste.
“We are testing [HFR] for Avatar,” Letteri confirms. “Audiences are used to seeing 24 frames per second where the intermediate frames go blank and their mind fills the space with an image. It engages the imagination in a certain way. There’s a case for high frame rates with stereo 3D and lots of fast motion and you want to erase motion blur. It depends on the intent of the filmmaker. They might experiment and see if it can be used it to the advantage of certain scenes within a film.”
He feels that audiences are more likely to suspend their disbelief seeing some physical effects like miniatures and stop motion “because there’s a physical reality to it even though they know it’s not real” but that CGI is unforgiving. “We have to get everything absolutely right and not slip up otherwise the fact that it’s not real will strike a more discordant note in the audience’s mind.”
That said, he says it’s often easier to solve a problem in a computer “because you quickly run into limitations when you do it physically.
“I think we’ll be able to reach a point where we won’t be able to tell what’s real and what’s not,” he says. “You will hit the limit of what you need every pixel to look like to do exactly what it would have done if you photographed that character. For some kinds of effects, I think we are already there.”
A keen astronomer who has long harboured fantasies about travelling to outer space, Avatar is a dream come true. “When Jim said ‘let’s go to Alpha Centauri and make a whole new world” I couldn’t wait,” he says. “That’s where I’m going.” 

Wacky Races: Rogue Films jumps aboard The Grand Tour Rally


               
Content marketing for VMI

In celebration of Season 2 of Amazon Prime’s The Grand Tour, social media stars took to roads across the world in The Grand Tour Rally cars. To fuel their favourite driver to victory, fans were encouraged to Tweet or Re-Tweet hashtags in support of the rally.
Rogue Films had to fasten their seat belts too. Working for Niche, Twitter’s in-house agency, and ultimately on behalf of Amazon Prime, the production company followed a similar format to an episode of The Grand Tour albeit with Twitter creators in place of the three familiar hosts.
“Visually, it couldn't be too much of a departure so we aimed to make our episodes feel as close as possible to the show - overly stylised segments included,” explains cinematographer Tom Welsh who partnered with producer Rob Jelley on the job.
The pair accompanied the social media influencers racing from Edinburgh to Monza in Italy (home of legendary marque Ferrari) via Silverstone in electric-powered rally cars.
They then all flew to Mexico for a final thrilling street race through Mexico City. Along the way the drivers had to perform challenges – such as having an electric shock to their legs if their heartrate dipped below a certain pace.
Explains Jelley, “The plan was to create a short 30 second film each day focussed on the story of the journey and the particular challenges but these crept up to a minute and by the time we got to Mexico we made a 2.5 minute piece since we couldn’t pack it all in without losing a sense of narrative.”
“We placed six Panasonic GH4's as minicams inside the host’s vehicles and took a DJI Inspire 2 and a DJI Mavic, plus a number of GoPros,” explains Welsh. “We took an A7SII as B-Cam on a gimbal although in hindsight, a GH4/5 would've been better for this as we used it more than expected.”
The minicam rigs were the same ones (with the same crew) as the team use on the main series - so these were paramount to achieving the same look and feel.
“Since we were cutting and delivering content on the road I picked the Panasonic EVA1 to match easily when grading,” says Welsh who turned to VMI to supply the camera which was hot off the production line. “That was the first foremost reason but I've been wanting to shoot with a Varicam LT for a while too - so that was up as an option. The EVA won in terms of portability with a pretty close feature set too. Often, I'd go for an Alexa Mini in this situation but the nature of our small travel crew meant I didn't have a dedicated camera assistant and switching from car rig to mobile could've proved timely.”
They shot 4K for hero shots and HD when on a tight deadline just for ease of working through selects. In terms of lens, Welsh used the Canon 30-105/T2.8 cine zoom exclusively. “This setup worked excellently for the tracking car setup, which I used the EVA1 for primarily,” he says. “The new V-Log of the sensor took a bit of getting used to in terms of exposure, often reminding myself to keep checking the waveform on the camera monitor than rely solely on my SmallHD 5.
“I used to shoot a lot of FS7 and the EVA1 is very close to the FS5 ergonomically. I like working handheld with a camera at chest level, similar to a TLR (twin lens reflex) stills camera. Annoyingly the side handle rosette on the EVA1 couldn't support the weight of the body with the Canon cine-zoom on it, however, I believe Panasonic have since remedied this issue with an upgraded replacement part.” [Note that this has now been upgraded on all EVA 1 cameras].
The nature of delivering for Twitter in the week of Season 2 release meant social media was intrinsic to the project. The project’s DIT/editor, Taylor spent most of the two weeks in the back of a splitter van, ingesting and cutting as footage was shot.
One downside here is that because of the new camera and codec, we were forced to use the (then, brand new) Adobe Premiere 2018 on similarly brand new MacBook Pros - so we inevitably encountered a few bugs with first gen software and new hardware,” informs Welsh.
A short film of The Grand Tour Rally was finished at Envy.

Dare the Unknown with the F55 in neon-lit Hong Kong


Content marketing for VMI

Fashion brand Palladium wants you to ‘Dare the Unknown’ in a new promo featuring rule-breaking New York-based super model Jazzelle (Zanaughtti), London Grime artist Octavian and Arthur Bray, a Hong Kong-based DJ and a founding member of the Yeti Out collective.

Directed by Vivek Vadoliya, the short film challenges us to be brave, forthcoming and courageous in everything we do and for Benjamin Thomas that applied to his cinematography for the project too.
It was producer Shimmy Ahmed’s friend, the director Lucy Luscombe, who suggested Thomas for the London, Paris and Hong Kong legs of the shoot.
“The NYC portion of the project with Jazzelle had already been shot on the Sony FS7 with vintage Leica lenses,” explains Thomas. “They wanted to use a similar camera package for continuity.
“Balanced against this my task was to film in an ad hoc documentary basis, often without a recce, in challenging shooting locations where additional lighting would often not be possible. They wanted to film mostly at night as Hong Kong tends to be lit by eye-popping neon that would add to the visual language of the film.”
Thomas’ go-to camera when working on a digital project is the Alexa Mini
 or -  if resolution is of significance - the RED Helium: “as I aspire all my work to hold up if shown in a cinema,” says Thomas.
“However, the budget we had available to us was a limiting factor, so the challenge was to get the best camera package we could afford while matching what had already been shot.”
He continues, “For several years now I have been working with Barry at VMI, as his knowledge and stock of the latest film equipment available is all-encompassing. He suggested the Sony PMW-F55 for a number of reasons: a global shutter, internal NDs and a fast native ISO.
“A global shutter is handy for handheld shoots with fast camera movements and internal NDs are time saving to allow you to maintain the aperture you want while moving quickly between different lighting scenarios at will. The F55 has a base sensitivity to light of ISO 1250 making it favourable for documentary filmmaking when working with available light.”
Thomas knew that the Sony PMW-F55 shared the extended colour gamut of its bigger brother, the F65, so the F55 (installed with latest v9.0 firmware) won his vote over the FS7.
The project’s 1st AC, Michael Hobdell, echoed this decision as the F55 VMI provided came complete with an Anton Bauer battery plate with D-tap (“powering our wireless video and follow focus would not add unnecessary bulk”), and a PL lens mount to “provide maximum flexibility when choosing which vintage lenses I wanted to go for,” explains Thomas.
In addition, the option for DNxHD recording meant the project’s editor (David Graham) could cut in Avid without having to transcode all the footage if time was in short supply.
For lenses, Thomas opted for Cooke S2/3 Speed Panchros.
 “They have a unique look and they save me the extra faff of using creative filters,” he says. “The Speed Panchros naturally produce a slightly softened, diffused image which means when I’m looking for a flare or to minimise the size of the camera when in a tight space, I can lose the lightweight clip-on matte box without changing the look of the footage.”
Since VMI have had its set of the lenses professionally rehoused by TLS, there would be no issue when used in conjunction with a wireless LCS.
“I had considered both Canon K35s and the Bausch & Lomb Super Baltar lenses, having used them both on feature films with great success, but I was seeking a different aesthetic, and having tested and compared all these vintage spherical lenses - and several sets of vintage anamorphic lenses - I knew that the Cooke’s would most closely match the Leicas already used.”
Footage was recorded internally at S-Log 3, S Gamut 3.cine ISO 1250 XAVC 2K (2048 x 1080) 25/50P with the exception of some product and driving shots that were captured at XAVC 4K (4096 x 2160) 50P.
“Since we had to bottleneck our grade with the FS7 footage, we opted against RAW capture as this would have necessitated the use of an external recorder,” he says. “Although VMI stock both the AXS-R5 or AXS-R7 external RAW recorders, this is an optional extra for the F55 and would have increased the footprint and form factor of the camera as well as increasing our hire cost and post budget.”
The production was able to save further on costs by using a Petrol carry-on bag and camera coffin when travelling to Hong Kong, a portability “which took the pressure off significantly,” Thomas says.
The shoot was split into two sections: with three days in London and Paris beforehand, and some days later, flying to Hong Kong for an additional three shoot days spread over a week.
“Barry was able to let us hang on to all the kit for the entire three week duration at a keen rate to make the project work. In short, we could not have made this film without the generous support of Barry and the entire team at VMI.”
The film was graded by Caroline Morin at Cheat in Hackney and can be viewed at Vimeo