Thursday 28 April 2016

Nocturnal activity caught on camera for the first time

VMI

The biggest prize in natural history filmmaking is to capture animal behaviour in a way that's never been done before.  Offspring Films achieved this in its eye-opening document of Tarsiers in their natural habitat.

The sequence is part of Monkeys: An Amazing Animal Family, a three hour series produced by the Bristol-based indie for Sky 1. Through intimate monkey encounters, sumptuous photography, field demos and a CGI family tree – biologist Patrick Aryee shows how key members of this primate family evolved bizarre body-types, intelligent behaviours and tight-knit social groups to thrive in some of the planet’s wildest landscapes.
One member of the species which the team wanted to film was the Tarsier, a tiny nocturnal hunting creature identifiable by its large bright eyes. 
“Tarsiers have been filmed before using very bright lights which not only disturbs their behaviour but causes their pupils to contract as if they are blinded,” explains director Clare Dornan. “They've also been filmed using infrared which does capture their natural behaviour but only in a black and white image. Filming them in colour, undisturbed, in their natural environment had not been achieved before.”
Advances in technology mean it could be done with right camera. Wildlife cinematographer Mark Payne-Gill worked with Offspring to test the production's principal acquisition format, Sony PMW-F55, alongside that of specialist low-light camera Canon ME20F-SH, hired from VMI. Payne-Gill had first used the model for BBC show Stargazing Live.
“The Canon is not only a much more suitable rig for run and gun shooting than a DSLR-style camera it was also clear, after tests, that the Canon's output held up much better in low light,” he says. “Because we didn't know quite how far we'd have to push the sensor on location we didn't want to risk not being able to capture the footage so we decided to take it.”
Since the Tarsier would be filmed in the dense and remote Indonesian jungle, Payne-Gill wanted to strip back the camera's form to make it easier to operate under tricky conditions.
“I asked VMI's help in minimising the amount and weight of batteries and other peripherals to make it functional, lightweight and easier to operate – essentially to try and make it into a camcorder rig,” he explains. “The communication with VMI was brilliant. I got exactly what I wanted with expert advice.”
The set-up devised by VMI included a Vocas support plate mounted on 15mm bars to the front and back of the ME20F-SH and a small Hawkwood battery mount. The kit also included a TV Logis Alphatron viewfinder and Convergent Design Odyssey 7Q recorder with its own power. 
At the Tangkoko National Park in Sulawesi, Indonesia, local guides helped the team locate the Tarsiers in their tree-dwelling habitat. Payne-Gill began shooting with a Nikon 70-200mm lens at f2.8 and found that he didn't need the faster 85mm and 50mm lenses shooting at 1.5 and 1.4 stops because of the Canon's extraordinary low light performance. Seeking even closer up shots of the animal he selected 600mm Canon glass shooting f5.6. 
The ME20F-SH has a ISO rated at 4 million so the camera technically had a wide headrooom of sensitivity to play with but Payne-Gill was concerned to achieve a signal to noise ratio acceptable for TV. 
The canopy was do dense it acted like a massive nd filter over the forest floor, blocking out much of the light even in day time,” he says. “Even when the light had gone we were still shooting with the Canon 600mm and it gave us another hour of filming. Only when it was properly dark did we use a small ARRI LoCaster LED panel which really only gave a mild moonlight illumination. I pushed the Canon to 45db, the equivalent of 144000 ISO, and we were all blown away by the results.
“It was stunning that we could shoot at those high ISO ranges using a slow lens that you wouldn't normally consider using in that light,” he says. “We got close-ups of the animal's faces with their massive pupils and in colour so we could properly tell their story.”
Tarsiers are most active at dusk so they began filming on the F55 before switching to the Canon. The results was so remarkable that the producers decided to make the change of camera part of the editorial. 
“To help explain to the audience that these creatures are nocturnal we showed the presenter explaining to camera that we can't follow them after dark unless we use a new high tech camera that can film at night without disturbing them,” says Dornan. “When we did so the effect was impressive. The results looked as if we were filming in daylight again.”

Wednesday 20 April 2016

Light-field Advances Virtual Production

IBC
The essential mechanics of recording light as it enters a camera haven't changed for 150 years but recent advances in light field technology could have profound consequences for everything we understand about moving image content creation.
Rather than recording a flat 2D picture, light-field captures all the light falling on the camera in five dimensions including the direction and intensity of each ray. Research took off in the 1990s and centered on two means of acquisition: arranging multiple cameras to simultaneously record different angles of the same scene, or using a single camera fitted with a micro-lens array (MLA) comprising hundreds of thousands, even millions, of miniature lenses.
You would then need software to compute the data, massive processing power, or compression - or both - to crunch the data in anything like realtime, a means of manipulating it in post and of presenting it.
While much of this chain is rudimentary there are solutions coming to market, notably at the front end. Indeed, the technology is moving rapidly from experimentation to commercialisation. Light-field has made it onto the agenda at the SMPTE and at the MPEG JPEG committees, where a new working group on the topic has been established. Hollywood has taken note too.
“Light-field is the future of imaging,” declares Jon Karafin, Head of Light-field Video at Lytro which has just announced a prototype light-field cinema camera. “It is not a matter of 'if' but 'when'.”
The benefits of being able to manipulate such unprecedented scene detailing data range from a more cost effective path to creating visual effects, to refocusing any part of any frame or even relighting a scene. Shutter angle and frame rate can be computationally altered in post making different release mastering from the same original material almost as simple as pressing a button.
The tech is also believed superior to current means of shooting 360-video for Virtual Reality. “Conventional VR cameras give you a left and right eye flat stereo view akin to 3D in cinema, whereas a light-field would give you a spatial sense of actually being present,” says Simon Robinson, Chief Scientist, The Foundry, which has helped devise plug-ins for both Fraunhofer and Lytro systems.
One significant implication of light-field is that it takes the onus of creative decision away from set and into post production. If any optical parameters can be changed including exposure and camera position - does light-field mean the end of shooting for post and the beginning of shooting in post?
“Ultimately, this is a disruptive technology and we need the feedback of the studio, cinematographer and VFX community to make sure what we are creating meets their needs as well as helping them understand the ultimate creative control this unleashes,” Karafin acknowledges.
While some developers are working on a single-camera solution, researchers at Germany's Fraunhofer IIS prefer a multiple camera array claiming that a MLA will not provide sufficient parallax to deliver a rounded impression of a scene. Lytro counters that obtaining parallax from a single MLA is far easier and far more accurate than any number of seperate cameras can ever be.
Others, such as Raytrix, a maker of light-field optics for industrial applications, says all existing light field systems are limited by the laws of physics. “They are workable with close-up subjects like a face but if you want to extract depth information for scenes ten meters away you might as well use standard stereo 3D cameras,” says Co-Founder Christian Perwass.
In the short term, light-field data will be used for manipulation of imagery and output to standard 2D screens, but in the longer term the technique might be used to create live action holographs. Several companies, including Zebra, Holographica and Leia3D, have designs on projecting a three-dimensional interactive image.
“Imagine looking out of a window in your home. Now imagine that as a holographic picture," says Robinson. "That is where we are headed.”

Tuesday 19 April 2016

Playing in the same Sandbox



Broadcast 
When multiple vendors come together to deliver a pair of world-first video over IP projects this year, their collaborative approach was in stark contrast to the wider picture in the sector.

p 43 http://edition.pagesuite-professional.co.uk//launch.aspx?eid=0896b2e9-6be1-48b5-9eec-8c56c9ef9524
The most publicised IP project was led by the EBU and conducted at Belgium broadcaster VRT as part of its Sandbox technology programme. The remote production of a live music concert in January claimed to be the first to use IP across the entire chain.
While the project has achieved its goal of showcasing how multiple vendor systems could create an SDI-style of reliability for mixing a live event, there are discussions about continuing it perhaps to focus on virtualistion or moving software to the cloud.
Other avenues to explore include marrying the two timing mechanisms for audio and for video over IP which are currently distinct. “The dream is to have only one PTP (Precision Timing Protocol) clock,” says Keon Meyskens, open innovation manager, VRT Sandbox, of SMPTE 2059-2 and AES67. “It looks possible but we need it to be a standard.”
Sandbox only worked with HD 1080i - since that was the practical requirement for VRT - and was consequently uncompressed, but as broadcasters look to Ultra HD a form of compression might be tested by the project participants. The favourite codec in this regard is TICO, a scheme devised by IntoPix. However, Sandbox consultant Michel de Wolf, says compression may not be needed if bandwidth connections higher than 10GigE are used.
Yet another course of action would see the EBU collaborators examine splitting the audio, video and metadata into separate streams rather than a single combined route demonstrated to date. This approach, dubbed TR-03, is favoured by lobbying group AIMS which has come out in support of Sandbox. 
While the emphasis of Sandbox was on a practical solution to live IP there are cracks in the make-up. In its conclusions to the project, the EBU admitted that the project made it “very clear that more collaboration was essential”.
IABM director of technology and strategic insight, John Ive comments, “Many people talk about live IP production as if it were already a done deal. In reality there are many issues to be solved and hurdles to be overcome.”
Jan Eveleens, CEO at project participant Axon, agrees: “We are still in a phase where some big companies are trying to set the market with their proprietary systems and the standardisation is trying to catch up. We have multiple solutions for one single problem which, for a manufacturer, is unsustainable.”
While some aspects have been ratified into international standards others only carry the status of recommendations. Even where standards have been agreed they are not being universally adopted.
ASPEN, a rival proposal to EBU / AIMS, is led by router systems manufacturer Evertz. Its video over IP pipeline is based MPEG-2 and its roadmap includes a separation of audio, video and metadata similar to TR-03.
Indeed, Evertz can point to IP live deployments since 2014. “ESPN is the first real facility built using an IP core and running the ASPEN framework,” says Goyal. “This year the Superbowl was the first major event to be produced using an IP infrastructure based on an ASPEN framework (from NEP's facilities for CBS Sports).” CBS made similar use of the tech for coverage of Masters golf and NBC Sports is trialling ASPEN for production of the Olympics.
Yet another system, proposed by server vendor NewTek, promotes compressed IP video over 1GigE pipes while Sony wants customers to use its own codec.
Opinions differ as to whether these seemingly inoperable approaches risk splintering the industry's migration to IP.
“We have a strong belief in open standards not linked to any vendor and broadcasters are not helped with a mess of different protocols,” says Meyskens. “ASPEN dictates its standards and then makes them free to use which is not the same as an open standard.” 
AIMS member Imagine Communications also takes aim at ASPEN. “Evertz [system] has compromises that are not aligned with the longer term vision of the industry,” says CTO Steve Reynolds. “It’s evident that ASPEN was a short term solution to a short term problem.”
Mo Goyal, product marketing director, Evertz hits back: “The big difference between AIMS and ASPEN is that we’ve used a proven standard … while AIMS is promoting TR-03 as a possible path for IP [and] it’s not proven.”
Understandably, kit vendor trade body IABM, takes a diplomatic tack. “Industry standards remain important and should continue to be one benchmark of stability,” says Ive. “However, looking for a single standard is no longer tenable for every aspect of the industry. The value of standards is increasingly in the open documentation of important parameters, ensuring that more than one supplier can produce systems that will be compatible or operate in a consistent way. If one standard leads others, this will come from popular use.”
It would be wrong to paint this is a VHS versus Betamax argument. Evertz in particular is concillatory. “It is easy for us and our customers to deploy ASPEN today but that's not to say if and when something down the pipe is proven and has commercial value that we can't make adjustments to it,” says Goyal. “It's all software and we have the flexility to adapt.”
Benelux leads the way
The second major recent live IP first was a multi-camera production of magazine show Carlo's TV CafĂ© aired by Dutch broadcaster RTL4. All production operations were centralised in data centers at the Media Park in Hilversum, fed by IP-streamed media over a fibre network from the event, studio and centralised galleries and played out fromDutchView Infostrada’s cloud system. The demonstration illustrated how IP connectivity can be used to reduce on-location and in-studio resources.
That these pioneering activities have taken place in the Benelux raises a question as to whether this is by coincidence, or perhaps proximity to funding in Brussels, or if there's anything special in the region's waters. 
High-tech industries in the Netherlands can take advantage of Amsterdam's location on top of one of the world’s largest internet hubs: the Amsterdam Internet Exchange (AMS-IX). One of them, Dutch encoding specialist Mobile Viewpoint, has had its technology exclusively chosen by the EBU as part of EBU Flex, an all-IP delivery network marketed as a lower-cost alternative to satellite or fibre for contributing live links. Already deployed by the EBU for coverage of the US Presidential campaigns, the system will provide live links for the voting of all 28 countries in the upcoming Eurovision Song Contest from Stockholm.
Belgium physicist Ingrid Daubechies is credited with inventing wavelet compression which is core to the JPEG2000 codec and work that Belgium's IntoPix has advanced with TICO. There are strong image sensor developers in the country too, including super-motion outfit IMovix and projection manufacturer Barco. The sports and news server solutions firm EVS is headquartered near Liege.
While the BBC conducted IP tests form the Commonwealth Games in 2014, the low-lands reputation for innovation could simply rest on its inhabitants ability to collaborate.

“It is in our DNA to work together as a small community with different nationalites,” says Michel De Wolf, former EVS CTO and consultant to the EBU. “US companies are less used to working with people from Europe, while the Japanese have to overcome the language barrier.”

Monday 18 April 2016

Ad Fatigue is Deterring Viewers; What's to Be Done?

Streaming Media

As part of a move away from QoS to QoE, Conviva is introducing Ad Insight, which will apply the capabilities of its video playback experience monitoring to the ad experience.


The performance of online ads should be considered a valuable business metric as analytics firms say Quality of Experience, not content, is the new king.
We all know online video content is subject to buffering and delivery issues but should that also include the ads? While VOD and live event programming is routinely passed under the microscope of realtime network and storage resource planning, the technical performance of the ad playback is often glossed over. It's a glaring gap that analytics providers are drawing attention to.
"Publishers have relied on decades-old models for selection and placement of ads based on methods which worked for linear broadcast television, but these are broken and not suitable for internet television," says Keith Zubchevich, chief strategy officer at video optimization firm Conviva. "There is data available today which enables online publishers to monitor the impact of ads on viewer behaviour but no one is using it. They are just guessing. Publishers have never had to think this way and have never seen a need for this data. If an operator continues to overload with commercials when the data is telling them that this is adverse to a viewer's experience they will go out of business."
"The response to ads is not well understood," says Andy Hooper, VP cloud solutions, ARRIS EMEA. "If I'm a customer of an online video site and I have to watch an ad to get to the content perhaps I can click away after 5 seconds or perhaps I have to watch the whole roll. Either kind of strategy will deter some people, but service providers are not fully aware of this. In another instance, when playing streams on my app from my TV service provider, I sometimes get freezing ads that stop me getting to the content. This is a big problem and one that reflects back on the service provider, yet too often the service provider does not know enough about the totality of the customer experience to do anything about it."
Conviva says its studies have shown as much as a 58% viewer churn based on poor online ad experiences. Compounding the relatively unsophisticated targeting of online ad delivery, experience issues with online ads result in ad breakage, make-goods, potential fraud, and lost availability of ad inventory, not to mention a loss of goodwill from viewers, says the firm.
"The impact on a viewer's experience from ads is massive," says Zubchevich. "If I watch a show and it's riddled with ads I'm getting ad fatigue and I'm beginning to look for content elsewhere. The fundamental next step for service providers is to monitor ad impact."
Not coincidentally, Conviva is launching an Ad Insight product that essentially expands the capabilities of its video playback experience monitoring. It is in beta and launching at the end of April.
Ad Insights, explains the company, enables publishers to run a more profitable streaming advertising business by optimizing ad strategies by giving them a real-time look at how ad placement, duration, and frequency correlate with video engagement. This, it says, provides companies with the right tools to reduce ad fatigue and viewer churn while increasing revenues.
"Service providers could do more in this area," asserts Johan Bolin, VP of Products, Edgeware. "Today, very little is being done partly due to data being sourced from two different parts of the business. There is data from the ad insertion server about the actions of customers interacting with an ad, and data from the ad streaming server about the rendering of that ad. There is a need to cross-corrolate this data. As ad personalisation (targeting) grows I would assume service providers would see these reports as much more of a requirement."
All of this plays into a wider argument that the customer's experience with a service provider now spans multiple traditional data silos. This ranges from the broadband call centre to the TV call centre, across to operational teams examining video player and session data to digital marketing teams interrogating intent to purchase. It probably also includes the emerging field of home network management.
"This is not being done at anywhere near enough scale by service providers," says Hooper. "Customer experience management crosses organisational boundaries. Some service providers are addressing this by installing a chief digital officer or customer experience executive, even at board level, but they need to to do more."
While service providers have tracked and reacted to technical Quality of Service (QoS) issues for some time, executives point up a trend toward tracking the subscriber's end to end quality of experience.
In its OTT: Beyond Entertainment Survey published in November, Conviva concluded that one in five viewers will abandon poor experiences immediately and that 90% of viewers choose services that deliver a superior experience.
This leads Zubchevich to argue that content is no longer king. "For the first time you can forget quality of content as being the single most valuable metric," he says. "Anything you put online has to be marketed and monitored for experience. It must be better than TV."
Publishers still care about the quality of content, of course. Zubchevich's contention is that with so many sources to which a consumer can turn to get the same content, the defining factor will be the experience they receive from the site serving it.
"The number one currency is QoE," he says. "We are just starting to see operators look at marketing the experience of viewing as important as the content itself. This is not something they've ever collected in the past and it's a fundamental shift."
"We need to redefine what QoS is. QoS has meant basics such as 'is the stream available at all?', 'is it available most of the time?', 'can I watch it largely uninterrupted?'," Zubchevich says. "QoE takes a zero tolerance approach to starting the stream. QoE is also about resolution. If I watch a HD or UHD TV broadcast and move to an IP-based provider I am not expecting a poor experience in comparison. What used to be acceptable is no longer acceptable and failure to address buffering or bitrrate issues in that moment means you lose the consumer.
"Everything we see tells us that QoE has a very direct and quantifiable impact on engagement, and engagement subsequently has a very direct correlation to the financial stability of the service."
Netflix and other pure-play streamers have built their business on a data-centric strategy and are not hide-bound by the legacy of separate technical and consumer facing parts of the business which telco and satellite pay TV providers face.
Nonetheless, ARRIS' Hooper contends, '"the mantra 'content is king' has been a fallback excuse for some service providers for not gettting this right. As the market fragments into different content sources I think you'll find that consumers will gravitate over time to the site where they're having least friction. That means pay TV doesn't just have their business threatened by OTT but by new data pipes which can provide a greater customer experience journey through the content lifecycle. Having exclusive content deals is a defensive mechanism. Enabling a better customer experience will deliver more positive brand benefits in the longer term."

Faster than a ray of light

Broadcast 
Lytro’s light-field cinema camera could revolutionise film-making, while developments in holographic display are bringing Star Wars-style technology closer to reality.
Despite swapping celluloid for silicon, the fundamentals of recording images haven’t changed since the invention of photography.
New advances in light-field technology, first conceived in the 1840s, are about to change all that. Light-field is a way to capture or project rays of light in a scene.
Technically, a light field is 5D and includes three spatial dimensions (X, Y, Z) plus two angular dimensions, describing the direction and intensity of the light ray.
Record all of this with enough fidelity and in theory you have a holograph. While there is currently no commercially available display capable of showing a holographic video (see box), there are potential applications in using light-field data flattened into 2D to create visual effects and other image manipulations in post.
Moving rapidly from science-fiction to experimentation, light-field has made it onto the agenda at the SMPTE Cinema Conference at NAB and at the MPEG JPEG committees, where a new working group on the topic, plus sound fields, has just been established.
What’s more, the company that brought light-field to the mainstream in 2012, with a consumer stills camera that enables users to refocus after capture, claims to have developed the world’s first light-field cinema camera.
“Light-field is the future of imaging,” declares Jon Karafin, head of light-field video product management at Lytro.
“It is not a matter of ‘if ’ but ‘when’.” Light fields can be captured in two ways. One is to sychronise an array of cameras, each recording a different point within the same space; the other is to place a microlens array (MLA), comprising hundreds of thousands of tiny lenses, in front of a single sensor. Lytro has chosen the latter approach, building on a decade of software R&D, 18 months of intensive hardware design and $50m (£35m) in funding.
Last November, the company released stills camera Lytro Illum and announced Lytro Immerge, a video camera designed for virtual reality, which it has not yet released. These products are stepping stones to the company’s new system, the specifications of which are astounding.

Lytro Immerge
Currently in alpha test and due to launch during the third quarter of this year, the Lytro Cinema Camera carries the highest resolution video sensor ever made at 755 megapixels (MP), capable of recording 300 frames per second (fps) through an MLA comprising more than 2 million lenslets.
The next highest resolution sensor publicly announced is a 250MP model being development by Canon.
By contrast, HD equates to 2MP and 4K to 9MP. The resolution needs to be vast for the system to process the unprecedented volume of information. According to Lytro, the effective output resolution will be 4K. “We are leapfrogging the leapfrog, if you will,” says Karafin. “We are massively oversampling the 2D to be able to change the optical para meters in post.
Everything that makes the image unique is retained but you can re-photograph every single point in the field of view.”
For example, the shutter angle and frame rate can be computationally altered in post. As studios move towards higher frame rate cinema (Ang Lee’s Billy Lynn’s Long Halftime Walk is shot at 120fps), Lytro thinks there’s a market for being able to render the same project at 24fps or 120fps for theatrical release and 60fps for broadcast, at the touch of a button.
With plug-ins for The Foundry’s Nuke available on launch, the system has a route into facilities where calibrations including depth of field refocus, tracking, stereo 3D and Matrix-style VFX can be created from the same raw information. Lytro admits the technology is “very challenging” and that work needs to be done on the software algorithms, compression system and hardware.
Nor will it be cheap. Too large to be handheld, the camera needs to be physically mounted to a dolly or crane and be cabled to a server up to 100 metres away.
Video village
The server itself is powerful enough to crunch data up to 300GB/s. Even then, the server will sit in a ‘video village’ supervised by technicians. The camera requires “a non-standard optical format” and Lytro will offer the unit with custom focal lengths.
The whole system, including operators, is being offered for rental. Cinematographers, already wary of virtual production technologies eating into their core crafts of camera position and light exposure, are likely to feel threatened.
Anticipating this, Lytro has incorporated a physical focus control and aperture check on the camera to give DoPs reassurance that the decisions they make on set are preserved down the line.
“There are those who swear by film as a capture medium, but for most cinematographers there is no right or wrong, just a tool that best meets the creative requirements,” says Karafin.
“Ultimately, this is a disruptive technology and we need the feedback of the studio, cinematographer and VFX community to make sure what we are creating meets their needs, as well as helping them understand the creative control this unleashes.”
Light-field-captured video may appear little different to the viewer until there are holographic screens capable of projecting a three-dimensional image.
Unsurprisingly, there are major developments here too, with companies including Zebra, Holografika and Leia3D among those with designs.
“We are not there today but we will cross that threshold,” says Karafin. “Holographic is the next generation of display technology and will truly change the way we all think about capturing native imagery.”

LIGHT-FIELD RIVALS AND HOLOGRAPHIC DISPLAY

Fraunhofer IIS
Fraunhofer researchers believe light-field technology could be a more efficient means of creating VFX in post. It has test-shot a live-action fi lm from a 16-HD-camera array, but its main work has been in devising software to compute the data and route it to a conventional post environment.
A plug-in for Nuke will be available soon. “An MLA will not work so well for movies because the baseline difference between the images captured by one sensor is too small,” says Fraunhofer’s Siegfried Foessel.
“It’s good for near distance but not larger distances, and for this reason we work on camera arrays.”
Raytrix
Raytrix has designed a micro-lens with three in front of a 42MP sensor to capture 2D video plus 3D depth information.
This outputs 10MP at 7fps. Raytrix markets to robotics and science industries because cofounder Christian Perwass believes light-field systems are not suitable for capturing most film scenes.

Raytrix
“They are workable with close-up subjects like a face, but if you want to extract depth information for scenes 10-20 metres away, you might as well use standard stereo 3D cameras,” he says.
Visby Camera
The US start-up is in stealth mode on a light-field VR system until 2017, the first part of which will be a codec.
Founder Ryan Damm explains: “The potential data stream with light-field is enormous since you need to encode and decode a 5D image at every possible angle.
To make this workable, these should not be much larger than today’s large video files.”
Leia3D
A team of former HP researchers is developing a holographic display using diffraction patterns.
Due for launch next year, the prototype is able to process 64 views with a depth that lets a viewer move their head around the image. Samsung and Apple have patents fi led on similar technology. “Our patent has priority,” says co-founder David Fattal. “We are talking with them.”

Leia 3D
It has also invented ‘hover touch’, the ability to interact with a holograph without touching the screen, and may combine this with technology that gives a physical sensation of touching a hologram, based on miniature ultrasonic microphones developed by the UK’s Ultrahaptics. “Holographic reality will be on every screen, everywhere,” says Fattal.

Friday 15 April 2016

TV music: what’s the score?

Broadcast 

The much improved range offered by production music libraries is a bonus for programmemakers, but commissioned music can take a show to another level, says Adrian Pennington.

http://www.broadcastnow.co.uk/features/tv-music-whats-the-score/5102599.article?blocktitle=Features&contentID=42957

The natural home of bespoke composition is drama, but programme-makers of all stripes want to give their production a distinctive voice and identity, a big part of which is carried aurally.
“Composed music works really well for formatted entertainment shows such as The Apprentice or The Singles Project, where you can have it timed perfectly for a reveal or a story beat,” says Lime Pictures head of non-scripted Derek McLean. “We try to make reality shows feel more like drama.”
Ironically, music in drama has tended to become less overt as a result. “For documentaries and reality, there is often a need to create suspense and mood using a type and tempo of music traditionally associated with narrative drama, because there may be relatively little actual drama happening on screen,” says composer John Hardy.
While off-the-peg music has its place, there is a risk that it could feel generic. “You can never get the subtlety and sophistication from production music that you can get with a specially composed piece of music,” says Anne Dudley, whose credits include BBC1 family comedy Billionaire Boy.
It is the flexibility of being able to tailor, dissemble and time a bespoke track to the story that often attracts producers. “In documentaries, you need quite specific music that lifts the production but lets the footage speak for itself, rather than overpowering it,” says Grace Reynolds, executive producer at Twofour, who has used composed music on shows including Channel 4’s Royal Navy School and Educating Yorkshire. “As much as we may need a full dynamic tune, we are often crying out for simpler tracks,” she adds.
Philip Guyler scored BBC1’s National Lottery Stars and Hollywood movie Her, while his library compositions were used in Ricky Gervais’ Channel 4 comedy Derek and AMC’s zombie drama The Walking Dead.
“With library music, you can probably catch the mood and atmosphere, but you can’t react very closely to the emotion on screen,” he says. “You can move a library track around a bit, but you don’t have the flexibility to tailor it to the picture.”
The right fit
The fast-turnaround nature of some shows can make fresh compositions unpractical, though a decision on musical direction will take the content into account.
“Composed music doesn’t work for TOWIE,” McLean says, because of the show’s tight editorial deadlines and the decision to feature a lot of commercial music to keep the storyline current (it also had a “very successful” product placement deal with Ministry of Sound’s Marbella Anthems album series).
For ITV Be spin-off Life On Marbs, Lime had a particular Balearic soundscape in mind, for which it commissioned composer Dobs Vye.
The two routes are not mutually exclusive. For Hollyoaks’ recent serial-killer plot, regular partner Audio Networks supplied catalogue sounds and created bespoke music.
“It can be more expensive, but it’s worth the investment,” says McLean. “If you can negotiate a share of the rights, specially composed music becomes a revenue stream for the producer and the library gets to retain the music for reuse in other productions.”
Composed music is perceived as being expensive but, as Guyler points out, a good composer will adapt to the budget. “Electronic music can produce realistic acoustic-sounding scores and you can overdub the sample with live music to give more realism. This sort of production can be done cheaply.”
Paul Farrer (1000 Heartbeats, The Chase, Judge Rinder, The Weakest Link) adds: “Obviously it depends on the kind of score you want, but it’s always much less expensive than you think. As with other TV craft disciplines, the key is communication. Be up-front and trusting by sharing your concerns about time or budget – 99.99% of the time the composer will respect that and work with you to provide their best work for the budget.”Working to a brief
A brief can be as open as half a dozen adjectives or fully finished pilot episodes with temporary music. “I’ve had people send me hours of music on CD to soak in and even had 3D set models delivered to give me the sense of scale,” reports Farrer. “The worst jobs are when producers have become welded to a piece of temp music that they don’t have the rights to use.”
Most discussions tend to be about tone. “‘Too cheesy’, ‘not catchy enough’, ‘too scary for afternoon audiences’,” says Farrer. “Once you’ve figured that out, the rest falls into place. Generally speaking, the more creative freedom you can allow, the better it goes for everyone.”
Highlighting specific music or elements of a track is often a better guide than a verbal description. “People don’t tend to know what they want until they hear it, so if a producer has only a vague idea, I’ll request that they identify some existing tracks that they like as a starting point,” says Guyler.
Producers will issue a pitch to half a dozen – sometimes more – composers and production music services. This puts added pressure on the composer to devise the right sound. “It’s a bit of a lottery,” says Guyler. “You might not win the pitch, but then hear music used in the final show that is nothing like the initial brief.”
A pitch can be anything from a sample of previously produced music or a 30-second taster to a full-blown demo. Demos are supposed to command a fee but there are reports that producers increasingly expect them to be done without payment.
“When you get called for a face-to-face pitch, you are aware that lots of others are going for the same job and sometimes you have to articulate coherent ideas before you’ve seen a frame of a picture,” says Dudley.
Composers increasingly find themselves working without sight of the locked-off picture but, much like colourists, it is their skill at interpreting a brief and collaborating with a producer and editor from script to post that is in demand.
“When you are working at speed, then any information the producer can give you is a help,” says Hardy, who scored BBC1 drama Hinterland using instruments as diverse as a bowed psaltery and wind chimes recorded forwards and backwards. “Lots of late changes to the script can be a nightmare if the music is no longer of the right duration.”
Reynolds adds: “I try to contact composers early on and want them to understand the project, with lots of meetings for them to share their thoughts on how music will fit in. We’ll show them footage as soon as we have it and the dialogue continues through the edit as they send tracks and we provide more detailed feedback about specific scenes. Relationships and trust are really important.”
On a returning series, producers will want to retain the musical continuity of the programme brand while moving the show forward. “Labelling and cross-referencing every audio version can come in handy if, for example, the production reintroduces a character from an earlier episode and you can quickly find and insert elements of their musical signature,” says Hardy.
He has used stems (audio elements) in a novel way to create an online game accompanying the third series of Hinterland, in which users can create their own soundtrack to a clip from the drama using a mix of separate audio pieces.
“The days of production music being seen as the poorer cousin to commissioned music are long gone,” says Farrer. “People will always want new music, just as audiences will continue to demand innovative film and TV content. There are parallels with photography. Image libraries are huge and instantly accessible. Does this mean we no longer need new photographs? Of course not.”



case study: Scoring Poldark

Anne Dudley is working on the 10-part second run of BBC1’s Poldark after scoring the first run of eight. “With a long-running series, there’s more opportunity to develop music than with a feature film,” she says. “The overall aim is to hide the music from the viewer while heightening the emotional content.”
She researched Cornish folk music and the history of the drama’s late 18th-century setting before meeting executives at producer Mammoth Screen to win the pitch.
“I went the extra mile and trawled through some film and TV soundtracks that I thought were in the right vein, to give us some references,” she says.
Dudley explored the idea further with Mammoth managing director Damien Timmer before replacing the temp music on a couple of scenes. “At this point, I began to understand the vocabulary the producers were using,” she says.
After a spotting session with the producer and editor, detailing the dramatic drive of each scene, Dudley spends two to three weeks on each episode, typically working to a final cut. “We’re scoring about 35 minutes per hour. Starting a series from scratch is always hard, but as you go on, you can return to certain ideas for characters or emotions.”
The series’ composition will combine elements of the original score with fresh music IDs to accompany new plot threads, themes and characters. Dudley writes on piano before orchestrating for violin and harp soloists, then oversees recording with an 18-piece string orchestra at Angel Studios.

Ang Lee's war drama 'Billy Lynn' faces projection challenge

Screen Daily 
EXCLUSIVE: Only a handful of exhibitors will be able to screen Ang Lee’s anticipated drama as the director intends.
When Ang Lee’s Billy Lynn’s Long Halftime Walk goes on release in November only a handful of exhibitors will be able to screen it as the director intends.
The TriStar Pictures war comedy-drama, starring Kristen Stewart, Vin Diesel, Steve Martin and Screen Star of Tomorrow Joe Alwyn in the title role, is the first to be shot in a combination of 4K resolution in stereo 3D and at 120 frames a second (fps) - a bold specification that exceeds all DCI compliant presentation systems.
The Oscar-winning director, who previously pushed the boundaries of 3D with Life Of Pi, spoke last year at CinemaCon about shooting at 120fps, which was in part chosen as a means to immerse viewers in Billy Lynn’s combat scenes.
“What Ang Lee is aiming for cannot be done on any DCI-compliant equipment today,” said David Hancock, director, head of film and cinema, IHS Technology, who was speaking to Screen as part of an upcoming feature on flm specifications.
“The latest [projectors] can show 4K 60fps but if you wanted to show Lee’s movie in 3D you would need two of them [one for each eye of the 3D view].”
Digital Cinema Packages (DCP’s) can be made in 4K 3D 120fps but they are non-compliant and there is not a DCP player or Intergrated Media Block (IMB) - a server necessary to throughput the data - capable of handling that right now. 
Preview
Lee is set to screen a 15-minute preview of the film in its fullest specification on Saturday (April 16) at the SMPTE Future of Cinema Conference in Las Vegas. The screening is made possible by a specially modified arrangement of an IMB with dual Christie Mirage projectors.
“This is the only projector capable of doing this and you need two to make it happen,” confirmed Christie Digital’s senior director, product management, Don Shaw.
“Our solution is not intended for cinemas but for theme parks.”
“No current technology in the market can do 4K 3D at 120fps per eye,” added Oliver Pasch, sales director, Digital Cinema Europe, Sony Professional Solutions Europe. “No system has the necessary bandwidth.”
There are reports that Texas Instruments is developing technology to upgrade projectors to play the format and there is work being done on more efficient compression algorithms to improve the efficiency of systems without damaging the overall image quality.
“The bottom line is that we will need adoption of these better encoders and upgrades to the projection systems to get to the point where 120fps 3D 4K can be distributed as a playable DCP,” commented Richard Welsh, chair of the SMPTE Future of Cinema Conference.
High frame rate
High frame rate (HFR) filmmaking gained a high profile push with New Line Cinema / MGM’s The Hobbit franchise, directed by Peter Jackson in 2012.
The effect, which in the case of The Hobbit doubled the traditional 24fps at which film is shot and presented to 48fps, eradicates motion blur to deliver a crisp hyper-realistic look.
Ahead of that film, and anticipating a glut of HFR content, exhibitors upgraded select screens at a cost of $6,000-$10,000 per screen with HFR technology.
In figures supplied by IHS, at the end of 2015 there were 162,000 cinema screens worldwide of which 149,000 are DCI compliant digital screens and of which 74,000 are 3D.
However, the number of installed projectors capable of showing HFR content is harder to quantify since there is no data source on this, according to Hancock.
Estimates range from as low as 3,000 screens, which upgraded to show The Hobbit, to as many as 60,000 screens worldwide today which are equipped with the latest projection systems.
However, the maximum that Sony digital cinema projectors and the DLP systems of NEC, Christie and Barco can playback is 2K 3D 60fps or 4K 3D.
Released in several formats
The likelihood is that Billy Lynn’s Long Halftime Walk will be released in several formats including 2K 3D 48fps, the same specification as The Hobbit, with just a handful of specialist venues outfitted with non-DCI compliant technology to screen Lee’s ultimate vision.
“Lee is pushing the boundaries by trying to show what could be done. But there is no business model to get the film [at its maximum creative intent] into the hands of exhibitors,” said Shaw.
Lee’s comedy war drama, adapted by Simon Beaufoy and Jean-Christophe Castelli from the novel by Ben Fountain, centres on a heroic infantryman’s final hours before he and his fellow soldiers return to Iraq.
In 2014, filmmaker and VFX artist Douglas Trumbull presented his short sci-film UFOTOG at the IBC trade show in 4K 3D 120fps on dual Christie projectors.
Trumbull is developing a system called MAGI for processing content to that high specification and is seeking to licence the technique to studios.
TriStar parent Sony Pictures declined to comment for this article.
Screen International will explore this issue in greater detail in an upcoming ScreenTech feature on film specifications.

Lytro debuts holographic video camera for cinema

Screen Daily 
The prototype is being revealed at the Future of Cinema conference (part of the NAB show) in Las Vegas this week.
Camera maker Lytro is launching a first of its kind light-field video system intended for feature film with significant implications for the future of production.
The system combines the highest resolution sensor ever developed with specialised software to enable frame rate, aperture and focal length to be determined in post-production.
The prototype is being unveiled at the Society of Motion Picture and Television Engineers’s SMPTE Future of Cinema conference in Las Vegas on April 16, part of the NAB Show (Apr 16-21), along with a short film showcasing the technology, and will be launched commercially in Q3 this year.
While conventional cameras record light as a two dimensional image - a mechanism that hasn’t changed since the invention of photography - Lytro’s camera is able to capture multiple viewpoints on set in a single shot, including the direction and intensity of the light rays.
This data can be used to rebuild detailed pictures of the depth and colour of objects in a scene enabling traditional craft decisions such as focal point, camera position and even lighting to be reframed after the event.
“Currently, key creative decisions such as the frame rate are ‘baked in’ at the time of capture. Light-field means you can computationally change the shutter time, frame rate and aperture as a virtual camera process,” said Jon Karafin, head of light-field video, product management, Lytro. “This is a disruptive technology and we need the feedback of the studio, cinematographer and VFX community to make sure that what we are creating meets their needs.”
One advantage of the technology, according to Lytro, is that studios will be able to create versions of the same film at 24 frames per second (fps), 48 fps or 120 fps as desired for theatrical releases and at 60 fps for broadcast transmission all from the same source material, rather than limiting such decisions to the time of shooting.
Other benefits of recording include making Virtual Reality and VFX production less resource-intensive and enabling live action to be treated in the same way as computer generated imagery. Lytro has developed a plug-in for The Foundry’s software Nuke giving the raw data a route into post-production suites where image manipulations such as a change in the depth of field, a shift in zoom or matting for virtual backlots is possible.
The unit will output 4K resolution but in order to do that, alongside computing information such as the depth of objects in a scene, Lytro has developed the highest resolution video sensor ever made at 755 Megapixels (MP) shooting 300 frames a second.
This is more than three times the resolution of Canon’s highest rated 250 MP sensor, which is still in development. In comparison, High Definition is the equivalent of 2 MP.
Before light hits the sensor it will pass through a micro-lens array comprising over two million hexagonal lenses each with a slightly different perspective on the scene. The captured information is otherwise known as a hologram.
“We are leapfroging the leapfrog, if you will,” said Karafin. “We are massively oversampling the 2D image to be able to change the optical parameters in post. Everything that makes the image unique is retained, but you are able to rephotograph every single point in the field of view.”
Challenges
He admitted that the technology is “very challenging” and that work needs to be done on the software algorithms, compression system and hardware.
Too large to be handheld, the studio camera needs to be physically mounted to a dolly or crane and be cabled to a server up to 100 metres away. The server itself is being designed to throughput data at speeds up to 300GB/s (it currently works at 100GB/s).
The server will sit in a “video village” off-set and be supervised by Lytro technicians. The camera requires “a non-standard optical format” and Lytro will offer the unit with custom focal lengths. The whole system, including operators, will be offered for rental. Lytro says some studios are running trials.
Anticipating negative reaction from cinematographers, Lytro has incorporated a physical focus control and aperture check on the camera plus keyframe metadata to provide reassurance to the DoP that the decisions they make are preserved into post.
“When the software is opened in post then whatever decision the DP or assistant or focus puller has made on set will be the same,” explained Karafin. “We are trying to be sensitive to that workflow.”
Advancements in tech
Light-field was first theorised 125 years ago but research only took off in the 1990s. While Lytro uses a single camera, German researcher Fraunhofer IIS has employed a multiple camera array to achieve a similar effect.
Developments such as these have put the technology on the industry’s map. Light-field will be discussed by SMPTE at the Future of Cinema Conference. In addition, a new working group exploring light-fields and soundfields (the audio equivalent) has been established by the MPEG JPEG committees chaired by Fraunhofer’s Head of Moving Picture Technologies, Dr Siegfried Foessel.
The Lytro Cinema Camera builds on a decade of R&D at the company which brought the first consumer light-field stills camera to market in 2012. Lytro made a move into light-field video last year after receiving a $50 million investment led by GSV Capital. It also made a quarter of its staff redundant in order to hire specialists in video and virtual reality. In November 2015, it announced Lytro Immerge, a light-field video system designed for VR acquisition which is yet to be released. The Cinema Camera is the company’s debut professional acquisition equipment.
Lytro was founded by Standford university math professor Ren Ng in 2006 and has been helmed by CEO Jason Rosenthal since 2013. The company received a total of $150 million in funding from backers including Netscape co-founder Marc Andresssen.
Audiences will not notice any difference in content shot with a light-field video camera while presentations are made on flat screens. However several companies, including Leia3D and Zebra as well as tech giants like Microsoft and Samsung, are working on holographic or augmented reality displays.
“I have no doubt that light-field is the future of imaging,” added Karafin

Thursday 7 April 2016

Sky Goes After Millennials With Live Streaming on Facebook

Streaming Media Europe

In a continuing effort to reach younger viewers on mobile devices, the pay TV powerhouse will start delivering sports coverage and live news on Facebook Live.
http://www.streamingmediaglobal.com/Articles/News/Featured-News/Sky-Goes-After-Millennials-With-Live-Streaming-on-Facebook-110295.aspx

Pan-European pay TV broadcaster Sky has revealed its intention to stream live news and sports coverage on Facebook Live this summer.
This does not mean live streaming of English Premier League matches, Formula One, Ryder Cup golf, or other sports to which Sky has broadcast rights, but ancillary coverage of those events in a bid to reach millennial audiences who are deserting the more conservative studio-bound presentation of traditional sports for coverage more in tune with informal fan discussion accessed on smartphones.
It's a strategy that has led Sky to invest $7m in sports streamer Whistle Sports and to partner with them on a social media channel in conjunction with Sky chat show brand Soccer AM last October.
Sky says live streaming to Facebook will offer a unique behind-the-scenes look at a number of major events and breaking news stories.
In a blog post Sky digital director Lucien Bowater said Sky had already been experimenting with live streams and that the results have been "impressive."
"In less than 24 hours since [Facebook Live's] release, more than 100,000 people viewed [former cricketer] Michael Atherton’s exclusive interview with [cricketers] Freddie Flintoff and Kumar Sangakkara at the T20 World Cup," Bowater wrote. "In news, similar numbers watched [Sky News journalist] Mark Stone’s live report on the migrant crisis in Calais."
He added, "The platform works for us because it allows our news and sports teams to connect with the audience in a slightly different way—offering a different perspective, one that might not otherwise be possible in such a fast moving environment."
Sky previously partnered with Facebook on 'Instant Articles' enabling Sky to host articles from news and sports teams on the social network, as they’d appear on the web.
"By working with them, we've seen more views, more shares, and more interaction," said Bowater. "Increasingly though, our digital strategy has been geared towards video. We’ve seen an incredible response to some of the videos we've uploaded to our own social media channels."
Sky News, Sky Sports, and Soccer AM Facebook pages have amassed 14 million followers. In the past three months, the broadcaster says it has attracted almost 370 million video views, and in January, Sky News was one of the most watched Facebook video publishers in the world, Bowater noted.
This week Sky said it would stream some exclusive content ahead of boxer Anthony Joshua’s bid for the world heavyweight title, as well as coverage of the junior doctors' strike from Sky News.
Facebook is set to replace the Messenger button on its site as it ramps up competition with Twitter's Periscope. This will enable any user, not just major content owners, to live stream.
Facebook has also rolled out the ability to 'go live in Facebook Groups and Facebook Events; 'live reactions' make it easy for viewers to express their feelings in real time during a live broadcast; while a new dedicated place on Facebook's mobile app lets users discover live video.

Wednesday 6 April 2016

Whisper Films On The Starting Grid

TVB Europe

After Channel 4 took over free to air F1 coverage for the start of the 2016 season it put Whistle Films in the driving seat. 
https://twitter.com/presteigneuk/status/719825291863056384/photo/1

One of the prized contracts in the outside broadcast calendar went to Whistle Films earlier this year, an award that caused some controversy.

The indie, which formed in 2010, is backed by Channel 4 as part of its Growth Fund and for which the broadcaster took a small stake in the company. When the BBC divested itself early of terrestrial UK rights to cover Formula 1 at the end of the 2015 season, Channel 4 took up the reigns and put the presentation of it out to tender.

Established by Sunil Patel who oversaw the BBC’s F1 output before leaving to launch Whisper, BT Sport presenter Jake Humphrey and ex-F1 driver David Coulthard, Whisper beat more seasoned sports producers like North One – which produced ITV's recent coverage – to the chequered flag.

Sunhil is unfazed by suggestions in the press of favouritism. “The pressure to succeed because we had this high profile win doesn't come into it. The pressure comes from all those on our team to deliver on our own high expectations. We are duty bound to keep fans entertained and to improve coverage where we can – there is no added pressure from any other party.”

Because of the tight nine week turnaround between landing the contract in mid-January and the start of the 2016 season in Australia – nearly two weeks of which was required for shipping equipment to Melbourne – Whisper wisely decided to rehire Presteigne Broadcast Hire as its OB partner.

Presteigne had designed and supplied F1 flypacks complete with air conditioning and power distribution systems for the previous seven years of BBC broadcasts and had the kit ready to go at its HQ in Crawley. It also supplies up to 15 crew including sound ops and engineers.

By and large we are using the same kit as the BBC operation with one major uplift in editing,” explains Patel, who will executive produce C4’s coverage. BBC Sport had made a fateful decision to base its editing on Final Cut Pro 7 in 2011 just as Apple decided it would no longer support a professional version of the software. A sensible decision then for Patel to swap the suites out for four new ones of Adobe Premier.

On site, these are linked with EVS IP Director logging and search tools, themselves integrated with a trio of EVS XT3 servers and further hooked into an EditShare rack of collaborative storage. The rest of the kit contained in two flypacks remains the same and includes a Ross switcher, Lavo sound desk and Riedel Artist for talkback with the only other significant addition being a Sony PMW-F5 with Canon Cine lenses to work alongside conventional RF cams.

This will give us a real cinematic look for feature making, content we are familiar with given our heritage of branded high end content,” says Patel.

Whisper has created a range of brand-funded sports content in association with companies such as Red Bull, UBS, Shell and Hugo Boss. It has also won conventional TV commissions such as BBC1 doc Racing With The Hamiltons: Nic In The Driving Seat, and produced highlights for ITV4’s coverage of DTM German Touring Cars. Whisper also produced BBC2’s NFL studio presented highlights in the run up to the Superbowl 50 introducing a touchscreeen for pundit analysis.

Patel said he decided to apply for Growth Fund investment when TV commissions began to “dry up”, and he felt that C4’s backing would give Whisper “credibility” and better access to commissioners.

Formula One Management (FOM) run a strict and well-oiled machine. Rights holders have to join them at the F1 sting 5 minutes prior to race start and only leave the host feed once the podium ceremony is over at the end. In between rights can only tailor presentation with commentary.

There is limited opportunity to do anything within the sport itself but the real difference is around the presentation aspect, hence our commitment to our talent line up,” says Patel. “The difference will be in the insight we can give to viewers from the people we have on in the pit lane and paddock.”

The FOM set-up is deliberately formulaic across the world. “There are new places – such as Baku and Mexico for 2016 - which we will be keeping a watch over this year but each venue has its own unique challenges,” comments Presteigne head of technology, David O'Carroll.

All the opening sequences and feature material is stored and played off the EVS. A catalogue of historic race material is also held there. “For example, if Lewis Hamilton does something special in practice or race day and he refers to a previous race we do have the ability to find that moment he is talking about and play that incident as live,” says Patel. “It's just a question of searching for the clip logged in IP Director.”

Much of the chatter in outside broadcast circles is about how technologies such as IP can be used to cut the costs of sending crew and kit to events around the world. The F1 flypack and its dozen or so engineers and technicians are already a slim-line production.

The only aspect I can see coming back to the UK would be editing which won't happen until the cost of fibre reduces and internet speeds increase,” says Patel. “The next generation of IP-enabled kit might hold the key to saving costs instead of transporting kit and the cost of hotels.”

Perhaps more than any other sport, Grand Prix racing would seem to lend itself to a higher resolution yet despite dabbling in stereo 3D and making the most of advanced wireless technology, FOM's coverage remains resolutely HD for this season at least.

In principal we could supply 4K in the pods with some minor alterations,” says O'Carroll. “What is more challenging is the reliance on RF acorss the site. There's not a viable 4K link that would allow us to acquire 4K. That said, we can upconvert 50p from the camera which would look pretty good, if not true 4K.”

Given the detail and design of the cars 4K would be amazing for Formula One but until a platform like Sky (which is broadcasting the rest of F1 schedule in the UK) offers 4K to consumers then I can't see it happening,” says Patel. “It will take a year or two.”


With a contract for ten races a year until 2018 the indie will be in the best place to anticipate an upgrade should FOM – or Channel 4 – decide to up the ante.