Thursday, 3 October 2019

Evolving the visuals - Si Bell shoots Peaky Blinders 5


British Cinematographer 

The period gangster epic Peaky Blinders is an established global hit drama and not one that showrunner Steven Knight is going to alter lightly. Yet as seasons five pushes the saga into darker territory, the show’s lead creatives felt the time was ripe for a subtle shift in visual storytelling.
“There was a lot of pressure we put on ourselves to make this as good as it’s ever been and also to try and elevate it,” says DP Si Bell, who previously worked with series director Anthony Byrne on TV projects such as Ripper Street and the feature thriller In Darkness. “We both wanted to stay true to what has made Peaky such a success but also bring a different energy to the story.”
Principally, this involved moving away from fast cuts in favour of devising a number of single developing shots in which the camera is choreographed to move around and with the actors according to the motivation in the scene. Bell likens it to a ballet – which was literally the case in one set-piece scene.
Season 5 picks up in the aftermath of the 1929 stock market crash. The world is thrown into turmoil with opportunity and misfortune – perfect for the Shelby family led by Tommy Shelby (Cillian Murphy) who is now an MP, but also for the forces of fascism, represented by the leader of the British fascist movement Oswald Mosley (Sam Claflin). Mainly set in 1930s Birmingham, the six-hour drama was shot in studios in Manchester with location work in Manchester, Stoke, Rochdale, Bradford and Liverpool.
“The story has moved on, so we needed to reflect that,” says Bell, “but we also wanted our version of Peaky to retain the stylised lighting set by George Steel in the original series. We watched all the previous episodes and worked out the lighting choices, including the slightly OTT style of large beams of light, lots of atmosphere, fire, explosions, and ramped-up speed. We weren’t going to change those elements.
“We did, though, want to cover more scenes in a progressive shot,” he continues. “It’s all about the energy of camera movement and having the ability to move the camera with the actors to try and tell the story in the most economical and interesting way. Plus, we wanted to use zooms and anamorphic lenses which are both new stylistic choices for this show.”

All previous Peaky Blinders were shot with ARRI cameras but Netflix was asking for “a true 4K deliverable,” reports Bell. “To me that meant only one thing: Red.”
Bell sought the advice of Sam McCurdy BSC, who at the time was shooting the Netflix drama Lost in Space on Red’s Monstro sensor.
“I totally respect Sam. He’s a friend and we discuss things quite a bit. He said his experience with Helium, which I first asked about, was good but he urged me to test the Monstro. I was sceptical, since I didn’t think that shooting Anamorphic would work with such a large sensor, but I trust Sam’s judgement, so I set up some tests.”
Bell went back to the lab – actually, his home – and shot side-by-side tests with the same lens on Red camera bodies housing Helium, Monstro and Gemini sensors.
“I was already impressed with the Helium sensor from previous tests and knew that the Red camera body would give us the flexibility we needed for operating. What I didn’t know, until I did some super low-light tests with candles, was just how much I could push the Monstro sensor and see how it would hold up. To be honest, I was blown away. It was really impressive how clean the image was in low light. Its quality of skin tones and colour detail are really mind blowing.”
The conundrum remained: how to fit an Anamorphic lens onto a full-frame (40.96 mm x 21.60 mm) sensor. “We tested this further and found a mode perfect for us. If we shot 4K at 1:1 aspect ratio, we could use the maximum amount of the image circle of the lens. We shot the full height of the sensor and this allowed us to get a square anamorphic picture from the sensor. We’d de-squeeze the anamorphic to the 2:1 delivery aspect ratio and outputted an 8K x 4000 image.”
He adds, “This was perfect for us in every way, creatively and operationally.”

Initially, Bell was less keen than the director about switching from spherical to Anamorphic glass. “I’ve never shot anything on anamorphic before and I was a little worried how this would work in combination with developing shots. We went through a crazy amount of lens tests in pre-production testing Canon K35, Leica, Lumilux, Panavision C Series and old Cooke lenses. Once we’d done that, I was much more confident that we could get the look we wanted and shoot it on schedule.”
The crunch was finding a macro lens that could capture an extreme close-up moving to a wide and back to a close-up in a single shot. In the end, Bell landed on a Cooke anamorphic /i Look 65mm T2.6 along with a full set of Cooke primes (25, 32, 40, 50, 75, 100 and 135mm).
“We used the 25mm for certain sequences but the lenses we lived on were 32 and 40, with the 65 macro for developing shots,” notes the DP. For more stylised moments using contra zooms and slow motion he selected an AngĂ©nieux Optimo 24-290mm spherical glass and AngĂ©nieux Optimo Anamorphic 56-152mm Zoom.
Bell teamed with regular Peaky Blinders colourist Simone Grattarola at Time Based Arts to maintain the show’s continuity of look with the Red rushes. He worked with DIT James Shovlar to test shoot different looks for various series’ locations and used these as reference for Grattarola to work from in the grade. Colour space was managed from camera to post within Red’s image processing pipeline (IPP2) recorded in RedCode 5:1.
“IPP2 worked extremely smoothly, protecting the highlights and allowing us more latitude to tweak the image in post,” says Bell. “I really enjoyed the workflow with Simone. We were all blown away by how great it looked.”

A Tiffen 1/4 black satin filter was deployed most of the time with NDs and rotating polarisers “just to take the shine and reflections off certain things.”
For Bell and Bryne, the proof of what they were trying to achieve lay in creating the developing shots. Much of these were moved on ARRI’s Trinity rig on a Garfield mount or dolly executed by Steadicam operator Andrew Fletcher with Bell on the wheels controlling the tilt.
“With (grip) Paul Kemp we worked a way of getting the camera from floor level to 11 feet in the air, moving the rig around the set without needing to put down any boards. Basically, we could move the camera where we wanted to. I’d control tilt from the monitor, Andrew controlled the movement up and down, and Paul moved the dolly and jib arm so it was a real team effort,” explains Bell.
Audiences won’t have to wait long for one of the series’ signature shots since the opening sequence of episode one does just this by establishing the Shelby clan in the iconic Garrison Pub.
“We enter the bar from the point of view of a messenger boy and move past the bar up and over the top of the snug (where the gang wheels and deals),” describes Bell. “We see people scattering from snug and then we see Tommy approach from below. We move up to his eyeline, then to his hand and back to Tommy’s face, then out to a wide and it develops 180-degrees to follow Tommy moving toward the windows of the pub where we end on a group shot of all the family. It’s one shot covering eight pages of script and it was really challenging to choreograph and particularly to light. There was nowhere to hide lights apart from the ceiling where we put softboxes and flags.
“Since I wasn’t used to the Monstro, I was cautious at first about how well it would really capture the details in the shadows of locations like the Garrison Pub,” Bell adds. “As we saw dailies, I started to push more and more, and push the ISO more and eventually I got really confident with it and became braver in my choices. I pushed it as far as I dared, and I wasn’t let down.”


Monday, 30 September 2019

Facebook now wants your thoughts, literally

RedShark News
When Facebook made a two billion dollar play for Oculus Rift it triggered the first rush to mainstream VR. Now it’s spent half that amount buying a company that makes mind reading technology. The brain-computer interface race is on.
New York start up CTRL-Labs is being incorporated into the social media giant’s augmented and virtual reality division, with the idea of bringing its mind-reading technology into consumer products.
Similar technology (not least one already being developed at Facebook) is worn on the head. CTRL-Lab’s twist has been to create a wristband that can decode ‘musculoneural’ signals into machine-interpretable commands. By wearing the wristband you'll be able to control a computer using just your mind.
“You have neurons in your spinal cord that send electrical signals to your hand muscles telling them to move in specific ways such as to click a mouse or press a button," Facebook VP Andrew Bosworth said in a Facebook post. "The wristband will decode those signals and translate them into a digital signal your device can understand, empowering you with control over your digital life."
Quicker than a reflex, critics make the obvious carp that Mark Zuckerberg was making one more land grab for our collective digital soul.
Erin Biba’s Tweet to Bosworth was typical: "I mean honestly who do y'all think you are. Don't you own enough of our data already? Holy s--- this is gross."
Facebook is already up to its ears in invasion of privacy issues but the idea of a company like it owning your thoughts, potentially being able to pre-cog your own intellectual property, remains a far-fetched utopia.
A more profitable train of thought is why Facebook – and Elon Musk would be another – are turning their attention to solving the brain-computer interface at all.
“Being able to recognise even a handful of imagined commands, like ‘home’, ‘select’, and ‘delete’ would provide entirely new ways of interacting with today’s VR systems – and tomorrow’s AR glasses,” Facebook blogged a year ago in relation to the prototype headset that can transfer one’s thoughts directly onto a computer screen.
The intent is hands-free communication without saying anything.
“Imagine a world where all the knowledge, fun, and utility of today’s smartphones were instantly accessible and completely hands-free... A decade from now, the ability to type directly from our brains may be accepted as a given. Not long ago, it sounded like science fiction. Now, it feels within plausible reach.”
The CTRL-Labs acquisition builds on this by targeting hands-free communication without physical movement.
“It’s the intention [to move], not the movement” itself that controls the avatar, explained Thomas Reardon, CEO of CTRL-Labs, at an industry conference last December.
In June, CTRL-Labs itself was in the acquisition business when it bought patents related to a wearable that enables control of robotics and PCs via gestures and motion.
If something can measure brainwaves to determine movement that a person is thinking about, even if they aren’t physically moving, and then be able to translate that into movement on a digital screen that would truly be something out of Avatar the movie where the lead character was disabled.
According to CTRL-Labs, whose newly minted multi-million dollar founders are both Ph.Ds in neuroscience from Columbia University, measuring signals on the wrist rather than the head makes development of a commercial device easier. It would seem that determining the intent of neurons in the brain can be harder to decipher than doing so from the hand where the signals show less interference.
“There’s consistency across people due to the layout of the muscles [in the wrist],” said co-founder Patrick Kaifosh.
The next step will be to tie this into the core operating system for Oculus and pave the way for computer games played at the speed of thought.
In this scenario even the mixed reality futures imagined in Ready Player One seem anachronistic since the video game players still need full body haptic suits to participate. They are still tethered in rather ungainly real world fashion to the machine.
When the starting point is to conceive the human brain as a computer processor then the ultimate brain-computer interface is so transparent that you’d hardly notice.
CTRL-Labs aim is also to short circuit what they believe to be the brakes on the brain’s system – the body itself. We can think faster than any AI (currently) they suggest but it takes an age for those signals of intent to route to our hands or legs or eyes and therefore into action.
The only people ahead of Facebook, Google or Apple in this area will be the military who will no doubt be plotting thought-powered war games and techniques to hack the cortex of opponents.
If our thoughts can be inferred into actions for the playing of games, they can surely be altered for nefarious intent. See The Matrix for details.

Forget 8K, are you ready for 32K?



RedShark News
With 8K suddenly all the rage and as specifications for 8K tellies have just been agreed, there are already concrete steps to double and even quadruple the number of pixels for display.
Earlier this year Sony installed a 16K screen into the front of a cosmetics store in Yokohama, south of Tokyo. The 63ft wide screen (19.2m) screen is believed to be the largest 16K screen yet.
Sony has plans to make the product available, in custom sizes, for well-heeled consumers.
The screen is based on micro-LEDs, the same technology behind Sony and Samsung LED cinema screens which use tiny, non-organic LEDs - three per pixel – to deliver a colour and contrast quality with extreme high brightness that is on par with OLED, but seemingly without the manufacturing issues or lifespan deterioration of their organically emissive cousin.
While noting that 16K screens are likely to be adopted as a corporate niche for many, many years, respected pundit David Mercer told the BBC that “even 8K on a big display is almost mesmerising.
"When you get to this resolution it delivers almost a quasi-virtual reality experience as your eyes perceive there to be depth to the content."
That’s exactly the impact that we are promised with new technologies such as light field. Developers Light Field Lab are targeting large scale theatrical and experiential venues with its holographic displays first.
It’s also the target for a new music and entertainment venue proposed for the East end of London by the Madison Square Garden group. If plans for the MSG Sphere get the nod from London mayor later this year, it will house the “largest and highest resolution LED screen in the world” at a resolution which this reporter understands to be 32K.
The screen will also be curved to fit the structure’s golf ball design and allow for “immersive” performances (think Black Mirror’s ‘Rachel, Jack and Ashley Too’ with a giant holographic Miley Cyrus) and augmented reality.
There will also be an adaptive acoustics system that delivers “crystal-clear audio to every guest”, a haptic system that will convey bass so the audience can “feel” the experience and wireless connectivity that delivers 25 megabits per second for every guest (presumably based on 5G).
What’s more the 90-ft high sphere will be clad in LEDs panels to project ultra-high def footage, perhaps live from the event inside, over 150 metres away.

The arena

The 4.7-acre site near Stratford’s Westfield shopping centre is currently inaccessible to the public as it is surrounded on all sides by active railway lines. It was last used as a temporary coach park during the 2012 Paralympic Games.
The 21,500 capacity will make it the largest concert arena in the UK.
“London has a population of nearly 8.8 million, which is almost identical to New York City at 8.5 million, but while the London market has two large-scale capacity venues, the New York market has seven,” MSG explained.
Technically, there is no upper limit to content resolution. If money were no object then content of 50K could be produced – NHK the Japanese broadcaster now expanding into 8K drama production has admitted as much.
There are also stills cameras capable of recording triple the amount of megapixels than your ‘regular 8K 7680×4320 pixel count.
Quite whether there is any point given that 8K is widely understood to be at the limit of human perception is another matter.
32K though in an experiential setting, especially if off-set by some form of giant scale AR or mixed reality animation, could make for such a wildly different experience that it would pull in punters from passively grazing Netflix at home.
Worth noting that it’s not just music that MSG promotes either. It also has stakes in a couple of sports teams – basketball and ice hockey – and in esports.
What’s more I understand that a Soho post house is already prepping content for display at 32K.
8K is coming. The Consumer Technology Association this week laid out the official specs for 8K TV. This includes at least 7,680 x 4,320 resolution, support for up to 60 frames per second,10-bit colour and HDR function. Displays will also have to upscale any video to 8K. Official 8K logos will start appearing on sets from January.
A word of caution in all of this though: According to the latest figures from analyst Screen Digest, barely a fifth of Japanese homes will have a 4K UHD TV by next year and only 62000 homes in Japan (less than 0.1%) will watch the Olympics in 8K.
“There is zero correlation between content and product,” said Maria Rua Aguete, executive director at IHS Markit speaking at IBC 2019. In other words, the industry can pump as many 8K screens it likes into the ether but content producers are not taking the bait.

Friday, 27 September 2019

Politics live: Hansard of the airwaves or reality TV car crash?

IBC
The live broadcast of the UK’s Parliament is proving increasingly popular with British viewers as debates over Brexit and legal challenges intensify. 
https://www.ibc.org/publish/politics-live-hansard-of-the-airwaves-or-reality-tv-car-crash/4982.article#.XY4KI_e-pYw.twitter
After heated debate in the Commons on Wednesday night, speaker John Bercow, who plays the part of on field referee, said the atmosphere in the Chamber was “worse than any I’ve known” and called it “toxic”.
Such drama is one of the reasons that Freeview channel 232 is suddenly the hottest button on the box.
BBC Parliament’s broadcast of proceedings from the House of Commons, House of Lords and various committees are usually only watched by a few 100,000 die hard political junkies but Brexit has sparked a ratings spike for the channel and its complementary website Parliamentlive.tv.
A number of video clips from the chamber have also gone viral, recorded by MPs in contravention of Parliament’s own rules.
A record audience of 2.6 million viewers tuned in during the week of 2-6 September peaking on Tuesday 3 September when 727,000 viewers (contributing to 1.5 million for the day) watched as more than a more than a dozen Tories defied their leader by voting in favour of seizing control of the Commons timetable.
That broke the record set earlier this year, when the week beginning January 14 drew 2.2 million viewers.
It is the element of unpredictability, the political point scoring and ticking clock of deadlines which has made recent broadcasts as appointment to view as a major league football or UFC wrestling match. Speaker John Bercow plays the part of on field referee and VAR.
Politicians had refused cameras entry into Parliament for years, afraid of what the public would learn.
“My concern is for the good reputation of this House,” Prime Minister Margaret Thatcher told the Commons during a 1988 debate on the topic. In response, Labour MPs chanted “frit, frit, frit” (“frightened”)
Those fears have come to pass. BBC Parliament is either democracy laid bare or a reality TV car crash.
Peter Knowles, the BBC Parliament controller has a more nuanced view. “People like the idea that this quirky TV channel is showing this drama and that it’s something they have got in common,” he recently told BBC Radio 4's Media Show. “People may hold extreme opposing views but love to find common ground in sharing their experience of watching it.”
Staying neutralWhile news programmes and channels have long dipped into the feed from Westminster adding commentators, opinion and analysis, the Parliament channel has to tread a studiously neutral line.
Aside from Westminster it also covers proceedings from the Welsh and Northern Ireland Assemblies, the Scottish Parliament and the European parliament, and offers a continuous largely unmediated feed.
“There is constant mediation on screen with captions telling you who’s speaking and what they are talking about,” Knowles says. “At special moments (such as Black Rod’s ceremonial proroguing of Parliament) or in gaps in proceedings we do try and guide people.
“One or two of the audience maybe think they know it all - they’ve got Erskine May [treatise on parliamentary practice] costing a few hundred pounds sitting on their knee at home - but most are jolly grateful for guidance.”
He stresses: “We are absolutely squeaky clean as far as commentary [is concerned]. We have to be impartial through and through.”Knowles, a former managing editor of BBC TV News, has been with BBC Parliament since 2001.
The Commons sets strict rules for what can and cannot be shown.
“There used to be a lot more [restrictions]. Mercifully that list has reduced over the years. At the start of parliament broadcasts (in November 1989) what you got was a close-up of somebody speaking (with occasional wide shot) but you couldn’t see anything going on around them. There were no reaction shots (of MPs listening).”
Parliament since relaxed its stance on that and more recent changes have helped give audiences a far greater sense of being there, he says.
“Eye level cameras next to the speaker’s chair have transformed things so you can see when the tellers approach you or see the PM on his feet opposite the leader of the opposition. It’s much, much closer to a feeling of being there.”
Nonetheless, when the session officially ends so too must the channel’s live coverage. “It goes to a colour picture of a clock,” is how Knowles describes the end of session place holder.
That’s tricky, because often the action continues on the green seats and spills out into the lobby, where his cameras can’t go.
For example, immediately after parliament was prorogued the SNP filmed a group of Plaid Cymru MPs singing Calon Lan and posted it on social media.
“It wasn’t part of the feed,” says Knowles. “It does raise the interesting and difficult question which is why do the MPs – who have set the rules – go off and break them?”
Parliamentary rulesThe immediate change [we want] is that if an MP or the PM themselves brings in a guest - which may be another country’s PM - into the viewing gallery and the MPs are clapping and pointing and talking about them, the audience cannot see it. It’s not filmed and that is a real frustration.
“The bigger question is whether we can get some glimpse of the voting lobbies. At the moment the only chance we get is if MPs tweet a picture out. But we can’t see them.”
Protesters in the public gallery are not allowed to be shown either.
“That’s common practice across all parliaments so as not to give much attention to protestors. That said the Scottish parliament might give you one shot, briefly. So there’s a balance they are striking between openness and being seen to encourage troubled behaviour in the house.”
Photos from the voting lobbies are banned – but Speaker Bercow has not wrapped any knuckles.
The ability for MPs to reach their constituents or wider audiences online, even by clipping Parliament channel coverage, is a significant development.
Video of Sikh Labour MP Tanmanjeet Singh Dhesi demanding that Boris Johnson apologise for controversial comments he made about the appearance of Muslim women has been viewed over 2.5 million times on Twitter.
MPs are well practiced at creating a sound bite targeting TV bulletins but the more media savvy MPs (or their teams) are now versed in making montages to share online. The PM or leader of the opposition will have a ready-made question or quote to publish instantly on their Facebook page or Twitter feed to target their electorate.
Even before the cameras were allowed into Chamber, then Prime Minister Thatcher was taking advice on where to look, how to stand, to convey her arguments not to the sitting MPs but to the televisual audience.
MP’s behaviour continues to be modified and amplified by social media. It’s unlikely we will catch Jacob Rees Mogg or anyone else reclining in quite the decadent way he was snapped doing.
The Prime Minister has even begun arranging curated Q&As with the public on Facebook Live – literally putting into practice the narrative of a government speaking direct to voters over the heads of parliament.
One side effect of the heated debates has been a growing fanbase on gaming platform Twitch. The site streams the UKparliament channel to 200,000 viewers on the Thursday PM I checked and has attracted almost 27,000 followers.
One reason given is the real-time chat interaction which the Twitch platform offers to participants alongside the live feed. Viewers can pepper their contributions with emotes (Twitch’s terms for emoji’s) of Johnson, Corbyn and the Speaker.
Broadcasting and streamingAlthough Parliamentary proceedings have been broadcast since 1989, the BBC only took over the channel and its production in in 1998. Bow Tie Television’s involvement with the broadcasts began in 1989 and it’s been the principal contractor since 2001.
It supplies multi-camera coverage of Chambers and Committees, operating over 170 camera channels concurrently to deliver over 7,000 hours of manned unscripted content each year. Now part of the NEP Group, Bow Tie’s staff operate mic switching and sound mixing across 50 room set ups using over 1250 microphones every day, providing the sound reinforcement in meeting rooms for all capture and relay. The teams produce the broadcast output, both directing and producing content and adding metadata markers to facilitate search and discovery.
On the Parliamentary website there are sometimes up to 20 live streams of Select Committees available (operated by Bow Tie).
One direction the channel could take was outlined last year by John Grogan MP is the Labour MP for Keighley and vice chair of the All Party Parliamentary BBC Group
“Is it possible to imagine that using the latest digital technology the BBC could do for Parliamentary coverage what they have done for the Olympics or Wimbledon? As Tony Hall himself told the Select Committee: “could we work with the parliamentary website to allow people to search more easily by topic, to have notifications when things are being brought up in the House?’”
BBC Parliament annual content budget is £1.6m and its transmission costs are reportedly around £7m, a cost that could be slashed if the channel went totally OTT although given the massive public service remit this is not likely soon.

Thursday, 26 September 2019

Google claims era of quantum supremacy

RedShark News
Google claims to have built the first quantum system capable of a calculation that cannot be done by any normal computer. This means that it’s passed some sort of threshold called ‘quantum supremacy’ and while we don’t need to worry about transcending to the digitalverse just yet, it’s a milestone that’s worth reporting.
Simply put quantum computing might actually be able to do stuff that even supercomputers with a brain the size of a planet just cannot.
We’re not going to get desktop quantum PCs soon but the theory is out of the lab. The Schrödinger’s cat is out of the bag.
“As a result of these developments, quantum computing is transitioning from a research topic to a technology that unlocks new computational capabilities,” state the mathematicians behind the breakthrough. “We are only one creative algorithm away from valuable near-term applications.”
Blimey.
Google, Microsoft, Intel, IBM and others including Alibaba are in a race to build a reliable quantum system that can vastly outperform the bricks and mortar of silicon-chip based processors.
Quantum computers will be measured on the atomic scale using the power of atoms and molecules to perform memory and processing tasks.
Unlike digital computing’s requirement that data be binary, qubits (quantum bits) can be in multiple states at the same time. In theory, circuits can be programmed into a state known as a superposition, where they are equal to neither 1 nor 0, but some combination of the two. This fluidity means an unfeasibly large number of calculations can be done on unfeasibly large numbers extremely quickly to supercharge developments in artificial intelligence.
Decades ago, computing and AI visionary Richard Feynman laid down two tasks to kickstart the quantum era. One was to engineer a quantum system to perform a computation in a large enough computational space and with low enough errors to provide a quantum speedup; and secondly to formulate a problem that is hard for a classical computer but easy for a quantum computer.
Google reckons it has done both.
Using a quantum processor named Sycamore with 54 programmable superconducting qubits the Google AI Quantum team measured repeat experiments of a math problem (something to with proving the randomness of numbers which we won’t delve into here).
This took about 200 seconds trumping that of the world’s faster traditional supercomputer, Summit, by about, oh… 10,000 years.
Summit, which is capable of 200 petaflops, is built by IBM.
What’s more, Google’s researchers used an algorithm called Schrodinger which simulates the evolution of the full quantum state.
“Quantum processors based on superconducting qubits can now perform computations beyond the reach of the fastest classical supercomputers available today,” they declare. “To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor. Quantum processors have thus reached the regime of quantum supremacy.”
Scientists have been trying to make a quantum computer for years. Part of the issue is that - surprise surprise - atomic molecules are very sensitive to any kind of disturbance from the environment, such as heat, radiation and magnetic fields. Quantum computer chips must be protected by several levels of shielding and cooled down almost to absolute zero but that still makes them tricky systems to tick over.
Even Google’s boffins agree: “Realising the full promise of quantum computing still requires technical leaps to engineer fault-tolerant logical qubits.” In fact, the experiment was performed in part at a Nano-fabrication facility in California.
Nonetheless, “In reaching this milestone, we show that quantum speedup is achievable in a real-world system and is not precluded by any hidden physical laws,” the mathematicians state.
With Moore’s Law, a pretty consistent theory of computing power, about to hit its laws of Newtonian physics shelf life, the hope is that quantum compute power – governed by Einstein atomic physics - will grow even faster.
But while quantum computing has great potential, the field is in its infancy. And it will take many generations of qubit increases for quantum computers to begin solving the world’s challenges. It will likely be a decade before the industry gets to tackling engineering-scale problems, and it will likely require 1 million or more qubits to achieve commercial relevance.
One wrinkle: The Google paper ‘Quantum Supremacy Using a Programmable Superconducting Processor’ was posted on the website of a research centre for NASA but was subsequently removed but not before eagle-eyed guys at Spaceref had downloaded it.
Whether or not the results of this particular test stacks up, there’s no doubt that the breakthrough into quantum computing is just around the corner.
For one thing, the similarity between IBM’s own quantum computer design and the time-dimension travelling power console of the Tardis can’t be dismissed out of hand.

Tuesday, 24 September 2019

Behind the Scenes: The Goldfinch

IBC
Complex, dense and 800 pages long, the 2014 Pulitzer Prize-winning bestseller The Goldfinch by Donna Tartt has become this year’s prestige cinematic literary adaptation.
John Crowley, who directed the adaptation of Colm TĂłibĂ­n’s novel Brooklyn, has taken on the task and given himself the best possible shot at pulling it off by teaming with Roger Deakins BSC ASC. It’s the British cinematographer’s first film after winning the Oscar for Blade Runner 2049.
“I was attracted to the story because it’s character driven,” Deakins tells IBC365. “I’ve never been into the action genre or superhero movies and it’s getting increasingly hard to find movies that have something that’s really motivated by people in normal situations.”
The coming-of-age adventure focuses on Theo Decker (played at different ages by Oakes Fegley and Ansel Elgort) and the painting of a chained goldfinch he takes from a museum after a terrorist bomb attack that kills his mother (Nicole Kidman). The painting becomes a symbol of hope as Theo grieves, grows into adulthood, turns to crime, and then tries to right his life.
Tartt’s writing is rich with description about even the most circumstantial detail. The Guardian called her approach ‘cinematic’ observing that the narrator of The Goldfinch “catalogues the world’s visual clutter as greedily as any unblinking movie camera.”
“The danger is that details can overwhelm the main story points when you begin to visualise it,” Deakins says. “In a way, you’ve got to start from scratch. I’ve not done many films that have been adaptations. No Country for Old Men comes to mind where to me it felt important to have Cormac McCarthy’s image of that world in mind when shooting. Similarly, here, it’s more important for me to get a feeling of the book, something that’s not in the script. It’s nothing concrete, more a feeling of feel place and character.”
The script by Peter Straughn (who adapted the novels Tinker Tailor Soldier Spy and Wolf Hall) handles the sprawling story by omitting a chunk of time to concentrate on two periods featuring the hero aged 13 then jumps to his twenties.
“The script changed the book from being a linear narrative into something more fractured. That was what I talked about with John in terms of treatment. It’s one big flashback in a way bookended by the same traumatic event.”
Creating the atmosphereSome DPs may have opted for a different lighting or colour scheme to depict stages of the central character. Not Deakins. “I don’t think the past is any different to the present. I don’t like that technique where you use a specific lens or treatment on the image to make it look like a different time. It should be more about capturing a mood.”
Crowley presented him with a whole series of images at the outset of the film to describe the atmosphere he wanted to create.
“These were his feelings of the darkness and the light which we used as talking points really. We didn’t reference other movies, just this collage of different images.”
Location scouting, which Deakins made with the director and production designer KK Barrett, proved invaluable in translating these ideas to the screen.
“I wanted to use natural light where possible but due to shooting 10-12-hour days, including in New York in winter, there was no way we could do that and shoot a full day’s work. It was frustrating but we found work arounds.”
One was to find locations in the ground floors of apartments or houses with access to the street so that Deakins could ring a building with a lighting rig. “Then I had total control.”
Scenes set in a suburb of Las Vegas were shot in Albuquerque, New Mexico “on the edge of the desert where the sunlight was hot, harsh and hostile.”

The only set was the Metropolitan Museum, out of necessity since the building is blown up. Despite this major action scene, Deakins dislikes multi-cam, favouring shooting on ARRI Alexa XT with a set of Zeiss Master Primes in 3.4K Open Gate and an aspect ratio of 1.85:1.By contrast night interiors of scenes in New York were coloured warm while those in Amsterdam were more classically designed “with a whole range of colour based on the time of day and of the city.”
He persuaded the production to visit Amsterdam on location rather than, as planned, shoot scenes set there in a studio against blue screen.
“It was very important that we conveyed a feeling of time passing and the darkness of Theo’s struggle with himself in this hotel room,” he says. At this point in the film Theo is contemplating suicide. “We have to have the reality of him seeing the world from inside this box.”
Painting a shotThe painting of the film’s title is real. It’s by Carel Fabritius, a pupil of Rembrandt, made in 1654 and hangs in The Hague.
“We went to see the painting. The museum made a fantastic reproduction of it, then KK had them painted over to make them even better.”
The painting in itself, Deakins feels, is fairly insignificant. “It’s just a bird,” he says. “What’s important is the idea that the painting is Theo’s physical connection with his mother.”
The director of photography on any film is in a privileged position when it comes to being up close and personal with actors. Deakins has photographed dozens of stars giving arguably their best screen roles including Russell Crowe, Javier Bardem, Frances McDormand, Sean Penn and Tim Robbins, but he still gets a thrill from seeing a performance take shape in front of his eyes.
“I love operating the camera mainly because I am seeing the performances for the first time,” he says. “I am the closest person to the performance, and I know when I’m watching something pretty remarkable.”
In the case of The Goldfinch he singles out Fegley, Sarah Paulson and Luke Wilson, who plays Theo’s emotionally abusive father. “Luke was quite a revelation. I didn’t know him very much from his previous work but just seeing his performance was remarkable.
“What I don’t get to see, until a preview or premiere, is how the performances jig-saw together with each other. For obvious reasons we never film Oakes and Ansel in the same shot. The character development from child to man is, I think, a pretty brilliant translation of a book that many said was unfilmable.”

BT Sport discusses creative benefits of remote production over 5G

SVG Europe
BT Sport bagged another ‘world first’ at IBC last week by successfully demonstrating 5G-enabled multi-location live remote production — but the application throws up as many questions as it does opportunities.
The advantages of remote production – reduction in cost, better work-life balance for employees, reduced carbon footprint and the ability to produce more games – is all good for fans, but not game-changing.
“The real gamechanger is when you combine remote production with producers who understand how to exploit the new technology; how to make the most of the new freedom, including the creative flexibility of wireless cameras,” said Matt Stagg, director of mobile strategy at the pay TV broadcaster.
Jamie Hindhaugh, BT Sport COO added, “I am excited about the creativity. When you start untethering cameras then you don’t need to book RF points and the perception is that you have 14 cams when you’re only using ten. We are very interested in how 5G’s high bandwidth network can integrated with the classic OB van.”
“Rights owners should be looking at remote production more closely,” urged Paolo Pescatore, analyst at PP Foresight who chaired a Q&A at the demonstration. “It represents a great way for them to improve the quality of their asset, reach sports fans and keep them engaged in a much more efficient way.”
The showcase followed BT Sport’s and EE’s ‘world first’ two-way remote broadcast over 5G of the EE Wembley Cup between Wembley Stadium and the Excel exhibition centre last November.
This time, live feeds from three stadiums in the UK, where matches in the FA Women’s Super League were playing, were connected over EE’s 5G consumer network to the broadcaster’s production hub at Stratford and routed on to the OB park at the RAI. Andy Beale, BT Sport chief engineer, switched the feeds live.
Reporters at the venues – Stamford Bridge (Chelsea); The Emirates (Arsenal) and the Etihad (Manchester City) – carried 5G HTC dongles and conducted a live four way broadcast between each other and the RAI.
“5G allows broadcasters to pay for what you need with agile specifications that don’t exist with 4G,” Beale said.
Even 4G cellular bonded links had been a “best effort” from a network perspective, Beale said, especially in congested places like stadia, and even with relatively low attendance figures of 6000 let alone 60,000 fans.
“You couldn’t throw much at it; it was never your priority feed,” said Stagg. “5G will be a rock solid service. It will guarantee performance.”
Network slicing guarantees broadcasters a minimum standard of speed and throughput with 100 megabytes per second lower latency. As a result, OBs will become far more efficient.
“5G is superior to satellite for connectivity,” asserted Stagg. “The ability to network slice — to take a part of the network and put a service wrapper around it – will enable a broadcast grade network.”
Provisioning the network for something like breaking news should be as quick and easy as going to a web front end and entering the postcode, specifying an amount of bandwidth and an amount of time. “That’s our aim,” Stagg said.
Gemma Knight, football match director for BT Sport expounded on the benefits. “For lower tier sports like the National League we might start with the manager at home, then travel on board the team bus, film arrivals and interviews in the changing room. Then you can reposition the cameras for the position. It makes it look like you have a lot more facilities.”
She added that she felt trust in the reliability of the feed. “You want to know that when you cut, it is going to be stable. That’s huge for a director. Plus, it means I don’t have to travel long distances or overnight at a venue and I can spend more time with my son. That opens up a world of opportunity for me.”
BT Sport’s pitch side reporters were able to follow video of all live games on their mobile device enabling them to deliver a more up to date and informed goals update to viewers.
The broadcaster is developing apps including in-stadia AR such as instant playback of a penalty goal for spectators to watching a holographic view of the match “with the ability to walk around the field of play and see replays from every conceivable angle.”
It envisages that stadia could be ringed with a vast array of remote operated cameras catching every possible angle and giving a total field of view.
Switching between two or more cameras at one location
Such exotic consumer applications are being lined up for introduction of the full next generation 5G core network, enhanced device chipset capabilities, and increased availability of 5G-ready spectrum from 2022.
Amid all the excitement there are multiple challenges and wrinkles. For a start is not likely to be the main connectivity solution for every event for some time, possibly ever.
“We want to bring pictures back in the best quality we can, so if there’s fibre to the venue we will use it. If we want more flexibility with cameras will mix in 5G and if we need to use 4G we will bond it.”
Switching between two or more cameras at one location over 5G is another hurdle, as is cutting between 5G enabled and other contributed feeds in synch.
“One discussion with the standards bodies is getting to the ability to time signal in 1 millisecond so over a mobile network you will be able to get a PTP clock,” said Ian Wagdin, BBC senior technology transfer manager. “That means I can genlock cameras over mobile network. That breakthrough will be of massive importance.”
BT Sport and the BBC, in the guise of Stagg and Wagdin, represent the UK’s creative industries on 5G to mobile standards body 3GPP.
“Among other things we are discussing is how do you tell a story when handing off a 5G video feed? We need to have an integrated workflow,” Wagdin explained.
There were calls for bonded device vendors to start working with 5G chipset manufacturers to bring 5G links gear to market.
There is also the contentious issue of spectrum use for media specific applications like remote production.
“Traditional broadcasters are under an immense amount of pressure,” said Pescatore. “Valuable spectrum has and will continue to be taken away for reuse of rolling out mobile networks.
“Freeing up more spectrum for super fast 5G networks will mean better TV experiences for everybody, as well as helping fulfil obligations for rural coverage and meeting mobile demand. In essence, it solves a lot of problems.”
Asked whether 5G would be the de facto standard for remote production by 2030, BT M&B lead propositions manager Alison Hutchins said, “No”; Matt Stagg said it would depend on the sport and the tier of sport; and Wadgin said, “Yes. There will be no other way.”