Thursday, 23 February 2017

Basketmouth plays Wembley with multi-camera production values

VMI

The sell out show Basketmouth Live @ The SSE Arena, Wembley was not only a high octane gig for the Nigerian comedian but a night of great success for live video producer Johnsons Creative.
http://vmi.tv/case-studies/article/129

Produced by Whytelion, Cokobar.com and Barons World, Basketmouth Live has become a fixture on Valentine’s Day proving a massive hit among the afro urban community in Britain.

Previously held at Indigo O2 and Hammersmith Apollo, the change of venue to the 10,000 capacity SSE Arena also triggered a change in live video producer.
When Whytelion approached Johnsons Creative’s Uchenna Johnson, who had handled production management at previous comedy and concert events, she didn’t let them down.
“I’ve done live directing and mixing before as well handling camera crew but this event was a step up in every department,” she says. “I really wanted to push myself and do something of high quality for Whytelion. That’s when I called VMI.”  Familiar with the hire company from previous experience, Johnson sought a solution that would pack considerable bang for buck.
“I went along to the VMI New Products Day and saw the new Libec Swift Jib 50 being demoed. It seemed perfect but I didn’t think we could afford it, particularly with the robotic head. Nor did I think we could get it in time since this was very last minute, so I was delighted when VMI upgraded us.”
The Libec Swift Jib 50 is a telescopic jib arm with Remo 30 Robotic Head capable of achieving a maximum height of 2.8m/9ft. This was complemented with three Sony PMW-200 XDCAMs and a PMW-300 with a longer zoom before returning to VMI to have a personal demonstration of the Sony Anycast mixer. 
“I’ve used mixers before but not this model so the demo was really handy for me to familiarise myself with the control surface and its functions. It gave me the confidence to work with the mixer on the day.”
Johnson coordinated the whole production. One camera operator worked behind the scenes to record artist interviews and another worked front of house to film vox pops of audience guests giving their reasons for attending and their reactions afterwards.
She directed the live mix, shared on screens around the venue, using three cameras including the jib-mounted camera on the stage.
“I made sure we got to the Arena several hours before the show so I could walk the venue with the camera operators and do pre-runs,” she reports. “We did some kiss-cams which went down well. Everything went very smoothly.
So much so that Whytelion have already indicated that they will work with Johnson on the production of next year’s Valentine’s Day show. “They said they were impressed with what we were able to achieve. I was under a lot of pressure on the day and I couldn’t have done this without VMI’s support.”

Wednesday, 22 February 2017

Sony claims world’s fastest flash memory

RedShark News

As camera makers introduce models that are capable of higher burst rates and 4K video capture, memory card technology must move at a similar pace to take full advantage of these features. Step forward Sony’s 300MB/sec new entrant.

http://www.redsharknews.com/production/item/4378-sony-claims-world’s-fastest-flash-memory

Sony claims to have done that with the SF-G, a new SD card series in the UHS-11 class, that leapfrogs all comers with a spec that pegs write speeds up to 299MB per second and read speeds up to 300MB/s for file transfer.
By comparison SanDisk’s Extreme Pro UHS-11 card – holder of the throne since early 2014 – was rated at 250MB/s write and 280MB/s read by the manufacturer. Rivals include the Lexar Professional 2000x UHS-II U3 and Toshiba Exceria Pro UHS-II U3.
All marketing claims need substantiating against actual performance and it’s a caveat Sony itself makes stating that: “transfer speeds vary and are dependent on host devices, operating system and usage condition.”

However, we’ll take it as read that Sony’s new card does indeed make it the best performing of its class for your high-performance DSLR or mirrorless camera. Write speeds of this order are essential for continuously taking high-resolution images, such as RAW and JPEG, as well as sustained video capture of 4K.

Sony says it wrote an algorithm to prevent loss of speed in data writing even after repeating burst shooting, and that this contributes to the camera’s speed of burst shooting.
The cards are available in 32GB, 64GB or 128GB capacity from next month and can be used in conjunction with a new memory card reader, Sony is introducing in April. The MRW-S1 has a SuperSpeed USB (USB3.1 Gen.1) standard A port for cable-free PC connection, so that files can be copied faster than copying through the SD slot on a PC.
It’s handy to know there’s a free downloadable file rescue software for recovery of deleted images or videos, including RAW images and 4K XAVC-S video files.
The UHS (Ultra High Speed) version II has a maximum transfer rate up to 312MB/s – so we’re pretty close to needing an even higher performing standard from the SD Association.
Cameras supporting UHS-II include Fuji’s X-T1, Panasonic Lumix GH4 and GH5 and the Samsung NX-1.

Tuesday, 21 February 2017

Pretzel Films mixes a zany cocktail with Alexa Mini and Superspeeds

VMI

An online commercial for drinks brand Island Oasis achieves a zingy yet retro flavour thanks to a smart creative brief which mixes high end digital capture and super slow motion laced with old-style optics. 
http://vmi.tv/case-studies/article/128

The script, from creative agency Vayner Media, features a Spanish barman intoxicated with the idea of mixing cocktails with the natural fruit based ingredient. Executed by Pretzel Films, the series of spots are shot in a set which has the feel of a typical Spanish bar.

“In accordance with the brief, we needed the decor to feel a bit tacky and a bit dated with cues from the eighties, and also to make the whole design really bright with poppy neon colours,” says Pretzel Films’ Production Manager, Leonie Marzecki.

The obvious choice of camera may have been a RED or ARRI Alexa
but DoP Charlie Goodger working with Director Gille Klabin elected to shoot with Alexa Mini and older Zeiss Superspeed lenses.

“I mainly shoot music videos and smaller commercials without a huge crew so probably my camera of choice is the Alexa Mini since it’s lighter and more convenient for me to handle,” says Goodger. “Other considerations included its ability to shoot 200fps and because I knew it could capture great images with unusual lighting conditions.”

Lighting was arguably the biggest challenge on set since Goodger had to balance lighting for the actor adjacent to the jet-black Island Oasis drinks machine with both against a busy bar background.

“The aim was to retain saturation in the image whilst letting the background fall off into darkness. We put a backlight on the actor while making the Island Oasis was neutral in terms of saturation and making sure we can clearly see the ice inside it.”

Goodger selected a set of Zeiss Superspeeds MK II 
including a PL Prime, 18mm, 25mm, 35mm, 50mm and 85mm. “The treatment was meant to feel retro so wanted it to be not too crisp and sharp and with an element of something nostalgic to it without going too far. The older lenses, which can catch the odd flare, worked well for that.”

Other kit for the production included a 17" Sony PVM-1741 OLED and 9" TV Logic LVM-095W1 monitor
, an ARRI FF5 follow focus kit and MB20 Matte Box for the Mini and a set of  Tiffen Soft FX Filters. 

The higher frame rate was deployed in more surreal images of ice cubes and strawberries flying around.

“We shot for a day as scheduled and nothing went wrong,” says Marzecki. “Everything from VMI went to order.”

The spot, ‘Slow it down’, was produced by Pretzel's Mike Facey for online distribution with a cut-down version fo

Monday, 20 February 2017

How cable companies should be using big data

Kinect365 for Cable Congress

Cable service providers should be embedding best-practice big data processes to improve performance.
https://knect365.com/cable-congress/article/5732ecc9-e52a-4ac9-bb89-be9ab1f0461d/how-cable-companies-should-be-using-big-data?_ga=1.62156526.310922387.1485974885

With the Netflix’ of the world extensively using data science to drive engagement and revenues by improving the relevance of their content catalogues, as well as the performance of the user experience, it is clear that cable companies need to make use of similar technologies and tools to stay competitive.
Arguably, pay-TV providers are only now realising what players in other sectors, such as banking and retail, have learned the hard way – that in a highly saturated market consumers expect operators to know them and be able to cater to their circumstances, preferences and needs in real time.
A recent Paywizard survey found that 84% of customers said they would cancel a pay-TV service as a result of poor customer experience – and 24% have done so in the past year. What’s more, 46% said they retained a subscription they might otherwise have cancelled due to positive customer experience.
“Providers that do not address subscriber demands for more personalised, relevant and timely experiences by tapping into data-fuelled customer insight will see high churn rates and defections to more CRM-savvy services,” says Paywizard chief marketing officer Bhavesh Vaghela.
“Operators offering on-demand video services, for instance, need to be able to identify when a subscriber is nearly finished the series they’ve been watching religiously and offer some spot-on recommendations for new content – and maybe a deal for some content they previously did not have access to – to keep the customer on board.”
A focus group research run by Paywizard last year showed that subscribers have a very negative view of telcos and operators primarily offering broadband, “which spilled over to their perceptions of these companies’ as pay-TV providers,” suggests Vaghela. In contrast, the feedback on OTT services was that “they are much more on-the-ball and far more responsive when it comes to customer experience”.
In Germany, for instance, Amazon Prime Instant Video has emerged as the market leader with 46% of consumers surveyed taking the service compared to 41% subscribing to runner-up Sky Deutschland. Similarly, in the UK, OTT players such as Netflix, Amazon Prime and, more recently, Now TV are hugely outpacing more traditional multi-service operators in terms of subscriber growth.
“Cable and satellite providers need to raise their game when it comes to interaction with subscribers,” says Vaghela. “OTT providers are using all the tools at their disposal to gain insight into their customers and make sure they make each interaction – from subscription uptake to service disruption queries to content recommendations – produces a positive customer experience. Broadband and cable providers need to emulate this data-driven, subscriber-focused approach if they hope to stem churn and firm up customer loyalty.”
Cable companies have used spreadsheets and business intelligence (BI) products for years, looking at ways to improve operational efficiency, reduce churn and optimise revenues. Through recent investments in DOCSIS and server virtualisation, MSOs have been evolving their technology stacks to become more IP-centric when it comes to delivering subscriber services.
The game-changing factor, according to Nagra senior product marketing manager Simon Trudelle, is the possibility to leverage a broad set of real-time data sources, including non-structured data from multiple systems and applications, and combine these with libraries of smart data analytics algorithms.
“This allows data scientists and managers to quickly identify key data correlations, tune algorithms on the spot, and turn these data insights into repeatable actions that deliver tangible business results over time,” he says.
Service providers use such techniques to improve STB and multiscreen app usage, understand viewer behaviour and interest for specific TV programs, drive up OTT VOD store consumption, or analyse the root causes of churn by customer segment among possible use cases.
Going a step further, industry pioneers like Altice Group’s Cablevision in the US or Sky in Europe have used data techniques, coupled with a smart ad engine, to deliver more targeted TV advertising and increase ad revenues.
“It is clear that a fully connected, two-way TV platform is the foundation to a data-driven business strategy, so there is an imperative to make sure that all TV screens, from STBs to player apps on consumer devices, run applications that collect data,” says Trudelle.
Yet leveraging big data and analytics goes beyond traditional dashboarding and data mining. It needs to use a new set of tools and technologies that can rapidly aggregate multiple real-time data sources of any format, while allowing business users to quickly identify correlations, and more importantly drive repeatable actions that fix business issues.
“Finding out that 20% of a VOD catalogue generates 80% of the consumption for a given user age group, with 30% of the assets that are never watched is helpful insight and can help reduce content acquisition costs,” says Trudelle. “On top of that, making sure that movies that are getting significant attention on social networks are then systematically promoted within the VOD service matters too, contributing to the overall business performance of the service. It’s this kind of link between data insight and day-to-day operations that is key to drive business excellence for cable broadband providers in the long run.”
Nagra’s user experience and content protection TV platform products are now ‘cloud-ready’ to deliver real-time data streams that, along with other data sources, can feed its Insight data analytics platform to empower service providers.
One secret to mining accurate data-driven customer insight, and reacting to it successfully in real time, is being able to access all the relevant information on customer behaviour patterns.
This requires operators to have solutions that don’t just capture information – such as sign-up forms, viewing data, billing records, marketing interactions and even complaints – but that they store that information in what Paywizard dubs a single customer view (SCV) database.
The lack of a unified database can be a particular problem with cable and broadband players, many of which hold phone, mobile, broadband and TV customer data in different places. A telco provider that has introduced a television offering has to be sure to bring this TV subscriber data together with that of the original parts of the business.
“This SCV data can be analysed, segmented and used to more effectively communicate, service and engage customers,” says Vaghela. “The result is a better overall customer experience and a stronger brand for the operator.”
There may, though, be a cultural inertia in cable broadband companies inhibiting best data practices. “The most successful are not afraid of what data tells them and not afraid to act,” contends ThinkAnalytics CTO and founder Peter Docherty. “It’s not about criticising the way a job is done before but thinking of a better way to reach the result.”
There are also technical restrictors. “It's not enough to have the right information – you’ve also got to be able to do something with it and that requires flexibility in the systems you’ve deployed,” says Docherty. “With some middleware or back-end systems it is not easy for operators to effect change. It might involve changing something in the app and then getting users to download the new app which might take weeks even to get it certified and pushed to an app store.
“Being competitive and improving engagement is not just about another data lake or another graph telling you what need to know,” he insists. “You need to be able to close that loop in minutes not months.”

Friday, 17 February 2017

Michael Cioni - The Technative

British Cinematographer

It takes a special kind of bloody mindedness and vision to succeed when all about you are predicting failure, but then few people possess the passion and relentless determination of Michael Cioni.

The president and co-founder of digital post-production pioneer Light Iron is about to make another splash when Panavision’s digital cinematography camera, the Millennium DXL, launches commercially this winter.
As significant as this product could be, it is likely far from the pinnacle of a career forged by two watershed moments.
Growing up in Chicago, Cioni was intrigued by his father’s work as an animator. He recalls how, after school, he watched his father create animated cell drawings for commercials featuring the Ninja Turtles or Care Bears. With his mother he used to visit the local Eastman Kodak plant to collect newly developed rolls of film and deliver them to his father's company, Cioni Artworks.
“Sometimes he would let me expose one frame of a film for a commercial on his two-story animation stand,” says Cioni. “As a kid I got introduced to technology and creativity working hand-in-hand. That’s a powerful combination. The process of cell animation – which is akin to stop-motion – is hyper-technical, but drawing cartoon characters and giving them personality is hyper-creative. I began then to recognise the benefits of merging the two, and later saw a changing ecosystem in which the technical and creative worlds drew even closer together. Defining this is critical for the next generation, and is why I created a word to describe those of us who understand today’s balance between technology and creativity: ‘technatives’.”

He adds, “I believe the next generation of greats in the motion picture industry are going to be in equal parts, technically-minded and creatively-minded. In the past, these disciplines have traditionally been separated, but this will not be the case in a heavily automated, highly-complex and competitive future. The most important advice I can give future filmmakers is to approach the creative process with a technologically open mind.”
        

Cioni is proof of a new generation for which a grounding in art and science is essential. Armed with a belief in the power of the ‘technative’ Cioni embarked on a degree in Mass Communications and Media Arts at Southern Illinois University, and soon ran up against his second formative experience.
“I was very fortunate to be at college studying film at a time when digital cinema started to emerge,” he says. “My friends and I were hungry to get our hands on new digital technology, and did everything we could to learn about it and manipulate it to our advantage.”
And this they did, using student loans and money gifted from their families to purchase a complete digital filmmaking kit including an Apple G3, Final Cut Pro, a Sony PD150 DVCam camcorder and a FireWire infrastructure to integrate it all together.
The university, however, was unimpressed. “We were producing short films out of our apartment of better quality than anything made on the school's expensive equipment. Our editing systems were better, our cameras were better and we authoured our first of many advanced digital workflows. It was new, organic and grassroots. Just a bunch of film students discovering an entirely new way to create stories with professors that couldn't comprehend what was happening. I remember being told on more than one occasion that if I used Final Cut Pro and my PDF150 to do a class project I would be failed. I, of course, did it anyway.”
As college juniors, Cioni and friend Ian Vertovec subsequently won a Student Emmy and Kodak’s Emerging Filmmaker program in 2000 using their DIY digital kit. When they were invited to the Cannes Film Festival, they had to choose whether to go or miss all-important end of year exams.
“Some of our film school teachers were so troubled with our renegade success they didn’t consider the Cannes Film Festival an excuse to postpone the exam, and so when we went, we failed most of our junior year classes,” says Cioni.
         Undeterred, the duo and their team ended up winning the same awards the following year with the same outcome. A trip to Cannes met with failed final exams. Soon after, they quit college without having graduated.
“As a twenty-year-old, I learned a great lesson: that people generally prefer things to change slowly or not at all,” reflects Cioni. “To those dependent on the status quo, rapid change is like the friction on an active fault line. Seeing that Luddite mentality at an early age prepared me to push for innovation even when others resist it. I learned that my technative instincts can have a significant impact on telling stories, and that there were other filmmakers interested in that too."


So successful was the tyro filmmaker that the news entertainment programme he directed with Vertovec and friends while at SIU, ‘alt.news 26:46’, won five Emmys in its first two seasons (the show still airs on PBS and has gone on to win 13 more Emmys).
Naively perhaps, in 2001 Cioni thought taking his passion for digital cinema from a small town in America's midwest to the Hollywood elite would be easy. “We figured we'd go to Hollywood and get jobs with everyone else who is excited about this new technology! But to our surprise, Hollywood's leaders were a lot like the faculty at SIU; they didn’t like the idea of digital cinema either. Instead of being welcomed with lots of opportunity, we had to work on our own to find like-minded people who recognised that a digital tipping point was approaching.”
They first partnered with director Christopher Coppola and together formed PlasterCITY Digital Post in 2003. It opened with an office on Hollywood and Vine and grew over six years to be one of the most significant digital desktop-based post houses in the world.
“When we started there were only a few filmmakers that embraced an end-to-end digital workflow,” he remarks. “George Lucas set the stage with Star Wars Episode II using a Panavised Sony F900. Then filmmakers like David Fincher, Michael Mann and Robert Rodriguez started building their own pipelines for digital experimentation. For our business, it was indie filmmakers who loved digital because they were willing to try anything to save money and still have a professionally looking product. But we knew the time when mainstream cinema would start adopting digital was just around the corner, and in 2006 RED Digital Cinema became the catalyst we needed.”
“When RED started, there were many who publicly argued their camera had no professional merit” he says. “But it wasn’t necessarily the actual camera they had issues with, it was that the product represented a change in the market’s pre-established power and control. When powerful people are met with disruptive innovations, they tend to resist significant changes that have the potential to negatively affect their interests. The bad news for entrepreneurs is that there will always be credible critics in a changing market (which makes it hard to know who to trust if you're on the fence). But the good news is that everyone gets exposed to these new ideas. Those who once said they’d never shoot digital are now preaching to others to convert. It's because of the companies who willingly took a lot of the risk that shooting digital is no longer taboo.”

In 2009 Cioni and Vertovec, along with Cioni’s brother Peter, struck out on their own and set up a fresh file-based 4K facility. “The concept was to marry the desktop efficiencies of the indie world with the industrial needs of massive studio productions. Merging the light with the heavy iron if you will.
“Even though the financial market was a terrible time to start a new business, we were seeing more traction toward file-based filmmaking, so we offered a house built on support for tapeless workflows like RED and ARRI Alexa.”
The facility quickly grew, prompting an expansion into New York with the acquisition of OffHollywood’s Manhattan assets, and taking the facilities’ expertise on location with a series of international mobile laboratories called Outpost.
“Suddenly we were servicing a dozen major tentpole releases because the incumbent facilities were simply not as well-equipped to handle 3D or high-resolution data in the field. We were doing head-to-head tests with major competitors, whose army of engineers would still be downloading while our footage was already on the cinematographer’s iPad. We thought to ourselves, ‘we were born and bred to do this stuff.’
 Over the years, Cioni has amassed credits on over 200 digital shows including Gone Girl, Pirates Of the Caribbean IV, The Walk and The Amazing Spider-Man.
When Panavision came looking to diversify its business, Cioni saw a prime opportunity to expand. The acquisition in December 2014 brought Light Iron’s archiving workflow, creative services and colour science into the Panavision fold. And, it also sparked an idea for a new camera system built for both production and post-production.
        
The Millennium DXL, in terms of resolution, pixel pitch, lensing, ergonomics, workflow, accessories and electronics, is designed to be the most advanced digital cinema camera ever made. It’s a super-computer with modular accessories and superior RED electronics. But that’s not the only reason this camera is so special.
“It’s the most advanced camera ever because it is not built for today's needs; rather we are concerned about what the needs of tomorrow are going to look like,” says Cioni. “It’s also not just designed for the cinematographer, but also for the camera assistant, VFX supervisors, editors, and even for downstream delivery to OTT platforms like Netflix, Amazon, Hulu and HBO. When you get up close and personal with the DXL, you'll see how many people were considered when building this product.
“DXL’s 8K Weapon sensor provides incredible resolution and dynamic range. In addition, Panavision's custom toolset and Light Iron’s colour science and end-to-end workflow, including wider colour gamut, contribute to the expectation for high rental demand.
“Some people like high resolution for the ability to re-zoom and reframe, or better tracking. Some like HDR to make an image look more three-dimensional. And other people’s preference is for wider colour to help the picture seem more realistic. But the truth is, it’s the combination of the three that make a picture truly fantastic.
“Very few people have seen all three of these powerful elements put together in a single image because the cameras, displays and projection systems aren’t totally available yet. These technologies are still emerging. But the DXL already allows you to shoot content ready for that market change. And our future DXL roadmap is even more exciting. We're building a camera system that has the ability to make pictures as unique as your own fingerprints. That’s where the DXL is going and when you look at the big picture, the DXL is something no one has ever done before.”
"Thinking back, my interest in innovation has only increased since I was a teenager. I love to focus on possibility, regardless of the size of the challenge before me. The fuel for innovation is different for everyone. For me, it's a deep desire to keep an open mind and always being willing to share what I've found with the world.”

Thursday, 16 February 2017

The realities of making VR and AR streaming a reality

Knect365 for TV Connect

360 video is a stepping stone to the merger of physical and digital worlds, the virtual and real. All it takes is a lot of processing power and bandwidth.

https://knect365.com/media-networks/article/98d3564c-2e6b-43f7-8728-dda3038f818a/the-realities-of-making-vr-and-ar-streaming-a-reality

From a technology and business standpoint, live 360-degree video streaming remains one of the acute challenges on the path to a fully immersive, premium TV experience on head-mounted displays (HMD) and flat screens. This is nothing new. Delivering high-resolution video (whether 4K today, 8K tomorrow, or 12K in future) on a mass scale has been an industry struggle for years.
HMDs are a long way from penetrating the mass market, production equipment and workflows are yet to be standardised, editorial grammar is a work in progress and budgets for content emanate largely from marketing.
“2017 will be the year where immersive VR will disappear as 3D did a few years ago, or it will be adopted by fans,” forecasts Carlo DeMarchis, chief product and marketing officer at sports media services company Deltatre.
Unleashing 360-degree video hinges on the ability to monetize it toward the widest audience. “Given the exorbitant cost involved with producing high-res content, this cannot be practically achieved by implementing legacy adaptive bitrate technologies,” says Alain Nochimowski, EVP of Innovation, Viaccess-Orca.
As an example, standard DASH-based streaming of 4K-captured 360-degree video content not only results in much lower resolutions within a user’s field of view, it may require up to 20Mbps bandwidth. “Who would want to watch a 90-minute soccer game in VR 360 if the ball can barely be seen?” poses Nochimowski.
The technical challenges associated with VR video are not trivial. Most people agree that frame rates have to be higher, colour needs to be 10bit (HDR), compression artefacts need reducing and resolution has to be greater.
The lowest usable quality VR video is streamed at 1440p/30fps, needing at least a 10Mbps ‘up’ from the event and ideally 6Mbps download to the consumer’s device. For live streams with multiple camera rigs and points of view, simply multiply the 'up' requirement by the number of rigs. This can quickly turn into a very high bandwidth requirement.
“Data and bandwidth constraints are substantial,” admits DeMarchis. “However, we can expect this to be solved in the medium term as compression techniques and network topologies advance.”
Several sports and sports broadcasters have partnered with VR production specialists to trial and commercialise 360 live events. Deltatre is using Nokia’s OZO VR system for clients including UEFA; Fox Sports is hosting live VR streams of US Open Golf, NASCAR and more with NextVR. Sky has focused mostly on VOD sports content such as coverage of boxer Anthony Joshua’s bouts but has a stake in VR developer Jaunt.
Some technical partners offer an end-to-end VR platform, which critics might call proprietary and therefore risks being locked-in. Others prefer to offer best-of-breed tools. In either case, the most crucial aspect is the encoding solution that optimises bandwidth.
Viaccess-Orca’s Virtual Arena uses tiling technology to increase bandwidth efficiency while “significantly improving streamed video resolution in the user’s field of view,” Nochimowski says. “What’s more, it paves the way to video monetisation through advanced content protection and video/advertising analytics.”
VR specialist Focal Point VR tested live 360 of Champions Tennis from the Royal Albert Hall in December. It streams 6K VR video from Blackmagic Design cameras using its technology that packs the stream into a standard 4K video format.
“Packing allows us to have native effective 6K resolution in the main areas of interest such as the field of play while areas of less interest – such as the view behind the VR viewer’s head – receives fewer pixels,” explains head of production Paul James.
This requires an up connection of 18Mbps with the viewer needing better than 10Mbps down (and ideally close to 18Mbps) to receive the full resolution. “Below that we transcode the stream so viewers can still enjoy the VR stream down to around 2Mbps,” explains FPVR CEO Jonathan Newth.
By the end of this year end FPVR aims to support up to 16K 360 streaming (“close to retina resolution”) with down speeds of less than 20Mbps. “Without aggressive optimisation this would require greater than 300Mbps which is generally not available and certainly not commercially viable,” says Newth.

Production issues

Rapidly evolving technology, although essential to 360-degree production, can make editorial and planning very difficult.
Most cameras are either not designed to be used for VR (such as any GoPro rig), or are prototypes (therefore unreliable), or very high-end and out of the financial reach of most filmmakers.
“Fundamentally, the tools necessary to produce a live virtual reality video are provided by different companies meaning there is no common workflow,” says Newth.
Stitching software, for example, is essential to VR production: standalone software, plugins and hacks are available from many different providers, all competing for market share. These tools and techniques need to work in real-time to be useful for live streamed VR.
Producers also remain unsure about the importance of 3D (stereo) VR video. “Many have avoided it because of its high production overhead, others because it is hard to do well,” suggests Newth, although FPVR believe stereo 360 essential to the quality of the experience.

Who is in control?

The jury’s out on whether audiences prefer live VR as a more familiar directed (editorialised) experience or to control the gaze themselves. Sky and deltatre are separately exploring a solution in which user selectable, directed experiences, 360 video and traditional feeds are available within the same app.
Others think live VR appeals to millennials keen for greater control over the experience. FPVR enables real time, gaze control camera selection and viewer selectable resolution/bitrates to allow for bandwidth variation.
“You lose some advantages of editing and action replay and the ability to zoom in and out of a picture,” says Newth. “But what you get in return is a feeling of authenticity and presence.”
Other storytelling difficulties include avoiding motion sickness for viewers watching fast action. Putting VR cameras on F1 cars as they corner at 150mph was flagged up by Sky’s tests.
“Each event has its own set of specificities that need to be reflected in the production work,” says Nochimowski. “In other words, the production rules for basketball differ greatly from soccer.”
VR offers great intimacy, but rig positions at football or NFL venues suffer from being a long way from the action. Basketball can be much more effective because of proximity to the court. Indeed, the NBA has made weekly live paid VR streams available via association’s League Pass program, partnered with NextVR.
“Beware of placing a rig courtside and another behind the net because of the jump in a viewer’s eyeline when viewing angle is switched,” warns DeMarchis. He advises insertion of a blank frame when swapping between camera angles of different heights “to tell the brain to expect a change”.
An issue directly related to getting return feeds from the camera back out to the cloud is that the majority of camera positions are cabled, therefore limited in movement. No robust wireless live solution for VR has been invented.
The weight of the current crop of HMDs is also problematic and a contributing factor in the preference for videos of two to five minutes. As designs develop as it is anticipated to and VR gear becomes as light and user friendly as a mobile phone, users are more likely to want to spend longer in the experience.

Toward mixed reality

The ability to add contextual overlays (stats, advertising) on top of 360 video is just a glimpse at what an AR-enabled experience could look like.
There are two distinct flavours of AR: mobile/tablets and the more immersive experience of headsets such as Microsoft Hololens. The latter have a very limited installed base and substantially lower resolutions than their VR counterparts, making them unsuitable to video based applications as it stands, according to Newth. The mobile/tablet form of AR, typified by Pokemon GO, doesn't naturally support live VR video which is best as an immersive experience.
Closely related to AR is Mixed Reality (MR) which merges real and virtual worlds to produce new environments and visualizations. Ultimately, physical and digital objects will interact in MR in real time.
“Imagine a VR live stream of the Hollywood premier of Avatar 2,” says Newth. “The remote viewer has the chance to get close to the stars with the perfect red carpet view, but we could also add CGI characters from the movie into the scene, sharing the space with the flesh and blood stars.”
Integrating high-quality CGI to video in real time was pioneered by mega-budget feature films like Avatar as an aid to production. Similar technology has now made its way into live TV.
The X Factor producer Fremantle Media and Norway’s The Future Group (TFG) have developed entertainment format Lost in Time which combines live action filmed on a green screen with audience participation and real-time graphics.
By genlocking live studio footage with virtual images rendered in Epic’s Unreal games engine, viewers at home can participate in the game alongside contestants, using mobile and VR apps, live.
Most of the 60-minutes of each show being aired in Norway from March is pre-recorded, but through the use of the companion app viewers can interact with live elements incorporated into the broadcast.
“Nothing has failed yet but we decided to remove one element of risk which is live production,” says BÃ¥rd Anders Kasin, co-founder, TFG. “However, this is possible and will likely happen from season two.”
VR/AR/MR is rapidly moving toward a world where the boundary between the digital and the physical is eroding. The next step is to add human-like senses and artificial intelligence to VR headsets to unlock even more immersive applications.
Later this year, Intel will debut consumer technology that does just that. Project Alloy is a wearable computer that features an Intel seventh generation Core processor, a fisheye lens, two RealSense cameras and other body worn sensors.
The system will be linked to a VR rig developed by Voke, a live VR specialist Intel acquired in November. Voke’s rigs are typically loaded with 12 (or more) cameras that capture 360 video and, crucially, information about scene depth. Intel says the system processes 40 to 50GB/s of data from an event to the viewer in real time.
But Intel is going further. To create an emotional connection with merged reality experiences, Alloy headsets use RealSense cameras to record a viewer’s spatial and contextual awareness (basically so you can move without colliding into real-world stuff). Achieving this at high fidelity requires a superfast data capture, which Intel pins at more than 50 million 3D points per second.
At this year’s SuperBowl, Intel and Fox Labs produced panoramic POV replays from an array of 38 5K cameras ringing Houston's NRG Stadium. With the entire field including the players digitized, the viewer could theoretically view from any position, from any point of view, and with an enhanced ability to interact. It is exciting to imagine viewers with Alloy headsets perhaps as soon as next year being able to appreciate the point of view a player in the SuperBowl.
“This is a game changer for the entire category of virtual and augmented reality,” believes Achin Bhowmik, Intel’s VP, Perceptual Computing Group. “You choose the experience, and you get to navigate real-world content in new ways.”
Bhowmik goes on to point out that it took billions of years of evolution to develop sophisticated human perception comprising 3D vision, binaural hearing, smell and taste connected to a powerful brain with incredible processing capabilities.
“It took only a decade for digital devices to sense like humans, due to the rapid pace of perceptual computing innovation,” he says. “The ability to learn and adapt to what devices sense is right around the corner.”

Realtime Matrix-like effects from Robic the robot

RedShark News

Robic is a new robotic arm that moves faster than anything the industry has yet seen, the result being that productions can capture things in-camera that were previously very much a post-only experience.

http://www.redsharknews.com/production/item/4356-realtime-matrix-like-effects-from-robic-the-robot

Befitting the world’s highest speed robot camera, Robic rather flew under the radar on debut at IBC last year. Now its distributor, Triptent, wants to take the tech to the wider market in tandem with big name rental houses.

The robotic arm system is claimed by its New York-based developer, Robic Team, to move up to twice as fast any robot arm in the industry.

It features a high-torque follow focus capable of moving from zero to infinity in just 0.25 seconds. It claims precise focus during pre-programmed moves travelling up to 170 kph at 2,500 frames per second. The system is compatible with pro cameras including the Phantom HD, Flex 4K, Red Epic and the ARRI Alexa Mini.

This type of technology has been around since The Matrix bullet-time sequence and is now available off the shelf for commercials and specialist shots, potentially saving a lot of money in post work because the effect can be achieved live.

“We filmed a fire eater and by the time the fire had exited her throat in 2 seconds we had shot her head on and moved the camera around her in 360-degrees,” says Triptent founder Joe Masi. “Those type of angles and moves have never been seen before with the human eye let alone a regular camera and would take months to render in post.”

With a Phantom camera it can run 2500fps at 1920 x 1080 or 1050fps at 4K. In both cases the Phantom captures direct from the sensor to its internal memory in bursts of 4 to 20 seconds at maximum speeds. Data is recorded to a Phantom magazine solid hard drive or to an external computer.

One 4K file can be as heavy as 126GB and around 4TB can be recorded in a day, plus back-up copies.
In order to play the file on an external monitor the camera can display an HD SDI file on a conventional monitor for inspection.

The hardware comprises a set of four motion control arms developed by Staubli. These were custom engineered with software, called Nando 7, to service high-speed cinematography workflows.

Two of the robotic arms, together with the custom Robic software, are designed to control cameras, and two are designed to control objects. The camera control arms can be used alone, or programmed to follow the movements of an object mounted on another Robic arm for synced capture. The camera arms can be controlled and programmed via Sony PS4 controller, “dramatically reducing prep time,” says Masi. “What makes us different is that all other robotic camera systems run on the same motion control software which is kind of limiting,” he says. “We created our own program from scratch.”

The software controls the robotic arm, follow focus, air pistons, motors, solenoids, valves and relays. It also controls IR sensors to trigger recording of the action in a scene.  

“The IR light (or laser, depending on distance), emits a continuous signal,” explains software engineer Fernando Kocking. “When the light beam reaches an IR sensor on an object the light is interrupted ‘telling’ the computer to run the scene. The trigger system can be attached to many types of sensors - even something as simple as a switch.”

An initial set-up will typically take 45 minutes, and depending on the complexity of the shot, it takes 30 minutes or less in between setups for programming and adjustments.

Masi says he’s in talks with larger TV and film rental companies and also to industries outside film like aerospace and automotive for filming crash tests.