NAB
AI and ML tools are
already having an impact on post-production with tools largely aimed at
delivering more efficient workflows. This article takes a closer look at the
applications which include logging and discovery, color grading and high frame
rate smoothing.
article here
Broadly speaking,
we’re talking about machine learning rather than artificial intelligence though
many tools get marketed as AI
As reported by
British tech journalist Jonny Elwyn, writing at the Mediasilo blog, ML is
a branch of AI and far more constrained in its scope and ability:
Given a large
enough dataset of images (e.g. people smiling), a computer program would be
able to iteratively learn what a smile looks like, based on patterns of pixels,
and so eventually, predictively and accurately find and tag photos of people
smiling.
This same approach
allows the ‘Neural Engine’ in DaVinci Resolve Studio to power such features as
“facial recognition, object detection, smart reframing, speed warp retiming,
super scale upscaling, auto color, color matching and more,” according to Blackmagic
Design.
Similarly,
Adobe’s Sensei powers features across the Creative Cloud suite of
applications, including content-aware-fill in After Effects and Auto-ducking,
Morph Cut, Color Match and Auto Reframe in Premiere Pro.
Adobe Sensei in
action
Elwyn highlights
the painstaking technique of rotoscoping as ripe for an intelligently automated
overhaul.
Runway is a
browser-based video editing tool which offers AI powered rotoscoping as well as
other useful tools, such as automatic removal of backgrounds, the removal of
objects from a shot with smart ‘in-painting’ and automatic Beat Detection
described as ‘Pre-edit to the beat.’
In Elwyn’s
estimation, Runway seems to be pushing the boundaries of what is possible in an
online video editor that you can use for free, “and are definitely ones to
watch.”
Other AI/ML-driven
tools include the ability to generate synthetic voices. The text-to-speech
engine of Editingtools.io can make creating and re-creating temp
voiceovers quick and easy.
Color Grading in
the Cloud
Making every shot
in a final sequence have the same look and feel regardless of how varied their
starting points is one of the reasons professional colorists exist. It takes
real skill and a practiced eye to make everything look consistent, Elwyn
writes, though colorists might take issue with his summation that this “also
tends to be the less creative part of the process compared to the stylistic
look creation.”
That’s because the
colorist will often work with a DP to set the look in prep — and are often
credited with saving many scenes in post that perhaps didn’t benefit from the
correct lighting conditions on set.
That said, AI is
making inroads into what is often a time-consuming part of the postprocess
where a DP will want to supervise the grade but typically lacks the time and
has moved on to another project.
Elwyn picks
out fylm.ai, which is an in-browser grading application equipped with the
ability to replicate the look of any reference image and apply it to footage.
It can also match the sensor characteristics of any two cameras and recreate
LUTs based on matching a before and after image.
Another tool
is Colourlab Ai, co-developed by colorist Dado Valentic, the founder of
post house Mytherapy. The software is capable of automatically matching the
grade of an entire sequence of shots to a reference shot or image to a quality
described in early reviews as equivalent to the first pass by a human
colorist.
Asset Management
One of the
traditional tasks for an assistant editor is to do much of the “grunt work” of
ingesting, organizing, categorizing, labeling and generally preparing the footage
ready for the creative editorial phase.
This usually takes
a long time and a lot of repetitive actions, both of which can be eliminated
thanks to machine learning techniques for recognizing objects and adding the
associated metadata.
Nova.ai has a
huge range of AI driven post-production tools that can do everything from
visual searches of your footage to automatic translation into 30 other
languages using transcription and subtitle translation.
For example, you
can search a video to find and tag a range of facial emotions, objects and
activities as well as celebrity faces. You can also upload an image of a person
and then get Nova to search your footage for that person.
“This could make
traditionally time consuming editorial tasks much faster such as finding all of
the close ups of a particular actor for a trailer or all of the goals in a
football match for a highlights reel,” says Elwyn. “The days of trawling
through archives for shots of particular people or places would be a thing of
the past.”
Motion Grading
You should also
take a look at the motion grading technology TrueCut Motion from Pixelworks.
The software technology allows filmmakers to dial in the judder and motion
blur, with any source frame rate, shot-by-shot in post-production. It then
ensures that these creative choices are delivered consistently across every
screen, whether in the theatre or the home.
James Cameron is
using the technique to re-release Avatar and Titanic in 48
fps and 4K HDR. High frame rate titles like The Hobbit: An Unexpected
Journey werenot universally well received but TrueCut Motion applies the
conversion in a way that maintains the cinematic look and feel of the
director’s intent.
As an example of how
to work with HFR capture, at CinemaCon earlier this year, Pixelworks
demonstrated a clip from The Hobbit from the original release
captured at 48fps and contrasted with a TrueCut motion grade still at 48fps but
now with a cinematic 24fps look.
“Motion grading is
a creative choice,” explains Miguel Casillas, senior director of ecosystem
marketing at Pixelworks. “The process is led by the director and/or DP and can
be as granular as they like. We create a series of motion grades for review and
the filmmaker selects which they prefer shot by shot. After a while, the
process becomes intuitive.”
No comments:
Post a Comment