When Gary Oldman walks away with the best actor Oscar next March, spare a thought for fellow thesp Andy Serkis. Oldman probably deserves the gong for his powerful incarnation of Churchill in Darkest Hour as well for his back catalogue of crazies (Sid & Nancy, Leon, True Romance), but his former Planet of the Apes co-star can point to an arguably more impressive recent CV — albeit behind the mask of CGI.
https://www.redsharknews.com/post/item/4973-should-cgi-performance-capture-be-recognised-as-an-acting-art
In 2014, when Oldman was paired with Serkis’s Caesar for Dawn of the Planet of the Apes, studio 20th Century Fox mounted a campaign to get the latter Oscar-nominated. It failed, not because of any lack of admiration for the actor’s art but because of confusion as to whether or not Serkis alone deserved to be honoured for the animation that overlaid his original performance.
Performance capture is still a relatively infant creative process, despite rapid advancements since the efforts of Robert Zemeckis in 2004 to animate Tom Hanks for The Polar Express. Many in the industry remain dubious of the blurred line between performance and technology and the artists and animators — usually from New Zealand’s VFX shop Weta — that contribute to the digital character. The matter has not been helped by Serkis’s own shorthand for what he does as “digital make-up”.
The well-received finale in the trilogy, War for the Planet of the Apes, should change things — at least as far as recognition by various critics circles is concerned and by BAFTA which tends to be less conservative in these matters than bodies like the Golden Globes or the Academy. As with ‘Dawn’, much of the performance capture in ‘War’ is moved outside the studio in Canadian locations, assisting in making the look of the picture more naturalistic and essentially helping the audience to suspend belief that they are watching a talking ape. Ironically perhaps, the animation is so good that Serkis’s performance — and consequently the emotion of the character — comes through more than ever before.
“What Andy is doing is acting and performance capture is recording it,” director Matt Reeves, told Indiewire. “In this story, we pushed Caesar to a place where you’re able to empathise with his desire for revenge and then question how you’ve been provoked and implicated. And what these effects represent is a high water mark. It takes tremendous artistry on both sides (actor and animator).”
Speaking to the Independent, Serkis claimed performance capture is, “no different” from any other kind of acting.
“An actor in a performance-capture role receives a script, works on psychology, emotions and motivation, and goes on set to be shot in exactly the same way as any other character,” he said. “That performance is used to cut the movie and it’s that performance that creates emotion, pace and drama. The visual effects render the character, just like putting on makeup, except here it happens after the fact.”
No discrimination
Serkis emphasised that he doesn’t want to deny the “brilliance” of the visual effects team. “But the awarding bodies should not discriminate about this being different,” he says. “If they don’t think Caesar is good, that’s fine, but it’s a different issue.”The performance capture pipeline features many similarities to those carried out on an everyday film set. However, the main importance still lies with the motion capture system and its ability to capture the highest quality data possible.
Serkis is more likely to win some kind of special achievement award from the Academy sooner than the best acting nod, but a performance captured role is more likely to win awards in future as the technique becomes more familiar.
JJ Abrams’ Star Wars reboot and the forthcoming The Last Jedi feature Serkis’ work as Supreme Leader Snoke. Pretty much every recent superhero film from Marvel and Universal features it. Spielberg is a convert, using it extensively (and dubiously) on The Adventures of Tintin (with Serkis as Captain Haddock), with Mark Rylance on The BFG and for 2018 release Ready, Player One. Serkis is in production on a version of Animal Farm at his Ealing-based performance-capture studio, The Imaginarium. James Cameron can be counted on to push possibilities further still with the Avatar sequels which began shooting last month and for which the director has reportedly experimented with performance capture underwater.
Such ‘cyber-thespianism' or 'post-human acting' goes hand in hand with production techniques for visualising the animated characters and CG background in realtime on set — processes which Cameron led the way with while making Avatar. The virtual production process was showcased by the stunning CGI work on last year’s VFX Oscar winner, The Jungle Book.
According to The Foundry, maker of VFX tool Nuke which was used to composite The Jungle Book, the most immediate future scenario is “that we manage to create CG characters so realistic we can’t tell which performances are given by a real-life human and which by their digital replica.”
Although, as Serkis himself has observed, what would be the point of trying to replicate humans with human performance capture?
To which one might add — no point at all. Nonetheless, what if one day we might be able to do without human actors altogether by melding visual effects with AI. Might Hollywood create its own believable, fully digital actors?
As The Foundry points out, in large part, this will come down to how well emotion can be realistically digitised. Currently, we’re not quite there yet. Every CGI character you see in a film or a game that gives a truly realistic emotional performance does so because there was a real actor who gave that performance.
And even then, we’re often led into the ‘Uncanny Valley’ — the place where human replicas which appear almost (but not exactly) like real human beings elicit feelings of eeriness and revulsion. Cracking this has been a perennial challenge for CGI artists.
Encouragingly, reckons The Foundry, the last three years have seen huge progress in this field. The reception — particularly by younger viewers — of a digital Peter Cushing in Rogue One: A Star Wars Story indicates VFX technology has reached a stage where we can create a human likeness to a compelling degree of accuracy.
“Truly conquering the Uncanny Valley will mean mastering human emotion to the point where we can create fully digital actors who can give convincing pathos-laden performances, indistinguishable from the real thing,” states The Foundry.
It believes a key to this will be rendering. The VFX industry has made huge strides in rendering surfaces and lighting, which is why digitally created humans are looking more and more realistic: “As rendering improves in the future, it will become even easier to make things ‘look right’, which in turn will make it even more difficult for us to distinguish between digitally created human faces and the real thing.”
No comments:
Post a Comment