Avatar: The Way Of Water – how gaming technology helped bring the Oscar nominee to life and could change filmmaking forever | Science and technology news
From the Indiana Jones-esque adventures of Lara Croft to the increasingly Pixar-quality animated visuals of Super Mario, video games have long looked to Hollywood for inspiration.
But the last few years have shown that the relationship is becoming increasingly transactional.
While these days you don’t have to search far for a movie or series based on a popular video game (The last of us And Sonic the Hedgehog are only two, with Mario himself soon in the cinemas), it goes much deeper than you think.
“These worlds have been converging for a decade,” said Allan Poore, senior vice president at Unity, a video game development platform that’s increasingly turning to movies.
“And for the most part, the basic principles are actually the same.”
In fact, modern video games look so good that the technology behind them is literally changing the way blockbusters are made – including the biggest of them all.
Avatar: The Way of Water was easily the highest-grossing film of 2022 – fitting given that it is the sequel to the highest-grossing film ever made.
James Cameron’s latest blockbuster is nominated for Best Picture Oscar ceremony on Sunday – and success in technical categories like visual effects seems all but certain.
The technology behind Avatar
Many of the tools used to bring The Way Of Water to life came from Unity’s Weta Digital division.
Unity bought Weta’s tech assets, the New Zealand-based visual effects company founded by Lord of the rings Director Peter Jackson for about $1.6 billion in 2021 (he still owns a now separate company called WetaFX, a more traditional visual effects company that — somewhat confusingly — also worked on Avatar).
But Unity’s deal brought a team of talented engineers used to working on films under the umbrella of a company best known for its accessible video game engine. Think of a gaming engine like a cooking recipe – it contains everything you need to create a game. Some are designed to help create specific types of games – like a shooter or sports title, while others are more broad-based.
Unity has been used for everything from indie titles to entries in the Call of Duty and Pokemon franchises.
Jackson said the amalgamation of expertise, known as Weta Digital, would be “game changing” for developers.
What defines video games is that the rendering of the worlds that players explore is done in real time. That’s because a game can play out differently depending on what the player is doing – it’s not fixed like a movie or TV. Just think of the scene in The Wrong Pants where Gromit builds the train tracks as he moves along and you’ll get the idea.
This is very different from how films have traditionally approached visual effects, where the rendering is all happening during post-production — which is why you see behind-the-scenes footage of actors standing in large green spaces or talking with tennis balls at the ends of sticks. All the computer magic was done after the fact.
“How do you speed up filmmaking?”
And while The Way Of Water still relied heavily on those techniques, parts of the production were fueled by new real-time techniques that allowed Cameron and his cast and crew to paint a picture of the finished product while they worked on set.
“How do you speed up film production? You do that by showing artists and directors a representation of what that frame will look like as soon as possible,” says Poore, who worked on the hit animated films Ratatouille and Incredibles 2 and Coco during his time at Pixar.
“Directors will use a screen that actually shows real-time components so they can see what the scene and environment will be like as they film.
“Hopefully they help make film production smoother, easier and faster.”
With Avatar 3 less than two years away and not another 13-year gap like that seen between the first two films, that assessment could well be correct.
A galaxy far, far away…
Unity’s competitors have also attempted to take advantage of real-time photorealistic visualization to get into filmmaking and, in some cases, go even further.
The Mandalorian hit war of stars Series, which returned this month for its third series, uses an immersive soundstage called The Volume to place its actors in any fantastical scenarios its writers can dream up.
This is the Oscars’ new crisis team
What 94 years of winners tell us about the Oscars
Rather than relying solely on green screens to see the effects added during post-production, The Volume features a massive wall of screens showing real-time digital environments created with Epic’s Unreal Game Engine (which powers the popular shooter Fortnite). .
This means the actors know where they want their characters to be and changes can be made on the fly.
Two recent comic book adaptations have also used it – The Batman from last year And Ant Man threesome from last month.
Click here to subscribe to Backstage wherever you get your podcasts
Meanwhile, Star Wars actor Ewan McGregor was working on The Volume his return to the franchise last yearand hailed its transformative impact compared to the films he worked on 20 years ago.
“There was so much bluescreen and greenscreen, and it’s just really hard to make something believable when there’s nothing there,” he said. “And here we were [on Obi-Wan Kenobi] In this amazing set where when you’re shooting in the desert, the desert is everywhere, and when you fly through space, the stars fly past you. So Cool.”
At the big Oscar preview party
What it’s like to get an Oscar nomination
“It’s a huge change”
While Poore doesn’t see the need for traditional digital effects techniques waning anytime soon, the idea of a “virtual production room” where visuals can be generated on the fly will only grow.
At the British National Film and Television School there is already a whole course dedicated to this very subject.
Ian Murphy, Director of the School’s Masters in Visual Effects, says: “The main change, which is really exciting, is that post-production is at the end of the process and involves us from the very beginning.
“VFX people are pretty technical, but that forces them to have conversations with production designers and cinematographers on set – and that’s a big shift.
“When you’re shooting on green screen, you’re having some pretty weird, nebulous conversations. The idea of this technology is that the changes are fairly instantaneous. And it may not be the finished images, there’s still work to be done on the visual effects, but.” Something of that process is sort of a blueprint that gets you into full production.
“And with the images that you’re now getting from a game engine, certainly the trajectory is that eventually people will see the actual images in the cinema.”
We’ve certainly come a long way from Pong.
You can watch the Academy Awards exclusively on Sky News and Sky Showcase in the UK from 11pm on Sunday 12 March. Also, starting Monday morning, catch all the info from our Oscars special backstage podcast, available wherever you get your podcasts.