The top five grossing films of 2019 had one thing in common: they all relied heavily on visual effects & 3D animation.
The top five–Avengers: Endgame, The Lion King, Toy Story 4, Captain Marvel, Frozen II–all used the cutting edge of what computer imaging had to offer in 2019. So as we dive headfirst into the new year, New York Film Academy (NYFA) surveyed our instructors and alumni who worked on dozens of movies, games, and television shows this year to find out what 2019’s biggest trends were and how they will lead to the big trends of 2020.
Paradigm Shift in Buyers
If you want to pitch an animated show, you are lucky as there have never been more producers buying animated works. In the recent past, the main purveyors of animated series were Nickelodeon, Cartoon Network, Fox, and Comedy Central. A good amount of those developed internally use a library of existing IP.
But with streaming services like Netflix and Amazon now major players in the industry, more series than ever are being made. Hulu, TBS, Apple, Disney+, HBO Max, and Quibi have also thrown their hats in the ring, and more major and minor services are right on their heels. Because the streaming competition is international and because animation (generally) travels well overseas and is not ballooning in cost like live action productions, animated series are becoming a staple of streaming services.
Visualizing the final film before it shoots it has never been more difficult. Modern tentpole films require more and more VFX, digital sets, CG characters, which means what you capture on set is bits and pieces of plates, as well as green screens that will need to be stitched together in post. This makes it hard for directors and other creatives to ensure what they are getting in camera is right for what they want.
That’s where virtual production comes in. Virtual production is when you use real-time 3D tracking and visualization to approximate what the final set extension or 3D VFX will look like in post, on set while capturing actors reacting to them in real time. A rudimentary form of this technology has been used in line broadcast for decades, like the first-down line on a football field that is keyed to the ground and matches perspective across mulit-cam cutting; or the real time weather graphics that respond to the meteorologist’s movements. However advancements in game-engine and real-time rendering has allowed franchises like The Lion King to use VR technology, like that NYFA Game Design alum Guillermo Quesada helped develop, to visualize what a fully CG set looks like when captured using conventional directorial and lensing techniques.
Work Stations in the Cloud
Despite decreases in GPU costs, a modern top-of-the-line workstation can still set an artist back $5000. This spread across hundreds of artists can mean quite a costly investment for traditional VFX companies, which is why some artists and VFX houses are turning to “cloud” computing.
The most resource-intensive part of most shows is rendering. If a company does not need to own a render farm or even need to use a RAM farm that can generate previews, they would be able to have hundreds of thousands of dollars and spend more time on the art rather than the computing. This is where cloud computing comes in. Artists, companies, and supervisors are able to “rent” time calculating the preview or render of the shot they are working on, only paying for what is needed from more advanced computers to visualize what they are working on and then switching back to their home (local) stations for tweaks and then sending to a cloud farm for finishing. This process will allow boutique houses to compete with bigger competitors while keeping most of the money on the screen.
One of the most time-consuming parts of the VFX and/or animation process is rendering. For the first Frozen film, it took 30 hours to render a single frame, and with 24 frames a second, the render times add up fast. Video games on the other hand have been rendering at 60fps for decades but not quite at the quality expected for broadcast or theatre experience. The Unreal game engine is changing this. With strides in real-time rendering driven by the success of Epic Games (Fortnite) pouring resources into real-time rendering for use in animation and VFX, it is possible to render media in seconds what previously would take hours. For those looking to learn the tools of this future, Unreal is the software for you.
AI and Machine Learning
“Deepfakes” and “machine learning” have become daily terms in our newsfeeds, and they are affecting the VFX industry as much (if not more) than anything else. Being able to do head replacement, digital doubles, or de-aging, or having an actor give a postmortem performance as see with Peter Cushing in Star Wars: Rogue One, requires a tremendous amount of frame-by-frame pixel perfect work across dozens of software packages. This year, a deepfake plugin was released for After Effects, allowing artists to use this tremendous technology of machine learning to “photo-realistically” create deepfakes with little to no coding knowledge–training the algorithm yourself on your home machine.
written by Matt Galuppo, Associate Chair of NYFA 3D Animation & VFX