One of the blights of the Internet in general and social media in particular is that self-professed experts on any given subject have access to a treasure trove of information but more often than not choose to ignore it in favor of their preconceived notions. In the realm of film, this generally translates to certain topics popping up on a regular basis, backed by the same fallacies each time. One such topic: the use of CGI (Computer Generated Imagery) in films.
For a few years now, the existence of movies – usually four-quadrant blockbusters made by the major studios – with admittedly poor visual effects has led to widespread online condemnation of CGI in general, as though the mere use of it was a crime against the art of filmmaking. The Marvel Cinematic Universe is the go-to scapegoat for this, the shorthand for CGI as a lazy substitute for practical effects and sets.
In particular, what has been making the rounds this week is a behind-the-scenes clip from Spider-Man: No Way Home, and what irked certain fans is the fact Spidey’s mask was added digitally in some shots where he’s putting it on or taking it off. Now, even without taking the film’s production circumstances into account (it was shot during the pandemic, and some scenes had to be readjusted with varying results), that’s something the franchise has been doing since the Sam Raimi films: for scenes where the actor is wearing the full suit and mask, the latter is placed on top of a specially designed helmet – a face shield – that makes it look smooth. For shots where the hero puts it on or takes it off, CGI is required, otherwise it’s necessary to cut away at the crucial moment.
The anti-CG crusade has birthed some truly baffling takes, not least the notion that more established directors shy away from digital trickery. This is far from the truth: Martin Scorsese has used it to some extent in a lot of his films made over the past two decades, as has David Fincher who in one interview mentioned his movies contain, on average, more CGI shots than a Marvel movie (for example, the snow in The Girl with the Dragon Tattoo was added in post-production because there wasn’t enough precipitation during the location shoot in Sweden).
The real issue isn’t so much the CGI in and of itself, but how it’s implemented and, more importantly, how much time the teams in charge of it are given: there’s a marked difference between the average blockbuster, which typically has a schedule whereby the more effects-heavy scenes are shot first so the post-production team has more time to work on them, even while the rest of the film is still shooting, and something like Ant-Man and the Wasp: Quantumania, which had its post-production schedule reduced by almost five months due to changes with the release date and scenes that were added late in the process, with even less time to work on them properly (that said, there is no version of the film where the character M.O.D.O.K. would not have looked ridiculous while retaining a comic book-accurate design).
Like most other items at a filmmaker’s disposal, CGI is a tool, and needs to be used well. There’s been no marked downgrade from 2008 to today, as one particularly disingenuous tweet tried to suggest (primarily because one of the “2008” images was actually from a 2017 film): for every Pirates of the Caribbean movie, with its impeccable effects work, there was an Indiana Jones and the Kingdom of the Crystal Skull and its wonky digital ants; and for every Ant-Man 3 there’s an Avatar 2 or a Kingdom of the Planet of the Apes, pushing photorealism to new, dizzying levels.
Digital effects are not going anywhere, much to the chagrin of certain corners of the Internet. What needs to disappear, on the other hand (and apparently that is happening over at Marvel, for example), is the “we’ll fix it in post” mentality. Because not everything is fixable if you haven’t planned it properly.