Two terms often trotted out to describe movies are “wonder” and “magic.” The notion that watching a Hollywood film is a transporting experience as much as it is just a brief diversion is fed to us everywhere, from self-congratulatory Oscar ceremonies to those occasional AFI primetime list specials to those ridiculously overdramatized multiplex prologues that whisk you away to Movie World, whether via a space coaster or, my favorite, by having your movie theater seat bloom into a mess of neon flowers and tentacled vines that rises up into a glorious celestial night-time. If only the movies were this exciting. They certainly used to feel that way, or so says every grown person, thinking back fondly on the supposedly more innocent spectacles of their youths.
In my particular generation, a specific problem emerges that fudges the gap between the subjective tinge of nostalgia and the industrial facts of film history. In terms of visual wonder, things really did change as we passed through the tender age of childhood awe. After the breakthrough of Tron in 1982, the transition from practical to computer-generated effects was gradually and definitively accomplished, spearheaded by the company Industrial Light and Magic—from around 1985 forward, when the first CGI-created character appeared against a live-action background in Young Sherlock Holmes, to the culmination that was Jurassic Park in 1993. We saw the evolution as more and more previously unthinkable things took shape on-screen, from Holmes’s stained-glass window-man come to life to The Abyss’s snake-like water creature and Terminator 2’s liquid metal alchemies. Or those prehistoric animals we previously could only dream about seeing move and breathe in such an authentic-looking fashion. With CGI now the industry standard, there is seemingly no limit to what can be accomplished, or to the jadedness of most viewers. We never wonder, “How did they do that?” anymore, since the answer is readily clear. Technicians are less fettered than ever, but that’s not necessarily true of imagination, that of the moviemakers or the audience.
One year before Jurassic Park, there still seemed to be discernible, almost poignant limits on what movies could accomplish. In 1992, even the most rudimentary computer graphics registered as somehow impressive, as best exemplified by Brett Leonard’s The Lawnmower Man. By any stretch of the imagination this is an asinine film, and even at the time it looked simultaneously cutting-edge and bargain-basement. If you missed this upon its first release, no generosity of spirit could compel you to understand how effective it once was. But for the sake of argument, enter the mind of a thirteen-year-old for whom the notion of an alternative cyber universe was both adorable and terrifying. The effects always kind of looked as though some pimply, hairy-palmed kid fashioned them in his basement on his dad’s Power Macintosh; yet at the same time it was clear some kind of new portal had been opened.
This cautionary tale of science run amok—its target: virtual reality!—is crude enough that it might have made Michael Crichton balk, but its visuals of a CG purgatory are memorably despairing. Here, in some vast computer mainframe, the animated avatar of the film’s antihero—a former simpleton-turned-genius (straight-to-video star Jeff Fahey), the test subject for a series of government virtual reality experiments—is swallowed up whole, and somehow able to enter a universal network. This is not a good thing, you see, because becoming a virtual being has given him limitless power and therefore corrupted him. Even before he announces his arrival on the world stage by making all the phones on earth ring simultaneously in this pre-cell phone, dial-up age, the lawnmower man (thusly named because he mows lawns, in case you were wondering, but also because it was the title of a 1975 Stephen King short story to which the film bears absolutely zero relation) is one scary cyber-menace, opening wide his pixilated maw and screaming things like “The universe is mine!” and, famously due to the film’s many TV trailer spots, “I am God here!” Slightly lessening the character’s impact are his weirdly spindly limbs and rotating joints that look like they were stuck on with the ’toon equivalent of metal fasteners.
Speaking of human bodies recontextualized, later that year saw the release of Death Becomes Her, directed by that filmmaker/technology envelope pusher Robert Zemeckis. The movie received scads of attention and an eventual Oscar for its groundbreaking digital money shots: Goldie Hawn’s reawakening after being pumped full of lead with a hole clean through her stomach; or Meryl Streep walking and talking with her head screwed on backwards—prompting the immortal Streepian line, “I can see my ass!” (With Wes Craven’s Music of the Heart still to come, this wasn’t even the actress’s lowest moment.) Such grotesque images were all the more remarkable for the fact that they were visited upon such respectable, award-winning actresses.
These miracles of modern science were so heavily discussed that they overshadowed the real offense, which was the film itself, a satire of Hollywood vanity that treated women over a certain age as competitive, shrewish, gladly self-mutilating egomaniacs. (It’s also, along with The First Wives Club, one of two satires in which Hawn gave audiences a preview of her own plastic surgery wreckage.) Death Becomes Her is essentially an epic catfight in which Streep’s fading has-been actress Madeleine and her vengeful childhood friend Helen, bitter that she stole her husband years earlier, are seduced into taking eternal-youth potions and then proceed to beat the living shit out of each other for well over an hour. In a film without a charitable bone in its body, there is no humanity to Madeleine and Helen (Mad vs. Hel . . . cute?); they are simply the worst that Hollywood can offer, repulsive Norma Desmonds for the digital age.
But the most pleasantly overloaded effects movie of that year was Francis Ford Coppola’s Bram Stoker’s Dracula. In order to evoke the feel of the early Hollywood monster movies that inspired him, Coppola refused to take advantage of any of the latest in digital technology, instead only wishing to employ in-camera techniques that would have been possible even in the silent era. The result is a film that has dated better than any other effects-driven entertainment of 1992. It’s a genuinely unsettling movie out of time, tactile and purposefully distorted. There’s a choppiness to the film’s many wondrous images—a man foisted into the air by a clawed hand with the lightest of touches; the Count scaling the side of the building like a slithering salamander—that registers to the eye beautifully, eliding completely the strange haziness and unconvincing fluidity of CGI effects, especially those of the period.
Rear projection was used instead of compositing, forced perspectives instead of postproduction trickery, miniatures and mattes in the place of digital landscapes. Film is occasionally sped up or run backwards. Characters look monstrous due to good old-fashioned makeup and latex. At the time of the film’s release, these were not necessarily atypical techniques, but such bold aesthetic choices undoubtedly encouraged a subliminal emotional recalibration after witnessing the latest Zemeckian miracles. One didn’t say “wow” when watching Coppola’s film. We were too immersed in the world it created to be taken out of it for such trivialities.
Michael Koresky is the staff writer of the Criterion Collection, as well as the co-founder and editor of Reverse Shot. His writings have appeared in Film Comment, Cinema Scope, indieWIRE, Moving Image Source, and The Village Voice.