This is part three of a three-part series about the movie industry’s switch to digital cameras and what is lost, and gained, in the process. Part one, on the traditional approach to filming movies and the birth of digital, ran on Wednesday; part two, explaining why digital images look different from filmed images, ran yesterday.
As digital technology continued its immeasurable improvements at the end of the last century and this beginning of this one, even the most fervent opponents of the newer formats came to recognize that digital images began to capture details that the human eye, and the film camera, could not see. Furthermore, the new formats have posed aesthetic challenges that some filmmakers have eagerly embraced. Thomas Vinterberg’s 1998 international art-house hit The Celebration (made according to the low-tech strictures of the Dogme 95 movement) was a watershed moment. It was shot by British pioneer Anthony Dod Mantle, perhaps the most important single figure in the history of DV as an artistic medium. He used a now-antique, handheld miniDV camera with the resolution of pre-HD television, but brought a distinctly cinematic sensibility to this traumatic family reunion
By the early 2000s, more than a few filmmakers began to see the rapidly improving digital technology as a replacement for film. Foremost among them was George Lucas, who drove the development of a new Sony high-resolution digital camera designed for cinematic use, and used it to shoot the first all-digital Hollywood movie, Star Wars: Episode II—Attack of the Clones, in 2002. That was the moment when many people in the film world saw the writing on the wall: If digital filmmaking was not initially cheaper than the photochemical variety, it would soon become so, by eliminating the cost of buying, processing, and transporting thousands of feet of fragile and sensitive material. Furthermore, with digital formats the director and cinematographer can see exactly what they’re getting as they shoot. Editing and effects work, which have mostly been digital enterprises for years, become simpler and more direct, since the entire project exists solely as a data file stored on hard drives, never as a physical object. The venerable movie concept of “dailies”—raw footage processed overnight and viewed a day or two later—no longer pertains, since whatever the camera has recorded can be viewed immediately on a computer, a TV monitor, or even a mobile device.
Lucas, James Cameron, and Peter Jackson, by any measure three of the most successful filmmakers in history, have embraced the digital future with ideological fervor, essentially decreeing that new technologies dictate a new sensibility and new kinds of movies. Given that their movies are grandiose constructed universes loaded with CG (computer-generated) effects, which bear only a limited relationship to photographing the physical world, this seems both logical and consistent.
Independent-film pioneer Steven Soderbergh has compared the birth of digital cinema to the birth of modern art, implying that those who want to cling to film are aesthetic reactionaries, trying to expel Picasso from the salon.
Many other filmmakers, from Martin Scorsese to David Fincher to David Lynch, have shifted toward digital with varying degrees of uncertainty, largely because of financial pressures from studios. A generation ago, every large and medium-sized American city had at least one lab that processed motion-picture film. Today there are only a handful, most of them clustered in Los Angeles and New York City. Fuji stopped manufacturing motion-picture film in 2013 (as the German firm Agfa had done a few years earlier), leaving Kodak, a company trying to struggle back from bankruptcy, with a virtual monopoly in a shrinking market. A studio executive at Fox recently predicted that film would disappear as a practical production option by about 2017.
In what many observers saw as a death knell for celluloid, 11-time Oscar nominee Roger Deakins, perhaps the most respected cinematographer in the business, shot his last feature on 35mm film in 2010 (the Coen brothers’ True Grit). He announced he was switching to the Arri Alexa, a high-resolution digital camera designed for cinematic use and meant to deliver a high-resolution “film-like” image and a rich, natural color palette. “This camera has brought us to a point where digital is simply better,” he has said. “This is not to say that I don’t still love film—I do. I love its texture and grain, but in terms of speed, resolution, and clarity of image, there is no question in my mind that the Alexa produces a better image.” Interestingly, the Alexa is far from the highest-resolution camera on the market, suggesting that it’s the proprietary image codec (coder-decoder software) inside the Alexa—which movie-makers did not program, and cannot control—that delivers pictures that match or surpass those recorded on photochemical film.
Independent-film pioneer Steven Soderbergh has compared the birth of digital cinema to the birth of modern art, implying that those who want to cling to the photochemical medium are aesthetic reactionaries, trying to expel Picasso from the salon. After shooting his four-hour epic Che with the Red One, a 4K digital camera (meaning it captures images approximately 4,000 dots wide, with about four times as many pixels as standard HD screens) that is the Alexa’s principal competitor, Soderbergh joked that he felt he should call film up on the phone and tell it that he’d met someone. Lena Dunham, the creator of HBO’s epoch-shaping hit show Girls, has specifically suggested that the ease and affordability of digital technology has opened the doors of creativity for young female filmmakers who lack technical savvy: “I thought you gotta be a dude who operates machines to do this job,” she says.
There can be no doubt that the advent of digital cinema has greatly reduced the price of admission for aspiring directors, and in that sense has democratized the medium. Shane Carruth’s critically acclaimed experimental indie Upstream Color, a peculiar blend of romance, thriller, and scientific allegory, which he made on a shoestring budget and distributed himself, was shot with a Panasonic DSLR camera you could order online right now for about $1,000. Its image quality is nowhere close to that of the Alexa or the Red, but it’s vastly superior to consumer-grade DV of only a few years ago, and Carruth uses the camera’s slightly washed-out color palette as an effective aesthetic signature.
The truth is, it’s an oversimplification to divide the film world starkly into pro- and anti-digital camps, as most people in the business have worked in both formats and will happily concede that each has its advantages. Phedon Papamichael, the Oscar-nominated cinematographer of Alexander Payne’s black-and-white rural drama Nebraska, worked in 35mm for most of his career. He feels some sympathy for the pro-film rebels, but suggests they’re making a big deal over what is fundamentally a minor technical matter. He shot Payne’s film in a manner that would have seemed completely backward a few years ago. Nebraska was originally shot digitally (and in color) on the Alexa, then rendered into black-and-white during the digital intermediate process—and then actual images of black-and-white film grain were scanned into the file to lend what Papamichael calls an “organic feeling.”
While digital formats will increasingly dominate the movie industry, reports of film’s death may have been exaggerated. “Hybrid cinema,” meaning films that combine digital and photochemical work, have become more the norm than the exception, at least in prestige Hollywood productions. Even in Alfonso Cuarón’s Gravity, a breakthrough work of digital cinematography and CGI animation (largely shot on the Alexa), cinematographer Emmanuel Lubezki shot the final scene, when Sandra Bullock’s astronaut finally makes it back to Earth, on 65mm photographic film. The quality of light in that scene—direct sunlight, refracted through water and reflected off the amber sands of a beach—is spectacular, and the film grain feels immensely welcoming. It’s as if not just the character, but the film and the audience, have come home. Cuarón and Lubezki evidently sought a visual contrast to the photography in the rest of the movie; even in this digital age, film still delivers a naturalness that can’t be simulated.
Papamichael says he doubts that film can survive in the long run—ultimately, there may not be enough work to keep any labs open—but Mantle, the artistic godfather of digital cinema, is far more optimistic. It was a mistake to view one format as a substitute for the other, he observes, and the competition between film and digital is “a silly old horse race” that should end in a truce.
He suggests we are nearing a point of equanimity, where “enough people understand the value of both,” and that film will not disappear anytime soon. Indeed, he hopes to shoot in 35mm on several upcoming projects. “A competent and responsible image-maker should be able to make their own judgments as to what fits where and why,” he says. “The canvas is still our choice, and thank God for that.”
Andrew O’Hehir is a senior writer at Salon.com.