Avatar won the Best Cinematography Oscar, which brings up an interesting question. As one of my movie friends so aptly put it: "C'mon, that is not cinematography. That is 3D visual effects." The movie is probably about one-third live action footage and about two-thirds computer generated imagery, so which are we counting? If we're just talking about the footage that was run through an actual camera -- the footage of actors sitting and chatting in rooms in front of scientific machines -- it's really not all that impressive. If we're talking about the much more impressive footage that was generated in a computer, then we've entered a strange new realm. And it's something that the Academy is going to have to address.
If computer-generated imagery counts, then movies like Up should also be considered for Best Cinematography, and taken shot-for-shot, that movie is much more beautiful and visually effective than Avatar. Computer artists use different tools from cinematographers to achieve the same ends. They use light, and control the way the light falls into the picture. The light can be strong or soft, originating from all different angles. It can be a harsh spotlight or deep shadows or a bright sky. Computer artists also arrange their frames in much the same way, placing a figure in an empty landscape to communicate isolation, or surrounding the figure with buildings or trees to convey something else.
Both kinds of artists can create moving shots as well. In a computer, there is literally no limit to the kinds of moving shots that may be created, but a real camera is limited by things like physical space, gravity, the availability of equipment, weather, or a million other factors. Thus, ironically, a live-action movie may choose to use computer effects to enhance or correct a shot that could not be made live (such as a tracking shot moving from outer space, through the stratosphere, and in through a character's window). And it goes without saying that both kinds of cinematography require a certain kind of artistry, a trained eye, which not just anyone possesses.
Those are the similarities, but there are major differences. If an image is created in a computer, it can be completely controlled. Actors always hit their marks, and weather is never a problem. A digital artist can create any kind of shot in a comfy room, wearing his jimmies and bunny slippers, while a traditional cinematographer must brave the elements. Certainly live cinematographers control much of their work, sometimes taking hours to set up certain shots, waiting for the right light, or creating it with a myriad of electrical illumination. But there's a certain risk shooting something from life, and actually exposing film to do it. Sometimes it doesn't work out, and sometimes accidents happen that make things better. Sometimes a lens flare will make a scene more beautiful, and I've seen artificial lens flares created for digital animation, copying live cinematography without creating it.
Then there is the definition of the word "cinematography," which implies that film is exposed with the use of a camera, although the other half of the definition is "the art of," which also applies to computer artists. But if an image is created in a computer, is it cinematography, or is it special effects? I'm not bemoaning the use of computers to make movies. Certainly no camera alone could create something as gorgeous as Avatar or Up, or at least not without a ton of money and imagination. But at the same time, no computer could create something quite as gorgeous as previous Oscar winners for Best Cinematography like Sunrise (1927), Black Narcissus (1947), The Third Man (1949), The Hustler (1961), Barry Lyndon (1975), Days of Heaven (1978), Apocalypse Now (1979) or a dozen others.
In other words, there are enough differences between the two approaches that I want to propose a solution. From 1939 to 1967, the Academy gave separate awards for Black and White and Color cinematography. It was a period of change as color film became more and more sophisticated and cheaper to produce, while black and white became more of a specialty item, and no longer the cheaper alternative. Since this is also a period of change, I propose that the Academy implement two Cinematography categories, one for Best Digital Cinematography and one for Best Film Cinematography, or something to that effect.
The names would be important. You couldn't use "Best Analog Cinematography," because that would imply a complete non-reliance on computers. Then we would have to take into account that traditional film cinematography uses digital enhancing as well. We'd also have to take into account another of this year's nominees, Harry Potter and the Half-Blood Prince; it was mainly shot with live actors and on actual film, I'd argue that it would belong in the digital/computer category since a large portion of it was shot on "green screens." But when you look at a more traditional nominee like the beautiful black-and-white The White Ribbon, which likely used little or no computer enhancing, it seems pretty clear that it's in an entirely different category.
Your thoughts, dear readers? A good solution, or more trouble than it's worth?