While everyone can see that the future of theatrical cinema is in video projection, and the future of film post-production is clearly video/computer technology, the future of high quality image acquisition remains film.
NEW YORK "Digital video seems cool," says Robert Mastronardi, a Kodak account manager for the New York region, "but when you think about it, film is really cool too. It has been perfected for over a hundred years. Modern color film has over nine different color-sensitive layers, attached to a plastic base, all in a width that's less than a human hair." He sounds just a bit pleading, like a guidance counselor trying to convince kids that sexual abstinence is really hip.
I was at the "Ultimate NFL Films Shoot-out: Film vs. Video" screening, held in New York at Kodak's private theater to convince industry professionals that film is still king.
And it is. While everyone can see that the future of theatrical cinema is in video projection, and the future of film post-production is clearly video/computer technology, the future of image acquisition remains film. At least for the really slick, high quality, high gloss, high budget productions. No one would argue that video ever looks better, the argument is always that it's cheaper.
According to Kodak, sales of 16mm film stock, which was in decline all through the 1990's and earlier this decade, as film schools in droves switched from 16mm to DV, is now on the rise again. The professional community, it seems, has come to appreciate the widescreen Super-16 format as the ideal camera format for shooting HDTV-compatible television programs.
The list of TV programs shot in 16mm film includes Sex in the City, Scrubs, Gilmore Girls, Malcolm in the Middle, The OC, The Shield, Jonny Zero and Law & Order: Trial by Jury.
NFL Films has been shooting pro football in 16mm since the 1960s, and their extensive use of slow motion, which is often shot at 120 frames per second, make all current HDTV cameras inadequate for their needs (including the professional Sony HDW-F900/3 HDCAM used in the comparison shoot). Then too, there's the "film look," something not easily mimicked by video cameras, even after applying extensive post-production tricks to try to mimic film.
"I think of videotape like Formica," says Steve Sabol, president of NFL Films, in one of Kodak's promotional brochures. "It's smooth and has no depth. Film is like wood with subtle textures. We're romantic and we're storytellers."
Everyone in attendance, mostly independent producers and wannabes, and including several DPs (directors of photography, aka cinematographers) for feature films, agreed the film look is superior. Several of the DPs had worked with both state-of-the-art professional HDTV cameras, and with 35mm film cameras, as well as 16mm.
The technical arguments for film, as presented here, boiled down to:
*VASTLY BETTER DYNAMIC RANGE, or contrast range (In cinematographer's terms, film has a 12 to 15 stop range, while professional video is typically 7 to 8 stops)
*BETTER DETAIL THAN HD, even from 16mm (When film is scanned into video, the quality of the scan is described as a "2K" scan or "4k," based roughly on the horizontal pixel resolution. Kodak says 16mm makes a very nice 2k scan, meaning it can produce a pleasing 1920 x 1280 HDTV image.)
*SMOOTHER SLOW MOTION (due to variable frame rate cameras)
*MORE ACCURATE FOCUS with optical viewfinder
*SHALLOWER DEPTH OF FIELD (due to lower sensitivity of film -- see footnote below)
From an image sensor engineer's perspective, these items would all appear to be challenges. Go down the checklist, one by one. Over time, each can be addressed.
But we all know that even after the issues in this "basic" checklist are addressed, a new checklist will appear with more subtle issues. And then there will be another list. And another.
So I pose the question: Will it ever end? Will the image coming out of a video camera ever truly match the image from a film camera, to the satisfaction of the most demanding cinematographers' eyes?
Sure, its agreed video will take over as an intermediate-processing format (what the film industry calls a "DI," for digital intermediary), and for distribution.
But what about image "acquisition"? This is where film remains king, and I suspect this will continue for a long, long time.
A bit of experience from the world of professional audio may be informative here. By now, practically all professional recording studios have replaced their old analog mixing consoles and recording equipment with digital gear. But you may be surprised to learn that on the very front end of the whole recording process -- the microphone -- some of the best, most expensive (over a thousand dollars) microphone models continue to use vacuum tubes for their built-in pre-amps. That's right, vacuum tubes. To the highly discerning ears of many studio-recording engineers, the ultra-low noise MOSFET amplifiers used in other microphones just don't match the purity of tubes.
Think about that as you ponder the film versus video dichotomy.
And, on a related topic, check out this article about whether film grain should be included in new standards for high definition DVDs, Go with the grain, film R&D chief urges, for art's sake.
The argument that video is too sensitive to light, producing greater depth of field -- seemed almost laughable to me at first, yet it was a significant part of NFL Films' argument for 16mm film. Obviously, I thought, one could always reduce depth of field by reducing the amount of light entering the camera, forcing it to use a wider aperture setting. But when I suggested out loud that a neutral density filter could do the trick for video, there was a sense that I just didn't get it. Cinematographers are purists. Why put another piece of glass in between the image and the image sensor, when with film, you don't have to?