Thursday, December 6, 2007

Digital vs. Film: The Real Low-Down


Digital technology can never surpass, let alone match, the unassailable superiority of film. Or so say the photo purists. Silver halide crystals, so goes the argument, are inherently superior and will always produce images with higher resolution, better tonality, and greater color fidelity than pixels. This article of faith is just as vigorously challenged by digital photographers, who swear that pixel-based images can capture more detail, higher dynamic range, and more accurate color.
Digital images can be counted and compared by the numbers: pixels, color bit depth, and image file size. Film can be measured and analyzed by densitometers and photospectrometers. While it's easy to compare different digital cameras to each other according to the numbers, as well as discover the relative merits and attributes of different films by their measurements, there's no direct way to do a head-to-head, point-by-point comparison between these two technologies. However, looking closely at some of the physical and chemical characteristics of film will give us clues of what a realistic equivalence with digital might be.
Just as pixels are represented as tiny squares on monitors and output as round rosettes by printers, images on film are recorded on tiny silver halide crystals, which in photographic parlance are called grain, or granules. This is why film is often referred to as being fine, medium, or coarse grain film. The finer the grain, the more detail it can record. The tradeoff, however, is that fine grain isn't as photo-sensitive as medium or coarse grain, so it is said to have a lower ISO rating. Conversely, coarse grain film, also called high speed film, is much more light sensitive, but like a pixelated image, the granules may be so large that they are annoyingly visible to the eye. Furthermore, they record far less detail.
There are other differences. Fine grain film, because it can record so much more detail, has inherently better tonality and gradation. Coarse film has very large granules that cannot resolve great detail, and is high contrast because it captures very little tonality.

Which Argument is Right? And Which is Better, Film or Digital?
While it's possible to directly compare a film's grain count with a pixel count, the number would be irrelevant because of the nature of the granules. Pixels have associated bits, called bit depth, that define color and grayscale. In a professional digital camera, each pixel can have anywhere from 36 to 48 bits of data to describe the state of a single pixel. For instance, if a 1x1" CCD has a density of 2000x2000 pixels and a color bit depth of 42 bits, the total number of data bits is 168 million bits, or 21 megabytes.
A single frame of fine grain 1x1.5" 35mm color film has an estimated 13 to 15 million individual silver halide granules. However, film is analog, not digital, so the state of each granule is, theoretically, infinitely variable. The combination of more tightly packed grains and infinite variability in each grain allows film to capture true continuous-tone images, while digital always has a fixed number of steps or grayscale levels limited by the bit-depth of the pixel.
To set the scale of detail, a digital camera's resolution is measured according to the total number of lines it can resolve before they begin to run together. A typical 3 megapixel digital camera has a resolving power of about 1,000 lines over the entire image sensor. So, if the CCD is 1/2" in size, that amounts to a total resolution of 2,000 lines per inch. By contrast, the resolution of fine grain 35mm Kodachrome film is about 2,200 linesper millimeter That's more than 50 times better raw resolution than digital. Using this for comparison, film scientists sometimes peg Kodachrome's digital equivalent as a 100 megabyte file. Of course, larger film{}2 1/4x 31/4, 4x5", etcwill yield correspondingly more data and overall resolution than a 35mm frame.
In real life, however, the resolving powers of those respective technologies are actually much closer than the numbers would indicate. That is because of differences in gamuts, or the range of data capable of being represented and reproduced by various input and output devices such as image sensors, film, printers, and monitors. The gamut on film is much higher than digital, but they both come up against a seemingly inexorable bottleneck: the output device's gamut. For film, that's how many lines photographic paper is capable of resolving, and for pixels, it's the number of lines that can be reproduced by an ink jet printer or computer monitor. Both paper technologies are similar in terms of clay coating, brightness and opacity and gamut. The net result is that with all the extra resolution film captures, it will produce about the same amount of detail in print as is captured by an image sensor. The only way you can make use of film's much greater resolution is either by enlargement{}a 35mm frame can be blown up to 14x17" or even 16x20" without an appreciable loss of quality or viewing it via transmitted rather than reflected light, as with a slide projector.
Digital vs. Film

The same scene, taken with 20, 35, and 100mm lenses pitting the Canon D30 Digital Photo versus the Canon EOS1V Film Photo. While it's not possible to truly demonstrate the differences between film and digital on your computer monitor, you can at least get an idea that for monitor output it's hard to distinguish the difference


Digital
Film
20mm
35mm
100mm

Tonality is also a tossup, since analog film can, by its very nature, produce greater tonality, while some digital technology, by design and engineering, can capture an appreciably higher dynamic range.
Because the photosensitivity of each granule is set according to its chemical composition and size, it has limits on how much light it can register and absorb. Too little light, and no photochemical reaction takes place. Too much light, and the photons overwhelm the granule and blow it out to a uniform monochromatic black (which, upon reversal, turns all white). The range of normal sensitivity of light to dark is called a film's dynamic range, and it is expressed, not in decibels like an image sensor, but as a logarithm of those decibels, on a scale of 0-5. A medium grain film averages a dynamic range of about 4.0, depending upon the brand and emulsion. That translates into about 6-7 f-stops on a camera lens. Anything above or below is expressed as monochromatic black or white, with no detail whatever.
Some digital camera image sensors are capable of capturing significantly more detail than film in the highlights and shadows A typical high end image sensor, such as the Philips 2x2K CCD found in many professional camera backs, can capture 11 or more stops of data, roughly double that of a comparable film stock. But again, the extra dynamic range is useful only if the gamma of the output technology allows it to be accurately reproduced.
There are other technical differences that translate into digital or film superiority, such as color fidelity, saturation, etc. But at the end of the day, a good professional photographer can get comparable quality from either technology. And yes, it's entirely possible to use a digital camera to produce images that can be output as Ansel Adams-like museum-quality prints.

No comments: