Digital intermediate (typically abbreviated to DI) is a motion picture finishing process which classically involves digitizing a motion picture and manipulating the color and other image characteristics.
It often replaces or augments the photochemical timing process and is usually the final creative adjustment to a movie before distribution in theaters. It is distinguished from the telecine process in which film is scanned and color is manipulated early in the process to facilitate editing. However the lines between telecine and DI are continually blurred and are often executed on the same hardware by colorists of the same background. These two steps are typically part of the overall color management process in a motion picture at different points in time. A digital intermediate is also customarily done at higher resolution and with greater color fidelity than telecine transfers.
Although originally used to describe a process that started with film scanning and ended with film recording, digital intermediate is also used to describe color correction and color grading and even final mastering when a digital camera is used as the image source and/or when the final movie is not output to film. This is due to recent advances in digital cinematography and digital projection technologies that strive to match film origination and film projection.
In traditional photochemical film finishing, an intermediate is produced by exposing film to the original camera negative. The intermediate is then used to mass-produce the films that get distributed to theaters. Color grading is done by varying the amount of red, green, and blue light used to expose the intermediate. This seeks to be able to replace or augment the photochemical approach to creating this intermediate.
The digital intermediate process uses digital tools to color grade, which allows for much finer control of individual colors and areas of the image, and allows for the adjustment of image structure (grain, sharpness, etc.). The intermediate for film reproduction can then be produced by means of a film recorder. The physical intermediate film that is a result of the recording process is sometimes also called a digital intermediate, and is usually recorded to internegative (IN) stock, which is inherently finer-grain than camera negative (OCN).
One of the key technical achievements that made the transition to DI possible was the use of the 3D look-up tables (aka "3D LUTs"), which could be used to mimic how the digital image would look once it was printed onto release print stock. This removed a large amount of skilled guesswork from the film-making process, and allowed greater freedom in the colour grading process while reducing risk.
The digital master is often used as a source for a DCI-compliant distribution of the motion picture for digital projection. For archival purposes, the digital master created during the Digital Intermediate process can still be recorded to very stable high dynamic range yellow-cyan-magenta (YCM) separations on black-and-white film with an expected 100-year or longer life. This archival format, long used in the industry prior to the invention of DI, still provides an archival medium that is independent of changes in digital data recording technologies and file formats that might otherwise render digitally archived material unreadable in the long term.
Telecine tools to electronically capture film images are nearly as old as broadcast television, but the resulting images were widely considered unsuitable for exposing back onto film for theatrical distribution. Film scanners and recorders with quality sufficient to produce images that could be inter-cut with regular film began appearing in the 1970s, with significant improvements in the late 1980s and early 1990s. During this time, digitally processing an entire feature-length film was impractical because the scanners and recorders were extremely slow and the image files were too large compared to computing power available. Instead, individual shots or short sequences were processed for special visual effects.
In 1992, Visual Effects Supervisor/Producer Chris F. Woods broke through several "techno-barriers" in creating a digital studio to produce the visual effects for the 1993 release Super Mario Bros. It was the first feature film project to digitally scan a large number of VFX plates (over 700) at 2K resolution. It was also the first film scanned and recorded at Kodak's just launched Cinesite facility in Hollywood. This project based studio was the first feature film to use Discreet Logic's (now Autodesk) Flame and Inferno systems, which enjoyed early dominance as high resolution / high performance digital compositing systems.
Digital film compositing for visual effects was immediately embraced, while optical printer use for VFX declined just as quickly. Chris Watts further revolutionized the process on the 1998 feature film Pleasantville, becoming the first visual effects supervisor for New Line Cinema to scan, process, and record the majority of a feature length, live-action, Hollywood film digitally. The first Hollywood film to utilize a digital intermediate process from beginning to end was O Brother, Where Art Thou? in 2000 and in Europe it was Chicken Run released that same year.
The process rapidly caught on in the mid-2000s. Around 50% of Hollywood films went through a digital intermediate in 2005, increasing to around 70% by mid-2007. This is due not only to the extra creative options the process affords film makers but also the need for high-quality scanning and color adjustments to produce movies for digital cinema.