Hollywood, The 4K Way

Eleven years ago, in the fall of 2000, the Sunday Arts & Leisure section of The New York Times published a long freelance article I wrote announcing the birth of digital cinema. Digital projection for large venues was mostly a dream at the time, but the technology existed and had been proven to provide satisfying images for the average moviegoer. Meanwhile, digital cinema’s biggest booster, filmmaker George Lucas, had just finished shooting Star Wars: Episode II—Attack of the Clones in 1080p/24-frame-per-second digital using a cutting-edge camera developed by Sony and Panavision. It was the first major motion picture to be shot entirely in video.

Although the article was mostly a rehash of developments reported in the trades, it carried the weight of my interview with Lucas on the benefits of shooting digital and discussed the improved viewing experience and huge cost savings for the studios that accrued to digital projection. There were economic challenges to an industry-wide rollout, but the studios were scheming to secure the funding that would encourage mass adoption by theater chains. For his part, Lucas was espousing that, with the advent of 1080p/24 cameras, it was time for filmmakers to give up their beloved celluloid and join the digital age.

The story clearly stated that the technology was in its infancy and predicted many years would pass before film could be fully replaced for either image capture or projection, though my editors at the Times wrote a provocative headline and deck that suggested the changeover was more imminent. In any event, the reaction to the article among Hollywood creatives was quick and thunderous. For delivering the blasphemy that film was in its twilight, I actually received hate mail at the Times and was vilified on cinematographer bulletin boards in the online world’s equivalent of a hanging in effigy.

Needless to say, I was taken aback by this luddite response, but looking back now with a decade of hindsight, it’s easy to see that the filmmakers were right to be concerned. Despite those early pronouncements that the digital age had arrived, and those who later suggested that 2K resolution was more than enough for film capture, archival, and exhibition, we now know film is far more than an equivalent 2K medium. That point was made especially evident for me recently when a small team from Home Theater was invited to tour Sony Pictures’ Colorworks post-production facility in Los Angeles. Tom Norton, our senior editor and video technical editor, Web editor Scott Wilkinson, and I got a close-up look at several 4K film projects currently under way and were treated to some demonstrations designed to show how regular high def compares with 4K on both the big screen and small monitors. What we saw astonished us. But I’ll get back to that shortly.

Why Digital Cinema?
Before digital cinema, very little had changed in the hundred years prior about the way movies are made and shown. Although there have been advancements in screen formats, sound, color, and editing technology, movies were still captured with film. The raw footage was edited and eventually spliced together to make the master negative. Typically, that negative was used to make a positive print (called an interpositive) to check the final work and/or dupe negatives from which film prints could be made for distribution to movie theaters.

Consequently, by the time the movie got to your local theater, the print might be three optical generations down from the original negative, with a commensurate loss in detail and who knows what effect on color from the various chemical baths used to develop it. This print was then shown in a theater by a mechanical projector that, depending on its design and maintenance, might or might not align each frame in the gate and move the filmstrip along at the right speed to provide a stable, jitter-free image. Assuming all went well to this point, only the very first viewers got to see the print in its freshest form, because with each showing, it picked up more wear and dirt, until finally, by the tail end of a run, the scratches and specks would become obvious to the viewer. If you were unfortunate enough to live in a rural town, this was the only way you got to see the movie, as the secondary markets typically got a title only when the theaters in the big cities were done with their prints and passed them along second hand.

Digital cinema changes all that, beginning with the cinematography. When capturing a movie digitally, you can shoot hundreds or even thousands of hours of raw footage cheaply, greatly freeing up the director and cast creatively. You completely eliminate the costs associated with the rush development of film dailies, and it lets you check your shot on the spot before moving to the next scene, which makes for a more efficient shoot. Editing and the melding of CGI effects are streamlined when the picture originates in the digital domain. Then, assuming a rigid exhibition standard is in place, if you watch a digitally captured movie in a theater with a digital projector, you will always see an exact replica of what the director signed off on, down to the pixel. No generational losses, no print wear. It is his or her vision brought to you in its most pristine form.

It doesn’t stop there, either. With digital projection, movie houses acquire the ability to promote alternative content. Have you been to the cinema lately for a high-def opera showing or a sporting event? Digital projection has largely enabled today’s resurgence of 3D. And most critically from the studio standpoint, digital projection allows the elimination of bulky film prints, which cost approximately $2,000 each to make and distribute. Going wholesale digital puts millions of dollars back in the studios’ pockets each year, and it benefits the environment by eliminating the chemical processing and disposal associated with movie prints.

From 2K to 4K
No one can argue with these benefits for either capture or exhibition. But as my opening anecdote implies, what has been a subject of debate since the early days of digital cinema—and remains controversial today—is whether the technology, on both fronts, is really ready to replace film. Modern film stock represents an ultra-high-resolution medium with the ability to capture anything from the deepest blacks to the brightest highlights, and there’s a century of knowledge about how to best use it. George Lucas’ pronouncements not withstanding, today’s Star Wars fans bemoan the fact that the last two movies in the series, Episodes II and III, were shot entirely in early-generation 1920 x 1080 high definition and therefore will never enjoy the benefit of a higher-resolution scan from film.

Nonetheless, 2K digital cinematography has taken steady hold in the last few years. It is widely accepted for shooting television dramas, and many successful and critically acclaimed movies have been captured digitally, including Slumdog Millionaire, the 2008 Academy Award winner for Best Picture.

The adoption of digital capture will likely quicken now with the advent of 4K resolution cameras and the rapid penetration of digital projection in theaters brought about by a studio program to bankroll new installations (more on this below). Manufacturers that make 4K motion picture cameras today notably include the Red Digital Cinema Camera Company launched by former Oakley founder Jim Jannard, and now Sony, which has just begun shipping its CineAlta F65. On our visit to Colorworks, we were shown a 4K projection of a short film-noir promotional piece called The Arrival shot by cinematographer Curtis Clark to showcase the F65’s capabilities. Although the images were strictly 2D, they exhibited a stunning clarity that tended to make them pop off the screen, along with tremendously wide contrast. Blacks were rich and deep, with no obvious crushing of shadow detail, and bright highlights had lifelike punch. Recent major motion pictures photographed in 4K include District 9 and The Social Network, both shot with the Red One camera. These newer cameras’ ability of to reproduce something akin to the full dynamic range of film could begin winning over any remaining film holdouts.

ARTICLE CONTENTS
Share | |
COMMENTS
loop7's picture

One of the more interesting and newsworthy of recent.

Rob Sabin's picture
Matt, your points are well taken. Resolution aside, one of the biggest differences between the 2K cameras and the new generation of 4K cameras is said to be the dynamic range, which is more related to bit depth. At least, I know Sony is claiming a greater ability in their new camera to capture shadow detail and deep blacks, and I saw some (admittedly color corrected) samples of live footage that looked exceptionally rich in this regard. It'll be up to the cinematographers and directors to decide if these new cameras get close enough to film to actually replace it, at least in most instances. We're in for some very interesting times, especially considering the somewhat rapid rate of adoption now of digital projection in theaters.
Jarod's picture

This was a very interesting and enlightening article. I learned a heap from it. I really enjoy reading about film and digital cinema. Love all you have brought to both this site and the magazine Rob! Cheers

jmedarts's picture

Read these articles last week, thought them very interesting and well done, but suspected it would be a while before I saw any benefits. I waswrong.

I saw "Hugo" yesterday in Bridgeport, CT. The FedEx commercial in the trailers was the usual washed out, soft garbage I have come to expect. Nowhere near as good as a bluray in my home. But the Sony 4K logo came up before the feature, and WOW! What a picture! If you have not seen a 4K presentation, go find one, you owe it to yourself.

I know where I am going to see movies from now on, and I think I will be going to see more than I have in the past.

X
Enter your Sound & Vision username.
Enter the password that accompanies your username.
Loading
setting var node_statistics_98400