Gordon’s World: A Video Premonition

Recently I completed one of my all too infrequent efforts to cull the herd of old magazines more than three years old; anything earlier worth finding is almost certainly available on-line. A few key older issues were kept for various reasons, and a complete set of the long-defunct Stereophile Guide to Home Theater is still hiding somewhere in the garage. But by accident I ran across my only remaining issue of Video Theater. Never heard of it? It was a magazine begun in the late ‘80s by J. Gordon Holt. Holt is best known as the flounder of Stereophile magazine, which inspired a whole raft of competitors anxious to fill a pent-up demand for information on how equipment actually sounds, not just how it measures.

But Gordon was not only an audiophile. He was passionate about video as well in an era when home video hadn’t yet moved much beyond the 21-inch, CRT color TV. Video Theater was short lived, but was well served by Gordon’s unique observations and take-no-prisoners words. But the issue I found also had some pithy editorial observations about the road home audio had travelled, in Gordon’s estimation, and how this might predict the future of video beyond when these words were written, in mid-1990. Here they are:

Soapbox: Not Again, Please! If, as is claimed, history repeats itself, home video is primed for a sorry phase of contemporary evolution.

To date, video has been pretty well tracking the past course of audio, in whose early days professional studio equipment was the standard by which all else was judged. A mid-1940s audio system might have consisted of an RCA “postage-stamp” studio magnetic cartridge and arm, a Rek-O-Kut studio turntable, an Altec Lansing public-address amplifier, and an Altec Lansing, Stevens, or RCA studio monitor loudspeaker. (Only years later did home audio equipment start to surpass the professional stuff in performance.) Today’s reference video components are professional monitors from Tektronix, industrial laserdisc players from Pioneer and Sony, and pro VHS cassette machines from JVC and Panasonic.

In 1949, when the LP record was introduced, there was no such thing as “high-end” audio. There was high fidelity and there was low fidelity, but shortly thereafter the twain began to meet. A series of articles in the popular magazines—Life, Time, the Saturday Evening Post, and the now defunct Colliers weekly—reported on a burgeoning hobby called high fidelity, which had to be Good because its practitioners included much of the nation’s upper crust: doctors, lawyers, engineers, orchestra conductors, showbiz glitterati, and Mafia capos. Suddenly, the Great Unwashed wanted in too, but at no more than a tenth of the price the Aristocracy was paying. Of course, American enterprise rose to the challenge.

First off, all the cheap portable phonographs that had been knocking around for years were plastered with new labels declaring them to be “Genuine High Fidelity.” Firms that had been making dreadful little amplifiers for high-school “public address “ applications slapped “High Fidelity” labels on their last year’s models, and magazines that reviewed new records sent letters to all their critics requesting that, in future, they comment on the fi of all the records they reviewed. Since the critics’ record players had all been made before anyone had even heard of high fidelity, they had no highs and no lows, so the record companies set about producing discs that “compensated” for the shortcomings of the “average record player.” This involved boosting the mid-treble and the mid-bass to add brilliance and body, and limiting the range of musical dynamics so the soft parts didn’t disappear under the surface noise and the loud parts didn’t cause the pickup to jump grooves.

Then there were the record producers whose 6-figure salaries obliged them to be creative, which meant not only using 84 microphones to record a 96-piece orchestra, but playing diddly-games with their balances so that, when the first oboist had a six-bar passage, it was “spotlighted” in order to draw it to the attention of listeners who might otherwise not have noticed that important musical, event. And that was just for starters.

Shortly after, we were assailed by CBS’s “ pre-distortion” system, which was supposed to cancel the tracking distortion of “average” pickups, RCA’s “Dynagroove” system which squoze all the dynamic range out of the music and added :”dynamic equalization” to give the illusion of dynamic range, CBS’s CX noise reduction which was supposed to be compatible with systems lacking CX encoding but wasn’t, and Angel’s egregious re-mastering of EMI’s gorgeous recordings to add the “sparkle” (spelled “brilliance” or “screech”) demanded by American critics and record buyers. The result was music recording’s darkest hour, which happened instead to be a quarter century, during which some of the best performances by the world’s finest orchestras were reduced to utter dross, sonically unsalvageable by even the most sophisticated modern technology.

But this was all in the past, and it involved only desecration of sound recordings. What bearing could it possibly have on video? Simply, that it could not only happen again in video , it is already starting to.

The thing that ruined sound recording for 25 years was the popularity of “high fidelity,” which in those days was defined by the masses as “all the highs and all the lows.” The implication was, since highs and lows were a good thing, you couldn’t have too much of them.(Just like sex, money and power.) The more consumers that showed an interest in hi-fi, the more its quality standards were compromised in the name of LCD—the Lowest Common Denominator. No one responsible for making records gave two hoots up a bullfrog’s butt about quality by any existing standards; the name of the game was to produce records that had more highs and lows than the competition’s on the most average record players out there in consumerland. And who suffered most from it? The person who had shelled out the most money for an audio system that unabashedly reproduced what was on the record, that’s who.

In video today, manufacturers are just about where audio entrepreneurs were in 1960. Little light bulbs are going on above their heads, as they begin to realize that there’s a growing public out there that cares about color saturation, video signal-to noise ratio, shadow detail, and resolution, but has little idea of and even less interest in how they are legitimately obtained. All it will take is a few market surveys to prove to the manufacturer’s satisfaction that the average viewer’s system is (1) weak in color, (2) noisy as hell, (3) totally unhinged in black level, and (4) so defocused you can barely see the scanning lines.

If history repeats itself here, the next step will be video equipment and software incorporating (undefeatable) corrections intended to make the picture look “better” to people who don’t know quality from quaalude. It’s already starting to happen.

Gordon went on for a few more paragraphs about how video could also jump off the rails. In the early years, after he wrote his predictions, it almost did. But it avoided audio’s rocky path. Audio’s early years were much like the Wild West, with few constraints beyond what worked. With video, however, production standards were established that were much more stringent than the few that had applied to early audio. True, many consumers remain addicted to the over-bright and over-saturated images still available from today’s TVs through their Standard and Vivid picture modes. But video displays now also offer more accurate picture modes, exceptional adjustability, and controls to correct most errors for those willing to do so or hire someone who can.

Today’s video landscape is more rewarding than might have been imagined when Gordon wrote his concerns. He lived long enough to see video tapes and laserdiscs replaced by DVDs and the earliest Blu-rays. But he passed on in 2009, so never experienced 4K, HDR, or the most recent developments in flat screen TVs such as quantum dots, OLEDs, and relatively affordable, 65-inch and larger screen sizes. I suspect he might have resisted HDR at first, arguing that it violated the original vision of the filmmaker. But he likely would have relented eventually, aware that consumer video had largely bypassed the potholes he had feared.

X