Calibration Training, AVR Connections, Frame Rates

Job Training
In looking for a good part-time job (hopefully working into full time), I ran into a guy who advised that I take a video-calibration technician course. Who is best to teach this? I heard that an ISF course is available through a school in Colorado. I have also heard you and Leo Laporte chat about THX in glowing tones, and I see online they hold courses in Las Vegas and Dallas.

At this point, I'm as green as the grass, but I do have the interest. So where would you suggest I start? How much should I expect to pay, and how long is the training required? Do I need any additional background before attempting a calibration course?

Rich Ireland

Both THX and ISF offer video-calibration courses, and both are excellent. However, I've often likened them to drinking from a fire hose—the amount of information imparted in three days is staggering. Thus, I wouldn't recommend them for anyone who is "green as the grass."

Before taking either one, you need some grounding in video basics. Start by reading as much as you can find on the subject here on HomeTheaterMag.com and elsewhere online. In particular, read our TV reviews, and pay special attention to the Measurements section of each one, which talks about the technical aspects of setting up and measuring the set's video performance.

To get some practical experience, optimize the basic picture controls on as many TVs as you can—yours, your family's, your friends', etc. To do this, you need a good setup disc, such as High-Definition Benchmark on Blu-ray, which also includes some tutorial information, or Digital Video Essentials or Avia on DVD, though both of those are somewhat more complicated. You can also use any THX-certified DVD, which includes a section called THX Optimizer. Be sure to read my detailed description of the entire procedure here.

Otherwise, see if a local community college offers classes on video calibration, which will probably be longer and less intense than the THX and ISF courses. Perhaps you can hire a local calibrator to give you some private lessons, or maybe you can intern with him, doing some grunt work in exchange for watching him do calibrations.

As for cost, the THX Video 1 and 2 courses (three days total) are $1995, while ISF Level 1 and 2 (also three days total) are $1800. Neither set of courses is offered in a fixed location as far as I know; check the THX and ISF websites for their training locations and schedules. Then there's the cost of the equipment you'll need to do full calibrations, which can range from a couple of thousand to tens of thousands of dollars. (You don't need any of this equipment to do basic setup as discussed above.) It's expensive to become a full-fledged video calibrator, but I find it very rewarding to help people get the most out of their TVs.

The Best Connection
I can't see why I'd want to connect a source device's video output to my A/V receiver. The audio is a no brainer, but as an old IT guy, I was always taught that the best connection is a direct connection. So cabling the video to my A/V receiver makes no sense to me since it adds an unneeded device to the configuration. Also, I can't see how any A/V receiver would enhance the picture quality.

Steve Soricelli

Your concern is valid, but not because of any inherent problem in passing a video signal through another device. Many AVRs pass video signals without any degradation at all, but some do degrade the picture quality because the internal processor is poorly implemented. In particular, some AVRs clip "above white" and "below black," which is a deal breaker in my book.

I wouldn't say that an AVR can significantly enhance the picture quality, though many modern models can upconvert standard def to high def. Whether or not the AVR does a better job of this than the player or display is another question, but if another device does a better job, the AVR's video processor can often be bypassed.

How do you know if an AVR degrades the video or processes video poorly? Read our reviews, which include a section that tests these functions explicitly.

Assuming that an AVR doesn't do anything untoward, the main reason to pass the video through it is convenience. An AVR is designed to be a central switcher for both audio and video, and it's a lot easier to select what you want to watch and listen to from one device rather than having to switch the AVR for audio and the TV for video. Granted, your universal remote can do that for you, but if it's not pointed in the right direction, one or the other device might miss the command, leading to much gnashing of teeth.

Update: As Dave Anderson points out in the comments below, HDMI connections carry both audio and video in one cable, so you have no choice but to send audio and video to the AVR in that case, unless the source device has two HDMI outputs. And other than using multiple analog cables, HDMI is the only way to get the highest-quality audio into the AVR, which is a good reason to choose this approach.

It's So Refreshing
If television signals in the US are either 30fps (okay, 29.97fps) or 24fps in the case of Blu-ray movies, why the fixation on flat-panel refresh rates? I understand that 3D needs 240Hz to essentially interleave the left and right perspectives in an otherwise "normal" 120Hz display, but what's the science and/or math behind needing to display a frame four times in 1/30th of a second (120Hz display rate / 30fps)?

David Shorrosh

First of all, while it is technically correct that US broadcast TV can be characterized as 30fps (frames per second), it is more accurate to say it is sent at a rate of 60 interlaced fields per second, and when the TV deinterlaces that signal, the result is 60 frames per second. Why isn't the result 30 frames per second? There are several reasons that are too complicated to get into here.

Now, on to your question. The high frame rates to which you refer apply only to LCD-based flat panels, not plasmas. These high rates are designed to address the problem of motion blur in LCD TVs, and they are accompanied by a process called frame interpolation. Between each consecutive pair of deinterlaced frames, the TV creates new frames in which moving objects are placed where they would be if the camera had caught them at the higher frame rate to begin with.

With an interlaced input, a 120Hz TV creates one new frame between each pair of deinterlaced frames, while a 240Hz TV creates three new frames between each pair of deinterlaced frames. (Remember that there are 60 deinterlaced frames per second, not 30.) With a Blu-ray movie at 24p (24fps progressive), a 120Hz TV creates four new frames and a 240Hz set creates nine new frames between each pair of incoming frames.

Of course, nothing is free, and the process of frame interpolation causes its own problems. Primary among them is that it imparts a "video" look to movie content, to which many viewers object. Fortunately, frame interpolation can be disabled, but then each incoming frame is simply repeated as you note, which does nothing to improve motion blur. (In this case, each deinterlaced broadcast frame is displayed twice on a 120Hz set, not four times as you state.) Plasmas have no need for frame interpolation because their pixels respond to changes to brightness and color much faster than LCD pixels.

If you have a home-theater question, please send it to scott.wilkinson@sorc.com.

COMMENTS
Dave Anderson's picture

Re: AVR connections, I disagree with the statement "..., the main reason to pass the video through it is convenience." Because audio and video are now carried on the same cable (using HDMI) the primary reason for passing through the AVR is to get the highest-quality sound. The lossless codecs need the bandwidth of HDMI (or an analog cable for each channel), so unless there are two HDMI-outs from the source (so one could go directly to the display and one to the AVR) the HDMI cable needs to be routed through the AVR to get the highest quality audio formats.

CJLA's picture

Hi Rich Ireland,I'm assuming that you want to take a video calibration course so you can offer your services right? I would consider going to CEDIA.org the governing body of the audio video industry. From there I would search for a local CEDIA certified custom integrator by zip code, and then see if you could call a couple of these companies and ask if you could go in-the-field with them to see first hand what it is like. Calibration is a small subset of a/v and home automation and you might get turned on to other things that you may not be aware of. In any case, I think it's a great way to get exposure in this industry.Chris

Chad's picture

My question is why is frame interpolation required at all? Plasmas do not have motion blur, and LCDs should now be able to avoid this in a similar way. What I mean is that LCD pixels are now able to refresh at 120Hz or 240Hz. LCD TVs should be able to flash an image, go dark by turning off the backlight, and then flash the next image. This should produce the same effect as a plasma screen flashing its image without frame interpolation algorithms. I know that Samsung has a system on some of its LED TVs where it flashes the LED backlights off and on inbetween images, but viewing their TVs, it doesn't seem to fix the issue completely to make the image look as clear as plasma. I just want to be able to watch hockey without blur--is that too much to ask? And I don't want to buy plasma because they use too much energy in my opinion.

Erik in Wisconsin's picture

I don't think the answers are lining up with Steve's question. He is talking about connecting his TV OUTPUT to his receiver. The answers are about running source output through the receiver with the video output then going to the TV. I am set up this way myself (everything through the receiver and then to the TV) and totally understand that. However, that is not what Steve asked. I don't know why one would connect TV OUTPUT to the receiver. Is there a reason to do that? Maybe the question just wasn't stated that clearly and he means what the answers are addressing.

Scott Wilkinson's picture

Dave, good point—as long as you're using HDMI, in which case I agree with you.

Scott Wilkinson's picture

CJLA, great suggestion! Though I wouldn't call CEDIA the governing body of the audio/video industry; rather, I'd call it the trade association of the custom-installation industry.

Scott Wilkinson's picture

Chad, many LCDs do flash the backlight as you describe to decrease motion blur, but I don't know of any that do it like plasmas, which flash each 60Hz frame up to 10 times (this is what Panasonic's 600Hz subfield rate is about). It's an interesting idea that I'll talk to some LCD companies about. In any event, plasma pixels can change their state much faster than LCD pixels, which is why plasmas generally have better motion detail.

Scott Wilkinson's picture

Erik, I see your point. I was operating under the assumption that Steve was talking about the video signal from a source device to the TV, but I didn't edit the question sufficiently to reflect that. I know of no TVs with a video output, so trying to connect a TV's video output to an AVR makes no sense. I'll re-edit the question. Thanks for pointing that out!

Zap Andersson's picture

Note, though, that "blur" on an LCD is just as much a psychological effect as an "actual" effect due to the temporal nature of the signal. Most LCD's simply show the entire image for the entire temporal duration of the "frame", and then switch - where an old CRT actually "flashes" the image for your eyes for 1/50:th (1/60:th) of a second. The human mind *perceives this differently*... because the human mind likes to "fill in gaps". The CRT has the gap which the mind can fill, whereas the LCD actually holds the image over the entire frame interval, not giving the mind the ability to create the "filled in" frames.Unfortunately, people have bought the myth that higher framerates are "better", which is just crazytalk. My opinion as a filmnut: true 24 fps FTW.Frame interpolation, though, is the hellspawned evil, and honestly anyone implementing this in their TV's should be keelhauled... this is NOT a solution./Z

Steve Soricelli's picture

I'm currently using the 3 optical connections I have available for the audio from my Blu-Ray, DirecTV, & cable sources. I'm under the impression that these are Digital & not analog connections. So I would think this is no better or worse than HDMI. Anyway, my current receiver is an older model that does not have any HDMI connectivity. My original question was to hopefully justify replacing my older receiver. But I am not convinced since I didn't see any reference to optical vs HDMI.

Scott Wilkinson's picture

Steve, if your question is the difference between optical and HDMI, that's a different story. Optical TosLink is limited to 2-channel PCM and multichannel Dolby Digital and DTS (lossy, not the new lossless codecs), whereas HDMI can carry multichannel PCM and Dolby TrueHD and DTS-HD Master Audio lossless codecs. (I've seen some references that suggest TosLink has been upgraded to carry multichannel PCM and Dolby TrueHD and DTS-HD MA, but I suspect that the protocol it uses, S/PDIF, is still limited as described above.)As far as I know, cable and satellite use regular Dolby Digital for the most part, so for those sources, there is no difference between optical and HDMI. However, Blu-ray uses high-resolution multichannel PCM, Dolby TrueHD, and/or DTS-HD MA, so HDMI offers a significant benefit in that case.

Steve Soricelli's picture

If I understand this correctly it sounds like the only benefit I'd get presently is the enhanced audio from my Blu-Ray player. Since I'm happy with the audio today, I'll probably just wait until my receiver dies before replacing it with a 21st century model. I appreciate your responses & taking the time to answer my questions.

Chad's picture

Scott & Zap, I understand both of your responses, and I agree. I think maybe I did not state my solution to LCD blurring clearly. Like Zap said, CRTs and Plasmas flash an image. Whereas LCDs hold the image until the next frame is refreshed, so the brain cannot fill in the gap like it does for CRTs and Plasmas. To remedy this, LCDs have been upping their refresh rates to 120, 240, etc, but no matter how fast the refresh, the image is not flashed. As Scott said, some LCD makers do flash the backlights, but I've seen the blurring issue persists. Maybe it persists because the backlight flashing employed is not long enough to leave space for the brain to fill in the gap. LCD makers are employing interpolation algorythms when the brain is perfectly able to interpolate the motion itself. The easy fix is to have the LCD pixels refresh an image flash the backlight so the eyes can see it, then go black for enough time before the next image refresh for the brain to do its interpolation (say 60fps). Am I making sense

Luke's picture

I have an Integra DHC 9.9 and my question is...Do I calibrate the tv and the processor? Or do I just calibrate one and set the other to halfway? Thanks Newbie here. Love ur mag. Luke

X