New Display Measurement: Input Lag

Starting with the September issue (and now, online), we're adding a new measurement to our objective TV/projector tests. It's called "input lag" and while it's not as important as contrast ratio or color accuracy (which we already test for), it's an important metric for gamers, and anyone who notices issues with "lip sync."

So here's what it is, how we test for it, and what, if anything, you can do about it.

The What

To see an image on the your TV/projector's screen there are a number of steps that must occur. Let's skip over the source parts (Blu-ray, cable/satellite box, etc), and start with the moment the signal hits the television. To create the image, there are multiple steps of processing including (but not limited to) deinterlacing, scaling, noise reduction, color and gamma management, and in some cases, frame interpolation. Though most of these steps can happen fairly quickly, none are instant. If the processing power inside the TV isn't sufficient, there can be a few micromoments of delay before the image, after being received by the television, actually gets out to your eyeballs.

Most TVs do a good enough job that with movies and TV, it's hard or impossible to see anything going on. Even in the cases where it's bad, many receivers come with lip-sync adjustments that delay the audio to match up with the video.

However, this band-aid won't work with games. Many games rely on precisely-timed actions by the player. Take a first-person shooter, for example. The player notices an enemy on screen, presses the trigger button, and he/she expects the weapon to fire at where the enemy is on-screen at that moment. But if the display is delayed by a few milliseconds, the enemy is in a different location than what the player sees. This lag can result in far more missed shots than would be dictated by the player's skill. This problem is further exacerbated with multiplayer games. There's a delay between the player and the main game server (50ms, if you're lucky). So on top of that latency, if you add 50-100ms of "lag" between your eyeballs and what your computer is doing, the delay gets significant. This may not seem like much, but it's very noticeable.

I've noticed this personally. Until a few years ago, I steadfastly refused to change my computer monitor from an aging 24-inch Sony CRT. The near-instantaneous response between my PC and the monitor was a never-considered given. Then I switched to doing all my gaming in my theater: much bigger screen, more comfortable seating.

However, I immediately started to suck. In multiplayer games, I dropped from "among the leaders" in a given game, to "mid-pack, if I was lucky." I attributed this to a softening of my skills due to lack of practice and/or age.

Yeah... not so much. Sure, part of it is less playtime, but a much larger part was adding significant (and unnoticed) additional lag to the game. I could have been at my best, and I'd still be at a disadvantage because other players were getting a 50-100 ms head start on anything I was doing.

Having figured out the issue, and having done "testing" using faster displays, I can definitely say input lag can be a real issue.

The Measurement

The problem is, since this is such a new measurement, how do we come up with an objective way to test it? Previously, there were two methods: eyeballing a spinning test pattern from a Blu-ray (inherently subjective), or an expensive and time-consuming process involving a camera and a CRT.

Then this guy Leo Bodnar came up with a simple and inexpensive lag tester. For a little over $100, the appropriately-named LagTester is a battery-operated, handheld measurement device:

It has two main parts: a photo sensor, and a pattern generator attached to an HDMI output. The LagTester creates a flashing pattern on the display's screen (image at top). You hold the LagTester in front of one of these flashing blocks, and it tells you how much lag there is between the image it's creating and the image it's sensing. This number is also generated by the device, which it puts on-screen at the same time. Easy. As you can see, there are three boxes at different places on the screen. For our measurement numbers, we'll average these together.

We, along with CNET's David Katzmaier, HDTVTest.co.uk's David Mackenzie, and Secrets of Home Theater and HiFi (and others) Chris Heinonen, have been collaborating for several months to come up with some baseline measurements to make sure we don't get wacky numbers, or worse, random outliers due to technology (or testing) differences.

Another of our braintrust is Adeel Soomro of DisplayLag.com, whose whole website is dedicated to input lag and the testing of TVs. It's worth checking out, as it's quite a resource.

To give you an idea of the numbers we're talking about, here are a few measurements from some displays we've reviewed (or are reviewing) here at S+V:
Panasonic ST60: 74.6 ms
Panasonic ZT60 (upcoming review): 63.9 ms; 41.5 in Game Mode
Samsung F8500 (upcoming review): 131.1ms; 100.7 in Game Mode

JVC DLA-X35: 82 ms
Epson 750 HD: 38.4 ms

We sent these review samples back before we had a chance to test, so here are numbers for the same displays as measured by the crew mentioned above:
Epson 5020: 95.1 ms
BenQ W1070: 33.67 ms
Sony KDL-55HX850: 119.83 ms; 44.27 ms in Game Mode

Lower is better, but how good is good? That's a bit more subjective. Soomro considers 0-20ms "Excellent," 21-41 "Great," 42-62 "Okay," and anything higher as "Bad." For us, it's not so clear cut. Nearly all TVs and projectors we're going to review here at S+V are going to fall in the Okay and Bad categories, so we're on a different scale. This is because most of the displays with really low input lag are computer monitors or small TVs, not big, high-end/high-featured TVs and projectors like we review. It's a tradeoff, between picture quality and input lag, and how much you want to trade either way depends mostly on what you're going to do with the TV.

Hopefully TVs will improve now that multiple publications are holding the manufacturer's feet to the fire.

Can you do anything?

Generally speaking, the more processing a TV has to do, the higher its input lag will be. This isn't to say that one technology or feature (like 240 Hz) necessarily means one TV will be faster/slower than another. Generally turning off any extra processing can help, but not always. Only testing will show that.

Almost universally, though, Game Mode helps a lot.

Game Mode shuts off many of the more time-consuming processes. The cost of this lower input lag, however, is worse picture quality. This can range from inaccurate colors to more video noise to any number of other issues. Is it worth the trade-off? Maybe. That largely depends on how much input lag is an issue to you, compared to absolute picture quality. Check out What is Game Mode for more info.

Bottom Line

For most people, input lag isn't going to be an important metric of a TVs overall performance. For certain gamers, though, it's far more important. I say "certain" as many games don't require twitch-level accuracy. First-person shooters and fighting games certainly do, but real-time strategy, RPG MMOs, even racing games, don't require instantaneous response times.

Going forward, if we see any interesting trends with this new measurement, we'll let you know.

Measurement lag will be a part of the Test Bench section of each review starting with the ZT60 and F8500 plasma TV reviews in the September issue (and all displays reviewed online from this point on).

X