Sony's Flat Screen Page 3
Color temperature (Custom Mode/Warm 2 Color Temperature): 20 IRE: 5,871 K 30 IRE: 6,477 K 40 IRE: 6,422 K 50 IRE: 6,436 K 60 IRE: 6,419 K 70 IRE: 6,407 K 80 IRE: 6,361 K 90 IRE: 6,307 K 100 IRE: 6,294 K Brightness (100-IRE window): 35.5 ftL
Primary Color Point Accuracy vs. SMPTE HD Standard
|Color||Target X||Measured X||Target Y||Measured Y|
Of the Sony's three picture presets, Custom delivered the most accurate color reproduction when Warm 2 color temperature was also selected. After I tweaked the user picture adjustments, the set's grayscale tracking was ±193 K of the 6,500-K standard from 20 to 90 IRE, and brightness was just above 35ftL - an impressive showing for an 11-inch tabletop TV! Color-decoder tests revealed a relatively severe -25% red error, while green was -5%. Compared with the SMPTE HD spec for digital TV colors, the set's red, green, and blue color points all showed moderate levels of oversaturation.
Overscan measured 0% for 1080i-format high-def signals with the +1 Display Area setting selected, and 4% for the Normal setting. The Sony displayed 480-line DVD test patterns with full resolution, and there was no sign of edge enhancement. Picture uniformity overall was excellent: Both black and white full-field test patterns showed no sign of brightness drop-offs at the corner of the screen, and gray full-field patterns were free of color tinting at all IRE levels. The set delivered the most accurate gamma with its Gamma mode set to Off.
With its Cinemotion mode set to Auto, the Sony passed the film-detail test on the Silicon Optix HQV test DVD when displaying 480i-format signals via HDMI. It did fail several of the "jaggies" tests on that disc, however. The XEL-1's standard and MPEG noise reduction modes worked well, smoothing out background noise without reducing picture detail on both standard- and high-def programs.