Full on/full off contrast ratio, at least as far as we do it, is measured with a 100-IRE field and a 0-IRE field only after the display has been calibrated. The most notable reason why contrast ratio is a useless statistic is that you can only compare this number to other displays we've measured. There is no set way to measure contrast ratio, so everyone else's numbers are going to be different. But even take our own numbers. When are you ever going to see, or need to see, the difference between a full white screen and a completely black screen? Oh, right. Isn't that why there's ANSI contrast ratio?
ANSI contrast ratio has eight black boxes and eight white boxes arranged on screen in a checkerboard pattern. Even this, though, is flawed. The average picture level of a TV show or DVD movie isn't anywhere near the 50% APL that the ANSI contrast tests. Some research has shown that the APL of most TV shows is less than 20% APL.
So how about a test that uses the ANSI checkerboard, but has white boxes that are, say, 40-IRE. This would average out to an overall picture level of 20%, around what most television programs are. A realistic representation of the contrast ratio you're likely to see when actually watching the display. We could still print the black level and the light output (of course).
A good idea? Maybe. It would just barely be more useful than any of the other contrast ratio specs, as you'd still only be able to compare it to other displays we've measured. Also, how much different would it be than the numbers we already print? If you take the ANSI contrast ratio as a worst case scenario, and the full on/full off contrast ratio as a best case, you're getting a pretty good representation of what's going on with the projector. What is more useful is the black level and light output numbers anyway.
Of course, this doesn't solve the problem of not being able to compare anyone else's numbers to ours or ours to theirs. That seems to be a problem for another time.