[Tig] Monitor calibration

Martin Euredjian ecinema
Tue May 31 10:24:34 BST 2005


Over the last few weeks I've had discussions with a number of people on this
subject.  Some formal (using taking measurements as part of the discussion)
and some less formal.  After all of this I've come to conclude that there's
such a range of opinion on what is "correct" that I'm beginning to wonder
how we got here.

I thought it might be interesting to put this out and see what develops.
How do you setup your monitors and why?  What are you looking for?  Why?

Here are some data points:

  Units are FL and in parenthesis cd/m2 which is equal to Nits.
  [TV] means that the end product will be for TV transmission.
  [MP] means that the end product will be for theatrical release.
  The notes are just smart-ass remarks to provide some context.

[TV] Old SMPTE?:     32 to 35 (109.63 to 119.91) 
[TV] The "standard"?: 30 (102.78)
[TV] Better resolution?: 26 to 28 (89.08 to 95.93)
[TV] If you use the Sony calibration probe:  20 (68.52)
[MP] Theaters are not bright: 12 to 16 (41.11 to 54.82)

>From one end of this scale to the other there is a huge factor of 2 in white
point light output.  The [MP] case is understandable, of course, but, how
about the other cases?

Notice I didn't mention black levels above.  The situation is even more
confusing there.  Some seem to think that you need to set black to a
beam-off condition.  Which, I'll stick my neck out and say, is absolutely
wrong.  We can get into that later.  

Others like to use the famous PLUGE (Picture-LineUp Generation Equipment)
pulse. The problem with this, of course, is that it is just-about the worst
in terms of repeatability.  It think it may be fair to say that no two
people will set the PLUGE pulse the same way.  Everyone can repeat the
theory, but, in practice, it just doesn't work as well as an instrument.
Then, of course, there are those who will set a level at 10 or 15% video and
let blacks fall where they may.  This level may be in the order of 0.1FL to
0.2FL (0.343 to 0.685 Nits) or so, regardless of whether it is set at 10% or
15%.  

Yet another issue is that many don't know that it is time to throw away your
colorimeters and purchase spectrophotometers.  A colorimeter cannot be used
for metrology on different display technologies (say, CRT against DLP
projector).  A spectral instrument is of vital importance here.  A
colorimeter will invariably result in significant measurement errors.

Finally, to exacerbate the black level and, in general, color perception
problem, viewing environments are not consistent from location to location.
I've seen $30,000 instruments used to calibrate monitors where a $15 light
bulb provides the reference viewing environment.

>From a color-perception standpoint none of the above calibration
methodologies is necessarily wrong.  Each viewing enviroment requires a
different approach to image calibration.  This is also a place where color
science doesn't have all the answers yet.  There are lots of equations
available for everything except when you get to some some of these
perceptually-driven issues.  The more likely case is that there simply isn't
a single "correct" (whatever that means) answer to this question.

Any thoughts on this, perhaps to add to the confusion?

None of the above is critical of the very smart folks with whom I had these
conversations.  In fact, as more of this unfolded, I found them to want an
answer as much as I did.  Let's see what comes out of this post.

I've cross posted to CML and TIG to see what the two worlds opine on this
matter.  

BTW, please don't turn this into yet-another-CRT-vs-LCD debate.  That's not
even remotely the intent here.  There is, however, room for a
CRT-vs-Projector calibration debate.


Thanks,

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian
eCinema Systems, Inc.
voice: 661-305-9320
fax: 661-775-4876
martin at ecinemasys.com
www.ecinemasys.com
 
 






More information about the Tig mailing list