Does anything special need to be done to linear-intensity images before displaying them on a DICOM calibrated display?

StackOverflow https://stackoverflow.com/questions/5040387

Question

I have some code which renders RGB images from a physical simulation. Those images have a linear intensity scale, so must be gamma corrected before display on a normal PC monitor and it's easy enough for my application to apply the necessary power law at some point in its display pipeline (generally I use something from 1.6 to 2.2 on a fairly ad-hoc basis; whatever I think looks best).

Now it is likely that in future the application may be run by users with DICOM calibrated displays. It's entirely unclear to me in what way these differ from a normal PC monitor (other than in some way being "more accurate"). Is there a particular gamma value that should be used, or some completely different response function needed, in order to reproduce the original linear-intensity image reasonably accurately on the display ?

Was it helpful?

Solution

The definitive reference on the topic is here.

OTHER TIPS

Looking at this document:

http://www.docstoc.com/docs/6460598/White-Paper-DICOM-Display-calibration

It seems like at least some brands of displays are calibrated (using a LUT) to have a gamma of 1.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top