So I have a question to those people with experience in color management: From what I understand, the color calibration utility built into Mac OS X and others in a first step measure the response curve of the luminance displayed for a certain value of luminance put out of the graphics card. And this is done for the three different colors red green and blue. In the next step one pics a color target temperature and a target gamma. If I leave out the part about the colors for a moment and concentrate only on the luminance, according to my understanding these the measured response curves and the target gamma result in a mathematic transformation. This transformation is used by the color management system to make sure that for the luminance of a pixel stored in a file, the appropriate luminance with respect to the selected gamma is displayed on screen. If the file to be displayed has a color profile embedded, this profile can contain a gamma itself (sRGB with 2.2 for example). So if this gamma differs from the selected target gamma of the monitor, the color management system also should correct for that. If this is the case, the picture should be displayed the same way on the calibrated monitor regardless of the selected target gamma. From my observation, this seems to be the case in Photoshop, where the picture is displayed the same way on the same monitor if two color profiles differing only in the target gamma are used. If I, however, use Mac OS X Preview, the the way the picture is displayed changes with the selected target gamma. So there obviously is a difference in the color management implementation of the two programs. So is it wrong in one of them? Is it just incomplete in Preview? Or is it just that Apple assumes that OS X users always use a fixed target gamma (probably 1.8), although then it's strange that they offer a choice in their calibration utility? Or am I missing a point somewhere here?