Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
I understand the histogram... it's a visual representation of the distribution of pixels at different brightness levels.

In this photo example (borrowed from another forum), the histogram accurately represents what is being displayed and therefore you would assume what the sensor on the camera captured. You would think the highlights are blown and unrecoverable - they're all pegged to the right - the brightest white your screen can display.

However, after adjusting the exposure etc., you can easily get a picture and histogram that displays nicely (see the other image). So, information you would think was pegged or clipped, was not in fact clipped.

So while the histogram is an accurate representation of what is displayed, it's not an accurate representation of what data is there and what is actually clipped... is it?

If not, then why don't we have the tools at our disposal to tell when image data is truly clipped vs being displayed as clipped?

Or am I confused? :confused:
 

Attachments

  • blown.png
    blown.png
    190.5 KB · Views: 176
  • recovered.png
    recovered.png
    329.2 KB · Views: 164

fitshaced

macrumors 68000
Jul 2, 2011
1,741
3,632
I understand the histogram... it's a visual representation of the distribution of pixels at different brightness levels.

In this photo example (borrowed from another forum), the histogram accurately represents what is being displayed and therefore you would assume what the sensor on the camera captured. You would think the highlights are blown and unrecoverable - they're all pegged to the right - the brightest white your screen can display.

However, after adjusting the exposure etc., you can easily get a picture and histogram that displays nicely (see the other image). So, information you would think was pegged or clipped, was not in fact clipped.

So while the histogram is an accurate representation of what is displayed, it's not an accurate representation of what data is there and what is actually clipped... is it?

If not, then why don't we have the tools at our disposal to tell when image data is truly clipped vs being displayed as clipped?

Or am I confused? :confused:

My understanding is that an exposure if simply data gathering. If you overexpose, you are gathering more data than under exposing. If you shoot in raw, the data collected is not simply what you see, but what can be decoded by your photo editing software. I like to think of raw files like onions, you can peel stuff off but still have an onion.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
To help better illustrate my point, I believe this is the situation... as I've illustrated below. The Normal Histogram we see is misleading in that it implies data is clipped when it is not. I guess I'm wondering why we can't have a proper RAW histogram that actually shows when data is really being clipped? Is this technically impossible for some reason or are camera and software companies lazy, or am I the only one that sees a need for this? :D
 

Attachments

  • Histograms.png
    Histograms.png
    23.2 KB · Views: 116

fitshaced

macrumors 68000
Jul 2, 2011
1,741
3,632
To help better illustrate my point, I believe this is the situation... as I've illustrated below. The Normal Histogram we see is misleading in that it implies data is clipped when it is not. I guess I'm wondering why we can't have a proper RAW histogram that actually shows when data is really being clipped? Is this technically impossible for some reason or are camera and software companies lazy, or am I the only one that sees a need for this? :D

I think I misunderstood your question
 

paolo-

macrumors 6502a
Aug 24, 2008
831
1
I by no means know the answer and am by no means competent in the matter. I think it's because of the way the RAW files are made. Say current Canon cameras use 14bit raw files. However, a camera like a 7d spreads it's dynamic range over something like 12.x bits. (not sure where I read that *citation needed) So if the RAW converter showed you the full 14 bits you could have a clipped image that doesn't look clipped on the histogram. Also the dynamic range of the camera isn't set in stone, you know how you can still get data from an over exposed image but it's ugly and noisy, where do you draw that line? Also that lines tends to move with the ISO at which the image was shot and probably temperature (not to mention between same model cameras).

So I guess the manufacturers have the choice of showing you a histogram with an arbitrarily set limit or just show you what the developed file looks like.
 

wonderspark

macrumors 68040
Feb 4, 2010
3,048
102
Oregon
I totally understand the question, but totally don't know the answer. I'm totally dying to find out, though! Proceed, all! :p
 

TheReef

macrumors 68000
Sep 30, 2007
1,888
167
NSW, Australia.
If not, then why don't we have the tools at our disposal to tell when image data is truly clipped vs being displayed as clipped?

Or am I confused? :confused:

Correct, dragging around the exposure slider moves the histogram representative of visible pixels.

Many (most?) DSLRs display a JPEG histogram also... not very useful when shooting. People remedy this to increase dynamic range by playing with their whitebalance settings. The downside is you end up with green RAW files that need to be corrected in post. See UniWB

I see what Aperture does as more useful as it's the final result I'm concerned about - fitting all that RAW dynamic range into a range which is more easily visible.

You can (kind of) make it act like a RAW histogram though, by moving the slider around to bring the invisible RAW data into view of the histogram.
You'll see a vertical drop off from the point of clip, the vertical drop begins to move left as you drag the exposure left (in the case of over exposure) indicating the RAW exposure's upper limit.
The same idea applies except opposite for the RAW file's lower bounding clip.

Extreme example, I guess I let a little too much light in with that f/0 lens ;) :p

screenshot20130107at528.png
 
Last edited:

ChrisA

macrumors G5
Jan 5, 2006
12,581
1,697
Redondo Beach, California
The problem is more complex and there are a few dfferent cases

1) The JPG file is "clipped" and has some pure white areas while the raw file is not clipped

2) The raw file itself as clipped pixels because the charge in the pixel saturated the A/D converter

3) Both cases above can have the chiping in one, two or all three color channels

I think the most common case is to have one channel in the raw file "blown out" and the JPG clipped to white in that area.

My (rather old) Nikon D200 actually shows both JPG histograms and three color histograms you I can see if just one of the colors is blown out.

The real problem is caused by the need to move betwen different color spaces. What to do if the raw file as a color that is not in the sRGB color space? Do we "warp" the colors onto sRGB with some kind of non-linear function. That is what film did. Negative film did this very well, slides used to simply blow out just like digital.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
The problem is more complex and there are a few dfferent cases

1) The JPG file is "clipped" and has some pure white areas while the raw file is not clipped

2) The raw file itself as clipped pixels because the charge in the pixel saturated the A/D converter

3) Both cases above can have the chiping in one, two or all three color channels

I think the most common case is to have one channel in the raw file "blown out" and the JPG clipped to white in that area.

My (rather old) Nikon D200 actually shows both JPG histograms and three color histograms you I can see if just one of the colors is blown out.

The real problem is caused by the need to move betwen different color spaces. What to do if the raw file as a color that is not in the sRGB color space? Do we "warp" the colors onto sRGB with some kind of non-linear function. That is what film did. Negative film did this very well, slides used to simply blow out just like digital.

Agreed, but it doesn't explain why we're only given a JPEG histogram to work with in any and all tools.

Correct, dragging around the exposure slider moves the histogram representative of visible pixels.

Many (most?) DSLRs display a JPEG histogram also... not very useful when shooting.

Yeah... My question now is... why do you think we only get to see the visible pixels on a histogram?

I can obviously understand why you would want to see the distribution of visible pixels, but it would seem equally important when shooting or working with RAW to see the whole range of pixels.
 
Last edited:

Edge100

macrumors 68000
May 14, 2002
1,562
13
Where am I???
The histogram represents the in-camera JPEG conversion - which has a contrast curve applied to it, not the Raw file - which has linear gamma.

The camera is "squashing" your highlights into a JPEG.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
The histogram represents the in-camera JPEG conversion - which has a contrast curve applied to it, not the Raw file - which has linear gamma.

The camera is "squashing" your highlights into a JPEG.

Yeah, thanks.

My question now is, why can't we get a histogram for the RAW file? Is there some technical issue or is it just a lack of demand, need, or desire?
 

kevinfulton.ca

macrumors 6502
Aug 29, 2011
284
1
Yeah, thanks.

My question now is, why can't we get a histogram for the RAW file? Is there some technical issue or is it just a lack of demand, need, or desire?

There might be some technical issues like processing speed, but I tend to side with lack of demand. Most photographers (even the pros) only use the preview on the camera as a quick reference (it's 3 inch screen after all). Most (myself included) will just give it a quick look to double check white balance and the histogram to avoid clipping.
 

neutrino23

macrumors 68000
Feb 14, 2003
1,881
391
SF Bay area
Maybe you are seeing the RAW histogram. Think of this. The RAW data is, strictly speaking, not an image. It is a database of numbers from which an image is extracted. That image is then displayed on screen using some values for exposure, brightness, contrast, etc.

Maybe the image in question had the display values set too high for some reason. Try this out. Take an image which has good exposure everywhere but which includes bright and dark areas. Now run up the exposure to view the dark areas nicely. At this point the histogram will show data rammed up to the right and bright areas will be blown out, but the data is still there to recover the highlights.

So maybe the question is not why you can't see a RAW histogram but why did your camera choose a default set of presentation values that seemed to overexposed some areas.
 

Laird Knox

macrumors 68000
Jun 18, 2010
1,956
1,343
There are several factors to consider here.

JPEGs are 8 bits per channel and RAW may be 12 or 14 bits per channel. So that may be as few has 256 shades (8-bit) or as many as 16384 (14-bit). That means that for any given channel you have 64 times the information to work with.

So that step from 255 to 256 in 8-bit is actually 64 steps in 14-bit. This is still going to look blown out on the back of the camera or on the monitor. Essentially everything in this range will show as white (combining all three channels) as they are so close to each other. Even high end LCDs do not support 14-bit color.

What you are doing when you recover those highlights is spread out the values. They are all close to pure white but not actually the same value. So as you increase the recovery you are widening the step between shades. In an extreme case this will show up as banding or posterization as you spread the values out too far.

Another thing that plays into it is that the shades are not evenly distributed across the capture. Each stop in the shadows contains less information than each stop in the highlights. This is why it is often easier to recover the highlights rather than the shadows. This is also why you frequently hear people recommending that you "expose to the right." In other words, if your image is slightly over exposed then it is easier to correct than one that is slightly under exposed.

This also demonstrates why a calibrated workflow is needed to get truly accurate results.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
There might be some technical issues like processing speed, but I tend to side with lack of demand. Most photographers (even the pros) only use the preview on the camera as a quick reference (it's 3 inch screen after all). Most (myself included) will just give it a quick look to double check white balance and the histogram to avoid clipping.

I suspect you're right, but we should start demanding better tools. Wouldn't it be helpful to have the choice of histograms presented in post #3? So you could tell what is clipped in the JPEG, what is recoverable from RAW data, and what is clipped in RAW?

Maybe you are seeing the RAW histogram. Think of this. The RAW data is, strictly speaking, not an image. It is a database of numbers from which an image is extracted. That image is then displayed on screen using some values for exposure, brightness, contrast, etc.

Exactly, what you are seeing on a normal histogram like the one of the elephant in the first post is... "If we display this data on your screen, this is how it will look"...(eg. mostly blown out). BUT what it doesn't tell us, or what's hidden, is that there is a lot of useable data there that is not actually clipped at all.

So maybe the question is not why you can't see a RAW histogram but why did your camera choose a default set of presentation values that seemed to overexposed some areas.

Well, the camera is presenting the data just as you asked the camera to capture it (eg. over exposed) but what the camera doesn't tell you, is that there is still perfectly good data there. That's where the benefit of a RAW histogram with the full dynamic range would be helpful.

There are several factors to consider here.

...

Yeah, thanks I understand now why and how there's more usable data there than the histogram shows... I'm just wondering now why there's no way to know it. We have a histogram for the displayed data, but not the captured data.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
Let me summarize what I think is missing now...

I've annotated this illustration from post #3...

The top histogram, is the tool that is provided to us by our camera and photo editing software. It represents how the image data will be displayed or is currently being displayed. This is useful as it shows that there is information that is getting clipped during display.

I think what a lot of us would find very helpful, however, is the bottom histogram which is representative of how the image was captured by the sensor... This would be particularly useful to know in the field so we could know if we had to use filters to reduce the dynamic range on the sensor or could do that work in post.
 

Attachments

  • Histograms.png
    Histograms.png
    31.1 KB · Views: 1,213

acearchie

macrumors 68040
Jan 15, 2006
3,264
104
Let me summarize what I think is missing now...

I've annotated this illustration from post #3...

The top histogram, is the tool that is provided to us by our camera and photo editing software. It represents how the image will be displayed.

I think what a lot of us would find very helpful, however, is the bottom histogram which is representative of how the image was captured by the sensor...

A good example of this is in the film industry.

High end video is normally captured as flat as possible to allow for the most latitude in post. A lot of directors monitors have the option to view the data at Rec .709 which looks much more like the colours and contrast we are used to on TV. The reason they do this is that the shots the camera are recording are riduculously flat and it can be difficult to really take in the image.

Here is a rubbish example, my search terms aren't working so well as it's almost 1am!

alexalooks-comparison.png


I would think that this isn't implemented into digital cameras because it is probably the less than 1% that wants to use it and it would just be a feature that a lot of people would be confused by.

For me it's much easier to monitor using the current historgram, make sure nothing is clipping, and knowing I have a little leeway anyway. It also shows me what colours pop and by how much, something that you wouldn't really get from the flat image.

Also, don't forget that the histogram is actually completely correct as it is the histogram for the generated preview JPEG from the main Raw file. If it was to show a raw histogram it would then have to show the completely untouched flat photo as well.
 

Laird Knox

macrumors 68000
Jun 18, 2010
1,956
1,343
This is not correct. There isn't more information outside (left and right) of the displayed histogram. There is more information contained in what is displayed. The curve on your "normal histogram" is trying to represent 256 values in 8-bit and 16,384 in 14-bit.

Yeah, thanks I understand now why and how there's more usable data there than the histogram shows... I'm just wondering now why there's no way to know it. We have a histogram for the displayed data, but not the captured data.
I don't think you do.

Say the histogram is 100 pixels wide on your screen. That means that each column on the histogram must represent 2.5 (8-bit) or 164 (14-bit) shades. Each of those 164 shades will have its own value but the single pixel wide column must average all these somehow. Think about it - the rightmost 10 columns will actually be more than 1600 shades. In order to get the most accurate histogram it would have to be 16,384 pixels wide. :eek: I would need seven 30" (2560x1600) monitors to display that.

While the histogram may be based off of the JPEG it is still pretty accurate. The 8-bit JPEG is displaying the 14-bit raw 16,383 and 16,384 values as white (256). The difference between those two values is .006% - pretty much white in both cases. When you are pulling out the highlights you are changing the difference between them from .006% to say 1%. Now you can start to see the difference between each level of white.

Of course you have to throw away some of the data to make room for the "recovered" values. That is why you can only do so much to recover an image that has not been exposed properly.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
This is not correct. There isn't more information outside (left and right) of the displayed histogram. There is more information contained in what is displayed. The curve on your "normal histogram" is trying to represent 256 values in 8-bit and 16,384 in 14-bit.


I don't think you do.

Say the histogram is 100 pixels wide on your screen. That means that each column on the histogram must represent 2.5 (8-bit) or 164 (14-bit) shades. Each of those 164 shades will have its own value but the single pixel wide column must average all these somehow. Think about it - the rightmost 10 columns will actually be more than 1600 shades. In order to get the most accurate histogram it would have to be 16,384 pixels wide. :eek: I would need seven 30" (2560x1600) monitors to display that.

While the histogram may be based off of the JPEG it is still pretty accurate. The 8-bit JPEG is displaying the 14-bit raw 16,383 and 16,384 values as white (256). The difference between those two values is .006% - pretty much white in both cases. When you are pulling out the highlights you are changing the difference between them from .006% to say 1%. Now you can start to see the difference between each level of white.

Of course you have to throw away some of the data to make room for the "recovered" values. That is why you can only do so much to recover an image that has not been exposed properly.

Ok, the fact is there is a difference between clipped by the sensor, and clipped for display, and that gap represents useful data that COULD be displayed as I've illustrated.

EDIT: The RAW histogram I'm talking about provides the room to stretch out those values as you say so you can see what's there.
 

Laird Knox

macrumors 68000
Jun 18, 2010
1,956
1,343
But those values don't exist outside of the histogram - they are compressed within the histogram.
 

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
So I've done a bit of digging around and came up with a few applications that offer a RAW histogram in addition to a regular (or mapped) histogram...

Rawnalyze
UFRaw
Rawtherapee

Rawnalyze is no longer around, but the docs are still there and it makes for an interesting read...
http://www.oitregor.com/numeric/Rawnalyze/Rawnalyse_doc/datas/Histogram.htm

HistogramLegendRaw.gif


I actually downloaded RawTherapee on my Windows VM and ran it with the elephant shot I was playing with in post #1, (see attachments) and as expected it was very revealing in that it shows there is no clipping of the sensor data shown in the RAW histogram whereas the mapped histogram is a totally different story.

So, a RAW histogram does exist, it is useful, and it would be great if cameras implemented this (IMHO).
 

Attachments

  • Regularhistogram.png
    Regularhistogram.png
    267.9 KB · Views: 122
  • RAWhistogram.png
    RAWhistogram.png
    297.9 KB · Views: 116

kallisti

macrumors 68000
Apr 22, 2003
1,751
6,670
I think there are several issues at play.

(1) Displayed histograms are generated from an in-camera JPG and not the RAW data.
(2) The "combined" or "white" histogram is actually coming only from the green channel in most DSLRs. So it is possible that information is present in the red or blue channels that you aren't seeing in the "clipped" channel (which is really only the green channel). In a reverse scenario from what you are presenting, this can result in exposure issues in some cases as the histogram shows "adequate" exposure while other channels are actually blown out (most common in the red channel in my experience).

To my knowledge, the only body that shows a histogram reflective of the RAW data and not a JPG produced in-camera is the Leica M Monochrom. As an added bonus, the Leica MM histogram also displays the histogram with lines reflective of the zone system, which is probably more important given it is a B&W only sensor. On the plus side, the histogram is actually reflective of the RAW data seen by the sensor. On the minus side, there is zero tolerance if you blow out the data on either side. Shooting with the Leica MM (from experience): what you see on the histogram is what you get. If you clip the highlights or the shadows they are truly gone.

Not sure why this feature isn't present on all digital cameras.
 
Last edited:

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
Update... I found another RAW program called RAW Digger that runs on the Mac and offers a nice high fidelity RAW histogram which shows there are some very small parts of this elephant photo that are clipping on the green channel (likely the white fixtures on the wall).

It seems to show a full stop more data than a regular mapped histogram would.

I think this could be a great tool for learning where your camera sensor starts to clip.

Am I out in left field here or is this at all interesting to anyone else? :D :eek: :confused:
 

Attachments

  • rawdigger.png
    rawdigger.png
    443.1 KB · Views: 114

VirtualRain

macrumors 603
Original poster
Aug 1, 2008
6,304
118
Vancouver, BC
To my knowledge, the only body that shows a histogram reflective of the RAW data and not a JPG produced in camera is the Leica M Monochrom. As an added bonus, the Leica MM histogram also displays the histogram with lines reflective of the zone system, which is probably more important given it is a B&W only sensor. On the plus side, the histogram is actually reflective of the RAW data seen by the sensor. On the minus side, there is zero tolerance if you blow out the data on either side. Shooting with the Leica MM (from experience): what you see on the histogram is what you get. If you clip the highlights or the shadows they are truly gone.

Not sure why this feature isn't present on all digital cameras.

Interesting... and I totally agree!

EDIT: I read the Leica M offers a 11EV histogram (which makes sense if it's full range on a modern sensor). This compares to the 8EV in that RAW Digger application, and what I understand is 5-6EV (?) for a normal JPEG histogram.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.