I understand what you are saying. If I recall, and if I understand correctly, the CRTs started decaying almost from day one. It was a small change each month (perhaps barely noticeable even with precise measurements) but it was steady and constant. More or less a straight line. So one calibrated often to bring the display back towards the target.
Then the LCDs came along. They stay stable for a much longer time. Most of their designed lifespan. But then when they start to go, they go quickly. The frequent calibration is supposed to catch the LCDs as they start their death throes.
I don't know about the LED backlights. Maybe the backlight doesn't matter.
But as I said above, it's not just the monitor that needs calibration, it's the light in the environment as well. Frequent calibration will catch those changes as well. I have a Spyder that is used to calibrate the monitor, but mostly it reminds me to set the lighting in my office to a the "medium" level it was last calibrated at.
Cheers
Well the decay of a CRT depended on its usage- if you used the red pixels in it a lot, those phosphors would decay faster than the other ones, and regular calibration helped keep everything on the same level. An LCD does not really experience such usage-specific decay and thus I think the display as a whole is more stable.
Your comment about ambient light is understood, but I am a little up in the air about it. Given that a monitor essentially produces its own light (via the backlight, or CRT illumination, etc.) how much does ambient light really interfere with the actual output of the display? I can see there being some effect, but how much? Would it be noticeable to the naked eye, or even measurable by a device? I would be curious to know. Ambient light can affect your eye's interpretation of the monitor but I think calibrating is meant to set your monitor's output at a known standard, instead of compensating for your eye's subjective interpretation.
IMHO though, a good foundation of a color reliable environment is to be able to recreate a consistent and repeatable lighting situation when you are working. Then, calibrate to that. So for example close any blinds and use interior light only when you work on pictures so that the ambient lighting situation is not changing depending on the time of day. I don't think it is the right solution to the problem to be re-calibrating your monitor every evening because it's darker outside, then re-calibrating it again the next morning when the sun is up.
I'm not sure what you refer to when you say that when LCDs go, they go quickly. As the "colored" elements of an LCD don't really change over the lifespan of the device (they are just color filters tinting the backlight coming through) the only real element that changes is the backlight. This is why I say that CRTs needed more constant calibration due to their variable decay (the same problem would exist in an OLED display, at least with current technology) whereas LCDs stay more stable over time because everything is decaying at a more equal rate. There may be a drop in brightness over time but the nature of fluorescent light is that the spectral output is pretty stable too AFAIK.
I guess I can only add my anecdotes wherein since I have 2 monitors, I can qualitatively judge the difference re-calibrating makes when I do it to one monitor and not the other. Assuming both are equal to start with, if I calibrate my external display, I can see if it is visibly different from the laptop display after it has been calibrated (whereas the laptop LCD is still using the older calibration). Even after a few months, I see little to no difference.
I think in the end 90% of the battle is getting your display calibrated upfront, the other 10% is all in using higher-end spectrophotometers vs colorimters, or doing regular calibrations to keep up with display aging.
Ruahrc