Actually...
In the professional print world you are adjusting your output device to match the proof that the customer supplied (it's the customer you are trying to please not your ego). That proof is then sent to the press room where they use they keys on the press to match the customers copy.
Ergo. Your monitor really doesn't matter unless you are using a display on press, which at this point doesn't matter because it's what comes off the press that matters and that's handled on press.
For video professionals.
Unless you are supplying a final output for the exact monitor (or calibrated to your monitor) you are producing your video on the monitor you're using is a mute point.
If it's for TV display, it's going to be displayed on everything from Tube to Plasma. If it's for the Web it's going to be used 99% of the time by your cheap monitors.
So, please...
If anyone in this room can show any logic that proves this wrong please explain it. I've been doing this in both arena for years.
Your logic is irrefutable, MacOldTimer--but do you offer it to support the argument that the new ACDs are fine for "pro" graphic designer and videographers, or do you side with those who think the cheaper Dell monitors mentioned here are better options? Not sure which way your "logic" swings on this.
I worked for a time with a publishing company and did some graphic design work for 4-color publications (brochures and catalogs). We always ran proofs with an Epson printer and included those with our electronic submissions for press people to adjust color to. We'd okay their proofs before actually going to press.
All of our monitors were complete HP Pavillion pieces of crappola.
I also did event videography for a while, where the final product was DVD and streaming web. When doing web stuff, you more or less have to play things down the middle, as monitor profiles vary so widely. Output for DVD was a little more standard, but color was not so much about dead-on accuracy as it was a series of cinematic, creative choices. I would review the final output on several different systems (conventional TVs, LCD screens, laptop monitors, etc). A decently calibrated monitor gets you into the ballpark.
I suppose that only time you really need super-accurate calibrated monitors is when doing high-end video editing for film output. Then you are dealing with more universal standards of final display (projection).
I do not mean to say, from this or my previous posts, that Apple Cinema Displays (LED/IPS versions) are the best deal on the planet. Certainly people can have personal tastes leaning towards or away from the new design. But the claims that "Glossy screens = Apple HATES professionals" is pretty much hogwash.
Plenty of rich 'n' trendy graphic design professionals are going to put the new ACDs to productive use and make a lot of money with them. Apple does not hate "professionals." They love them--
especially the rich 'n' trendy ones! Those are the people who LIKE a new look and eco-friendly designs and will gladly "overpay" for them on a feature-per-$ comparison basis.
I'm NOT a "Kool Aid drinker" and I'll NOT be buying one for myself anytime soon unless I win the lottery. But some of the whining about the current LED/IPS ACD's features is getting a little out-of-hand, IMHO.