Having owned iMac and in the past the 30" ACD, I found that some of the above mentioned non-Apple monitors worked as well if not better for serious graphics/photo work. I would suggest the OP not take yours or my word for it but do some exploring on the internet for reviews and "on paper" measurements related to the specs of some offerings. Again we'll find that the Apple displays are not without merit but certainly have competition and offerings that hands down are superior to the ACD and iMacs. This is not opinion but summation based on facts.
Here is an excerpt from Digilloyd site that some of "us snobs" might find amusing -
Faux calibration
The term calibration is abused: true calibration means bringing the hardware device (the display) to a specified target state. The actual behavior vs specified target is then measured, and a profile is generated that describes the differences.
With most so-called calibration, the display itself cannot and does not change. In short, it is not calibrated at all. Instead, 8-bit video card data is altered in an attempt to produce an image that approximates the proper intensity and color, using repeated measurements. This is impossible to do well in 8 bit, particularly in darker tones, where there are only a few bits to work with (not even 8 bits!). But even 8 bits is woefully inadequate, certainly so in a 3D color space.
Thus calibrating an Apple display or other brand is actually not calibration at all, but a laughably crude lipstick on a pig effort that is often worse than the stock Apple profile supplied by Apple. Whether the hardware calibration is done with a $100 or $5000 calibrator doesnt matter: no actual calibration occurs (the display does not actually change other than perhaps contrast and brightness), Rather, 8 bits on the video card approximate