Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Citation needed.

I have yet to find any article besides the one you posted, and the only forum thread I've located on [H]ardforum that discusses this talks about how half-baked 10-bit color support is.

Interestingly enough, Anandtech couldn't get 10-bit output to work with their Dell U2711 sample. Had Anandtech or ArsTechnica made a post detailing it, I'd definitely consider it proof enough, as those are possibly the best tech journalism sites on the web, but they haven't got anything about this on widely available consumer tech or OSes yet. And they do tend to be on the forefront of new advances.

I'm not saying that you're wrong, but that I want some kind of evidence to back up your claims. I'll settle for screenshots, though links would be preferable.
 
I'll settle for screenshots, though links would be preferable.

Screenshots of what? You can't view a 10 bit screenshot on your 8 bit display...

With regard to links, what exactly are you missing? There is a link on here stating that 10 bit output is officially supported in Photoshop CS5 in OpenGL mode (which also means that Apple/ATI added 10 bit color support to the developer facing headers, which means it is supported to developers). There has also already been a link posted with someone who has it working, and detailing that Apple bother offered user support for 10 bit color, in addition to working with a monitor manufacturer to fix their 10 bit color support on their displays.

I'd like to see something posted from you as to why this is not supported... It's an officially supported OpenGL flag, just like all the other ones Apple maintains but doesn't make big flashing bullet points on their site.

Occlusion queries was also something added recently to the graphics drivers in OS X. I don't see any public advertising of this feature from Apple. Should I also assume that this is not officially supported?
 
Screenshots of any indication on the system - such as in system profiler - that 10 bit color is active would be nice. It will tell you, as it currently tells you that you've got 32-bit color pixel depth. But I meant of said OpenGL flags, as I can find no mention of this anywhere. Perhaps it's locked behind Apple's developer pages, which is why I said I would settle for screenshots if links were not possible.

The link stated that AMD supports 10 bit color output, and in fact, that Adobe does not, which I find rather telling. I don't find that one article as convincing as you do. It's heartening news, certainly, but it just doesn't seem very substantial to me.

Occlusion queries are not as big a deal for the professionals who use  hardware as 10-bit color is. It's of interest to the related industries - photography and film - who would probably like to do more directly in OS X. Occlusion queries are neat, I suppose - don't know a ton about them - but not a marketing point like 10-bit color is. I find it relevant that  bothered to list native 2.2 gamma in the Snow Leopard marketing. Not on the front page, but it was listed on the website. This still isn't.

And even if all of this wasn't true, my request for any kind of citation is a perfectly reasonable request. I really do want to see some more substantial information and my own searches have revealed nothing.
 
Screenshots of any indication on the system - such as in system profiler - that 10 bit color is active would be nice. It will tell you, as it currently tells you that you've got 32-bit color pixel depth. But I meant of said OpenGL flags, as I can find no mention of this anywhere. Perhaps it's locked behind Apple's developer pages, which is why I said I would settle for screenshots if links were not possible.

System Profile won't report 10 bit iirc, because it's not a system wide setting...

As for the API, it's actually part of the OpenGL spec. Here ya go:
http://www.opengl.org/registry/specs/EXT/packed_pixels.txt

Looks like it's part of the OpenGL 1.0 spec. So for any OS to support OpenGL 1.0, they have to support 10 bit color. As far as Apple's support, do a Spotlight search for the 10 bit color constant, it's in the developer headers. As far as I can tell, it's not even a new feature. I even see it defined in the headers for 10.4, which is as far back as I can go.

The link stated that AMD supports 10 bit color output, and in fact, that Adobe does not, which I find rather telling. I don't find that one article as convincing as you do. It's heartening news, certainly, but it just doesn't seem very substantial to me.

Whether or not Adobe supports it has absolutely no bearing on whether or not Apple supports it. Adobe's entire OpenGL mode should be entirely unsupported, imo, it's very poorly written.

Technically, it's not even Apple's call. As long as the hardware supports it, you can load a 10 bit texture into VRAM and draw it. But Apple is supporting it.

And even if all of this wasn't true, my request for any kind of citation is a perfectly reasonable request. I really do want to see some more substantial information and my own searches have revealed nothing.

My searches very quickly revealed things.
 
Please, do provide links then - clearly I'm not searching for the right things, and I would like some more information. Thanks for the OpenGL spec reference.

Part of OpenGL - yes. Doesn't mean that the functionality is necessarily guaranteed to be working properly. It certainly should be, but uptake has been so slow that if it was broken, not a lot of people would know or necessarily care.

From what I read, enabling it in Windows broke a lot of things. Not to say that it will here, but I'm expecting some ad print or at least a passing mention on 's website when they consider it ready.
 
Please, do provide links then - clearly I'm not searching for the right things, and I would like some more information. Thanks for the OpenGL spec reference.

Part of OpenGL - yes. Doesn't mean that the functionality is necessarily guaranteed to be working properly. It certainly should be, but uptake has been so slow that if it was broken, not a lot of people would know or necessarily care.
]

If it's in the developer facing headers, it's guaranteed to be working properly. I'm telling you this as a developer. If it isn't, than as a developer you open bugs on it, and DTS cases, which means it gets fixed.

From what I read, enabling it in Windows broke a lot of things. Not to say that it will here, but I'm expecting some ad print or at least a passing mention on 's website when they consider it ready.

10 bit output has been supported since Mac OS 10.0. I don't think you're going to see it on Apple's web site any more.

Not to mention, it's really a function implemented by the graphics drivers.

Apple's official documentation on enabling 10 bit is here...
http://developer.apple.com/graphicsimaging/opengl/extensions/apple_packed_pixels.html

The compatibility table ( http://developer.apple.com/graphicsimaging/opengl/capabilities/ ) indicates that the function is fully supported since at least 10.2.8 in the ATI, NVidia, and software renderers.

It's fully supported. I don't need to supply any more info.
 
So following up we received the MDP>DP cable and hooked it up, checked the config. opened the ramp.psd and there was still banding in the image.
Results were NO 10-bit unless we have over looked something in the config & setup.

1. ATI 5870 card (mac)
2. NEC PA27W (or other 10-bit compatable monitor with a displayport)
3. Mini displayport > Displayport cable
4. OSX running 64bit mode
5. CS5 PS running 64bit mode
6. PS OpenGL on and Advanced mode selected
7. Open ramp.psd
8. banding in the image - Result Fail

anyone have any ideas?
 
So following up we received the MDP>DP cable and hooked it up, checked the config. opened the ramp.psd and there was still banding in the image.
Results were NO 10-bit unless we have over looked something in the config & setup.

1. ATI 5870 card (mac)
2. NEC PA27W (or other 10-bit compatable monitor with a displayport)
3. Mini displayport > Displayport cable
4. OSX running 64bit mode
5. CS5 PS running 64bit mode
6. PS OpenGL on and Advanced mode selected
7. Open ramp.psd
8. banding in the image - Result Fail

anyone have any ideas?

There's some people on Apple's forum doing the same test on a 4890:
http://discussions.apple.com/thread.jspa?threadID=2130243&start=120&tstart=0

It seems like they're talking about the 5770 possibly, also.

It looks like the 4870 has it's own driver separate from the 5770/5870. So it's possibly that there is a configuration issue, or ATI didn't flag 10 bit as a card supported feature in their 5X70 Mac drivers. The drivers are basically responsible for telling Apple's OpenGL implementation what features are available on the card, and the ATI engineers may not have added the 10 bit flag to their drivers. If the drivers claim the 5870 isn't a 10 bit card, Apple's OpenGL stack won't enable 10 bit. (This isn't necessarily an issue on Apple's end, in since ATI writes their own drivers.)

I might know someone I could ask about the 5X70 drivers specifically. I'll see what I can find out.
 
2010 Mac Pro with 5870, connected to HP Dreamcolor LP2480zx monitor via DisplayPort (capable of 10-bits per channel).

Photoshop CS5, OpenGL advanced mode.

If I make a black-to-white gradient, and look closely at the monitor, I can see banding.
 
Eizo ColorEdge CG303W & ATI Radeon HD 5870

I am looking at an Eizo ColorEdge CG303W to go with a 5870. The monitor has dual DVD-D inputs.

Any ideas how this will work with a hexcore Mac Pro and Final Cut Studio? Also for photography. What cables will I require to get the best out of the monitor? Oh yes, I forgot (sorry); I shall also be using an Apple 20" ACD along with it.


Thanks

Hval?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.