Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

William Payne

macrumors 6502a
Original poster
Jan 10, 2017
931
360
Wanganui, New Zealand.
I have heard about software to control displays and such, Most of the time it is used when using high resolution displays or whatever with a powerful GPU.

I got rather curious as to how this software works and what it does, so I thought I would mess around with it in a situation where you would never use it just to see what happens.

I decided to try it out with a flashed to 5,1 2009 mac pro running high sierra with the stock gt120 GPU hooked up to an old 22" Acer display. I will show my results and you guys can form your own opinions.

I used cell phone photos of my screen so that people could see exactly what I was seeing and not photoshopped or anything.

The weird thing is that even after going back to standard settings using RESXTREME it still said it was running 10 bit and did not revert to stock 8 bit until after deleting the software.
 

Attachments

  • Before RESXTREME INSTAL-1.JPG
    Before RESXTREME INSTAL-1.JPG
    115.5 KB · Views: 845
  • Telling It To Output 10 Bit Per Channel -1.JPG
    Telling It To Output 10 Bit Per Channel -1.JPG
    133.2 KB · Views: 607
  • Semi Awesome Side Effect Of It Going 10 Bit. -1.JPG
    Semi Awesome Side Effect Of It Going 10 Bit. -1.JPG
    137.5 KB · Views: 560
  • After 10 Bit-1.JPG
    After 10 Bit-1.JPG
    63.5 KB · Views: 595
Last edited:

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
I have heard about software to control displays and such, Most of the time it is used when using high resolution displays or whatever with a powerful GPU.

I decided to try it out with a flashed to 5,1 2009 mac pro running high sierra with the stock gt120 GPU hooked up to an old 22" Acer display. I will show my results and you guys can form your own opinions.

I used cell phone photos of my screen so that people could see exactly what I was seeing and not photoshopped or anything.

The weird thing is that even after going back to standard settings using RESXTREME it still said it was running 10 bit and did not revert to stock 8 bit until after deleting the software.

Use the following option to go back to 8 bit.
Screen Shot 2017-11-11 at 13.47.43.jpg

Anyway, I am sure the software won't give you any magic, and you need a 10bit GPU and 10bit monitor to make it work. If force the software to go 10bit with 8bit hardware, expected something will go wrong, right?
 

William Payne

macrumors 6502a
Original poster
Jan 10, 2017
931
360
Wanganui, New Zealand.
Use the following option to go back to 8 bit.
View attachment 734548
Anyway, I am sure the software won't give you any magic, and you need a 10bit GPU and 10bit monitor to make it work. If force the software to go 10bit with 8bit hardware, expected something will go wrong, right?

Believe me I was not expecting this to work, I was just playing around and wanted to share my experience. There was no serious expectation of a result here. I was more implying that if the software can show a 10 bit depth on hardware that has no where near the capability to even output depths like that then how accurate is it on other gpu's.

Think of this more as a reviewish type thing.
[doublepost=1510380156][/doublepost]For example, I switched it back to 8 bit using the software and the colour and screen all looked normal yet the frame buffer in the system report was still showing ARGB2101010
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
Believe me I was not expecting this to work, I was just playing around and wanted to share my experience. There was no serious expectation of a result here. I was more implying that if the software can show a 10 bit depth on hardware that has no where near the capability to even output depths like that then how accurate is it on other gpu's.

It doesn't. There is no difference.

All the data will be converted back to 8 bit before it goes back to your display.

Almost any GPU can process 10 bit data. But that's different than what's being output. And that will just get converted back to 8 bit.

You could do the same thing with black and white Macs too. Loading up a color image on a black and white Mac could be done. It would just display in black and white. Just because you had a color window didn't mean that your Mac magically turned into a color Mac.
 
  • Like
Reactions: h9826790

William Payne

macrumors 6502a
Original poster
Jan 10, 2017
931
360
Wanganui, New Zealand.
It doesn't. There is no difference.

All the data will be converted back to 8 bit before it goes back to your display.

Almost any GPU can process 10 bit data. But that's different than what's being output. And that will just get converted back to 8 bit.

You could do the same thing with black and white Macs too. Loading up a color image on a black and white Mac could be done. It would just display in black and white. Just because you had a color window didn't mean that your Mac magically turned into a color Mac.

I agree that is sort of what I was implying with my observation. That even though it showed 10 Bit frame buffer in the system report. That due to hardware limitation I knew I definitely wasn't. What I was implying was that while it shows one thing in system report via this software. The reality from the hardware is very different.
[doublepost=1510607806][/doublepost]What I'm trying to imply is there is no software magic going on. I had zero expectations of it actually doing anything. I was just amused at that even though I knew I had an 8 Bit hardware limitation it still showed 10 Bit in the system report. Which I new was not correct.
 
Last edited:
  • Like
Reactions: dabotsonline
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.