Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Am I being a bit slow here.....? On 10.11.1 GM my 2014 retina iMac shows 30-bit pixel depth, however on 10.11.2 b1 my 2014 retina Macbook Pro shows 32-bit pixel depth. So is my rMBP better in this respect?

My 12" retina MacBook is 32-bit also
Screen Shot 2015-10-30 at 6.28.21 PM.png


So I noticed my 2013 iMac also shows 32-Bit... One thing I noticed on the OP picture is it says ARGB10101010 whereas both of my computer show ARGB8888. I am guessing the number after ARGB are the 10-bit and 8-bit designations? I'm not sure. The OP also shows 30-bit while mine show 32-bit. So what are the differences here. 32-bitARGB8888 and 30-bitARGB10101010.
 
Last edited:
  • Like
Reactions: Gudi
They are now selling a computer which truly is garbage. Their 5400 disk iMacs are so frustrating to use that the experience damages their brand and reputation.

Oh, you own the new 21" Retina iMac with the 5400rpm disk do you? No. No I didn't think so.
 
SSD wouldn't heat up not near as a HDD would, right?

What? I don't really know what you're asking or saying here. But no SSD's don't heat up anywhere near as much as a spinning disk they're much easier to fit in slim systems.

And you are wrong, for lots of us only basic configurations are a choice. I live in a country without an Apple store.

To be fair that must be horrible, but unfortunately its not "lots" of people in that situation, 99% of Apple's customer base has access to the online store to spec whatever they want.
 
Are you on a Retina iMac? I have a 2011 iMac at work and the 5K at home and I can definitely see the stripes on my screen. I guess it's possible that your eye is not sensitive to the difference.
I downloaded that file and all I see is a smooth black image.

No lines . No stripes.

I also looked at that file and was disappointed that the gradient looked smooth to me also. I consider my vision to be quite good. Then tonight I happened to find that I hadn't closed that window yet, and the bands jumped out at me. They were very obvious.

The difference was the lighting. So, if you try this test, I suggest looking at the image in a darkened room.
 
Can someone clear this up to us with less knowledge than you? Why do our Mac displays say they're 30 or 32 bit but OS X doesn't support it?
 
I am guessing the number after ARGB are the 10-bit and 8-bit designations? I'm not sure. The OP also shows 30-bit while mine show 32-bit. So what are the differences here. 32-bitARGB8888 and 30-bitARGB10101010.

Both are showing how many bits are allocated to each channel. The channels are Alpha (A), Red (R), Green (G), and Blue (B). So ARGB8888 means 8 bits are allocated to each channel. ARGB2101010 means 2 bits are allocated to alpha, and 10 bits to each color.

So both cases are making use of 32 bits (8+8+8+8 = 2+10+10+10 = 32). Alpha controls transparency; it is not a color. On the OP screen, it appears alpha was not counted, and so it displayed as 30-bit, whereas for the other, alpha was counted, so it displayed as 32-bit.

In case you're wondering why red, green, and blue, when most of us are taught that the primary colors are red, blue, and yellow: the latter are the primary colors when you work with paint (the subtractive primary colors). For light, the primary colors are red, green, and blue (the additive primary colors), which are the three colors the rods in our retinas specialize in detecting.

Does this answer your question?
 
  • Like
Reactions: navaira
Can someone clear this up to us with less knowledge than you? Why do our Mac displays say they're 30 or 32 bit but OS X doesn't support it?

A 32-bit display is needed whether the computer drives 8 bits per red/green/blue (the old way), or 10 bits per red/green/blue (the new way). The remaining bits are given to the alpha channel (see previous post).
 
The 2014 5K iMac also supports 10-bit color depth on OS X El Capitan, according to these reports.
That's good news, it might be time I try El Capitan on my iMac.

------​
The new 10-bit color depth reportedly only works within the Preview and Photos applications for now, but other third-party software should eventually take advantage of the technology.
I'd like to see it used to reduce the banding on the Yosemite blur effect. It can be quite noticeable in certain situations.

------​
Update: I had to see it for myself, so I installed El Capitan on a separate partition and confirmed 10-bit support is present in Preview.app! The difference is noticeable with the gradient I tested, compared to my 13" rMBP (Pixel Depth: 30-Bit Color (ARGB2101010) on the riMac vs Pixel Depth: 32-Bit Color (ARGB8888) on the rMBP for those wondering).

I can't get Photos to display it in 10-bit color, only Preview.
 
Last edited:
I'm able to use SwitchResX to force my external Dell display to 30-Bit Color. However, the test ramp is still all steps in Preview and in Photoshop. So something isn't working properly still.
 
The average human eye registers... 7 bits of color. By using ten bit color you go from one bit of color you can't see to three bits of color you can't see (per channel, per pixel). So I am asking my graphics processor to process 9 bits per pixel that make no discernible difference in the quality of the image. Why not leave it at 8 bits and maybe OpenCl can make use of the unused graphics card clock cycles/processing?
Wrong. While it's true that 8 bits of color might be enough to avoid color banding artifacts, this would be assuming LINEAR color. But every final-user display is non-linear (including TVs, cinema, computers, phones, tablets, etc...). So, 8 bits produce banding. If you've never seen banding on 8 bit displays, it's because you didn't pay attention to gradations on dark areas, or because it was a dithered image, or because your eyes cannot perceive it.

Said this, I doubt 10 bits is enough for all non-linear displays. Otherwise, SGI workstations wouldn't have supported 12bit per component color.

Also, I feel really disappointed (and I mean REALLY) that these news aren't in the Macrumors main page, which is plenty of watches, TVs, and related wastes of time, while ignored REAL news like this.
 
Oh no, the Bit wars again.

Stop counting bits. Please. And stop comparing Photoshop files.

These numbers are always misleading. Even professional photographers are fooled over and over again by such statements.

The bit number is just one tiny fragment of how precise color is reproduced. There are so many things which must work right before you can even think about using 10 bits. Also, you do not know where the 10 Bits actually apply. Is it in the system? The profile? The driver? The cable? The LUT? The LCD-Matrix? What encoding? RGB? Yuv? XYZ? You don't know.
 
  • Like
Reactions: simonmet
The display does, but that doesn't say anything about the graphics card, drivers and OS support.

You can test this by downloading the ZIP file on the bottom of this webpage, unzip it and open the PSD file in Preview. If the gradient is smooth, it's 10 bit, if you see stripes then it's not.

Indeed, I just realized the image capture is about the display, not the graphics. In order to confirm if your Mac really supports 30bit color it's necessary to look at the OpenGL capabilities. There're apps that show this. Alternatively, if you use XQuartz, just execute glxinfo from Terminal and you'll see if there're any 30bit visuals supported.

Anybody can confirm whether new iMacs graphics really support 30bit color? I seriously doubt integrated Intel graphics support it.

Note that this requirement applies to simple viewers too: Preview won't display in 30bit color if there aren't OpenGL 30bit configurations on your graphics card.
 
Anybody can confirm whether new iMacs graphics really support 30bit color? I seriously doubt integrated Intel graphics support it.

Note that this requirement applies to simple viewers too: Preview won't display in 30bit color if there aren't OpenGL 30bit configurations on your graphics card.

The Adobe Photoshop file (.psd) I mentioned does this. 10 bit allows for seamless gradients in grey tones. The blog post mentions this specifically, they tested it with two different iMacs.
 
Actually, you are wrong about that. Check it out:
https://appleappraisal.files.wordpress.com/2010/02/countries_and_official_apple_websites.png

But never the less, Apple claims to be a premium PC maker. If that were true, they would use premium parts as well.

That graph just proves my point. 99% of Apple's sales are from China, America, UK and Canada. Hence why everything launches there first. It doesn't have to cover 99% of the planet, just the areas where it makes financial sense, which it doesn't in the greyed out ones or Apple would be there.
 
Why Apple go to all this trouble with amazing technology and then cripple it by shoving in a 5400RPM drive is beyond me. You've made 11 billion dollars profit in 3 months. Just make SSDs standard in all your computers already.
Um. Surely you're aware that the spinning drive is an *option*? That people who want performance -- as anyone interested in 10-bit color would -- can simply *chose* an SSD model?

It's bizzare that people can be infuriated by the simple *existence* of an option that they are not interested in themselves.

Really, what is going through your head?
 
Um. Surely you're aware that the spinning drive is an *option*? That people who want performance -- as anyone interested in 10-bit color would -- can simply *chose* an SSD model?

It's bizzare that people can be infuriated by the simple *existence* of an option that they are not interested in themselves.

Really, what is going through your head?
It's the fact that they deliberately put in the 5400RPM option to shout out the lower price and drive sales to the SSD's that's the issue. Unfortunately, some people will buy it anyway and wonder why their iMac is sluggish and a bad user experience. It's bad for Apple and bad for those of us that are called upon to fix our friends and families issues especially since it cannot be easily upgraded.

I think we're mainly disappointed that Apple would stoop to these marketing tactics and are already fed up with them doing it with the 16GB iPhones. The iMac is just more telling of where they are going as a company (trying to squeeze blood from a stone). Apple used to be known for great experiences. Lately that isn't the case.
 
.

Also these 5400rpm drives are only in the the 21" iMac. I really wouldn't like to know the heat a 7200rpm 3.5" drive would kick out in those things, you'd have to cripple one anyway at which case you'd be better off running a 5400 WD Green. The perceivable difference between the two these days is negligible, its not like it would be running a WD Black 7200 even if there was one. I wouldn't run a system without forking out for an SSD now days anyway, which is available.

They're not 3.5" drives though, they're 2.5", which have even slower performance than their desktop counterparts. The reason they're laptop hard-drives is because they made the iMac so thin and light, which is the most important thing to consider for an all-in-one that would seldom be moved.
 
No. That's not it at all. The point is that the Apple brand is associated with a certain amount of quality and prestige. They refused to jump on the netbook bandwagon way back when because netbooks, whether $200 or $2, were garbage. At any price, the user experience was so terrible that it would be cruel to subject a person to them. Apple recognized this and held themselves to a higher standard.

They no longer hold themselves to that higher standard. They are now selling a computer which truly is garbage. Their 5400 disk iMacs are so frustrating to use that the experience damages their brand and reputation. And non-tech consumers just don't know enough to avoid this trap. They'll buy the base model because it's cheapest or simply in stock, and assume that no matter what, they can rely on Apple to give them a solid product. That's what Apple is known for, after all. But they will feel swindled as soon as they start using it. It offers a horrible user experience right out of the box.
If you never used a computer with an SSD, you won't consider the experience horrible. Before you've switched to a computer with an SSD, did you think your computer had a terrible user experience? No, you just accepted it as the way things are. People upgrading from a HDD computer to this iMac won't consider the experience to be horrible. In fact, they will be delighted by the display, in terms of resolution but also colour, but likely also by the CPU, GPU and memory performance if their previous computer was several years old.

If you want to beat up Apple, do it over still selling the 2012 non-retina 13" MBP with a Sandy Bridge CPU and iGPU and a 5400 rpm drive. All other laptops released since then have SSDs. But there are apparently enough people still buying it, otherwise they wouldn't keep selling them, despite Apple hardly promoting it. Some people just prefer storage size over speed (for a given price) or low price over speed (for a given storage size) even if that means that they have to get 2012 tech. And am pretty sure, the vast majority of them never used a computer with an SSD and thus simply don't know what they are missing.

Yes, the HHD-only iMac might mainly be there to hit a price point to get people into the door, but it might at the same time also make the vast majority its users happy.
 
Oh, you own the new 21" Retina iMac with the 5400rpm disk do you? No. No I didn't think so.

Oh get off it.

1) anybody with a snippet of tech knowledge would know how bad these things are. People don't have to personally use each iteration to verify this.

2) luckily there are things called reviews found online, all of which confirm the performance is an absolute joke from something so expensive.

A lot of apologists in this thread. You happily pay through the nose for second-rate products they're making a fortune on, and then cheer when Apple announce they've made so many profits, pointing this as evidence to the Apple-naysayers.

That is almost the perfect example of what it is to be in a cult.

Take a step back and look at the bigger picture.
 
Update: I had to see it for myself, so I installed El Capitan on a separate partition and confirmed 10-bit support is present in Preview.app! The difference is noticeable with the gradient I tested, compared to my 13" rMBP (Pixel Depth: 30-Bit Color (ARGB2101010) on the riMac vs Pixel Depth: 32-Bit Color (ARGB8888) on the rMBP for those wondering).
Just to clarify, on which hardware do you see 10-bit support, only on riMacs or also other computers?
 
The average human eye registers... 7 bits of color. By using ten bit color you go from one bit of color you can't see to three bits of color you can't see (per channel, per pixel). So I am asking my graphics processor to process 9 bits per pixel that make no discernible difference in the quality of the image. Why not leave it at 8 bits and maybe OpenCl can make use of the unused graphics card clock cycles/processing?

We can also only see up to 30 fps right? Which is why the human eye has been tested to be able to see quick flashing lights which show that we see upwards of hundreds of fps.

I hate people who keep trying to use computer dialect to give limitations to the human eye which is only limited by how good your vision is currently.
 
Oh, you own the new 21" Retina iMac with the 5400rpm disk do you? No. No I didn't think so.

LEL, you either haven't tried a pc with a hdd as main drive in a long time or you just mindlessly defend Apple. You realize first of all that hard drives are basically 80's tech? All that has changed is some added cache and a little faster RPM.

Try booting up an old machine with a regular hdd, everything hangs.
 
If you never used a computer with an SSD, you won't consider the experience horrible. Before you've switched to a computer with an SSD, did you think your computer had a terrible user experience? No, you just accepted it as the way things are. People upgrading from a HDD computer to this iMac won't consider the experience to be horrible. In fact, they will be delighted by the display, in terms of resolution but also colour, but likely also by the CPU, GPU and memory performance if their previous computer was several years old.

If you want to beat up Apple, do it over still selling the 2012 non-retina 13" MBP with a Sandy Bridge CPU and iGPU and a 5400 rpm drive. All other laptops released since then have SSDs. But there are apparently enough people still buying it, otherwise they wouldn't keep selling them, despite Apple hardly promoting it. Some people just prefer storage size over speed (for a given price) or low price over speed (for a given storage size) even if that means that they have to get 2012 tech. And am pretty sure, the vast majority of them never used a computer with an SSD and thus simply don't know what they are missing.

Yes, the HHD-only iMac might mainly be there to hit a price point to get people into the door, but it might at the same time also make the vast majority its users happy.

If you've only used a pentium 4 during your life, then you would have no problems upgrading to a core 2 duo you can find on ebay.

What the hell is your point here besides defending Apple? That people upgrading should be happy to have things hang on boot? I thought the point of UPGRADING was to upgrade, not be happy to still use 80's tech. If people can be happy with accepting old tech, then they don't need Apple products or even a new pc, they can find something on ebay for cheap.
 
Oh, you own the new 21" Retina iMac with the 5400rpm disk do you? No. No I didn't think so.
I have a 2013 iMac with a spinning disk which replaced a 2009 MacBook Pro that had a spinning disk. I didn't get a fusion drive in it at the time because it was an emergency upgrade for work. I needed to walk out of the store that very day with a computer, and stores only stock base models. So no option to upgrade to fusion or SSD without waiting on the mail. I have also since acquired a 2013 15" MacBook Pro with SSD. And a 2015 13" MacBook Pro with the newer SSD. I also have a couple older minis with spinning disk.

All have 8GB or 16GB RAM. Two of them are even 2013 models with comparable processors. The only real factor differentiating the performance of these machines is the storage device. And it's a night and day difference. The SSD machines are impeccable computers. They uphold the Apple prestige no question. The spinning disk iMac is a piece of crap. Unsleeping it takes forever. Starting any app takes forever. Saving files takes forever. Loading web pages takes forever. All of these actions hit the disk, and it bottlenecks the whole computer. It makes using it very frustrating and painful. And it's been noticeably slow since day 1.

What's really sad is that it's not even entirely the fault of the disk. I've been using computers casually and professionally since the mid 90s. They've all had spinning disks until these most recent machines. But not all those old computers were slow garbage. The most telling example was when I replaced my 2009 MBP with the 2013 27" iMac. The iMac was actually slower overall.. because of the operating system. Newer OSXs hit the disk a lot more. They actually need SSDs just to equal the overal user experience of a years older machine running Snow Leopard and a spinning disk.

If you never used a computer with an SSD, you won't consider the experience horrible. Before you've switched to a computer with an SSD, did you think your computer had a terrible user experience?
I get what you're saying about needing an established expectation in order to perceive the relative slowness. For me, having used tons of previous computers, the slowness of newer OSXs running on a spinning disk was glaringly obvious compared to Snow Leopard or Windows XP or 7 on a spinning disk. I instantly felt the difference going from Snow Leopard to Mountain Lion, and later upgrading to Mavericks made me practically stop using that iMac. Yosemite was no better. The only think that stops me from saying that Apple makes terrible computers is having gotten SSD machines in this last year. Apple's SSD based computers are great. It's unfortunate that OSX effectively requires SSD not to be a piece of crap, particularly since Apple will sell you one.

As for a Joe Blow buying his first computer, will he realize it's crappy? To your point? I don't know. I feel like he would, at least on some level. How could anyone not get frustrated at staring at the beach ball all the time, particularly when they forked over the big bucks for an Apple machine because of its reputation? There's still an expectation there, even if it's not based computer experience.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.