Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Does the Air 3 from 2019 already have a laminated display? I have one here lying next to an iPad 9 from 2021 and the Air somehow looks better, like the display is directly under the glass with less reflection and the 9 has a thicker glass over the display.
Yup. What you’re describing is the difference between a laminated display and one that’s not. It makes the screen look right on the surface.

I don't really have a clue about iPads, because I almost never need any. The one I used before was the first 9.7" iPad Pro from 2016 and when it died in April i bought those two for cheap because I couldn't decide. I wanted the last with a Home Button that might get the longest time updates.

I can't feel a weight difference between the old Pro and the Air, but the 9 is really heavy. The Pro is a little bit smaller in total though. I just looked it up and it should be even lighter. The second Pro with 10.5" seems to be almost identical to the Air 3. But I gave that one to my mum, so I can't compare the display, when it's on. Maybe I get my Pro back to life again, now that it was resting almost a year.

I have a 2nd Gen. 12.9" Pro from 2017 too. But that thing is just too heavy for me to use and also didn't get iOS 18.
Well, it depends on what you’re using it for. It doesn’t sound like you use it very much so probably buying anything more than the regular iPad would be a waste of money. If you’re occasionally using it to look up cooking recipes or the weather, does it really matter if the display looks slightly different? If you’re watching movies all the time or playing games then it would make a difference.
 
  • Like
Reactions: Adora
I barely notice a 60hz display. I’ve been working on a Pro since the beginning and have been used to 120hz, but when checking out the Air, I don’t even notice it. The only time I really notice anything different is scrolling through a page that has a lot of images
I’m the same way. I suspect if I had one side by side, I might notice the difference, but that’s not how I use an iPad. In the real world I’m going to use one at a time. If it looks the same without having the other one right next to it then does it make a difference? If I switch back and I’m thinking wow this looks like crap, then that’s a problem. I don’t get that from OLED or high refresh rate versus LCD. I get that when I go from a budget PC monitor to an Apple display. This is why I can’t go back to budget monitors.
 
I’m the same way. I suspect if I had one side by side, I might notice the difference, but that’s not how I use an iPad. In the real world I’m going to use one at a time. If it looks the same without having the other one right next to it then does it make a difference? If I switch back and I’m thinking wow this looks like crap, then that’s a problem. I don’t get that from OLED or high refresh rate versus LCD. I get that when I go from a budget PC monitor to an Apple display. This is why I can’t go back to budget monitors.
The only time I’ve really noticed things being faster with a higher refresh rate is with OLED Screens
 
  • Like
Reactions: russell_314
People think the new iPad Air M3 has the chip as the M3 MacBook Air/Pro but actually no it has a 9 cores GPU like the iPad Air M2. Ok that might be fine for the iPad Air but things are getting worse with the iPad A16…

Everyone is angry that the iPad A16 doesn’t has Apple Intelligence but things are getting worse :

- It has one binned CPU core

- It has one binned GPU core

So yeah the iPad A16 is slower than the iPhone 14 Pro/15… I don’t understand why they did that one the iPad A16.

PS : The iPad 10 has even 6 CPU cores… against 5 on this iPad… bruh difference won’t be very large I guess…
The M2 iPad Air was also “binned” compared to the M2 MacBook Air, 1 GPU core less. On the other hand, the M1 iPad Air had 1 GPU more than the M1 MacBook Air.

Tim is a master of supply chain optimization, so sometimes you get a bit more, sometimes a bit less.
 
Anytime you see different core's in products it's usually because the lower core counts are binned chips.
As myself and some other forum members have pointed out, in that case all chips are binned. Some are put in the higher bin where every core physically present in the chip is functional, others are put in a lower bin where some parts are disabled and/or defective.
 
Binned chips are normal and have been since forever. We'd have an insane amount of waste if we didn't and prices for tech would be through the roof. Anytime you see different core's in products it's usually because the lower core counts are binned chips.
Agree, there were talks about this back in the days when I started messin with PC's, the i386 and i486 chips.
Some got the cheaper 25Mhz version but most of the could run on a higher Mhz.

There were the same talks back then, Intel tested the higher Mhz ones and the ones that didn't pass the test were labeled with lower frequencies.

Better to do that than turning it into waste.
 
People think the new iPad Air M3 has the chip as the M3 MacBook Air/Pro but actually no it has a 9 cores GPU like the iPad Air M2. Ok that might be fine for the iPad Air but things are getting worse with the iPad A16…

Everyone is angry that the iPad A16 doesn’t has Apple Intelligence but things are getting worse :

- It has one binned CPU core

- It has one binned GPU core

So yeah the iPad A16 is slower than the iPhone 14 Pro/15… I don’t understand why they did that one the iPad A16.

PS : The iPad 10 has even 6 CPU cores… against 5 on this iPad… bruh difference won’t be very large I guess…
Sigh....

An education focused product with a super low price and no expectation of performance on any level other than opening freeform, miro, zoom, teams as well as office suite, safari and pages,keynote nad numbers a binned A16 is the end of the world?

Maybe Apple Should name the chips like Intel or AMD?
Apple A16-100, 150, 200? Then you'll be satisfied?

People complain for the sake of complaining...
 
If only those search results agreed with each other or with these comments!
Why would you expect search results to all agree with each other? If someone somewhere misuses a term, and it gets posted somewhere, and your Internet search turns up that misuse of the term as well as proper uses of the term (which people here have given you), wouldn't you come to the conclusion that some people just misuse the term, and so you should ignore that misuse and focus on the proper uses of the term?

The same applies to just about anything.
 
  • Like
Reactions: IJustWannaTalk
Maybe Apple Should name the chips like Intel or AMD?
Apple A16-100, 150, 200? Then you'll be satisfied?
I don't think that would help assuage the OP's misplaced feelings about this.

I get the impression that the only thing he'd accept is changing industry practices so that either all chips that don't meet full design specs are thrown away, so that all products made with those chips all have the same specs, thus raising prices a lot, or that the industry perfect their wafer fab processes so that there are never enough lower-performing chips to use them in lower-tier products, thus raising prices a lot.
 
  • Like
Reactions: IJustWannaTalk
People think the new iPad Air M3 has the chip as the M3 MacBook Air/Pro but actually no it has a 9 cores GPU like the iPad Air M2. Ok that might be fine for the iPad Air but things are getting worse with the iPad A16…

Everyone is angry that the iPad A16 doesn’t has Apple Intelligence but things are getting worse :

- It has one binned CPU core

- It has one binned GPU core

So yeah the iPad A16 is slower than the iPhone 14 Pro/15… I don’t understand why they did that one the iPad A16.

PS : The iPad 10 has even 6 CPU cores… against 5 on this iPad… bruh difference won’t be very large I guess…
so, if true, does that make them bad chips?
Answer: No.

Do you know that Intel "fuses" their chips, eg a i9 chip gets "fused" to become a i5 chip? Doesn't make a difference to the consumer ...
 
  • Like
Reactions: IJustWannaTalk
For whatever it’s worth, I’ll reiterate what others here have pointed out, and add one or two observations:

Binning is a normal part of chip manufacturing, and all the chips in question get put into different bins, including those that meet all the original design specs. It allows manufacturers to same some money by using chips, instead of throwing them away, that testing finds don’t entirely meet all the specs they were designed for. In some CPU and GPU chips, one or two of their cores might not work properly when clocked at full speed, and so the manufacturer disables them. You wouldn’t want them to remain enabled since you wouldn’t get reliable performance. Those chips are put into manufactured products that have other lower specs as well (display, number of ports, etc.), so that the manufacturer can more easily include lower-priced models in their product lineup that still perform well.

Some chips that get put into the lower-performing bins are useful for lower-powered devices that run at slower clock speeds for better power/battery performance over raw speed, while still delivering the performance that the manufacturer of the end product specifies.

When you’re choosing a computing device from a tier of products, you look at the specs for the number of cores, etc. that you want, which the manufacturer openly specifies in the end product’s specs, and you choose accordingly. The buyer of a lower-spec'd product isn’t being fooled into thinking they’re buying a product that contains chips that meet the original full specs for it.

The OP is suggesting that chips that don’t meet the original, full design specs should be thrown away. That would raise prices significantly.
 
Last edited:
Everyone is angry that the iPad A16 doesn’t has Apple Intelligence but things are getting worse
Hopefully none of the new iphone 17/18/19 have it also. I am a devout apple fan boy since they bought next, and apple Intellegence is making me hate their products so much. Ironically, I use various real AI products daily for work and elsewhere and am a big fan of the tech. This complete crap apple has put out is mind boggling bad. They even tried to co-opt the term AI, so simply talking about it is a pain because you have to make clear when you are talking about Apple Intelligence (AI) or artificial intelligence (AI)
 
  • Like
Reactions: IJustWannaTalk
I wouldn't. You're not responding to the person who posted the search results.
I’m not sure what you’re saying. Who posted what search results, and where? I’m responding to your questions and comments here. If you think I should be responding to someone or something else, can you point me in the right direction?

Maybe 10-15 years ago, when the term “binned” began to ramp up in how frequently it was used in articles and comments online (or at least that's about when I began seeing it more--it started in earnest in the 1990s), I hadn’t yet consistently heard the term often enough to take the time to find out what it meant, even though I’ve been an electronics technician with an interest in just about all aspects of electronics since the mid-1970s. Once I decided to look into it and did some searches online, like you, I was confused by what seemed like multiple interpretations, and I asked pretty much all the same questions you’ve been asking, including wondering why there are conflicting ideas about it from people who seemed to know what they were talking about. I weighed all the explanations, like separating wheat from chaff, and though it took a day or two for things to sink in, I figured out what it actually meant. I came to realize it was just a matter of not everyone describing what the term means entirely correctly, and so I no longer paid attention to the wrong, or even just half-right explanations. As you say, “based on the responses, even the people who think they know are not in agreement with each other.” So, I stopped expecting that everyone commenting about binning, both past, present, and future, would always get it right.

The short explanation is that when people disparagingly say manufacturers use binned chips in their lower-tier products, it’s a misuse of the term, since “binning” happens to even the perfect chips—they’re placed in their own bin. So it’s also incorrect to disparage the lower-performing parts that get put into different bins and are then used in lower-priced and more power-efficient products, by dismissing them as “binned parts”, since manufacturers can’t be expected to run multiple fab runs that somehow produce nothing but perfect, high-end parts for one run that get put into the pricier and more power-hungry products, and perfect parts for another run but with different die/mask layouts designed to produce parts with lower specs.
 
You responded to the wrong person in the ongoing thread in this post.
No, I was responding to the questions you posted in this thread, wondering what "binning" means and why you're hearing conflicting explanations of it, specifically "I'm noting the confusing use of a term, which continues to confuse the more people offer up various conflicting explanations", and "I don't know what the "correct" usage is, which is why I asked. But based on the responses, even the people who think they know are not in agreement with each other. ¯\_(ツ)_/¯".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.