Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I read somewhere that the A10 will also be 16nm, not 10nm. At least the A10 from TSMC will be 16nm. I'm not sure if Samsung will also get an A10 order from Apple, but I have a feeling now they won't.

I mean generation of silicon manufacturing process, not generation of A-series processor. Is unlikely A10 would be 10nm. As far as I know, both Samsung and TSMC will not be ready for 10nm mass production until later next year, which will miss the schedule of A-series production.
 
There is a standard spec and test model for all silicon chips called 'process corner'. The test procedure involves variation of supply voltage and environment temperature. When you outsourcing silicon chips, you must design the specs and setup threshold for different corner. If the test results pass designated target, that shipment is acceptable.

Of course silicon chips varies from chip to chip. There are superior and inferior individuals even within the same shipment. So chances are that you'll have a chip that runs well in high temperature corner, while the other one from the same shipment will perform poorly. To overcome this situation, designers will tend to use over-spec design. This is why you may sometimes 'overclock' your computer, while others would fail, for the same model.

The truth is: Apple choose Samsung as provider in the beginning. The reason why they choose Samsung is quite simple: TSMC's 16nm production line is not ready in early 2014. I knew that because I was participated in TSMC's plan. Apple had designed the spec, and Samsung had met with the standard. But Samsung failed to meet the quantity due to low yield, so Apple transferred some of the order to TSMC. However, it turns out TSMC chip is outperformed Samsung chip in some circumstance.

In other words, the performance behavior of iPhone 6s with Samsung chip is considered as 'standard' model. iPhone 6s with TSMC is 'superior' than original design.

This is exactly what happened here.

Apple Inc. May Have Made a Huge Mistake in Having Samsung Build the A9

"the TSMC 16nm FinFET Plus process features better electrical characteristics than the Samsung 14-nanometer process."

"In fact, that same source informed me that TSMC's A9 yields are twice those of Samsung's"

"It's no surprise that Apple is going all TSMC with the A10"

http://www.fool.com/investing/gener...may-have-made-a-huge-mistake-in-having-s.aspx
 
  • Like
Reactions: Benjamin Frost
We can simplify say samsung chip is the suck version

You'll hear more "interesting" rumors about TSMC v.s. Samsung if you're in this industry. All I can say is, A9 is not the only incident like this in recent years. If you compare these two companies side-by-side, Samsung will always show you spectacular numbers, on the paper.

However, I can't say I'd like TSMC better, either. As I've said, I've participated in their plan. And I can swear in the name of any spirit, I'll never take any job from that blood sucking enterprise again in my life.
 
  • Like
Reactions: Aloft085
The only thing that sucks is the absolute lunacy displayed here. Even if there is a difference, it won't be of significance to 99% of users. For the 1% of you here, obsessing and driving everyone crazy over your concerns, the difference will be 10 minutes of battery time as you run 14 high cpu games at once.

The article says over 50% cpu load would have serious battery lose for samsung chip, 99% user never gets to over 50% cpu load? I really doubt.
 
"According to my sources"

BS

I've not heard Sammy's yield number, but I can confirm that they was indeed in trouble. That's why TSMC got the order back during the second half of 2014.

However, this won't change the fact that TSMC lost A9 order to Samsung in the first place, simply because they're not ready. As a result, my project was abruptly ceased and got nothing at all.
 
You'll hear more "interesting" rumors about TSMC v.s. Samsung if you're in this industry. All I can say is, A9 is not the only incident like this in recent years. If you compare these two companies side-by-side, Samsung will always show you spectacular numbers, on the paper.

However, I can't say I'd like TSMC better, either. As I've said, I've participated in their plan. And I can swear in the name of any spirit, I'll never take any job from that blood sucking enterprise again in my life.

With your experience, is that samsung chip really just 1-2% battery different comparing to TSMC chip for this A9 case? For most of the normal users.
 
Arstechnica article is updated:

"Update: To clarify exactly what Xcode's Activity Monitor is telling us, remember that every logical CPU core is tracked individually, so for a dual-core CPU like the A9 "full utilization" would so in the dual-core A9 full load is 200 percent, 100 for each core. The Geekbench test is putting about 30 percent load on each core for a total of 60 percent. For comparison, the relatively light but modern iOS game Shooty Skies oscillates between 30 and 70 percent depending on how many objects are being drawn on screen."
 
EXACTLY! Most people are power users of their smart phones, it's what the freakin things are made for and I don't know anyone who lets their phone sit in stand by most of the day only occasionally checking emails and texting. Most people watch videos, play games, and use other cpu/gpu intensive apps. So for people to say that's not an important factor or most people don't stress the hardware in there phone that way is retarded.

Furthermore, Apple used two different chip manufacturers to meet anticipated demand for this phone, you really think they won't do the same for next years model?
Yeah and apparently Apple is increasing the processor performance just for that... Claiming that is not a normal usage is for me an inconsistent speech... On one hand increasing performances but on another side saying that normal usage is email reading... Which is fine since iphone 1st gen processor...
 
Apple Apologists will always defend and be ripped off.

Or Apple alarmists will always spread FUD, based on non-issues. And they panic for nor reason.

Really some people in that thread are hilarious, looking for the tiniest hair in the soup. Like every year.

If you think Apple is doing a miserable job, go and look elsewhere. The other 98% will just enjoy their iPhones.
 
  • Like
Reactions: Aloft085 and mw360
Lol, pitchforks down everyone.

Apple has said a 2-3% difference and now benchmarks show it.

Just use the device and stop worrying about such stupid things.
 
  • Like
Reactions: CraigGB
Although the difference is small, it appears that in every. single. test. the TSMC always comes out on top. It's just a matter of how narrow the victories are. And for gamers or media content creators, the difference could be noticeable for sure because those use cases max out the processor. I submit that there is a measurable minority of customers who game alot and are out making videos frequently. The fact that it is possible to reliably demonstrate a 1 hr 50 minute difference between the chips to me is enough to warrant a return if the customer requests it within the 14 day window as this is the compromise that has evolved: Apple builds in the cost of expected returns into the price, and we in turn get the right to exercise the return privilege if we so desire...and 99% of people won't exercise it.

There is a true difference between the chips, and many on this board care about it. Those who do not, would never visit a forum like this. All told the return rate I think is less than 1% increased by this "chipgate".
 
Last edited:
So the Apple unproved claim that real life usage would be only 1-2% different is not true. As long as user uses some intensive app, 20%-35% battery burn faster with Samsung chip is expected. Apple unproved claim is not real even if you only use some little app or basic functions of the iphone.
 
Everyone who is saying there is a vast difference are actually too dumb to ignore that all the tests except geek bench test the phone in normal use which is web browsing, gaming, graphics intensive apps, and scripting apps. Geek bench is not a battery benchmark, it ups the chip to 60-70% usage which will never happen in real world usage even during games and movies because its benchmark is design for short bursts of cpu performance not 4 hours of cpu maxed out. It's a unrealistic use

The only thing closer to the geekbench benchmark is you editing a 4K file which is an hour long and which takes 4 hours to process with high cpu usage; and yet still both will process the whole edited file


Anyone who thinks anything else is a power usage has either no idea how usage works or are really dumb or are simply trolling. Opening Instagram and maps and an hour of candy crush and Twitter and email at once is not the definition of a power useage

Ars article has been updated:
Update: To clarify exactly what Xcode's Activity Monitor is telling us, remember that every logical CPU core is tracked individually, so for a dual-core CPU like the A9 "full utilization" would so in the dual-core A9 full load is 200 percent, 100 for each core. The Geekbench test is putting about 30 percent load on each core for a total of 60 percent. For comparison, the relatively light but modern iOS game Shooty Skies oscillates between 30 and 70 percent depending on how many objects are being drawn on screen.
 
Ars article has been updated:
Update: To clarify exactly what Xcode's Activity Monitor is telling us, remember that every logical CPU core is tracked individually, so for a dual-core CPU like the A9 "full utilization" would so in the dual-core A9 full load is 200 percent, 100 for each core. The Geekbench test is putting about 30 percent load on each core for a total of 60 percent. For comparison, the relatively light but modern iOS game Shooty Skies oscillates between 30 and 70 percent depending on how many objects are being drawn on screen.

so what does this means?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.