Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As someone that spent a considerable time (19 years) in semiconductor manufacturing, here is my analysis:
Duh.

Chips aren't a "design it in our garage" like the Apple I/II were. The pipelines are very long, and development time is 3-4 years, from inception to delivery of the first chip.

So, they are already in the initial design of the A14/A15 chip now, and they are looking at new materials now in the design/development. The holy grail in the pre-1990's was copper on chips, where aluminum (aluminium Jony) was used for the vias and interconnects, but had issues with voids in plugs, which would kill the device. Anyhow, using Cu on the layers above the Si (as Cu attacks the Silicon on the wafer) solved that problem, along with a bunch more (i.e. dissimilar metals at the plug/interconnect with W/Al).*

Anyway, back to what I was saying, people don't just jump in with their chip design and make it happen overnight, or next week with their new chip. This isn't 1959.
 
This should but to rest any of the theories about Apple making last minute changes to the iPhone X because of Face and Touch ID. The design and features of the X were locked in at least two years ago. And for those who think last minute changes are even possible, you know nothing about the manufacturing process.
 
My theory is that the hardware and software was there, but it wasn't originally intended for authentication. Probably for the Animoji stuff and AR. Auth was probably put together "last minute" (in the concept of software development) after TouchID wasn't possible.
That makes a lot of sense. I'm sure Apple spent hundreds of millions of R&D dollars on a 3D infrared optical sensor so we could all have animated emoji. The neural engine necessary for authentication was probably just purchased from Radio Shack at the last minute.
 
  • Like
Reactions: PickUrPoison
The Siri voice was recorded in 2006, while the application was released in 2011, so in that case Apple was working on technology five years into the future.

No. Besides the fact that Apple bought Siri...

The Siri voices were originally recorded for a "a US company called Scansoft, who were then bought by Nuance. Apple simply licensed it." - ref

Amazing they are so far ahead. It's kinda weird how all these companies then come out with the same kind of tech even though a lot of it takes up to 3 years to reach market. Is that a mole sharing sercrets or some weird tech serendipity.

It's greatly the fact that certain tech becomes feasible at a certain point in time.

As for Apple being "far ahead", Google, Microsoft, IBM and others have been both publishing papers for years, and demonstrating / selling neural network chips.

Apple just doesn't share what they're doing, while using what others do. Apple fans who don't pay attention outside of Apple, are surprised when Apple announces something.
 
As someone that spent a considerable time (19 years) in semiconductor manufacturing, here is my analysis:
Duh.

Chips aren't a "design it in our garage" like the Apple I/II were. The pipelines are very long, and development time is 3-4 years, from inception to delivery of the first chip.

So, they are already in the initial design of the A14/A15 chip now, and they are looking at new materials now in the design/development. The holy grail in the pre-1990's was copper on chips, where aluminum (aluminium Jony) was used for the vias and interconnects, but had issues with voids in plugs, which would kill the device. Anyhow, using Cu on the layers above the Si (as Cu attacks the Silicon on the wafer) solved that problem, along with a bunch more (i.e. dissimilar metals at the plug/interconnect with W/Al).*

Anyway, back to what I was saying, people don't just jump in with their chip design and make it happen overnight, or next week with their new chip. This isn't 1959.

I have also spent quite a bit of time in the chip/computer industry, 29 years in January to be precise.
People don't get it. The complexity of chips is not like it was years ago. This is not just a chip with some stuff on it; it's a true system on a chip (SOC). these chips are more complex than complete computers were a decade ago.
You have an interconnect on chip that looks like, and is a network, to move data from one side of the chip to all the peripherals.

You have both an L1 and L2 cache on the chip along with all the other subsystems.

Looking forward three years is nothing new and has been the standard for years. You need to develop a high level architecture. You need to design the (RTL, Verilog) all the components, connect up the subsystems, do extensive verification, synthesis, power and clock domain analysis, timing analysis, silicon implementation and finally manufacturing.
You need all the IP, memory, etc in place and tested before you can even tape out a shuttle. Assume there is at least two tapeouts which take 3-4 months each.

This also means you need at least two teams working staggered to get one project then the next. As people roll off the first, they go to the next.

Anyway, there is nothing out f the ordinary and is what companies that make silicon and IP (intellectual property) do anyway.

This is not unique to Apple.
Qualcomm, Intel, AMD and a variety of others are all doing the same type of development cycle.
 
That makes a lot of sense. I'm sure Apple spent hundreds of millions of R&D dollars on a 3D infrared optical sensor so we could all have animated emoji. The neural engine necessary for authentication was probably just purchased from Radio Shack at the last minute.

Emoji is the big thing. It, with stickers, has been a highlight of the last 2 iOS releases. Like I said, authentication, not the whole neural and face stuff. But nice try.
 
The 1970s just called. Lee Majors wants his tech name back. But seriously, the problem with multi-core tech like this is that 90% of the current apps won't even use the extra cores, so you won't even notice the difference.

article-2455513-18B033B100000578-945_634x509.jpg
Perhaps, but the smaller cores can be used for background tasks and such, freeing up the big cores for for CPU intensive activities.
 
It's probably referring to the neural network inside the chip, used for recognizing things.

I like the name. But hey, I grew up in the '80s, and that word was thrown around a lot back then.

remember we thought bionic humans were going to be the future?
 
I have also spent quite a bit of time in the chip/computer industry, 29 years in January to be precise.
People don't get it. The complexity of chips is not like it was years ago. This is not just a chip with some stuff on it; it's a true system on a chip (SOC). these chips are more complex than complete computers were a decade ago.
You have an interconnect on chip that looks like, and is a network, to move data from one side of the chip to all the peripherals.

You have both an L1 and L2 cache on the chip along with all the other subsystems.

Looking forward three years is nothing new and has been the standard for years. You need to develop a high level architecture. You need to design the (RTL, Verilog) all the components, connect up the subsystems, do extensive verification, synthesis, power and clock domain analysis, timing analysis, silicon implementation and finally manufacturing.
You need all the IP, memory, etc in place and tested before you can even tape out a shuttle. Assume there is at least two tapeouts which take 3-4 months each.

This also means you need at least two teams working staggered to get one project then the next. As people roll off the first, they go to the next.

Anyway, there is nothing out f the ordinary and is what companies that make silicon and IP (intellectual property) do anyway.

This is not unique to Apple.
Qualcomm, Intel, AMD and a variety of others are all doing the same type of development cycle.
Yeah... I forgot about that. If something is changed, unless they hand walk the lot (not so much with FOUPs) through the fab, it takes a while to see the results of the change.

I remember one of those hand carry "future of the factory is riding on this lot" runs, and it got stuck in a machine I was new to. I had to have the robot, under vacuum, pull the wafers out of the machine with the fab manager on the phone.

I asked the fab worker next to me to handle the call, and to respectfully let the fab manager know that I needed to concentrate on getting these wafers out, or they would be destroyed. I heard the yelling of the manager on the phone telling the guy to "get me on the ^%^$% phone, now!" (I was a contractor for the Semiconductor Manufacturing Equipment company). I shook my head, and got the wafers out.

After apologizing to the Fab manager for putting him off, he told me that the president of the company wanted an update right then, and he apologized for his temper, and I did the right thing. After that, whenever I was around them, and I needed to make a phone call, I told him I need to get that person on the '$%^#$' phone, to which we'd both smirk.

Anyway, I got out of the Semiconductor Cycle (18 months between 'holy crap, I need to look for a new job' to 'holy crap, what's with all of the overtime?') and went into aerospace.
 



Shortly after Apple's iPhone X event this week, the company's silicon chief Johny Srouji and marketing chief Phil Schiller sat down for an interview about its new A11 Bionic chip with Mashable's editor-at-large Lance Ulanoff.

a11-bionic-iphone-x.jpg

One interesting tidbit mentioned was that Apple began exploring and developing the core technologies in the A11 chip at least three years ago, when the iPhone 6 and iPhone 6 Plus launched with A8 chips.Apple's three-year roadmap can change if new features are planned, like the Super Retina HD Display in iPhone X.
johny-srouji-phil-schiller.jpg

Apple senior executives Phil Schiller, left, and Johny Srouji

In fact, Schiller praised Srouji's team for its ability to "move heaven and earth" when the roadmap suddenly changes.A11 Bionic six-core chip has two performance cores that are 25 percent faster, and four high-efficiency cores that are 70 percent faster, than the A10 chip in iPhone 7 and iPhone 7 Plus. Early benchmarks suggest the A11 Bionic is even on par with the performance of Apple's latest 13-inch MacBook Pro models.

The A11 chip is more efficient at multi-threaded tasks thanks to a second-generation performance controller that is able to access all six of the cores simultaneously if a particular task demands it.The A11 chip also has an Apple-designed neural engine that handles facial recognition for Face ID and Animoji, and other machine learning algorithms. The dual-core engine recognizes people, places, and objects, and processes machine learning tasks at up to 600 billion operations per second, according to Apple.Apple's new iPhone 8, iPhone 8 Plus, and iPhone X are all equipped with an A11 chip.

In related news, Carnegie Mellon University's School of Computer Science has announced that Srouji will take part in a distinguished industry lecture on Monday, September 18 from 5:00 p.m. to 6:30 p.m. local time.

Full Interview: The Inside Story of the iPhone X 'Brain,' the A11 Bionic Chip

Article Link: Apple Started Developing A11 Bionic Chip When A8 Chip Was Released Three Years Ago



Shortly after Apple's iPhone X event this week, the company's silicon chief Johny Srouji and marketing chief Phil Schiller sat down for an interview about its new A11 Bionic chip with Mashable's editor-at-large Lance Ulanoff.

a11-bionic-iphone-x.jpg

One interesting tidbit mentioned was that Apple began exploring and developing the core technologies in the A11 chip at least three years ago, when the iPhone 6 and iPhone 6 Plus launched with A8 chips.Apple's three-year roadmap can change if new features are planned, like the Super Retina HD Display in iPhone X.
johny-srouji-phil-schiller.jpg

Apple senior executives Phil Schiller, left, and Johny Srouji

In fact, Schiller praised Srouji's team for its ability to "move heaven and earth" when the roadmap suddenly changes.A11 Bionic six-core chip has two performance cores that are 25 percent faster, and four high-efficiency cores that are 70 percent faster, than the A10 chip in iPhone 7 and iPhone 7 Plus. Early benchmarks suggest the A11 Bionic is even on par with the performance of Apple's latest 13-inch MacBook Pro models.

The A11 chip is more efficient at multi-threaded tasks thanks to a second-generation performance controller that is able to access all six of the cores simultaneously if a particular task demands it.The A11 chip also has an Apple-designed neural engine that handles facial recognition for Face ID and Animoji, and other machine learning algorithms. The dual-core engine recognizes people, places, and objects, and processes machine learning tasks at up to 600 billion operations per second, according to Apple.Apple's new iPhone 8, iPhone 8 Plus, and iPhone X are all equipped with an A11 chip.

In related news, Carnegie Mellon University's School of Computer Science has announced that Srouji will take part in a distinguished industry lecture on Monday, September 18 from 5:00 p.m. to 6:30 p.m. local time.

Full Interview: The Inside Story of the iPhone X 'Brain,' the A11 Bionic Chip

Article Link: Apple Started Developing A11 Bionic Chip When A8 Chip Was Released Three Years Ago
 
Marketing BS.

All of 3 years to design facial animated poo-emojies and a perma-turd on the display. Whatever apple spews out these days in marketing material is so offensive yet the cult following refuses put away their wallets.

The tragedy of iOS since iOS 7.0; year after year they stack and change the UX from something useable to something that resembles android 2.0

I mean all 3 years to design an A11 bionic cpu while the iPad pro's all have wasted real estate and no touch screen mb's. From removing the headphone jack to no usb-c on their flagship device. And worse yet more dongles and cables to sell. Fast charging not right out the box either.

Everything about apple, inc. these days is profit margins. Sure it looks good on the books but do not even compare apple 2017 to apple Steve jobs era. At least the man cared enough about aesthetics, UX, and overall packaging than profits because he knew the money would follow with an apple product.

The same design since iPhone 6 and no matter how you spin it it's the same. Worse yet you got that even larger protrusion on the back and a perma-turd notch in the front. The next X2 model will most likely feature a perma-turd on the bottom just to even out the look on the front.

iOS' biggest failure is iOS itself. All that power for what? iOS isn't exactly an OS that is great at doing work. For consumption an iPhone 6 is more than fine.
 
Please don't tell me that tech's un-grammar culture has now turned "architect" into a verb. Such poor usage disgusts me.
 
Ah, you don't say! I appreciate the input, I wasn't aware. Are most of the mainstream Qualcomm chips already applying this method? My current phone is running an eight core Snapdragon 810.

Yes, half of the cores in the 810 are low-perf, low-power.

Are Apple processors just rebranded ARM chips? Apple has a lot of input on the design, no?

---Edit---
After further investigation, I see that ARM is a licensed technology. Even Snapdragon processors use it. Interesting.

No, Apple does much more than rebrand.

ARM offers two licenses: you can license their designs, or the architecture.

When you license a design, such as the ARM Cortex series, you essentially get a finished CPU. Most companies do this, because it's far less effort.

When you license the architecture, such as ARMv8-A, you essentially get a specification of how your chip should behave; you do not get a chip.

Your Snapdragon 810 is an example of a design license — it just combines eight ARM Cortex cores with other chips of Qualcomm's. Newer Snapdragons like the 820, however, only use an architecture license; instead of the Cortex, Qualcomm implemented their own Kryo design.

It's the same with Apple. Up until the A5 (and A5x), Apple used ARM designs such as the ARM11 and ARM Cortex-A9. Starting with the A6, they implemented their own design (with weather-themed codenames such as Twister).

(Also, ARM was originally a joint venture co-founded by Apple, but that was a quarter century ago.)
[doublepost=1505497553][/doublepost]
Please don't tell me that tech's un-grammar culture has now turned "architect" into a verb. Such poor usage disgusts me.

'To architect' has been a verb since at least 1857.
 
I love how the chip got years of research and development but the way to implement the phone or supply process got 3 months beforehand. Lol
 
  • Like
Reactions: Applebot1
The first way is to license an ARM design which is effectively a blueprint that will allow you to fabricate a chip that ARM designed down to the transistor level. This is what Apple did up until it released its first 64 bit SoC, it took ARM cores that had already been designed by ARM and incorporated them into its own Apple A4, A5 etc obviously adding some Apple specific bits as well (interfaces etc).

Picking nits: the A6 was a custom design and wasn't yet 64-bit.
[doublepost=1505498073][/doublepost]
Marketing BS.

All of 3 years to design facial animated poo-emojies and a perma-turd on the display. Whatever apple spews out these days in marketing material is so offensive yet the cult following refuses put away their wallets.

The tragedy of iOS since iOS 7.0; year after year they stack and change the UX from something useable to something that resembles android 2.0

I mean all 3 years to design an A11 bionic cpu while the iPad pro's all have wasted real estate and no touch screen mb's. From removing the headphone jack to no usb-c on their flagship device. And worse yet more dongles and cables to sell. Fast charging not right out the box either.

Everything about apple, inc. these days is profit margins. Sure it looks good on the books but do not even compare apple 2017 to apple Steve jobs era. At least the man cared enough about aesthetics, UX, and overall packaging than profits because he knew the money would follow with an apple product.

The same design since iPhone 6 and no matter how you spin it it's the same. Worse yet you got that even larger protrusion on the back and a perma-turd notch in the front. The next X2 model will most likely feature a perma-turd on the bottom just to even out the look on the front.

iOS' biggest failure is iOS itself. All that power for what? iOS isn't exactly an OS that is great at doing work. For consumption an iPhone 6 is more than fine.

The quintessential MacRumors post, best read with a laugh track.

Which part of your post, in particular, is even remotely relevant to the topic of chip design?

Do you believe the people who design "animated poo-emojies", the iOS 7 UX, or the iPhone 6 exterior got shuffled around and now do hardware instead? Or vice versa? Do you believe lower profit margins would have led to better chips? Are you arguing the chips aren't good enough? Or that Apple should license chips instead?
[doublepost=1505498284][/doublepost]
Right, it is definitely conceptually similar to big.LITTLE and Apple are "late" to adopting it. Funny thing though, Apple was killing the rest of the mobile industry even before they adopted a big.LITTLE like setup- they were doing just fine with 2 powerful wide, deep cores and beating 4 and 8-core arrangements from other vendors.

Yup. They were late to this party because it didn't matter; their single-core performance destroys competing ARM designs.
 
Last edited:
Marketing BS.

All of 3 years to design facial animated poo-emojies and a perma-turd on the display. Whatever apple spews out these days in marketing material is so offensive yet the cult following refuses put away their wallets.

The tragedy of iOS since iOS 7.0; year after year they stack and change the UX from something useable to something that resembles android 2.0

I mean all 3 years to design an A11 bionic cpu while the iPad pro's all have wasted real estate and no touch screen mb's. From removing the headphone jack to no usb-c on their flagship device. And worse yet more dongles and cables to sell. Fast charging not right out the box either.

Everything about apple, inc. these days is profit margins. Sure it looks good on the books but do not even compare apple 2017 to apple Steve jobs era. At least the man cared enough about aesthetics, UX, and overall packaging than profits because he knew the money would follow with an apple product.

The same design since iPhone 6 and no matter how you spin it it's the same. Worse yet you got that even larger protrusion on the back and a perma-turd notch in the front. The next X2 model will most likely feature a perma-turd on the bottom just to even out the look on the front.

iOS' biggest failure is iOS itself. All that power for what? iOS isn't exactly an OS that is great at doing work. For consumption an iPhone 6 is more than fine.

Relax. It is just a phone.

Go outside and enjoy your family and friends.
 
I'm pretty sure they are done with Touch ID in the iPhones forever.

Apple states face ID is the future.

At first I thought they would slip it in next year but I don't think they are looking back, not on flagship iPhone models.
Hence the Wayne Gretzky hits at the keynote...

I skate to where the puck is going to be, not where it has been.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.