Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,180
38,961


Apple is investigating generative AI to accelerate its custom chip design, according to remarks by hardware chief Johny Srouji last month.

apple-silicon-feature-joeblue.jpg

Srouji outlined the company's interest in AI-assisted chip design during a speech in Belgium, where he received an award from semiconductor research group Imec. Reuters was able to review a recording of his remarks.

"Generative AI techniques have a high potential in getting more design work in less time, and it can be a huge productivity boost," Srouji said. He also talked about the role of electronic design automation companies like Cadence and Synopsys, which he said are "critical in supporting our chip design complexities." Both companies are said to be racing to integrate AI into their chip design software.

The news comes as Apple faces scrutiny over its perceived lag in consumer-facing AI. While competitors like Google and OpenAI have dominated headlines, Apple Intelligence has struggled to match rival offerings. Srouji's comments suggest Apple may be taking a more holistic approach – leveraging AI behind the scenes for chip development rather than focusing solely on user-facing features.

During his speech, Srouji traced Apple's silicon journey from the first A4 chip in 2010 to today's processors, and took time to underline Apple's bold decision-making during the 2020 Mac transition from Intel to Apple Silicon.

"Moving the Mac to Apple Silicon was a huge bet for us," he explained. "There was no backup plan, no split-the lineup plan, so we went all in, including a monumental software effort."

By all accounts, the transition was an unqualified success, bringing dramatic gains in performance-per-watt, battery life, thermal efficiency, and software compatibility achieved by Apple's M-series chips.


Article Link: Apple Considers Using Generative AI for Faster Apple Silicon Design
 
I'm so done with everything being called "AI". I'd be absolutely shocked if Apple wasn't already using some form of algorithmic machine learning to help accelerate the pace of technical designs.

Then again, the next hype word might be even more annoying.

As soon as "AI" is no longer able to perform the magical trick of thinning our wallets, there'll be something else that's different, but the same.
 
Last edited:
The longer LLMs have been around the more convinced I am that Apple is using them correctly - not as an end all standalone app but as a capability for existing work flows. General LLMs have huge issues when specific output is needed.

I'm sure Apple can fine tune a model for chips and it'll be helpful in their chip design workflow, even if it's just to iterate and find designs that won't work.
 
The funny thing about the perception of Apple being behind in AI is that the only reason that they are "Behind" is because they are trying to come up with privacy respecting features that actually help people without people having to learn how to use AI. To me that sounds like they are refined and ahead in AI. At least they aren't bombarding me with annoying advertising to shove it in my face like Microsoft did.
 
The funny thing about the perception of Apple being behind in AI is that the only reason that they are "Behind" is because they are trying to come up with privacy respecting features that actually help people without people having to learn how to use AI. To me that sounds like they are refined and ahead in AI. At least they aren't bombarding me with annoying advertising to shove it in my face like Microsoft did.

You're only ahead if you're actually producing something. Apple, as far as anyone can tell, doesn't have anything functional.
 
I used AI to write hundreds of lines of code yesterday for my job. It's fairly annoying. It writes a huge volume of code that looks right - like, I'm shocked at how many things it gets right - but then it face plants on some things where I'd be thinking about putting someone into a Performance Improvement Plan if I saw an employee making similar mistakes.

I think... the mistakes expose AI is exactly what some have been saying about it. It's sophisticated pattern matching/autocomplete. It doesn't actually understand anything about what it's doing.
 
I used AI to write hundreds of lines of code yesterday for my job. It's fairly annoying. It writes a huge volume of code that looks right - like, I'm shocked at how many things it gets right - but then it face plants on some things where I'd be thinking about putting someone into a Performance Improvement Plan if I saw an employee making similar mistakes.

I think... the mistakes expose AI is exactly what some have been saying about it. It's sophisticated pattern matching/autocomplete. It doesn't actually understand anything about what it's doing.

I agree with this 100%, when the first GPTs came out I was stoked but ended up wasting a tremendous amount of time trying to get it to really write code. Over the last couple of years I just use it as a helper, it can stub files and help with some methods iteratively but it can't do it all at once. Usually it's just faster to write it yourself. I do use Claude to discuss design patterns / system API usage, things like that, which is helpful just to make sure I'm on the right track and haven't overlooked anything.

I've had good success in setting up a test file, creating all mocked dependencies and then getting it to write tests to verify each code fork in the SUT. It's not perfect, I'll usually have to fix things to even get it to compile but it does save time and every so often finds a bug in the system it's testing - which is helpful.
 
I'm so done with everything being called "AI". I'd be absolutely shocked if Apple wasn't already using some form of algorithmic machine learning to help accelerate the pace of technical designs.

Then again, the next hype might be even more annoying.

As soon as AI is no longer able to perform the magical trick of thinning our wallets, there'll be something else that's different, but the same.
This reminds me when people used to make fun of the internet and call it “hype” in the early to mid 90s.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.