Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
52,111
13,741



Steven Levy has published an in-depth article about Apple's artificial intelligence and machine learning efforts, after meeting with senior executives Craig Federighi, Eddy Cue, Phil Schiller, and two Siri scientists at the company's headquarters.

backchannel-apple-machine-learning.jpg

Apple provided Levy with a closer look at how machine learning is deeply integrated into Apple software and services, led by Siri, which the article reveals has been powered by a neural-net based system since 2014. Apple said the backend change greatly improved the personal assistant's accuracy.
"This was one of those things where the jump was so significant that you do the test again to make sure that somebody didn't drop a decimal place," says Eddy Cue, Apple's senior vice president of internet software and services.
Alex Acero, who leads the Siri speech team at Apple, said Siri's error rate has been lowered by more than a factor of two in many cases.
"The error rate has been cut by a factor of two in all the languages, more than a factor of two in many cases," says Acero. "That's mostly due to deep learning and the way we have optimized it -- not just the algorithm itself but in the context of the whole end-to-end product."
Acero told Levy he was able to work directly with Apple's silicon design team and the engineers who write the firmware for iOS devices to maximize performance of the neural network, and Federighi added that Apple building both hardware and software gives it an "incredible advantage" in the space.
"It's not just the silicon," adds Federighi. "It's how many microphones we put on the device, where we place the microphones. How we tune the hardware and those mics and the software stack that does the audio processing. It's all of those pieces in concert. It's an incredible advantage versus those who have to build some software and then just see what happens."
Apple's machine learning efforts extend far beyond Siri, as evidenced by several examples shared by Levy:
You see it when the phone identifies a caller who isn't in your contact list (but did email you recently). Or when you swipe on your screen to get a shortlist of the apps that you are most likely to open next. Or when you get a reminder of an appointment that you never got around to putting into your calendar. Or when a map location pops up for the hotel you've reserved, before you type it in. Or when the phone points you to where you parked your car, even though you never asked it to. These are all techniques either made possible or greatly enhanced by Apple's adoption of deep learning and neural nets.
Another product born out of machine learning is the Apple Pencil, which can detect the difference between a swipe, a touch, and a pencil input:
In order for Apple to include its version of a high-tech stylus, it had to deal with the fact that when people wrote on the device, the bottom of their hand would invariably brush the touch screen, causing all sorts of digital havoc. Using a machine learning model for "palm rejection" enabled the screen sensor to detect the difference between a swipe, a touch, and a pencil input with a very high degree of accuracy. "If this doesn't work rock solid, this is not a good piece of paper for me to write on anymore -- and Pencil is not a good product," says Federighi. If you love your Pencil, thank machine learning.
On the iPhone, machine learning is enabled by a localized dynamic cache or "knowledge base" that Apple says is around 200MB in size, depending on how much personal information is stored.
This includes information about app usage, interactions with other people, neural net processing, a speech modeler, and "natural language event modeling." It also has data used for the neural nets that power object recognition, face recognition, and scene classification.

"It's a compact, but quite thorough knowledge base, with hundreds of thousands of locations and entities. We localize it because we know where you are," says Federighi. This knowledge base is tapped by all of Apple's apps, including the Spotlight search app, Maps, and Safari. It helps on auto-correct. "And it's working continuously in the background," he says.
Apple, for example, uses its neural network to capture the words iPhone users type using the standard QuickType keyboard.
Other information Apple stores on devices includes probably the most personal data that Apple captures: the words people type using the standard iPhone QuickType keyboard. By using a neural network-trained system that watches while you type, Apple can detect key events and items like flight information, contacts, and appointments -- but information itself stays on your phone.
Apple insists that much of the machine learning occurs entirely local to the device, without personal information being sent back to its servers.
"Some people perceive that we can't do these things with AI because we don't have the data," says Cue. "But we have found ways to get that data we need while still maintaining privacy. That's the bottom line."

"We keep some of the most sensitive things where the ML is occurring entirely local to the device," Federighi says. As an example, he cites app suggestions, the icons that appear when you swipe right.
The full-length article on Backchannel provides several more details about how machine learning and artificial intelligence work at Apple.

Article Link: Apple's Machine Learning Has Cut Siri's Error Rate by a Factor of Two
 

Relentless Power

macrumors Nehalem
Jul 12, 2016
35,305
38,012
It's only my opinion, but Siri struggles with the most basic tasks. I do appreciate Siri and I am hoping for advancements to keep progressing, making her more efficient. But over all, Siri has improved over the years.
 
Comment

LordQ

Suspended
Sep 22, 2012
3,582
5,651
I would love more functionality and less jokes. Yesterday I asked her to show me Foxes music on iTunes and had the common "I don't understand what you mean by blabla...".

If they added Translations like Cortana I'd be a very happy chap, I watch a lot of video recipes and would love to ask Siri stuff like "How do you say shallots in Spanish".
 
Comment

840quadra

Moderator
Staff member
Feb 1, 2005
8,284
3,582
Twin Cities Minnesota
I have a different experience. Siri is more understanding of difficult to pronounce names, doesn't require corrections as often, and has much less errors when I am using BlueTooth in my car. I rarely have to talk in a robot voice for it to understand me, where as when it first came out, it was a standard practice.
 
Comment

keysofanxiety

macrumors G3
Nov 23, 2011
9,534
25,273
Apple insists that much of the machine learning occurs entirely local to the device, without personal information being sent back to its servers.

And yet it still has no offline functionality, even if the learning is localised; well from what I can see, anyway. An iPhone 3GS managed offline voice functions like "call Tom". Why the heck can't Siri? Phone, text, open an app -- these are not commands that need to be bounced off a server.

It's fine if you're at Apple HQ, testing Siri on a 10Gb/s Internet connection. But when you're on the road with flaky 3G signal, it's the last thing you need.
 
Comment

Oblivious.Robot

macrumors 6502a
Sep 15, 2014
760
2,018
Yeah, no.
Siri is still pretty much useless for me, although better than before, but still meh with her current capabilities.

image.jpeg

Edit - For the screen brightness part, after some retries I found out that it does work, but you have to say it in one sentence as Siri forgets about it in the next. :confused:
 
Last edited:
Comment

ike1707

macrumors 6502
Jan 20, 2009
404
830
I take bigger issue with things like the "play the rest of this album" command instead shuffling my entire catalog of an artist, and saying "audiobook" to play my audiobook one day, and the next day requiring "play audiobook" to even understand my command.

Speech recognition is great, important really, but I would appreciate them fixing all the other sh*t she doesn't do right too.
 
  • Like
Reactions: LankyNibbs
Comment

npmacuser5

macrumors 65816
Apr 10, 2015
1,314
1,380
Others, like Amazon, figured out that 6 microphones and noise canceling are essential for good voice recognition. Like talking to a person who has limited hearing, "what did you say?". The old garbage in garbage out. A suggestion for Apple, have Siri bring up a text screen of what she heard after so many tries. Amazon has a feature that allows the user to see what Alexa heard, make a correction, and feedback to Amazon. Nice to see more research and creative thinking going on. Voice recognition has a good future.
 
Comment

wizard

macrumors 68040
May 29, 2003
3,854
571
Well the hype machine is in full gear. Honestly I haven't used Siri in months but it was no able worst than it was for me a few months ago. Simple things like driving along asking for a map to destination xx just fails miserably. It is like Siri doesn't comprehend that asking for a map to xx means popping up the map app with the directions ready to go. Siri has simply been terrible for me.

No then interesting thing here is that they are claiming machine learning techniques on the device. This makes me wonder how far away we are to an A series chip from Apple wth built in neural network technology. This would do more to advance the platform than another core as it would dramatically improve machine learning. So the question is Apple, how soon will the A series chips have built in neural network hardware?
 
Comment

themachugger

macrumors member
Aug 26, 2010
64
171
If Siri is learning, she's learning disabled. She constantly thinks I said "Skip 32nd" and goes looking for a movie called "Skip 32nd" on my Apple TV when it should be OBVIOUS I want it to SKIP THIRTY SECONDS. I could forgive her making the mistake once or twice but she never seems to figure it out.
 
Comment

EdT

macrumors 68000
Mar 11, 2007
1,881
1,593
Omaha, NE
Siri on my phone and autocorrect on any Mac device-IPad, MacBook or desktop- has gone from somewhat useful to irritating. When it gets something wrong it doesn't accept corrections , usually insisting on changing it back to the wrong word even if I go back and type in the correct word or spelling. My niece Caitlin is not Kaitlyn, but Siri insists on spelling it that way and autocorrect will change it back if I manually change it. I have put 'Caitlin' as a shortcut for 'Kaitlyn' but that doesn't fix the problem. And typing this message on an iPhone 6 meant retyping Kaitlyn several times, and the last, mis-spelled time above I didn't go back and correct. There are other words, spellings, and phrases that Siri seldom if ever gets right, and typing corrections do not seem to improve her accuracy.
 
Last edited:
Comment

thisisnotmyname

macrumors 68020
Oct 22, 2014
2,394
5,036
known but velocity indeterminate
I like Siri on the iPhone when it works, but at times it totally gets things very wrong that it previously got right.

What really gets me ticked off is how Siri on ATV is worse than useless almost every time I try to make use of it.

I agree. On the phone she works well for me. On the AppleTV it seems like she goes to sleep and the first time I try to interact with her such as "what did he just say?" it lags so long that the ten second rewind is well past whatever I wanted to hear. And then there's the lack of Siri search through home sharing content...
 
Last edited by a moderator:
Comment

vp719

macrumors regular
Jan 13, 2007
132
30
If Siri is learning, she's learning disabled. She constantly thinks I said "Skip 32nd" and goes looking for a movie called "Skip 32nd" on my Apple TV when it should be OBVIOUS I want it to SKIP THIRTY SECONDS. I could forgive her making the mistake once or twice but she never seems to figure it out.

This happens all the time, drives me nuts.
 
Comment

wigby

macrumors 68020
Jun 7, 2007
2,080
1,685
Siri stills sucks for me... still doesn't work well enough to be useful.
This story is a great example of Apple's scale problem. Since they do so much international business, when they say they've improved Siri by 2X, it's true. The problem is that anecdotally, for most of us, those improvements are not perceived. I only use Siri in English and for a very small set of tasks so I see no improvement and I assume most of us are in the same boat. If you surveyed other Siri users across 100 countries, I bet we would see a bigger pattern of improvement.
 
  • Like
Reactions: Ghost31
Comment

Rogifan

macrumors Core
Nov 14, 2011
22,312
28,059
This all seems a little defensive to me. The flavor of the year according to the tech press is AI/ML so now Apple's out there saying we do this too!
 
  • Like
Reactions: samcraig
Comment

samcraig

macrumors P6
Jun 22, 2009
16,637
41,619
USA
This all seems a little defensive to me. The flavor of the year according to the tech press is AI/ML so now Apple's out there saying we do this too!

Indeed. I also think the terms AI/ML has been diluted. It's becoming the new "cloud" and "big data"
 
Comment
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.