Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I missed the days when Apple was the best at doing everything.
That was when they did very, very few things, though. I mean, they had the first commercially successful GUI in a PC, that was the point when they really were the best at everything they did. But after that, they’ve always had one product line that dominates and the rest kind of dangle. Right now, everything but their mini pc line and maybe the watch – although that bar is very low – kind of dangles.
 
That was when they did very, very few things, though. I mean, they had the first commercially successful GUI in a PC, that was the point when they really were the best at everything they did. But after that, they’ve always had one product line that dominates and the rest kind of dangle. Right now, everything but their mini pc line and maybe the watch – although that bar is very low – kind of dangles.
Probably the worse dangle is the grouping of two processors (max & ultra) with the Mac Studio. Still M2 based. They should have sold the models separately because of the delays with ultra production. Yes treating it as bad as Mac Pro. Then we turn to AI, where’s the AI query summaries for an enhanced Safari that would be useful. Macs can work around this, but not iOS/iPadOS. We certainly all have the mixed opinions about AI up to 18.3. We hope after the WWDC sales pitch 18.4 is worth it. ;)
 
Wasn't bringing in any revenue so got no investment basically
Excellent point. First time I seen that business thought conveyed thinking back, but yes Apple is so fixated on subscriptions recently they literally buried Siri improvements to mere language accuracy tasks instead of improving Siri because of revenue. Makes one question management foresights?
 
Dude I literally JUST TRIED YOUR EXAMPLE ABOVE, and Siri correctly recognized the command, put a reminder in one hour saying “pick out a shirt for tomorrow”.
Since MY Siri cannot be different from your, I think the problem here is the user, not Siri…
I've had the same command fail or produce different results on different occasions. You really think the command set just remains perfectly static at all times, and isn't influenced at all by context or changes on the back end? OK guy.
 
Dude I literally JUST TRIED YOUR EXAMPLE ABOVE, and Siri correctly recognized the command, put a reminder in one hour saying “pick out a shirt for tomorrow”.

That's a big part of the problem -- it's wildly inconsistent

I use Siri in the exact same way, doing the same commands, in the same settings -- day after day

And it randomly doesn't work right every so often

Frustrating as heck
 
That's a big part of the problem -- it's wildly inconsistent

I use Siri in the exact same way, doing the same commands, in the same settings -- day after day

And it randomly doesn't work right every so often

Frustrating as heck
Part of that is misconstrued interpretation with external noises. Way too sensitive, also Siri responds to even TV audio at times.
 
I agree. I wouldn't assume that failure to meet deadlines is the issue with Siri. It has improved so little in the last ten years that I would think that, painful and expensive as it would be, wiping the whole team and starting over might be in order.
I just assume there is no team working on Siri at all the way she hasn't improved at all since introduction.

The complaints haven't changed at all in the 14 years since the rollout - the only thing that's changed is we went from comparing her to Alexa and the voice command UI that predate Siri (which... I think is still available today if you just disable Siri... and generally does as well as Siri if not better), to instead comparing her against ChatGPT. Which the fact we compare them at all really just gives Siri way too much credit.
 
Classic Apple behavior that goes way back - make a top selling product but miss/neglect something important and glaringly obvious to users and competitors for years…
 
Gee, thanks for calling me an idiot. Should I call you one too? Despite your accusations otherwise, I know exactly what I said, and exactly what Siri answered.

That example was from a some weeks ago. They constantly make changes to the backend that processes requests. Sometimes things get fixed. Sometimes things get broken. Sometimes things that got broken get fixed again. Perhaps you aren't aware of this? Now, wanna rethink your accusation?
No I don’t. I find amusing how lucky I am with Siri, when some people are not. Incidentally most of them are well known complainers on the forum.
BTW I never called you an idiot: I just tried the same exact command and it worked flawlessly.
 
  • Disagree
Reactions: 9081094 and CarlJ
I've had the same command fail or produce different results on different occasions. You really think the command set just remains perfectly static at all times, and isn't influenced at all by context or changes on the back end? OK guy.
I’m not the one making assumptions. Siri is producing consistent results to my queries, most of the times. Is it perfect ? Far from that. But it isn’t the mess described here, in my experience.
That's a big part of the problem -- it's wildly inconsistent

I use Siri in the exact same way, doing the same commands, in the same settings -- day after day

And it randomly doesn't work right every so often

Frustrating as heck
In my experience is not like that. It is quite consistent.
I’m not using Siri too extensively: I usually prefer speaking with people rather than a smartphone, but for the tasks I’m requesting, Siri is doing its job.
Concerning AI I cannot express any opinion since it is not yet available in EU.
 
  • Disagree
Reactions: 9081094
Siri has gotten so awful. The week before last, I was driving and some rain showers arrived a bit earlier than I'd expected. I didn't want to check the radar while driving, so I wanted to ask Siri when the rain would stop, thinking it'd pull from the minute-by-minute precipitation forecasts that Apple highlights prominently in Apple Weather, and likely a major reason they acquired Dark Sky over other apps.

The following ensued that afternoon:

Me: "When will it stop raining?"
Siri: "Yes, it appears to be raining."
Me: "How long will it be raining?"
Siri: "It'll probably stop raining in the afternoon."

Again, it already was afternoon. It was only ever going to rain in the afternoon that day. I wanted to know if it was going to rain for, say, 5 minutes vs. an hour or longer so that I could rearrange my errands accordingly — maybe hold off on that trek from the car to the grocery store entrance until after it stopped raining, if I could.

Alas, Siri somehow managed to be worse than unhelpful, almost like malicious compliance. By the end of that exchange I wish I'd have gotten a "here's what I found on the web for..." response.
 
„introduce an LLM version of ‌Siri‌ that will be comparable to ChatGPT and Google's Gemini“

This requires an absurd amount of data, together with insane computing power, an insane amount of NVidias blackwell chips together with a nuclear power plant. With hardware, Apple needs just a model and experts who know their business …

Apple should buy Aleph e.g. Alpha since Apple can‘t jump with Tim as CEO.
 
  • Like
Reactions: 9081094
I’m not the one making assumptions. Siri is producing consistent results to my queries, most of the times. Is it perfect ? Far from that. But it isn’t the mess described here, in my experience.
Love that for you, but your experience with Siri is definitely not everyone's. It's good for barking at it to start a timer or whatever where the intent is abundantly obvious, but it falls apart quickly past that.

ETA: Recall that the ideal case of Siri is to not force you to provide it with specifically structured queries so that it'll do what you want. It's supposed to understand a natural-language query and do what you ask correctly the first time. Anything short of that is falling short of the ideal, and in my experience it has gotten demonstrably worse at understanding over the past several years.
 
Love that for you, but your experience with Siri is definitely not everyone's. It's good for barking at it to start a timer or whatever where the intent is abundantly obvious, but it falls apart quickly past that.

ETA: Recall that the ideal case of Siri is to not force you to provide it with specifically structured queries so that it'll do what you want. It's supposed to understand a natural-language query and do what you ask correctly the first time. Anything short of that is falling short of the ideal, and in my experience it has gotten demonstrably worse at understanding over the past several years.
Siri getting worse is just a completely nonsense. I really can’t see how a developer could make it worse. Worst case scenario, it is not improving at the expected pace.

When I’m speaking to Siri, I’m using natural language, but I’m trying to use correct phrases.
So I’m not asking “ when it will stop raining” most of the times, but “give me weather forecast for this afternoon/morning/whenever”. Maybe they way I’m using it is the reason I’m more satisfied than some of you.
 
Siri getting worse is just a completely nonsense. I really can’t see how a developer could make it worse. Worst case scenario, it is not improving at the expected pace.
As a developer, I'm well aware of how you can make anything worse because I've unfortunately done it myself. It's very easy to fix a bug or make a seemingly safe change and unwittingly create new bugs in the process — collateral damage, more or less.

Maybe you removed a bit of code that seemed superfluous, only to later find out that it was there to handle some edge case that presented itself a while after you deployed your "fix." Maybe your fix broke another ostensibly unrelated part of the project that quietly calls the same code you just touched. Maybe you were in such a rush that your fix just wasn't a fix at all because you neglected to consider all cases.

It's easy to do, especially with projects of Siri's age and scope.
 
As a developer, I'm well aware of how you can make anything worse because I've unfortunately done it myself. It's very easy to fix a bug or make a seemingly safe change and unwittingly create new bugs in the process — collateral damage, more or less.

Maybe you removed a bit of code that seemed superfluous, only to later find out that it was there to handle some edge case that presented itself a while after you deployed your "fix." Maybe your fix broke another ostensibly unrelated part of the project that quietly calls the same code you just touched. Maybe you were in such a rush that your fix just wasn't a fix at all because you neglected to consider all cases.
Precisely this. To this developer, Siri gives off the feel of a huge pile of spaghetti code that contains all sorts of unwarranted assumptions that individual developers made at various times, assuming "if I find these two words in the sentence, that definitely means the intent of the sentence is X", when that's just one of many things it could mean.

Way too many special cases, trying to pretend that there's an actual understanding of language when there's isn't. Like every time they are faced with a thing Siri doesn't understand, they add yet another patch to handle that specific case, breaking several others in the process.
 
Last edited:
Siri getting worse is just a completely nonsense. I really can’t see how a developer could make it worse. Worst case scenario, it is not improving at the expected pace.
Tell us you don't know anything about developing complicated software systems without telling us that. There's a word for it, it's called a regression. Breaking thing B in the process of fixing thing A, because you made incorrect assumptions about the side effects that the changes for thing A would have on other use cases for the code. This happens all the time.

When I’m speaking to Siri, I’m using natural language, but I’m trying to use correct phrases.
That's the problem though. Apple has not published documentation of what constitutes "correct phrases", so you're not using objectively correct phrases, you're using your guess at what correct phrasing might be.

It's like walking through a minefield and not getting blown up and then asserting that that proves there are no mines in the field because you made it through safely.
 
Siri getting worse is just a completely nonsense. I really can’t see how a developer could make it worse. Worst case scenario, it is not improving at the expected pace.

When I’m speaking to Siri, I’m using natural language, but I’m trying to use correct phrases.
So I’m not asking “ when it will stop raining” most of the times, but “give me weather forecast for this afternoon/morning/whenever”. Maybe they way I’m using it is the reason I’m more satisfied than some of you.
Maybe, but the goal of Siri should be to work for everyone, shouldn’t it?
 
When I’m speaking to Siri, I’m using natural language, but I’m trying to use correct phrases.
So I’m not asking “ when it will stop raining” most of the times, but “give me weather forecast for this afternoon/morning/whenever”. Maybe they way I’m using it is the reason I’m more satisfied than some of you.
Also, this paragraph boils down to “You’re talking to it wrong.” There should be no concept of “trying to use correct phrases.”

Much like Face ID: You look at the device and it recognizes you. You don’t have to look at it a certain way or blink twice or anything… it just works. In fact, often it just works so well that you don’t even think about it. These are the types of software experiences that Apple has rightfully trained us over the years to expect from them.

I didn’t ask anything unreasonable of Siri — it pulls its weather data from Apple Weather, which has minute-by-minute precipitation forecasts. Upon opening the Apple Weather app once I was able to come to a brief stop, it told me that rain would stop in about 25 minutes. This information is also made available in iOS widgets, which considering how WidgetKit works on iOS means there shouldn’t be a huge leap to make this information also available to Siri if it isn’t already.

And yet “when will it stop raining” and “how long will it be raining” somehow weren’t sufficient for Siri to glean that I was looking for something more specific than “this afternoon,” especially when, once again, it was already afternoon. Does that mean 5 minutes from now or 2 hours for now? Completely unhelpful.

In fact, for the first query it couldn’t even tell I was asking for a forecast at all! It just told me it was raining…when my query should have implied that I was aware it was already raining.

That is absolutely a Siri problem, and a serious one. Maybe I just need to find the “correct phrase,” but I’m not sure what it would be. It shouldn’t be my problem, either.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.