Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
All of them with Apple all these years and we have seen the real Siri and AI in Apple so their departure may not be a big issue but Apple’s privacy policies towards harnessing customer data and use them for various AI models in coming up with proper ML/AI strategy may be another reason. All other companies like META, Google are experts in generating customer behavioural data and use them in research and try/apply in real world scenarios without any big negative impacts by making them as Betas or free. Apple tried to monetise something that they haven’t really demonstrated in the world with great success which is also a problem. With confused strategy and vision conflicting with its own policies, it is not easy for Apple to break the ice.
 
This. Anybody that thinks AI isn’t useful is a complete idiot. LLMs are game changing for me.
I wouldn’t call them idiots, LLMs aren’t useful for everything.

For those doing work, whether personal or professional where they would benefit, I’d say ignorant is a better word.

A whole lot of smart tech people used ChatGPT a year or two ago and have never touched it since. As you know if you use current models, the gains in the last 6 months have really massively transformed the usefulness.

Claude Code wasn’t even released 6 months ago (it was 5!), things are moving faster in this space than any computing technology I’ve ever witnessed.

GPT-5 is probably launching in a couple of hours and will change the interaction model again to minimize the need to manually choose models for most average people.

OpenAI is on track for 1 billion users by the end of the year and they have a planned path to profit around ~2029. It’s dubious whether that will be successful, will they retain their massive consumer advantage, will they execute on that plan perfectly etc. but if they do they’re going to be a massive concentration of value since they have so much mind share. Like, probably top 10 company in tech by revenue.



I get not wanting to use these tools from a privacy perspective and also a utility one – it’s tempting to want to just shut off technology as a whole for a while, I do it too – but with all respect to the people who choose not to use it, writing it off as a mere hype cycle or fad is ludicrous.

If you’re interested in this field you must stay current and update your domain knowledge in a pretty substantial way every 2-3 months, otherwise your opinions are ossified and mostly just noise that others have to wade through.



Private Cloud Compute is a clever implementation but consumers at least right now won’t value it. That could change down the road, but I think it’ll be similar to how many people don’t use Google or Meta products. They’re out there, but they aren’t the majority.

I actually do think Apple has a long-term strategy now but it’s centered around a hardware product we won’t see for 5-7 years. That’s a long time to wait for a high-demand researcher interested in pursuing research that requires immense scale and collaboration with others at their level which Apple doesn’t offer at the moment.
 
  • Like
Reactions: TheKDub
Apple's issue was Siri long before the AI hype. I'm not sure whether the issue was in architecture or leadership, but it's lagged its peers for years.

I'm not at all convinced that this is the downfall of Apple. I do think they overpromised and underdelivered, and that's concerning, but it seems to me they've jumped into the issue with big boots and key employees.

I'm pleased with the positive posts in this thread.
 
Some of us need to step out of the Apple Reality Distortion Field here..

"Losing talent" ≠ "maybe that's actually good!"
"Good .. getting rid of dead weight!"
"Do they even need to be working on this stuff!?"

😮‍💨 😵‍💫
 
Last edited:
Apple will continue making colorful glass icons and new emojis/momojis.

No need to hurry with the personal assistant and AI.
It didn‘t fully work for 10 years, so let’s just wait for a few more.

In the meantime, due to the partnership with OpenAi, just pass your private documents and screenshots to the competition (“Pls Siri ask ChatGpt about the current month”)
Apple has at least two developers maybe three...
 
I think the problem with AI and LLM training in general is it requires infringement on private and copyrighted data. I suspect most AI companies don’t care about this, but I suspect Apple does and that their hands are tied because of it. This could also be linked to their Siri woes in general.
 
I can't use AI at work (clinical healthcare). We have a zero tolerance no-AI rule here, and all the domains are blocked by IT.

I don't do a lot of things on my personal devices that AI could help with.
That's a shame, AI has huge potential in healthcare. Like so many other places, those that ignore it or aren't quick enough to adapt will be left behind.
 
  • Like
Reactions: Zacharybinx34
people are leaving due to huge sums of $$ being offered to them. Seriously, no matter how much you like your current position, a $200m offer is a lot to walk away from.
Yup, agreed, but if their current company was worth staying with...they'd stay. That includes finances, but it is definitely always about the money.

I am sure with AI in general, a lot of training gets involved, so a quick turn around in a super corporate infrastructure doesn't jive well. Explains why Apple wants to buy a firm. Quick progress is the name of the game.
 
Siri is weird. Apple could have improved it even before they jumped on the AI bandwagon (and renamed Artificial to Apple for some reason).

Ask Siri to add a calendar event to a certain calendar, as a very simple example. It can't do it.

Ask Siri to activate a Focus mode that isn't one of its default names. It can't do it. Create your own focus mode named one of its default names, and Siri still won't do it.

Siri needs to understand the basics of interacting with Apple's stock apps, and then with third-party apps (with the right hooks) and then, finally, maybe stick some intelligence in there. They're working backwards, in my opinion.

If it were me (and obviously, it isn't!) I'd split Siri from AI. Put a team on making Siri actually work on the phone, and then either spend the money to (re-)hire a decent AI team or giver up and collaborate with a third-party.
 
This is on Apple—one of hte world's richest companies—not wanting to pay their staff who are being offered more to join other companies. Being wanted and paid makes them feel valued, so it's a no-brainer.

Apple should see the damage as it can certainly afford to keep staff instead of making them feel replaceable..

Pony up Apple.
 
  • Like
Reactions: TVreporter
Siri is a legitimate disgrace. It's completely useless. People are in sheer denial if they think Apple aren't scrambling to catch up which is why they were so quick to partner with OpenAI when they introduced "Apple (non-)Intelligence". They're quite literally almost a decade behind. Now they're losing their top LLM researchers to competitors. If they weren't in a bad place, they wouldn't have engaged in false advertising to promote sales of the iPhone 16 line (where they shared promised AI features that never existed at the time: features their own teams were even surprised to see being promoted during the WWDC). Apple were known to never promote features that aren't ready, let alone even exist, during their world-wide conferences, and this recent shenanigan they pulled eroded the trust of many of their users including myself.
 
Last edited:
Spot the boomers or blue collar workers.

LLMs are insanely useful in my profession.

Anyone who still thinks LLMs are a gimmick or cute little features must be absolutely insane.
Absolutely agree!

I honestly cannot recall a single day when I have not used some form of AI. It has boosted my productivity by more than 100% and played a major part in helping me get a rather generous promotion that made my wallet smile.

Many people have no clue how to use AI properly. They poke at it, get confused, then blame it for not being magic. It is a bit like when cars first appeared. People refused to buy them because there were no roads, and they were convinced that a breakneck speed of 12mph would make your head fall off.
 
I wouldn’t call them idiots, LLMs aren’t useful for everything.

For those doing work, whether personal or professional where they would benefit, I’d say ignorant is a better word.

A whole lot of smart tech people used ChatGPT a year or two ago and have never touched it since. As you know if you use current models, the gains in the last 6 months have really massively transformed the usefulness.

Claude Code wasn’t even released 6 months ago (it was 5!), things are moving faster in this space than any computing technology I’ve ever witnessed.

GPT-5 is probably launching in a couple of hours and will change the interaction model again to minimize the need to manually choose models for most average people.
100% agreed that the improvements made over the past 6 months have been huge - as someone who is in the data science / analytics space.

I can actually understand some of the skepticism here - if the last time you tried ChatGPT or some other LLM was a year or longer ago - it was still "gimicky" (using this word broadly as there was still uses a year ago). Even a year ago - hallucinations were quite high, a lot of them used less than real-time data sources, so it couldn't give you the latest.

But today - I use LLMs all the time to help me debug, write more efficient code, quickly summarize information, etc. I still come at it with a skeptical lens - hallucinations still happen so your awareness level has to be high and you should be quick to challenge and verify. But it way better than it was even 6-12 months ago.

My view is that you can either work with or against AI, but the tide is turning towards AI being a legitimate tool and source of efficiency. The debate around the human cost is legit (ie. tech companies not hiring or hiring fewer entry-level engineers b/c of AI), and as well as how well AI can replace lower level functionality - but the fact that it's a legitimate debate now shows how far AI as gotten.
 
I can't use AI at work (clinical healthcare). We have a zero tolerance no-AI rule here, and all the domains are blocked by IT.

I don't do a lot of things on my personal devices that AI could help with.
That’s too bad. I trust AI over 90% of bonehead doctors in the medical field.
They’re just parrots.
 
Money is a very powerful pull factor. But I wonder just how many people in the AI division feel stifled by management and if that is the case it is a push factor for them. They've got some real talent over there. So why haven't they been able to execute this the right way?
In the old days Apple supposedly didn't even pay that well but people liked to work there because they thought they were making a difference.

When it becomes another job there's really only the money.
 
  • Like
Reactions: macbookj0e
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.