Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not going to happen.

OpenAI will not deliver the supposed gains that they are selling. Microsoft will have realised their CoPilot pyramid scheme isn't playing out and is actually a net financial loss that has detracted from other product value and existing customers and the board will throw Satya out and switch focus in a mass panic to security to retain government revenue in the cloud. Also NVidia's hyped valuation will crash when the market reaches saturation. About this time everyone will discover they don't actually need this current fad of "AI" and all it does is hallucinate stuff within a limited window of credibility. Also government regulation will catch up and kick the market in the balls.

(these are actual market positions that people are playing at the moment)

Edit: Apple are playing the usual conservative engineering policy game that they do which is adopting mature outcomes from the hype, you know the ones that don't make you drive off a cliff or replace the moon with a donut on thursdays on your camera.
Big difference between fantasy and what's really playing out.
 
Couple of nights ago, I was on my way to pickup son at a railway station in another city. I asked Siri in Swedish language to navigate to the railway station in that city. I have tried at least 5 times and Siri was showing waypoints to a place which has nothing to do with a railway station. I had to take my phone and start google maps and ask for directions. Google didn’t help with the shortest route but, it did understand my question and showed correct destination.

I prefer to use Siri in order to call someone or get directions etc. I hate it when it doesn’t even answer.
 
  • Like
Reactions: TVreporter
It feels like big Siri improvements have been promised every year since about 2014, but they never quite materialise and always get pushed back for another year.

Business as usual, I guess.
 
Our current AI fad, and it makes me feel dirty calling it AI but thanks Claud for that one, is not intelligence to any degree. It has no capability to reason about what it produces. It's just a statistically feasible sounding outcome with no proof or confidence measurement of the outcome.

It's artificial intelligence. It doesn't have to work like human intelligence or reason at all. What we want AI to do is to mimic the results of human intelligence.

Thus the only cases that AI matters is where the information output doesn't have to be correct or you're too stupid to realise it isn't. The market relies on the latter and hopes that the first one isn't going to get discovered.

Even humans are error prone. And so much we do as humans are in domains where there is subjectivity or no need to be 100% correct.

I would even argue that all human activities, except maybe mathematics and logic, is dealing with errors and incorrectness, bad data, falsehoods, sources not to be trusted etc.

You treat AI tools just like you do with other humans, Google search, Wikipedia, newspaper and others. You find out where they're useful and trustworthy based on experience.
 
So Siri stays the Forrest Gump of voice assistants for another year.

At least there's still some consistency from a company that used to champion creatives, but now crushes them in its ads.
 
hahahahahaha.

look, i buy apple stuff. but if you think they're not just blindly fumbling around in panicked reaction mode right now, you're being VERY charitable.

on the flip side, hopefully by the time they actually rally around some AI strategy, the whole hype cycle will have gone the way of NFTs/metaverses and we can be spared 'AI-everything' clogging all their devices.
I like this community because, contrary to what many people believe, we the Apple consumers (or at least the geeks that look beyond the Memojis and the fancy logo) are the most critical with the company. And I think that’s good.

On the other hand, regarding your second paragraph, I’m afraid AI technologies aren’t a trend like the 3D televisions, or the metaverse, or NFTs… I think LLMs, just like Internet or Social Media, is here to stay and will change the way we do many things. For better or worse, we still don’t know.
 


Apple is planning a major AI overhaul for Siri in iOS 18, and Bloomberg's Mark Gurman says that the update will let Siri control all individual features in apps for the first time, expanding the range of functions the personal assistant can perform.

iOS-18-Siri-Integrated-Feature.jpg

Siri will be able to do things like open specific documents, move a note from one folder to another, delete an email, summarize an article, email a web link, and open a particular news site in Apple News. Apple plans to use AI to analyze what people are doing on their devices, automatically enabling Siri features.

To make this happen, Apple engineers had to rearchitect Siri's underlying software with large language models or LLMs, which is also the technology that's been used for chatbots like ChatGPT. Apple has been working on a deal with OpenAI to integrate OpenAI's ChatGPT technology into iOS 18, and it is also in talks with Google about incorporating Gemini, but Siri functionality likely relies on Apple's own LLM work.

At launch, the new Siri functionality will be limited to Apple apps, and Siri will only be able to respond to one command at a time. Eventually, Apple wants Siri to be able to respond to multiple commands, such as capturing a photo and then sending it to someone in a message.

While the Siri features will be introduced at WWDC 2024, Apple reportedly does not plan to launch them in September when iOS 18 sees an initial release. Instead, Siri will be overhauled in a future iOS 18 update that's set to be introduced in 2025.

Basic AI tasks in iOS 18 will be processed on device, but more advanced capabilities will rely on Apple's cloud servers. Gurman previously said that Apple would power all of the initial iOS 18 features on-device without relying on cloud technology in order to preserve privacy, but rumors have shifted in recent weeks. Part of Apple's new Siri technology will include code for determining whether a request can be processed on device or requires Apple's servers. On-device iOS 18 AI capabilities will largely require an iPhone 15 Pro or later to work, and an M1 or later for iPadOS 18 and macOS 15.

According to The Information, Apple's AI servers will be powered by M2 Ultra and M4 chips, with Apple planning to use the Secure Enclave to "to help isolate the data being processed on its servers so that it can't be seen by the wider system or Apple." Gurman says that Apple will also provide customers with an "intelligent report" that explains how information is kept safe.

We'll hear all about the AI functionality coming to Siri in just over 10 days. WWDC 2024 is set to begin on Monday, June 10.

Article Link: More Advanced AI Siri Functionality Not Coming to iOS 18 Until 2025
A little late to the party here. No, late to the party would be a Siri advancement in 2024. Advancements in 2025 now? No, by that time the party will have moved onto another venue.
 
No it's terrible. Throw some group theory at it.

If H is a group and G is a subgroup of H and A is a subgroup of G and H then what is F?

Eventually it makes some assumption that A = F after rambling on about possible interpretations of what F is.

ChatGPT:
Given the context, we have:

  • H is a group.
  • G is a subgroup of H.
  • A is a subgroup of both G and H.
The question "What is F?" is ambiguous because F is not defined in the provided context.

That's the exact same answer I would have given.

I also think this isn't the math most people will feed ChatGPT with. It will be algebra and calculus up to early university level.
 
It's artificial intelligence. It doesn't have to work like human intelligence or reason at all. What we want AI to do is to mimic the results of human intelligence.
Even humans are error prone. And so much we do as humans are in domains where there is subjectivity or no need to be 100% correct.

I would even argue that all human activities, except maybe mathematics and logic, is dealing with errors and incorrectness, bad data, falsehoods, sources not to be trusted etc.

You treat AI tools just like you do with other humans, Google search, Wikipedia, newspaper and others. You find out where they're useful and trustworthy based on experience.

This is a fairly large straw man.

As a human I lack objectivity and time so I would like to contract those things out. What I need is someone to do objective research, quickly and deliver me information which is trusted and citable. I don't need a statistical mimic of a human, I need something different.

I definitely don't need someone who might deliver me garbage on a whim. I can get that from all the other meat sacks around me.
 
View attachment 2383719

It doesn't understand the difference between kelvin and Celsius and if you feed it a bad prompt, it'll just carry on rather than tell you that you're an idiot.

0K = -273.15oC

Yes, but I also think that anyones who deals with diodes will know how to correctly ask the question and not try to confuse with Kelvin and Celsius.

Sometimes, artificial intelligence gets better when it's helped by human knowledge.
 
Weak and pathetic, but not surprising. But here comes the people who gaslight themselves and say "I'd rather Apple be late and do it right." Not when they are VERY VERY late, and still don't do it right.

Millions of people are already enjoying AI features by other companies. They are imperfect but usable, and ever improving. Meanwhile, Apple has nothing, and seemingly plans to have almost nothing.
If Apple achieves what this article claims, we have something potentially revolutionary. Siri being able to do multi command requests in third party apps is not nothing. Apple is the only company in the world who can do this in a privacy oriented manner.
 
  • Like
Reactions: NoGood@Usernames
ChatGPT:


That's the exact same answer I would have given.

I also think this isn't the math most people will feed ChatGPT with. It will be algebra and calculus up to early university level.

Did you quite the whole thing? No.

Eventually it rambles on and leads to F=A, which is objectively wrong. F is undefined intentionally in the prompt but it is statistically required to be represented in the output. The end game is it produces garbage because it can't reason about the correctness of the input.

If you take this to your mathematics professor he will say F is undefined, then one of his hands will cover his face and his head will shake slowly.
 
  • Like
Reactions: BugeyeSTI
That doesn't sound too promising. They kind of dropped the ball here. iOS with deeper Siri integration and better Siri was always a promise they never realized and now they are playing catch up. I kind of hope that the "reworking Siri" part is more than that and also laying a good foundation for things to come, but we'll see.
 
Yes, but I also think that anyones who deals with diodes will know how to correctly ask the question and not try to confuse with Kelvin and Celsius.

Sometimes, artificial intelligence gets better when it's helped by human knowledge.

No. The weighting and significance was forced towards diodes and away from the conversion intentionally. It hallucinated the rest.

The point is it doesn't understand what you are asking it nor the answer.
 
  • Like
Reactions: BugeyeSTI
As a human I lack objectivity and time so I would like to contract those things out. What I need is someone to do objective research, quickly and deliver me information which is trusted and citable. I don't need a statistical mimic of a human, I need something different.

If that's your requirements, I don't think any AI tools in existence will meet your needs. And you might never get it. You seem to want something which is superior to human intelligence and which shows no weaknesses.

I have much lower bar and only need these tools to be useful in certain areas. I don't mind if they sometimes makes huge mistakes, just like humans.
 
I am hanging onto my iPhone 12 Pro Max until both my battery & Siri die. Than next year I can buy Super Siri v.2.0.
 
Last edited:
Did you quite the whole thing? No.

Eventually it rambles on and leads to F=A, which is objectively wrong. F is undefined intentionally in the prompt but it is statistically required to be represented in the output. The end game is it produces garbage because it can't reason about the correctness of the input.

If you take this to your mathematics professor he will say F is undefined, then one of his hands will cover his face and his head will shake slowly.

Because since F isn't defined, it suggests three interpretations of what F could be defined as. Yes, I know you want it stop and don't do that.
 
Before AI:
User: “Siri give me directions to RH Rooftop.”
Siri: “Getting directions to RH Rooftop.”

With AI:
User: “Siri give me directions to RH Rooftop.”
Siri: “Getting directions to RH Rooftop.”

I ❤️ Apple’s AI!
 
I like this community because, contrary to what many people believe, we the Apple consumers (or at least the geeks that look beyond the Memojis and the fancy logo) are the most critical with the company. And I think that’s good.

On the other hand, regarding your second paragraph, I’m afraid AI technologies aren’t a trend like the 3D televisions, or the metaverse, or NFTs… I think LLMs, just like Internet or Social Media, is here to stay and will change the way we do many things. For better or worse, we still don’t know.
i agree there is a core element of the technology that can be useful, but there’s a mad scramble to bolt it as a marketing term onto everything like they did with “smart toothbrushes” and “smart fridges” regardless of whether it’s making anything better. That’s the part that I can’t wait to blow over. Hopefully a few more rabbits and humane pins will help expedite :p
 
  • Like
Reactions: NoGood@Usernames
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.