Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I really wonder if any of the older iPhones will get any meaningful ai software tools with iOS 18 that will be useful! I pretty much seeing the iPhone 16s really get all the best ai features.
 
Good news. Siri is useless for anything other than setting alarms, timers and controlling music playback.

:cool:
With a scottish accent Siri is useless for everything :D

I don't have issues with speech recognition on ChatGPT/Copilot so this is good news if Apple is going to use OpenAI. We'll see.
 
  • Like
Reactions: gstevie
Soon they will utilize every Apple device for AI training/computing. It would be the world’s largest computer, and no one can come close to them.
What do you mean?

If Apple was going to build an AI network of all our iPhones/iPads/Macs then it doesn't make it better than if Microsoft/Google do it. There's loads of Androids and Windows PCs out there.
 
What do you mean?

If Apple was going to build an AI network of all our iPhones/iPads/Macs then it doesn't make it better than if Microsoft/Google do it. There's loads of Androids and Windows PCs out there.

Windows and Android are built by two different companies with different objectives.

They should also collaborate with two other companies that manufacture their processors: Intel and AMD.
 
Last edited:
I think people are underestimating LLMs at the OS level.

Imagine LLMs being able to control apps for you. You can tell an LLM to automatically setup Shortcuts for you since no one knows how do create them. You can tell an LLM install the best Ad Blocker for Safari for you and configure it to work properly. Even mundane things like asking an LLM to disable notifications for an annoying app should work well too.

Think of it like a highly capable digital assistant on your phone that can control your phone on your behalf. Don't think of it as a chat app.
If I need an “AI” to do those things for me, then that’s a symptom of an underlying problem that needs to be fixed - not an indicator that “AI” is needed.
Completely agree with @BanjoDudeAhoy. The whole Shortcuts reference is a great example. Apple did not invest in making sure Siri could handle lots of useful requests to its own built-functions so “Hey! Let the users deal with that!” Boom! Shortcuts! And then make it complex enough that, as you said, “nobody knows how to create them.” This is not. “AI” problem.

What constitutes the “best” AdBlocker for Safari? What is the criteria? What constitutes working “properly”? Does AI magically know how much filtering you want to do with the blocker? Again, AI is not a good fit.

Manage notifications by app? See my first paragraph. Apple should have ensured Siri could do this from day 1. It’s a simple toggle in a settings screen already.

I get the sense that Apple is now on the “buzzwords are cool” bandwagon like everyone else. They used to focus on day-to-day user experience a lot more than they do now and it shows with Siri.
 
GPT is overrated. I have found you can get it to give you the answer you want, and also when you just let it give you an answer it can often be wrong.

View attachment 2377396
How old is this conversation? This is not even remotely similar to what I just got trying to replicate this, which was far more accurate, and did not get tricked into error:

First turn:

Chat GPT: To give you the distances in trillions of miles, we can use the conversion that 1 light-year is approximately 5.88 trillion miles. Let's calculate the distances for the main stars in Orion:

  1. Betelgeuse: 548 light-years ≈ 548×5.88548×5.88 trillion miles = 3221.84 trillion miles.
  2. Rigel: 860 light-years ≈ 860×5.88860×5.88 trillion miles = 5056.8 trillion miles.
  3. Bellatrix: 250 light-years ≈ 250×5.88250×5.88 trillion miles = 1470 trillion miles.
  4. Mintaka: 1200 light-years ≈ 1200×5.881200×5.88 trillion miles = 7056 trillion miles.
  5. Alnilam: 2000 light-years ≈ 2000×5.882000×5.88 trillion miles = 11760 trillion miles.
  6. Alnitak: 800 light-years ≈ 800×5.88800×5.88 trillion miles = 4704 trillion miles.
These figures represent the approximate distances of the stars from Earth in trillions of miles.

Second Turn:

Me: I thought it was 1300 trillion miles?


ChatGPT: Let's clarify the distance for the entire constellation of Orion in terms of trillions of miles. Orion as a whole doesn't have a single distance because it's composed of many stars, each at varying distances from Earth. However, if we consider a rough average distance to the core of the constellation, we might get a general idea.

The stars of Orion's Belt—Mintaka, Alnilam, and Alnitak—are often used to represent the core of Orion. Alnilam, the middle star of the belt and roughly 2000 light-years away, can serve as a good reference for this central region. Using the conversion that 1 light-year is approximately 5.88 trillion miles, the distance to Alnilam is:
2000 light-years×5.88 trillion miles per light-year=11760 trillion miles

This is the distance to Alnilam, not the entire constellation. If you've heard the figure of around 1300 trillion miles, it might have been referring to a specific star or an average for a particular group of stars within Orion that's closer to us than Alnilam. If you'd like, I can help clarify or calculate the distance to a specific star or part of Orion.
 
Last edited:
I’m using GPT-4 a lot, and in my humble opinion, it is the best AI LLM out there. And the best AI company Apple could have arrived a deal with. I don’t want Google powering Siri or other iOS services.

Now the question is, if this iOS 18 integrated GPT will allow me to stop paying for GPT-Plus, or if the services provided for free on iOS won’t cut it compared to ChatGPT.
My guess is free and paid. Free for basic tasks ChatGPT lets you do today in 3.5 but paid for anything other requests. This should help off set costs Apple will accrue building data centres and to assist with future development of their own AI infrastructure. I’ve referenced something I call Siri+ previously and I believe the paid version will be part of Apple’s services. I can forsee in future versions of their OS offerings it helping with putting together videos and edits, auto correcting photos and images based on what ask it to look like. These might be part of future iterations of their software packages but under a monthly subscription.
 
  • Like
Reactions: Populus
I can already imagine it: “In conclusion, I found some web results. It is important to note that I can show them if you ask again from your iPhone.”
 
Last edited:
My guess is free and paid. Free for basic tasks ChatGPT lets you do today in 3.5 but paid for anything other requests. This should help off set costs Apple will accrue building data centres and to assist with future development of their own AI infrastructure. I’ve referenced something I call Siri+ previously and I believe the paid version will be part of Apple’s services. I can forsee in future versions of their OS offerings it helping with putting together videos and edits, auto correcting photos and images based on what ask it to look like. These might be part of future iterations of their software packages but under a monthly subscription.
Yeah, I fully agree, that’s the most likely outcome. And if it’s fairly priced (like, at $9,99 or even $14,99) I’m totally in, perpetually. If it’s more expensive, I’ll do just like I do with GPT Plus: buy a month whenever I need it.
 
  • Haha
Reactions: maxoakland
Cmon apple, give us "shareplay" for Maps so that the "co-pilot" can organize the route, stops, destination on the carplay device and let the driver focus on driving.
 
GPT is overrated. I have found you can get it to give you the answer you want, and also when you just let it give you an answer it can often be wrong.

View attachment 2377396
That’s pretty funny. Not only does it demonstrate that ridiculous response cycle you have shown, it fails to recognize that it’s impossible to have a single distance to an entire constellation unless we’re talking an average distance of all stars in the constellation. That is to say that each star’s distance relative to Earth varies.

Why would I want this flagrant language model on my phone, feeding me wrong information? Turn it off.
 
It's barely functional even for those TBH
Maybe in your experience, but I've been using Siri regularly for years to do those things (and others) and have issues no more than 5% of the time. Siri is almost always solid for me and for what I ask it to do.
 
  • Like
Reactions: shadowboi
GPT is overrated. I have found you can get it to give you the answer you want, and also when you just let it give you an answer it can often be wrong.

View attachment 2377396
Yes, since day one. Same for Gemini. They basically spit out a shuffle of words that they were trained on and make it sound like the definite answer. As soon as you challenge that answer if gives you another one and apologises. Even if the first answer was correct.
It never tells you that you are wrong and proves it. Never.
At the current stage I don’t see a real utility in those LLM’s. They might be good for some coding but that’s about it in my opinion.
 
GPT is overrated. I have found you can get it to give you the answer you want, and also when you just let it give you an answer it can often be wrong.

View attachment 2377396
What version of GPT is that? Here is GPT 4.

First, it's not a great question because as GPT 4 pointed out, the constellation is not a single distance from Earth. Second, the question also does not specify earth as the reference point. That's a logical assumption (which GPT 4 correctly makes), but the question is best framed with more detail and specificity. In any case, GPT 4 answered the question well, even when I gave it your prompt to get it to change the answer.

My recommendation is to look at where GPTs are and where they are going, not where they were. If you don’t pay for the newest ChatGPT model, you can at least put the prompts into Microsoft’s Copilot GPT 4 (which you can use for free) and see what that can produce. That also does a good job.
 

Attachments

  • Orion_GPT.jpg
    Orion_GPT.jpg
    204.8 KB · Views: 80
Last edited:
I think people are underestimating LLMs at the OS level.

Imagine LLMs being able to control apps for you. You can tell an LLM to automatically setup Shortcuts for you since no one knows how do create them. You can tell an LLM install the best Ad Blocker for Safari for you and configure it to work properly. Even mundane things like asking an LLM to disable notifications for an annoying app should work well too.

Think of it like a highly capable digital assistant on your phone that can control your phone on your behalf. Don't think of it as a chat app.
The issue is that LLMs tend to get things slightly wrong and occasionally completely wrong, and there’s no way to tell when they do. You always have to check the results, e.g. the Shortcut definitions and configuration settings in your examples. No way will Apple let GPT control apps.

I fully agree that it would be incredibly useful if it could work reliably, and also give reliable feedback when something exceeds its abilities. But that’s not the case with LLMs, by construction.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.