Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Gemini doesn’t run on devices. I would be as careful if Apple sends the data to cloud for running models. If it runs on devices, and Apple doesn’t train, it’s different. I can see Apple keeping models open, but make them on run on devices(iPhone, iPad and Mac).
That’s great to know. Wouldn’t google still have those data requests for their own use? Same as on google search?
 
That’s great to know. Wouldn’t google still have those data requests for their own use? Same as on google search?
Great question. It really depends on if the model runs locally or in cloud, and to what extent. Requests are inferences to these models already trained on huge datasets. If the model is running on device, and inference is local, there isn’t need to send data to cloud. The concern about your requests being used in training is huge. Open AI uses ChatGPT and ChatGPT Plus user chat to train their models. Open Ai claims they don’t for Business and enterprise or API users. Same with Google.
If Apple wants to make Gemini AI engine in iPhone, like search engine, I would disable it if it isn’t running locally and data is sent to cloud for training.
 
Does someone expect and reasonable update to AI and Siri capabilities in iOS 18?.
All this is 2 years away at least.
 
Will be extremely funny to read all this PR from Apple about the big important ML research they are doing and then find out at WWDC they are just gonna outsource Siri to Microsoft, lmao


I suspect there are two different models here. Apple has spotlight and then they also have an API for web search. Both fall into the very broad categorization of search , but one is more local oriented and the other is more external oriented.

Apple having some centralized , security privacy limited API for the 3rd party web services "chatbots" / "search summarizer" bots to plug into would not be surprising. But that would have limited interaction with say Photo classification of your Photos library by Apple. Or fixing your Photos with a "magic eraser" , Editing your audio to track, etc.

On the iOS/iPadOS , there are a bunch of sliders in settings to let Siri looks at your individual apps' data to contribute to Siri responses. What if Apple opened that up to a designated "AI assistant". If you choose to let Gemini/Copilot/Alexa/etc suck in your whole Pages document library that is up to you ... or not. You can keep Siri out also.

One single "AI assistant" to rule them all is the WRONG approach here in the current context.

The quite dubious approach here is for Apple assistant to 'win' the others have to loose. Or vice versa. All of them have very substantive problems. "Ahead" / "behind" is somewhat a wrong connotation, because they all have trade-offs and quirks. Right now it is more a matter of which quirks do you want to ignore more for that task at hand.


"AI assistants" as apps as a delivery vehicle are coming whether Apple does an assistant or not. There is nothing really to "outsource" here especially in the context of the hyper large models which only run in the cloud. Those are cloud services. The "availability" is going to be like the web is a generic cloud service. If you can get TCP/IP packets to it ... it is just available on the device.

[ Rember when Safari was trying to be multiple platform (i.e., on Windows ) because if you didn't suck the whole world into your proprietary web browser platform you were doomed .... until it wasn't true.

Basically same thing going on here with the AI hype train. ]


Apple only really needs to come out with a Siri 2.0 ( 3.0 ), something better than what they got. They don't have to 'kill' every other competitor and be everything for everybody. Just substantively better than the last version.
 
Does someone expect and reasonable update to AI and Siri capabilities in iOS 18?.
All this is 2 years away at least.

Apple doesn't have to roll out "Better than GPT4 " to produce a substantively improved Siri. The achievement hurdle that Siri currently sets isn't relatively that high. It just has to be better than the current Siri.

Chasing the Nvidia stock market hype train isn't going to do anything for Apple. If Apple delivers better products than last iteration then they sell lots of new product. Apple doesn't really need a hype train.
 
  • Like
Reactions: ipedro
I got idea to see what people would like to use AI for.

So lets make test and use emojis as vote. What will you use AI on iPhone on regular basis the most?

👍 - text generation
❤️ - text editing/manipulation
🤣 - smart search
😲 - image/video generation
☹️ - image/video editing/manipulation
😡 - controlling your devices/home
👎 - creating automation/shortcuts

There is no more reactions but I have not got any other idea for general use anyway. You can choose only one ....

Nah, I want Jarvis from Iron Man! We are already starting to see glimpses of that!
 
Last edited:
"Can't innovate my ass"
Hey, can we pay to use your system please Google.

LOL, honestly could not make it up, especially after Apple was supposed to be all smug and cocky recently about their amazing AI behind the scenes.

Really makes me wonder if Apple is simply too big a lumbering giant for it's own good now.
One could hardly call is a dynamic company with quick reactions and products anymore.

Just plod along and tweak what you have, and every once in a while try something new.

It's a shame it's become this, from back in the Amazing Apple 2 and first Mac days.
 
Apple has recently been killing in Deep learning space. Apple released MM1, and they have released more information than most open source LLM companies. I have been testing MLX for some of my workflows, it’s probably the fastest among other python libraries. It runs open source LLM models on my iPad Pro. Gonna be interesting once it gets to iPhone and other devices. With recent updates, I can run A Falcon 180 B on my M1 Max and my Nvidia RTX 4090 GPU can only dream. I hope Apple keeps up with the releases.
Thanks for this perspective @TechnoMonk ! Can you provide some insight on how you’re using MLX to run LLMs on your iPad Pro?
 
It'll be way ahead of Gemini if it simply acknowledges the existence of White people. The bar is pretty low, Apple. Let's see what you've got.
That’s not what happened, and you either know that and don’t care, or you didn’t read enough about what happened. The intention was to have the results better reflect the diversity of the world and to address the fact that much of the training of the models used white people (as is also the case with facial recognition technology). The problem was that those biases were substituted without adding additional context.
 
If Apple gets in bed with Google for core functionality with “AI” and it can’t be disabled, that would be a serious mistake. There is a reason I don’t use anything Google puts out besides YouTube, Google is not to be trusted.

I would seriously start consider alternatives to Apple at that point, and no Android is not an alternative.
 
Thanks for this perspective @TechnoMonk ! Can you provide some insight on how you’re using MLX to run LLMs on your iPad Pro?
I have an Apple silicon iPad Pro, and devloped a simple app which runs inferences using Swift MLX. Apple does say it isnt for production use. You need developer account to play around, and probably setup a pipeline for deploying changes to models. ipad Pro does get hot, and is limited by memory, but works very well for some of my fine tuned models, or RAGs.
Here is the link to Apple MLX swift examples repo, You can tweak and your own code to make it work with custom models or other LLM's.
 
  • Like
  • Wow
Reactions: gusmula and heretiq
Can anyone seriously state what useful thing Siri can do today that it could not do in 2011?

Serious question.

Not some obscure thing but a functionality that you would use daily, like “wake me at…”

As far as I’m concerned Siri is the butt of jokes of AI.
 
💩 What a marketing 🐂
Apple late to the game as usual.
Apple’s scientists are moving the paper through peer review but this is just “marketing”? This is science. Apple isn’t late. Apple is helping move the broader field of LLMs forward and has been for years. Apple’s public efforts are behind OpenAI’s, but so are every other company’s. We are still in the early days of LLMs. Being first doesn’t mean long term success.
 
Apple has has a very successful business practice of standing in the background as some technologies emerge. It's like they intentionally survey the market demand, usefulness and practicality, all while planning the next iteration/improvement in said technology. Wouldn't be surprised if in WWDC 2024 or 2025 we see an announcement of that next level use of the technology, as well as some massive unexpected improvement over others (like when Apple Silicon was announced). Patiently waiting and cheering on as we see the time proven strategy unfold time and time again (amidst all the negative comments, hate and side-line commentary). Have my popcorn ready.
 
I really think Apple was caught off guard last year by the hype curve and the real capability of OpenAI w/ MicroSoft's partnership. I'm sure they'll get into the mix soon, and frankly if it makes Siri more effective, that's the biggest win for me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.