Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think they are buying H100s and talked up nvidia back at WWDC. The scary fast event really got in to Machine Learning on the Mac, so I wonder if there will be a pivot to Apple silicon. Maybe those M3 Ultras that will likely be twice as fast as M2 Ultras will get some use in Apple data centers.
Would be nice if “those M3 Ultras” were likely to “be twice as fast as M2 Ultras” but not a chance of that. Looks more like 20% faster than M2, 40% faster than M1 Ultras.
 
Well, chatGPT 4 does not "understand" things. At least in a way that a real AI will (if one ever comes about.)

These generative machines are extremely elaborate mimeograph machines.

Whether or not Apple plans on competing with chatGPT remains to be seen. Tim Cook is rather good at being nebulous.
That's a pedantic comment if I ever saw one. AI in the context of the article and in the context of what the wider discourse on what 'AI' refers to are large language model based chatbots. It's not artificial intelligence in the way that it is in science fiction and stopping to point that out every time there's any discussion about AI is not helpful, informative or clever. From a user perspective, language models like ChatGPT can 'understand' a user's input to a level that feels astonishingly realistic and that is what is important.

When people talk about wanting a computer that can understand them, they mean wanting one that can understand their inputs from conversational queries like "Would it be quicker to take the train or to drive to get to my sister's right now", they don't care about whether it can understand their query on an emotional level or whether the computer can understand the philosophical concepts of what a question means.

Right now, Siri's level of understanding is pretty awful. When I tell Siri to turn the lights down in our children's room, I have to specifically say something along the lines of 'set scene boys evening' and I have to pre-configure the scene in the home app. I've had to set a bunch of scenes for different rooms and I have to pronounce the scene correctly. If Siri had the conversational ability of ChatGPT 4, I wouldn't have to say the perfect phrase every time. I could, for example say something like "Hey Siri, could you turn the lights in the boys room to their bedtime setting and could you put on the heating downstairs as it's a bit chilly tonight" and even without saying the scene names, it would know what I meant and it would know that I want the thermostat downstairs to be raised to a level that would activate my boiler, without me needing to set a temperature and it could put together a whole string of actions based on simple conversational statements without needing pre-set scenes or anything like that.
 
It should be expected that AI is something everyone is entitled to, which means allowing every citizen to have access to their own AI, which they can keep as a sort of digital companion, privately and across time.

I hope what Apple may be offering in the future is a personalised AI subscription, 1 per Apple account, a personal assistant of sorts, far beyond Siri but maybe sharing her name.

An AI bot that is personalised to you, remembers every conversation ever had, knows all your files and history across devices. Serves as a valuable personal assistant. Unfiltered and uncensored, fully programmable AI by each customer through use.

Tie that in with potential real-time image generation experiences through VisionOS, describing environments and actions taking place and the AI generates real-time immersive and animated scenes
 
Well, chatGPT 4 does not "understand" things. At least in a way that a real AI will (if one ever comes about.)

These generative machines are extremely elaborate mimeograph machines.

That's where you get it wrong.

GPT has a neural model, meaning it simulates human neurons. Of course, it is a more simplistic model, but it each object has a group of "qualia" associated with it, which is a tridimensional model of how an object should look.

Speaking in a simplistic way, each object is a group of associations:

Car -> is red
-> is blue
-> is Toyota
-> is useful

So, imagine you ask "GPT, what is a car"?

It will go through this model, process its associations, and say something like:

"A car can be red or blue. Cars are usually from the Toyota brand, and they are useful."

That's different from simply holding in its memory a copy of the sentence "a car is red". A classic computer can only reproduce this sentence; it can't transform it in any other form than it was designed to, unless a human writes a specific program to transform it in a different way.

But GPT can. And it can transform sentences in things it wasn't designed to before.

By the way, this also happens to be how WE humans associate and process information. We're complicated predictive / associative models.
 
You are correct in theory, but the point is that Apple hasn't been doing it. The point is not about generative AI, but APPLE making generative AI when they've made Siri a moron. So it's useless to state that it can be "damn reliable" when Apple hasn't been making it reliable.

Siri often doesn't even understand very regular commands for tasks it's specifically designed and marketed to do. This is common knowledge. Now we keep assuming that Apple will fix things, but they haven't fixed Siri. In fact, by many accounts Siri has gotten worse these several years.

Last night I told it to play a song in three different ways. Siri did three different things except play the song. I tried naming the song, and the song and the album, and the song and which playlist it's in. Finally I had to pick up my iPad and tap into the app to do it myself.

Part of the issue is probably on how Siri recognizes speech. It's using an ancient recognition engine. Just compare it to OpenAI's results.

The other part of the issue is that it doesn't understand sentences too well. But that's a more complicated problem to solve. Still, Siri's reliability would improve a lot if it recognized words more reliably.
 
  • Like
Reactions: Radeon85
If Apples AI would be carbon neutral it would be a game changer and differentiate from the rest. In Tim Cook we trust 🌏
 
Microsoft, Google, et. al. have left Apple for dead when it comes to AI.

Cook's problem is that he only sees the $, and forgets he has to produce a compelling, competitive product first.
 
  • Like
Reactions: Bob1985
Siri has to improve by a huge margin. Expecting to see iOS 18 with some improvements
 
I want Siri to understand different languages properly. When I used Carplay, Siri couldn’t understand the street name I said in Swedish but Google and Waze could understand me. When Siri was just an app in the beginning, It was mind blowing. Apple bought it and did nothing with it other than integrating with Apple devices. Everyone else followed after and became better than it.
 
  • Like
Reactions: pappl
I want Siri to understand different languages properly. When I used Carplay, Siri couldn’t understand the street name I said in Swedish but Google and Waze could understand me. When Siri was just an app in the beginning, It was mind blowing. Apple bought it and did nothing with it other than integrating with Apple devices. Everyone else followed after and became better than it.
Siri can’t even get a grip on Australian.
 
  • Haha
Reactions: Kottu
The weird thing at the moment is that «AI» is at the same time a terrible fad and yet it is isn’t. Like with other technologies before, it is over-hyped, magazines, consultants,every effing software suddenly goes about AI. Like with kryptocurrency, über-expensive jpgs and the hoolahoop. It’s super annoying and in many ways what our attention-economy-society is all about these last years.

On the other side the technology, which is not artificial intelligence but only a very good simulacrum by means of complex neural learning patterns and vast databases (the mimeograph comment above is gold) is no ********, even if some applications like generating the same manga-esque image-iterations of childwomen like Mia Gezelig over and over again might be, although also very Gibson-esque and very human.

Where Siri is a simplistic handcoded database (groundbreaking at the time) that must become worse the larger it grows, a GPT-driven personal assistant would get better with the size of what it learns. Just imagine if OpenAIs ChatGPt would remember your sessions, learn from you personally, adapt wo your prompts and evolve around the thing it knows about you. Now transfer that to a system that knows you intimately. Your schedules, your contacts, your todos, your health, your moods, your favorite music, food, books, interests … all of it. Combining the almost terrifying 360-degree-knowledge of the Apple ecosphere about its users with a robust learning system would potentially lead to a whole different kind of OS, to a real personal computer. If that will feel like a person or if it works in more discreet ways, we will see (at the moment the almost-invisible applications seem wo work well for Apple), but if they are smart they will just behind the scenes rework Siri, keep the brand-name, integrate it seamlessly into what their own ecosystem offers to a GPT-model, personalize it and make it seem easy and friendly and smooth, as they always (well mostly) manage to give tech a friendly face.

As Apple on the other hand does a lot of thing and then stops at 70% for some reason and never realizes the full potential of many products and ideas, maybe it might still suck 😂
 
  • Like
  • Love
Reactions: arkitect and ipedro
No one has been able to monetize the “AI” yet because no one wants to pay for it. OpenAI is still burning through cash. Only one making money is Nvidia because it’s selling shovels during a gold rush

Generative AI is a fad and Apple better start looking for opportunity somewhere else

I just hope you don’t have kids.
 
No one has been able to monetize the “AI” yet because no one wants to pay for it.
Microsoft have monetised AI quite well, actually. Read their latest results. Customers paying $400k/month for GPT4 PTUs and $100k/month for GPT3 PTUs. There's obviously a large investment gone into that, but people are certainly willing and able to pay for it.
Then you have M365 Copilots just launching at $30/user/month which customers couldn't buy but have been asking to for months. GitHub Copilot is one of the fastest growing services of all time.

Then you have Siri setting timers, so I can see why you'd think AI is failing if Apple is your whole tech world!
 
I hope what Apple may be offering in the future is a personalised AI subscription, 1 per Apple account, a personal assistant of sorts, far beyond Siri but maybe sharing her name.
Oh that's a hard no from me. The only way to do this would be to monetise the data collected by the AI, so effectively this will end up like Facebook but with an apparently friendly AI coercing you into spending money for the highest bidder.
Trust Apple you say? With Tim at the helm? HELL NO
 
Siri is the worst AI I have ever used in my entire life. Also I'm not that impressed by the AI of Apple Music to recommended me music.

So I don't think anything good will come out of this.

But I get Tim Cook. AI is the buzzword right now and if you want to increase your stock price, just mention the name AI.
 
No one has been able to monetize the “AI” yet because no one wants to pay for it. OpenAI is still burning through cash. Only one making money is Nvidia because it’s selling shovels during a gold rush

Generative AI is a fad and Apple better start looking for opportunity somewhere else

I doubt Apple sees AI as a profit generator, but rather as an elixir to continue selling their super profitable products such as iPhones, iPads, Watches and Macs far into the future
 
  • Like
Reactions: arkitect
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.