Hahaha. I wish we could freeze this and revisit in 15 years to see what has actually happened.The fact that you used no paragraph breaks in that is funny as hell.
AI will soon transform the world beyond your wildest little dreams, of which you have none of course because you have zero vision for its capabilities to do so.
Siri is delayed because Apple is garbage and has no capabilities with AI. They are being passed horrifically and if OpenAI makes hardware they’re done.
Also the energy costs of AI are of ZERO consequence. We need to use ALL the energy necessary for AI because AI will be the only thing that can stop climate change. Humans cannot do it alone in time. AGI and ASI will build out fusion plants astronomically faster. They’ll solve innumerable climate issues clear across the board.
These are things that will be happening. Oh, by the way, AI will cure all human diseases within 15 years. Enjoy. And by the way, the very reason you know AI is the real deal is specifically because the cheaters in finance and on Wall Street are pouring literally hundreds of billions into it. Including the US government. Guess what buddy—these people do their due diligence.
They don’t.All the phantom braking begs to differ.
Do you understand the context of this?The fact they’re now talking about having to do hardware updates begs to differ.
I have one. However like many people I dont carry my camera with me, which is why improvements in camera quality are welcome.Why not buy a real camera if photography is important to you?
No. But I do know how to use YouTube, X, etc.
They don’t.
Do you understand the context of this?
I understand the context and it’s not a hardware upgrade to get rid of phantom breaking.You drive a Tesla?
I understand the context and it’s not a hardware upgrade to get rid of phantom breaking.
It’s to increase the processing power of hardware 3 to be able to getting close to robotaxi.So… no?
It’s to increase the processing power of hardware 3 to be able to getting close to robotaxi.
HW3 isn’t up to the level of HW4 and beyond, which can do more computations/sec than hw3. Tesla already committed to upgrade hw3 chips to those who bought fsd.So you don’t drive a Tesla, got it.
is hw3 not capable of fsd?
That sounds like Cameron Crowe.If one lowers their expectations to ZERO, then they are never disappointed. That is becoming a mindset for many Apple users.....
That’s absolutely hilarious. ChatGPT Advanced Voice is so far beyond anything Siri ever has been or ever will become.
It honestly makes Siri seem like it’s the exact same thing it has been since it launched in 2011 which is basically what Siri is. Siri is almost entirely useless. The only things you can ask are extremely simplistic tasks and hope it hears you right. ‘What’s the temperature’ ‘set a timer for X’ ‘do I have any messages?’ ‘Send a message to Y’ which it then misunderstands you 85% of the time.
This report is even MORE comical because by the time iOS 20 releases in Fall 2026, OpenAI’s Advanced Voice mode will be near-human level entirely, and it’s knowledge will be so vastly beyond Siri’s it’s incredible. It already is so vastly beyond Siri it’s incredible, and it will only get farther and farther ahead. Moreover, all the major AI companies will also be miles and miles ahead of anything Siri will ever become. xAI’s voice mode in Grok 3 is very good now too. Anthropic will be doing more with voice too and Amazon is using Anthropic for their new Alexa which is launching within the next several months. The new Alexa with use of Claude will make Siri utterly pathetic in terms of actionable requests too, because while OpenAI’s voice mode won’t be passed, the new Alexa using Claude will be extremely close and right on par with it, and with Alexa with all the devices integration, Alexa will be able to take tons of actions for the user.
But, how could they have "slept on AI"? Does Apple's leadership not have a finger on the pulse of the industry?I disagree. Apple is a hardware AND software company.
Their devices are only great, because the hardware and software work hand in hand.
They just slept on AI and are now scrambling to put something out in that area...
I respectfully disagree, but I think Apple put the cart before the horse. They should have developed a LLM on servers first because they knew they were already behind with a substandard product and then modified it to work on-device. It was interesting to watch Elon Musk's presentation on Grok3 because he compared the LLM model to the mini-model (I have no insider knowledge as to how much smaller), which performed only slightly less.You understand that on-device is code for ‘absolute garbage’. Yes, it has more privacy but it is simply too bad to bother being used.
OpenAI will literally have AGI voice in a few years and it will be something so absolutely nuts to behold no one would dream of using on-device. It will literally be indistinguishable from talking to another human, aside from the fact that it knows almost everything.
But, isn't that Apple's fault? Apple chose to put the cart before the horse. Since Apple was behind, it should have developed a LLM on servers first to catch up and have more computing resources available to make Siri more capable. The end user isn't going to give Apple credit for having it run on devices. They will look to see which AI performs best. Currently, it is a triage process to see which AI survives into the next round. At this point, Siri will only survive because it is pushed by Apple, but Siri may not survive for long in this rapidly developing field. And, if Siri doesn't survive, HomeKit doesn't survive.I feel like it’s unfair to compare this upcoming Siri to openAI’s offerings, since presumably llm Siri will be running on device. OpenAI doesn’t offer anything local or that can be used offline. Very different beasts.