Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,645
38,074


Google has launched its dedicated Gemini artificial intelligence app for iPhone users, expanding beyond the previous limited integration within the main Google app. The standalone app offers enhanced functionality, including support for Gemini Live and iOS-specific features like Dynamic Island integration.

google-gemini.jpg

The new app allows iPhone users to interact with Google's AI through text or voice queries and includes support for Gemini Extensions. A key feature is Gemini Live, which wasn't available in the previous Google app implementation. When engaged in a conversation, Gemini Live appears in both the Dynamic Island and Lock Screen, letting you control your AI interactions without returning to the main app.

The app is free to download, and Google offers premium features through Gemini Advanced subscriptions available as in-app purchases. Gemini Advanced is part of a Google One AI premium plan costing $18.99 per month. Apart from Gemini in Mail, Docs, and more, it includes access to Google's next-generation model, 1.5 Pro, priority access to new features, and a one million token context window. Users need to sign in with a Google account to access the service.

google-gemini-app.jpg

The rollout follows an initial soft launch in the Philippines earlier this week, with availability now extending to additional regions including Australia, India, the US, and the UK.

Previously, iOS users could only access Gemini through a dedicated tab within the main Google app, which offered a more limited experience compared to the Android version. This standalone release, available on the App Store, brings feature parity closer between the two mobile platforms.

Article Link: Google Releases Standalone Gemini AI App for iPhone
 
And google doesn’t restrict it by your iphone model.
That’s a very valid point. Folks like myself that are unwilling to give up a physical sim tray in the US don’t get AI features. When I try to “look up” a photo on the internet by long pressing, there’s a greyed out AI icon showing I’m not cool enough to have it ha.
 
Funny fail screenshot on the right, the Geekbench multicore score for the m4 pro seems incorrect, should be above 20k.

Not the first time Gemini fails in a demo case.
 
  • Like
Reactions: DEMinSoCAL
That’s a very valid point. Folks like myself that are unwilling to give up a physical sim tray in the US don’t get AI features. When I try to “look up” a photo on the internet by long pressing, there’s a greyed out AI icon showing I’m not cool enough to have it ha.
So you bought an iPhone from outside the US to get a sim tray? I purchased mine in Sweden and have it set to US with a US Apple account and I'm getting all the Apple Inteligence features over here. So if you're not getting them it's not because of where you purchased your iPhone. Unless you were referring to something else?
 
  • Like
Reactions: SFjohn
Hold the phone here. I demand Google store all their app data locally on the iPhone and Apple secure servers with Google being banned from scraping mu usage patterns globally. /s
 
So you bought an iPhone from outside the US to get a sim tray? I purchased mine in Sweden and have it set to US with a US Apple account and I'm getting all the Apple Inteligence features over here. So if you're not getting them it's not because of where you purchased your iPhone. Unless you were referring to something else?
I still have a 13 Pro, the last US model with a physical sim.
 
I don't know the average age of a commentator on this forum, however, I'm gonna take a guess and say I am the oldest. I have no problem giving my information talking to an AI. For me, this is something I've been waiting for since I was reading sci-fi back in the 50s… the real "future" it's gonna be a blast. It's going to be a nightmare, it's going to be enjoyable. The idea that I can talk with the information of the world, have have wide-ranging deep and philosophical conversations with an AI, I mean seriously open my pod bay doors
 
Aka we hope you divulge your deepest secrets to our ai, so we can cult-style blackmail you later!

In this digital age privacy is surely now a relic of the past if it wasn’t already. It’s not fair, the way our data is treated.

Great - now they will also known what you are thinking or not able to think and need help.
Full behavioral and psyco profile.
I guess it pays well 🤔

They went past full behavioral models a while ago.

I warned folks here a few years back about what Google was up to. They're going to replicate every person in software. First to predict behavior and then to modify it. It started with browser behavior. Having a local AI is pretty much the last step. It's conversational and adaptive, so people won't worry about syntax, they'll just talk to it naturally. There are so many nuances to conversation and people are going to just give that away to this company. Wearables will offer biofeedback as another plank in the psych profile.

Even worse, there will be no getting away from it for the saner people out there who refuse to participate. All those Google-ites who don't physically control their cameras (Spy-FI case, or at least a piece of tape/wite-out) will be providing loads of visual data on everyone and everything. Mics are live 24/7.
 
And it does not run on device like apple AI. 😊
Neither does Appe. All more serious questions are sent to OpenAI. No mobile processor is or will be able to handle everything on the device. Of course, it may be that basic queries will be processed on the device, but these will be trivial questions, smarthome operation, weather, looking for something.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.