Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
70,169
41,729


Apple doesn't currently allow iPhone users to change the Side button's Siri functionality to another assistant, but owners of iPhone 15 Pro and newer models can assign ChatGPT to the Action button instead. Keep reading to learn how.

chatgpt-iphone-dynamic-island.jpg

OpenAI's free ChatGPT app for iPhone lets you interact with the chatbot using text or voice. What's more, if you assign voice mode directly to the Action button, you can use it to jump straight into a spoken conversation, giving you quick, handsfree access to a far more capable assistant.
  1. Install the ChatGPT app, then sign into your account or create a new one.
  2. Next, open the Settings app.
  3. Tap Action Button.
  4. Swipe to Controls, then tap the two chevrons beside the currently assigned control.
  5. Using the control search field, type "ChatGPT."
  6. Select the control Open ChatGPT Voice.
chatgpt-voice-action-button.jpeg


A long press of the Action button will now open ChatGPT's voice mode. The first time you activate it, the app may request microphone access. Tap Allow to proceed. After that, you can begin speaking immediately.

A recent update means voice conversations now take place inside the same chat window as text-based prompts, instead of switching to a separate voice-only interface. Responses appear in real time, combining spoken output with on-screen text and any visuals the model generates. This keeps your conversation's context intact and makes switching between typing and speaking smoother.

Continue the Conversation Anywhere

You can also leave the ChatGPT app during a voice session without ending it. When you swipe away, the conversation continues and appears in the Dynamic Island as long as the assistant is listening and preparing a response. To stop, tap the Dynamic Island to return to the app, then tap End.

Note that while this setup gives you a fast, physical shortcut to a richer, more context-aware assistant, ChatGPT can't perform Siri-like system actions like accessing your calendar or setting an alarm.

Article Link: Use ChatGPT as Your iPhone's Action Button Assistant
 
mine is just Mute all the way
I’d love to have a dedicated physical silent mode slider come back so we could have both. I know it’s simple to just mute in control center but maintaining the tactile feel/option is a plus IMO.
 
It won’t. Siri though, might. Well not in name perhaps, but it’s very likely it will just be a skin for Google Gemini.

Siri isn't going away.

The report on the street is that Apple is going to license a separate installation of Gemini that will reside on Apple's PPC (Private Cloud Compute) servers to power Siri. Google would have zero access to this installation, so no data scraping will occur.

Apple has been wise to not rush into the LLM-powered AI Assistant era until the technology has matured and the compute costs come down. That's now happening.

What Apple should've done is gone full bore into the SLM (Small Language Model) architecture instead of "one model to rule them all". Have specialized models for different tasks, with Siri being a "conductor", but this technology also didn't exist in a usable form until recently.

While the frontier labs are releasing new models are break-neck speeds, we're only just now — this week, in fact — starting to see models that would be able to meet the needs of Siri.

Great time to be alive.
 
  • Wow
Reactions: HazeAndHahmahneez
Why would I want some antiquated ChatCPT model, give me Gemini 3.0 now with some Nano Banana Pro!
 
Siri isn't going away.

The report on the street is that Apple is going to license a separate installation of Gemini that will reside on Apple's PPC (Private Cloud Compute) servers to power Siri. Google would have zero access to this installation, so no data scraping will occur.

Apple has been wise to not rush into the LLM-powered AI Assistant era until the technology has matured and the compute costs come down. That's now happening.

What Apple should've done is gone full bore into the SLM (Small Language Model) architecture instead of "one model to rule them all". Have specialized models for different tasks, with Siri being a "conductor", but this technology also didn't exist in a usable form until recently.

While the frontier labs are releasing new models are break-neck speeds, we're only just now — this week, in fact — starting to see models that would be able to meet the needs of Siri.

Great time to be alive.



Wow amazing. Are you in tech if you don’t mind my asking???

You seem so down to earth and level headed..what’s your take on android/google. Do you do your best to avoid them like the plague? It’s still unfortunate they bought YouTube smh…

Do you think it’s hypocritical for Apple to tout privacy as much as they do but then do all these side deals with Google? Most ppl love Google search and use it without giving it a second thought…how unfortunate : (

So for most it’s no big deal and Apple prob knows this smh

Here’s to hoping Google can’t access any data with Siri integration but I still don’t trust it. Will not be using Siri as long as giigle is associated with it, it’s very simple, just like how I always change browser engines to anything other than Google search…
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.