I think we can all see where this is going. A ton of people are not just assigning their action button to a static shortcut, rather they are programming their shortcut to include various conditional statements so that the action button can do multiple things depending on the context -- all without manually reconfiguring the action button.
I believe this kind of function is what Apple will introduce as a first class automatic "Magic Button" of sorts in the next version of iOS, with one of the headline demos being contextual "one click" control of Home scenes and accessories (why do you think the iPhone has native Thread support now). This may be part of a Siri overhaul wherein Siri bifurcates into one mode dedicated to traditional conversational Siri (triggered by holding the lock button) and another mode dedicated to contextual intelligence based on the following: what focus mode you're in, where you are (GPS, yes, but especially Ultrawide band. Again, new ultrawide band chip this year), and what's on screen. This kind of Super Siri / Magic Button mode will be the next version of the Shortcuts app which hasn't been given much major attention since its introduction. Carl Pei mentioned this in his reaction to the iPhone 15, that the Action Button may be used in the future for triggering a "Super Siri" to rival ChatGPT's intelligence.
Today's rumor about the iPhone 16 introducing a dedicated camera button in addition to the action button helps bolster this idea. I'm not sure how conflicts will be managed in terms of whether or not the "Magic Button" behavior conflicts with the "Super Siri" on screen intelligence mode. Hell I'm not even sure about any of the above, it's just loose theory, but what I'm fairly confident about is this: I don't think Apple introduced the Action Button as a mere programmable button, that's not Apple. I think their long term goal is for that button to trigger a new behavior that's so essential to the future of their platform they're willing to give it a dedicated hardware trigger, much like they did with Siri. Of course this will also come to Apple Watch. If you're at the gym you may want your Action Button to start a treadmill workout, if you're at the trail maybe you want it to start an outdoor run, and if you've just arrived home maybe you want it to open your garage.
Another point to consider: "Magic Button" behavior is likely something coming to visionOS down the line (especially once we reach the always on AR glasses paradigm). You look at a lightbulb, tap your fingers, and it turns on. You look at your stove, tap your fingers, and your cooking workflow opens. You look at your desk, tap your fingers, and the visionOS virtual Mac displays pop up in front of you. This is the logical endpoint for visionOS + AI, using all sensor data to predict behavior so everything becomes 'one tap.' Why not bring this to iPhone too? If anything it makes sense to introduce this on iPhone first before visionOS.
I believe this kind of function is what Apple will introduce as a first class automatic "Magic Button" of sorts in the next version of iOS, with one of the headline demos being contextual "one click" control of Home scenes and accessories (why do you think the iPhone has native Thread support now). This may be part of a Siri overhaul wherein Siri bifurcates into one mode dedicated to traditional conversational Siri (triggered by holding the lock button) and another mode dedicated to contextual intelligence based on the following: what focus mode you're in, where you are (GPS, yes, but especially Ultrawide band. Again, new ultrawide band chip this year), and what's on screen. This kind of Super Siri / Magic Button mode will be the next version of the Shortcuts app which hasn't been given much major attention since its introduction. Carl Pei mentioned this in his reaction to the iPhone 15, that the Action Button may be used in the future for triggering a "Super Siri" to rival ChatGPT's intelligence.
Today's rumor about the iPhone 16 introducing a dedicated camera button in addition to the action button helps bolster this idea. I'm not sure how conflicts will be managed in terms of whether or not the "Magic Button" behavior conflicts with the "Super Siri" on screen intelligence mode. Hell I'm not even sure about any of the above, it's just loose theory, but what I'm fairly confident about is this: I don't think Apple introduced the Action Button as a mere programmable button, that's not Apple. I think their long term goal is for that button to trigger a new behavior that's so essential to the future of their platform they're willing to give it a dedicated hardware trigger, much like they did with Siri. Of course this will also come to Apple Watch. If you're at the gym you may want your Action Button to start a treadmill workout, if you're at the trail maybe you want it to start an outdoor run, and if you've just arrived home maybe you want it to open your garage.
Another point to consider: "Magic Button" behavior is likely something coming to visionOS down the line (especially once we reach the always on AR glasses paradigm). You look at a lightbulb, tap your fingers, and it turns on. You look at your stove, tap your fingers, and your cooking workflow opens. You look at your desk, tap your fingers, and the visionOS virtual Mac displays pop up in front of you. This is the logical endpoint for visionOS + AI, using all sensor data to predict behavior so everything becomes 'one tap.' Why not bring this to iPhone too? If anything it makes sense to introduce this on iPhone first before visionOS.