Apple wants very much for Siri to work with "natural language", but there are cases where it would be REALLY helpful if Siri could expose some more structured syntax and sort of a meta language. The most frequent case for me where this would help is when Siri obstinately misspells something, no matter how you pronounce it - it'd be terrific if you could say, "Siri, you're mishearing the restaurant name, let me spell it for you" "Okay, Carl, spell it for me" "R-U-D-F-O-R-D-S" (taken from a recent case where I was on my way to meet some people across the street from "Rudford's Restaurant", and Siri kept giving me directions for an hours-long drive to some restaurant that had "Redford's" in its name; the restaurant I meant was 15 minutes away, I just wanted the fastest route considering traffic - Siri was useless, there was no way to get her to give me what I needed, because they insist on having a natural-language conversation).
(As a separate answer to your original point, Apple is working to make it so your various devices converse with each other, and use various rules about which is closest to you - measuring travel time for the sound - and which you interacted with last, to decide which single device should answer your "Hey Siri" request, but it's not entirely clear how well or how often this works, particularly if you don't have the latest devices and OS's.)