This feature is bloody marvelous.
The head movement is just one more way to interact. There could also be switches in physical form, on the jaw or a big red button.
The thing I see as fantastic is that this feature (switch control) is on system level. Both in iOS and in OS X Mavericks. Until now, I believe, these features has always been third party plugins. Plugins which conflicts with the drivers from time to time. And only work in selected apps/programs.
Also this is one aspect to get access to the school market, where some has had to say no to apple since there students couldn't use their products.
But Apple has done this the year post-2010, and almost every solution used today is sort of the same as used in the 80's.
How they solved the switch control in OS X, looks truly amazing.
What's missing now is eye control, if they could crack that one in consumers price range.... well, the potential with this is much more then meets the eye.
And with the api to the speech engine... apps can dump the price $100 on the spot with no need to license an acepela voice for example.
If the text prediction, as seen in Mavericks, is possible to use in the iPad as well. Then there will be possible for small companies to make big apps with little resources. Ideas from frustrated users can be made, since apple now is unlocking much of the functionality that you had to have many experts to develop your self.
My son has cerebral palsy and uses his iPad for communication, before he had a Tobii C12. Around $10,000, 2 GB ram, 1,2 Ghz, windows computer, 4 hours of battery and heavy so my sons little wheelchair almost tipped over. So the iPad solved most of the problems, and iOS seems to with version 7 solve almost all of the problems that still are.
But that "shake to undo" sucks! Everytime the wheelchair hits a little bump in the ground. It acts as a windows update popup. All the time... Is it possible to turn of in iOS7?