- Apr 12, 2001
In a patent application filed in May 2008 and published today, Apple discloses that it has been researching methods to allow media devices such as the iPod to "push" their graphical user interfaces (GUIs) to accessory devices for the purposes of controlling the media devices remotely. In pursuing this technology, Apple notes that while remote control accessories for media devices are common, the interfaces are generated by the accessory itself, leading to incomplete functionality and varying experiences when multiple devices are used with a single accessory or a single device is used with multiple accessories.
As a response to these issues, Apple proposes methods for media devices to "push" their own GUIs to accessory devices, allowing for full functionality and a consistent user experience across compatible devices and accessories.But existing remote GUIs are defined and controlled by the remote control device, and consequently, they may bear little resemblance to a GUI supplied by the portable media device itself Certain functions available on the portable media device (such as browsing or searching a database, adjusting playback settings, etc.) may be unavailable or difficult to find. Thus, a user may not be able to perform desired functions. Further, GUIs provided for the same portable media device by different remote control devices might be quite different, and the user who connects a portable media device to different accessories with remote control may find the inconsistencies frustrating.
The lead inventor on the patent application is William Bull, former iPod User Interface manager at Apple and currently Senior Director of Mobile User Experience at Yahoo. Also among the inventors is former Apple executive Tony Fadell, the "father of the iPod".The portable media device can provide the accessory with an image to be displayed on the video screen; the image can include various user interface elements that can resemble or replicate a "native" GUI provided directly on the portable media device. The accessory can send information to the portable media device indicative of a user action taken in response to the displayed image; such information can indicate, for example, that a particular button was pressed or that a particular portion of a touch-sensitive display screen was touched by the user. The portable media device can process this input to identify the action requested by the user and take the appropriate action. The action may include providing to the accessory an updated GUI image to be displayed, where the updated GUI image reflects the user action.
Article Link: Apple Researching Methods for 'Pushing' User Interfaces to Accessories from Media Devices