Imagine, Microsoft coming out with the next version of Windows Mobile, finally integrating multitouch, only to be met by Apples iPhone with a full color, 3d holographic interface.
Not sure about multi-touch, which makes more sense on larger devices. But they've already shown pieces of Windows Mobile 7, and one input option is to use the camera to detect 3D movement of the phone... and your command gestures.
Darn, and I really thought that was novel. But wait, fake doesn't mean it's impossible to program, right?
As I said previously, this is being done today on other devices. Unfortunately, I can't find an example for you right now, but I'll keep looking.
In a similiar vein,
here's a video where a guy used a Wii remote to track your head to emulate 3D images on a flat monitor. (Jump ahead to about 2:35 if you wish.) This would be a cool UI idea to implement in reverse on any phone (iPhone or Nokia or WM type) that has an accelerometer... let you look "behind" file cabinets, etc.
Edit: There'a another variation that's being done as well, where 3D objects are superimposed in realtime on top of the camera image. For example, you could point your camera at a tabletop and little chess pieces would appear on it. As you reach out an arm, the camera and software watch and let you "move" the pieces in realtime... even if you change the camera angle.
Edit again: Aha! Here you go for some examples of "augmented reality".
Video here Most here use headtracking and googles, but the techniques are similar to what is being discussed here.