Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
70,317
41,933


Apple's patent applications provide a glimpse into what might lie ahead in our future. For example, we saw an explosion of multi-touch patents in the year before the iPhone's release. And while many of these concepts may never come to market, they provide an interesting look into the direction of Apple's research.

One of the most recent patent applications from Apple is entitled Multitouch Data Fusion and is authored by Wayne Westerman and John Elias (formerly of Fingerworks). Westerman and Elias have been prolific publishers of multi-touch patent applications and likely helped establish the multi-touch technology behind Apple's iPhone. While many have since hoped to see some sort of advanced multi-touch interface for Apple's Macs, only limited multi-touch support has been included into Apple's notebooks.

In Multitouch Data Fusion, however, Westerman and Elias are already exploring the use other inputs to help improve and augment multi-touch interfaces. These include:

- Voice recognition
- Finger identification
- Gaze vector
- Facial expression
- Handheld device movement
- Biometrics (body temp, heart rate, skin impedance, pupil size)

The patent application gives examples of how each of these could be used in conjunction with multi-touch to provide a better user experience. A few highlights are provided here:

Voice - Some tasks are described as being better suited for either voice recognition vs multi-touch. For example, if a user's task is to resize, rotate and change the color of an object on the screen, multi-touch would be best suited for the resize/rotation tasks but changing of color (or inserting text) may be better suited to a voice command.

Finger identification - the article suggests that using built in cameras (such as the iSight) with a swing mirror could provide an over the keyboard view of multi-touch gestures. This video information could be used to better distinguish which fingers are being used in which position. In conjunction with the touch input, this could be used to create more specific and accurate gestures.

Gaze - tracking where the user is looking could help pick windows or objects on the screen. Rather than resorting to moving a mouse pointer to the proper window, a user could simply direct his gaze at the particular window and then invoke a touch gesture.

Facial expression - detecting frustration on the face of a user could provide help prompts or even alter input behavior. The example given is if a user is incorrectly trying to scroll a window using 3 fingers, instead of 2, the computer may be able to detect the frustration and either accept the faulty input or prompt the user.

Many of these technologies are likely years away from the market, but continue to provide an interesting peek into Apple's future.


Patent: http://appft1.uspto.gov/netacgi/nph...&s1=20080211766&OS=20080211766&RS=20080211766


Article Link
 
I can see millions of possible uses for this one. For example, the software can detect vasocongestion in a certain extremity, and subsequent tissue swelling. It can then minimize all immediate windows, and open new ones of the user's favorite porn sites. This could usher in a new era of adult-themed convenience! Although they would need to work out some kinks, as I'm sure this would lead to more than a few complications in the workplace.

:apple::apple:Let's hope this doesn't fall through!!!!!:apple::apple:
 
Cue ClipIt. "It looks like you're angry. Would you like to..."

God, I hated that thing. Made me wanna bend it out into a traight line and then bend parts up and down until the metal weakends and it just breaks off.
 
The What's the story morning glory app?

Whilst some isn't ready for prime time, a few of these are already inthe labs - Microsoft's work on voice, gaze recognition is established, but not so much via a webcam.

Finger recognition as in fingerprints has already been around, but actually working out which finger via a multitouch pad has been mentioned in previous patents. Working out which finger via a webcam is more difficult. Same with facial recognition. However, handheld device movement is already nearly to market in some guises.

I think the tech demos of webcams being able to identify objects, then use them as user interfaces (e.g. noticing a colored plastic wheel, and then being able to use it as a steering wheel for a racing game etc) already show there is a lot more out there, but it's not had OS integration really.

Interestingly, this patent is all about how to fuse multi-touch data - and then how secondary data can be combined with touch data - either refine the touch data, be interpreted in accordance to the touch data, or combine together to be a new command. So it's moving on from just multi-touch. Would Apple release some of the gestures prior to next summer's OS update, to give time to get acquainted with them?

Interesting to see they're patenting the future possiiblities and what they are.
 
Why can't Apple just concentrate on fixing its OS, and providing quality made hardware before doing something new. :(

なぜなら!

Apple is now Micopple

I want money (that's what I want)
 
Why can't Apple just concentrate on fixing its OS, and providing quality made hardware before doing something new. :(

なぜなら!

Apple is now Micopple

I want money (that's what I want)

Snow Leopard, fool!
 
Ridicilous

This is again a good excamle of the flaws in the patentability of inventions in US. Where's the invention in the notion that voice, facial expressions, gestures, gaze, body movement, flip-flops can or could be used to prompt actions from a computer? The invention lies in figuring out a technological solution to make it happen and that is also what should be patentable.

And no, I didn't read the actual patent application so I may be somewhat of here :D
 
Interesting, sort of

Everything sounds cool except for facial recognition. That idea is flawed. Why, how many people smile, frown, or squint when they are reading something... or even have a fustrated look when reading what someone wrote in an email that went to an unintended audience.

I can just see a lot of mis-readings on this.

I think shock detection could be good. I mean, people tend to slam their desks, shove their keyboard, flick the screen, throw their mouse when ticked off at the computer. Heck MS needs that for the fustration. Everyday my virtual PC's on my Windows work computers lock up or run slow.

Voice recognition, I am not so sure about. It may have some a long way since I tried it - but it had a hard time with some words and if the software crashed and you had to re-install it; you had to spend hours retraining it. My nephew (while he was in high school) had a comuter project he had to do to graduate. His demonstration was to have his XP machine (yes XP has been around that long - he is now 22) being totally operated by vioce command, using Dragon Naturally speaking. He said it was a fun project, but was fustrating as it took hours to train the thing. We are a long way from the old star trek episodes ("computer, 2 to beam up". "Computer, .....").

Right now I would like to see more work done on touch. On a laptop/tablet I can see not needing a mouse/trackpad anymore. A keyboard is still needed for long typing.
 
While I think there is definitely a world beyond multitouch, I think multitouch will be the norm for decades - outlasting the mouse.

What I don't want:
- Facial expression watcher? My facial expression is usually "Dammit, i'm still at work", so does that mean it would constantly be doing the computer equivalent of the famous girlfriend question: "what are you thinking... baby, what's wrong... why won't you talk to me... we never talk anymore...". And then it has a hardware failure.

- Voice recognition: Never ever ever. While it was kind of cute in the quadra 850av days with the knock knock joke, i would hate to work in a cubicle farm with a bunch of people talking to their computer all day. "change color... rotate... change color ... rotate... Corporate accounts payable, Nina speaking. Just a moment. ..."

No, the real trick is going to be multitouch with tactile feedback, so you can have functional keyboards that are task-specific.
 
In Multitouch Data Fusion, however, Westerman and Elias are already exploring the use other inputs to help improve and augment multi-touch interfaces. These include:

- Voice recognition
- Finger identification
- Gaze vector
- Facial expression
- Handheld device movement
- Biometrics (body temp, heart rate, skin impedance, pupil size)


Many of these technologies are likely years away from the market, but continue to provide an interesting peek into Apple's future.

none of those sound like years away from the market, there already is lots of implemented user interfaces using those technologies; slr's had auto-focusing based on where the user was looking in the 90's, samsung (i believe) has already cell phones that have the "finger identification" through build-in camera, voice recognition has been around since os/2. controlling handheld by movement is available in sonye and nokia cell phones. etc etc.

actually, what's left to patent?? doing all this stuff in iphone?
 
Makes me think of a Minority Report meets Star Trek type computer. Sounds pretty damn powerful. :)

I think I like where Apple is going with this stuff, but please keep the option of typing for me whether on a keyboard or on a multitouch surface. Nothing more relaxing than being able to do my work while my music is playing and IM'ing my coworkers rather than talking on the phone. :)
 
I certainly think Finger identification for security could arrive soon.

It behooves Apple to do more than just secured logins. It's too pedestrian even. No, I think Apple will unveil a remarkable product which incorporates all those things. The Gaze Vector appeals to me most of all.

I can't wait.
 
I think we're finally heading to the point where advancements in technology are starting to change from "cool!" to "creepy."

I find a computer watching where my gaze is or trying to discern my mood a really unsettling thing....
 
While I think there is definitely a world beyond multitouch, I think multitouch will be the norm for decades - outlasting the mouse.

What I don't want:
- Facial expression watcher? My facial expression is usually "Dammit, i'm still at work", so does that mean it would constantly be doing the computer equivalent of the famous girlfriend question: "what are you thinking... baby, what's wrong... why won't you talk to me... we never talk anymore...". And then it has a hardware failure.

- Voice recognition: Never ever ever. While it was kind of cute in the quadra 850av days with the knock knock joke, i would hate to work in a cubicle farm with a bunch of people talking to their computer all day. "change color... rotate... change color ... rotate... Corporate accounts payable, Nina speaking. Just a moment. ..."

No, the real trick is going to be multitouch with tactile feedback, so you can have functional keyboards that are task-specific.

Wow! I never thought of it like that before, that would get pretty annoying. But i think the coolest feature is the possibility of Gaze. If Gaze is created properly (which its apple so it would be amazing) then the interface on a Mac would change and become a whole new experience. I would love to see apple come through with this :D.

i love :apple:pple
 
I agree that judging input based on facial expression sounds pretty out there...

[My wife off screen tells me that my dog just chewed up my iPhone]

:eek:

[Computer opens 50 windows, or whatever :eek: signifies]
 
Everything sounds cool except for facial recognition. That idea is flawed. Why, how many people smile, frown, or squint when they are reading something... or even have a fustrated look when reading what someone wrote in an email that went to an unintended audience.

I can just see a lot of mis-readings on this.

Apple would be the only company capable of identifying whether this type of technology will "work" through limitation such as this. If it doesn't work they won't use it.

Other companies would just use it for the "cool" factor, without thinking through the practicality.
 
I think we're finally heading to the point where advancements in technology are starting to change from "cool!" to "creepy."

I find a computer watching where my gaze is or trying to discern my mood a really unsettling thing....

HAL+9000.JPG


And if it's as buggy as Leopard has been, it won't recognize your bleary eyed face in the morning and will instruct the toaster to murder you with a burnt slice of raisin cinnamon bread.
 
Gaze tracking would be really cool. Voice recognition was always gimicky and goofy (and dopey and sneezy). The only way it would be cool is if you could say things like "Move this piece of sh*t over here and delete that piece of sh*t." And Undo would have to be connected to the phrase "Oh, f*ck."
 
Rise of the Machines :eek:

Finally, somebody is taking notice of the success of nintendo's innovative interfacing concepts. I had my doubts before using a Wii - horrible memories of the duck hunt era - but it is surprisingly accurate and very receptive and it will be interesting what developers will be able to push forward now that Nintendo have proved that the world is ready to embrace more innovative and interactive interfaces. In igloos in Illanois it is icy. i. :eek:
 
Cool

Very cool. I like the idea of the finger printing scan that come on those lenovo notebooks. Perhaps some James Bond style retina scan is in the works for access to your MacBook? :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.