Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I could see air gestures having limited use. Say you're cooking and reading a recipe online and want to scroll. It would be nice to scroll the page without having to touch your device. Or change the music while doing dishes. Beyond some fringe cases however, it doesn't seem like it would be useful on normal occasions to most people. If Apple were to go down this road, I'm fairly confident it would be a much better implementation that what was described in the article.

Hand ID? That just sounds like inventing something new just for the publicity. It feels like a desperate "LOOK AT ME" stunt to get attention in a sea of android devices. It doesn't really sound useful or practical after reading the article. And I am skeptical about the strength of its security.
 
From Touch Screens to Touchless Screens....??
While it sounds like a good idea, what happens if someone else waves their hand in front of your device?
Or what if you're air gesturing and a fly comes along and you swat it away, will the device be smart enough to know that you're not gesturing to it?
Touching the screen gives you greater control. Its more precise too.
 
There was a competition in 2006 for building Mac apps and it has a forum where lots of ideas were thrown around many of which have appeared on iOS etc People on that forum were pitching gesture based actions use the webcam way back then.

That's just a recent example. 2006 is also the year the Wii was introduced, and I think gyro-equipped "air" mice predate Wii.

Mice and trackballs exist to convert gestures to computer-readable form. According to wikipedia, the first trackballs were developed in the 1940s and 1950s. The mouse came along in 1963-1964. The Tom Cruise film "Minority Report" with its prominent use of air gesture-based computing was released in 2002. So it's really not a matter of "when" gestures will be adopted, but simply when the latest method of detecting and encoding gestures is implemented.
 
TV, Mac, and especially AR glasses all make sense for this.

I can think of one very viable use case for this on a phone, and that is for accessibility purposes. It could be an absolute game changer, dependent on how it’s implemented and personal circumstance. User configurable gestures based on the users capabilities which work using a true depth system could potentially allow much nicer navigation for people with disabilities that limit their ability to use the touch screen.
 
I'm already doing gesture to my Apple devices with middle finger. It's so annoying escpecially if jumping from iPhone X gesture to iPhone 8.
 
I honestly don't know why they bother with this hand gesture stuff. It may make sense for large displays (minority report style) but not for a pocket displays that you would rather just touch, and it would feel much more natural.
 
People like to rag on Apple for not innovating and pushing gimmicks.

Now THIS is a gimmick and not innovation.
 
Using such new technology just to replace what existing one can achieve does not bring a lot of value. What would have been the value of a mouse in an 80x24 world ? But with a GUI, it became obvious and even needed. The same here. Value is not for present devices, but could be huge and even essential with VR devices.
 
I remember my old Galaxy S4 had a few air gestures, like hovering over an image in the gallery would preview it, or swiping over the if you have an incoming phone call could answer or decline (and you could hang up if you were already in the call). I can't remember if there was anything else or not since I haven't used that phone in years.
 
This seems pretty pointless in a smartphone being you need two hands to accomplish what you want to do, BUT, I do see this being useful in things like TV's and smart homes which is where I see them going with this. Depsite what the writer said I don't see this being any better in an iPhone either. I normally don't use this word but I'm going to say this is a gimmick.
[doublepost=1554841263][/doublepost]
My Galaxy Note 3 had a few air gestures, it's not exactly new.
Yeah I think my S4 had this and it didn't work that great.
 
No. This feature is mostly useless, even on the Apple watch. All of the issues regarding security and unlocking the phone aside, the gestures work only when your hand is over the phone. This does not make the user experience any better than having your hand on the phone and swiping like we do today. Having to lift your hand in mid air for long durations (if you are reading an article for example) is tiring for the user.

The use case for unlocking your phone if your hand is wet is also unsubstantiated. We have a better solution that is already available and they're called "voice assistants". The whole idea of using voice is if your hand is unavailable, you can issue voice commands to get that one off task done. And if your goal is to launch a youtube video, well you're better off finishing your task and using your hand anyways.

It's a gimmick and I would refuse to pay additional money for this feature.
 
Peak phone phone has landed. Silly gimmicks RUs. Until Samsung perfects the folding screen. 20 years later the hologram display goes on sale. "I think we are turning Asian I really think so."
 
But I already have a phone that turns on, changes my home screen, changes music, starts playing music, makes a phone call - all while I am mowing my yard with the phone in my shirt pocket. And I thought it was OFF. :eek::mad:
 
Call me old fashioned, but I am not waving my hand over a screen. Touch works just fine and they are chasing a new input method to differentiate. It's a gimmick.
 
how does the expression go? This is "A solution in search of a problem"...
 
The concept of gestures goes back quite a while. I still think it has its place, but smartphones may not be that place.

I've said this several times over the years regarding the use of touchscreen on desktop PCs. Moving your arm around a 27" or larger touchscreen display could be wearying, to say the least - large arm movements are required. Smaller hand gestures a la Air Motion make more sense to me. A Mac equipped for Face ID will have all the hardware needed to deliver this capability to desktop computing. From there, it's a matter of quality of execution.

As someone else mentioned, controlling a TV is another example - too far away to touch the screen. And while voice control can accomplish a lot, when it comes to some kinds of point/swipe/drag/drop, gestures (whether directly touching as in touchscreen, or by air gestures) can be more effective.

So, it's coming, it's nice that a manufacturer is bold enough to bring it to market, but this doesn't seem the best product for proof-of-concept.

Agreed, for all the reasons you mentioned, but also for just a basic thing that's kept me away from touchscreen laptops and PCs, avoiding fingerprints on the screen, air gestures is something I'd love to see on PCs and laptops. With the failure of the Kinect I resigned myself to being in the minority on this, but maybe there's still hope!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.