Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Plus, speech interaction often pales in comparison to direct manipulation. How do you crop a photo through a speech interface? "A little wider. A little more. No, too wide. Oh, bugger." It's much easier just to reach out and grab the dang thing. So if you only can have one, choose touch. (Not that we necessary can only have one, but right now we basically have neither, so we have to pick which one to develop first. The one that's more beneficial wins.)

<snip>

My point is simply that we don't have to go back very far in time to see that "this sucks because it's different" isn't a good way to judge how things will evolve in the future. We went from goose quills and ink pots to typewriters to desktop computers to laptop computers, and it's not at all easy to predict where we'll go from here. It's fine to look at a given interaction model and be skeptical, but it's kind of foolish to dismiss anything out of hand. The future has this nasty habit of surprising us.

I notice how you go out of your way to emphasize the verbal without taking into consideration the gestural in context. While I would agree that telling the computer, "put that thing there," is essentially impractical when taken in the verbal-only context, adding the gestural as in "*touch* put this, *touch2* there," instantly tells the computer the object and the destination with the verbal telling it what to do with the object.

On the other hand, as you say, trying to tell the computer verbally how to edit a photo, depending on the context, can be both simple and complex. It would be quite simple to touch-select an image and say, "Increase brightness by one f-stop; increase contrast by 20%; Adjust white balance to Tungsten; use unsharp mask; export to email." In a matter of moments you have taken an image, edited it for clarity and told the computer to prepare an email with that image attached. Using a mouse alone or mouse and keyboard would typically require dozens of clicks, moving the mouse to find the pointer and hit the proper menu items and adjustment settings. Conceivably, the image could have been imported, adjusted and emailed out using only one touch and a series of spoken commands in less than a minute vs the 5 minutes or more by mouse and keyboard alone.

People need to get away from the idea that interaction has to be one thing or another--it should be as natural as talking to your partner or your neighbor. In fact, by combining all of the above, speech, gestures, text and even motion sensing (such as what Nintendo has for the Wii and Microsoft is adopting into the xBox) then computing can be much faster and easier than anything we have today.

However, that concept is still in our future. Were it not for people like Steve Jobs and the many different science fiction writers who can truly imagine how certain technologies can be used, we'd probably still be stuck using the old, heavy, cast-iron typewriters and relying on teletype and messengers for communications.
 
Voice commands fantasies

Conceivably, the image could have been imported, adjusted and emailed out using only one touch and a series of spoken commands in less than a minute vs the 5 minutes or more by mouse and keyboard alone

Thing one -- there are times that typing can go MUCH faster than clearly spoken words.

Thing two -- Getting work done through voice commands and touch and without a keyboard is a total fantasy.

Most of the work in the world is data entry -- not mousing for menus, etc.

Also, have you ever tried to navigate a voice-tree-menu on the phone when there is background noise or other people talking? Have you ever tried to voice commands on the computer or your phone when your kid is watching TV or listening to the radio nearby? Lots of luck.

Now imagine whole offices full of people trying to operate computers with voice commands and finger-swipes. HA! It would also put an end to getting work done during commutes, on the train or the bus. It would end getting work done in coffee shops and libraries, too.

Basically -- the idea of a lot of people getting work done with voice + swipes is a rich ball of B.S. -- an isolated techies' wet-dream.

Apple has shot itself in the foot by failing to allow global entry and control of iPod Touches and iPhone with external keyboards. Maybe those items are not powerful for video editing, page layout, and graphical design, but most people just do data entry! Think of all the meeting notes, idea development, novels, etc., that people could accomplish with today's iPhone/iPod if there were external keyboard entry. A fold-up keyboard and an iPhone/iPod, and you are ready to go for 90+% of the real-world work that gets done with computers.

With the iPad -- an oversized iPod Touch with no handle and no keyboard, Apple has exactly re-created the mistake it made with the Newton -- it has shot for a niche that does not exist, and that is at least a decade away.

Friggin' lame.
 
That's one way to look at it, and it's valid. But it's also valid to say that the computer in question is poorly designed.

(It's also valid to say that anybody who thinks you can determine "right way" and "wrong way" by counting key presses and mouse clicks is a frigging loon, but that's neither here nor there.)

snip
I suppose that my sarcasm didn't come through in my original post. In any event, my point is that computers are designed by engineers, but for the most part are used by non-engineers. The problem is that computers are designed for engineers and techies.

An extreme example of designing for the wrong audience is consumer electronic software. You ever buy a new TV? If it's a 2nd tier brand, the menu system is probably a mess and the main reason for that is the menu is designed by an engineer in China and then translated to English (or French, Spanish, etc.) The Chinese way of thinking is different than it is here, and then you add in the idiocyncracies of an Engineer and VIOLA, you get a menu that has you drilling down 6 levels to change the clock.

Anyways, my use of "right way" and "wrong way" were meant tounge-in-cheek.

If speed is your issue, then Win, D, Enter is the RIGHT way to get to your latest document. Keyboards have always been faster than mice, if the software allows it. Unfortunately, M$ left out using numbers for the system Recent Documents list, so it's too hard to get to the 5th most recent doc you just saved. That's called bad GUI.

I would love to see OSX add touch capability to screens. But not remove any keyboard/mouse functions. There are many times, esp since I've had an iPod Touch, that I would love to use my finger for certain actions that currently require a mouse. Not all mouse actions, just some, and not every time like on the Touch. Internet would be one. Moving windows another. And, of course, multitouch adds a dynamic that neither the keyboard or mouse can handle.
Of course there are a million ways to do the same task on Macs and PCs. I've grown accustommed to my way, which is the correct way ;). I was mainly commenting on the techniques used by non-techie computer users.

C'mon, I can't be the only one that gets aggrivated when working with non-techies ... can I?

While those three keystrokes are definitely faster, they are by no means 'easier'; the user has to learn and memorize the pattern because it is certainly not 'instinctive.' On the other hand, pointing is instinctive; using hand gestures is instinctive; just watch people as they talk and try to describe something or give directions; it's pointing, gesturing and overall moving of the hands--not an object. Simply put, the mouse came out as a means to translate the movement of the hand into a readable input to the computer. You really can't say that a mouse is 'natural.'

Neither are keyboards more natural than mice--in fact, the opposite is true. Yes, they're faster--especially if you're a touch-typist--but not instinctive; not natural. Of course, if we didn't have mice to represent our hand gestures, we'd still be locked into a text-only interface, not a graphical one.

snip
I totally agree. And Apple is way ahead of the curve at this point regarding UI. Imagine the state of smartphone UI if the iPhone hadn't come along when it did.

You recognize this later in your post, but this statement comes from you thinking like an engineer. You see one best way to skin a cat and find it aggravating when people use other techniques.

What Jobs et al recognized with the Mac is that there is one way to skin the cat that is most efficient, but perhaps not everyone has the time, talent or patience to use that technique. Nevertheless, if they can skin the cat some other way, the cat still winds up dead.

Now, an engineer might consider it inelegant and limiting to create a cat-skinning box for those without the know-how to do it manually, especially if the cat-skinning box can't multitask while it's skinning the cat. Nevertheless, if there's a huge market base of people who don't know how to skin cats but want to skin them, then that cat-skinning box could sell well. And if those people don't know how to skin a cat, they probably won't mind the extra step of opening the box to find out whether or not the cat is dead.

That said, I'm not suggesting that iPhone OS is a good idea for a desktop PC -- I find it frustrating that my multitasking abilities are limited if I'm trying to use my iPod touch like a portable pc. I'm just suggesting that there is a new interface technology with some great opportunities to revolutionize computing if it's used creatively, and maybe instead of trying to bolt the tech onto an OS architecture that's lasted more than a decade, it's a good time to rework things from the ground up. I intentionally did NOT suggest that iPhone OS should be a new PC OS, because I would throw that machine off a high-story building.
You got me. I have to admit that I am an engineer.

I don't know if you guys have been reading those articles that Arn linked to, but that one about Facebook Login is a prime example as to how many folks think. I like the way Apple is going about it with their devices. If more companies put the amount of thought into their UI, we'd be better off for it.
 
JGowan said:
They spent 5 years doing it to prep us for the switch to Intel.

that whole "just incase" story of having developed it for intel from the start was bunkum i reckon, they just hashed it together when they reaslied they'd headed the wrong way sticking with the PPC chips. the initial keynote was all, about how they'ed always written it to run on both, but the whole process is two separate things from what they follow up to say, and the need for 'universal' binaries a.k.a two different binaries stuck inside a hidden folder is backing that up.
You're wrong. It wasn't bunk. Steve Jobs said it, not me; and unless you have proof that he's lying, shut the H3LL up.

As far as Universal Binaries backing up your "theory", THAT'S the "bunkum". They had to have a way for software vendors be able to sell the new Intel version and still be selling the software running on existing PPC customers without a huge re-write for both chips. The fact that Apple could write a compiler that would essentially spit out two versions of a company's software is just Apple Genius, not some conspiracy "fact" for your conspiracy theory.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.