Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I disagree...our brains are quite well evolved to use tools like a mouse...one of things that separates us from animals.

Once again you reply with some braindead response because you know damn well that it's not possible to have Final Cut work better with multitouch surface over a keyboard and mouse. Sorry your point is null and void and all you are doing is grasping at straws trying to save yourself.

Ok, I'm putting my cognitive scientist hat on. There is an issue of metaphors here. I do not mean metaphors in an English literature sense, but rather in a cognitive sense; directed blends from an abstract domain to a concrete domain. There is a fair amount of evidence that everything abstract we think about is given its meaning and reasoned about through metaphors.

Every layer of indirection means that the unconscious mind (the vast majority of the mind in any human) has to work that little bit harder and increases the risk of misreasoning. A finger, or a paint brush, are less indirect than a mouse (the motions of the mouse do not correspond with the observed effect, etc.). The harder the brain has to work, the more likely that consciousness (a slow, blocking process — to abuse computer science analogies) has to get involved to sort out error conditions. This leads to an overall slower thinking process, less creativity and makes the interface mentally more tiring.

Final Cut Pro may seem efficient, but it is efficient up to the maximum allowed by the interface method thanks to this indirection. A replacement interface (and I can think of several right now, but then I've been designing multitouch mockups since before the iPhone and video was one of the examples that interested me as being ideally suited to this sort of UI) may start at a lower efficiency. It would, however, have a higher ceiling than the existing method. It would not, however be Final Cut Pro. It would be something just as capable, or more so, but with a very different way of looking at the problem. What is wrong with that?
 
Yes, a number of very simple tasks that require very little fidelity. I never said anything about common tasks, have I? I've been talking strictly about Pro apps that require actual precision. Something that you apparently can't grasp your head around.

Look, I'm trying not to get personal, but you're an argumentative ninny. You know what is my opinion? You're one of those people who took a workshop on Photoshop at the local public library and now you fancy yourself a user of "Pro apps". Or you're an arts undergrad of some kind with an inferiority complex, in which case you just need to get over it. Arts are more fun than programming, embrace it. :p

In this case, "common" doesn't mean what you want it to. I mean "common" in the context of your "Pro apps", since that is what we're talking about. :p You can't get pixel-precision with a mouse without zooming unless you have an exceptionally steady hand, eagle eyes, and a large display. Every video editor I've sat in with or observed in a usability study zooms constantly. What makes you think, since you're already zooming anyway, that there's no way around the "accuracy" issue? I think that it doesn't even require very much foresight, since it's already being done for other equivalent actions in a variety of applications. Beyond a failure to see the obvious expansion in the use of zoom (especially when it's available as a simple gesture, rather than something that has to be clicked or typed), you're more broadly insisting on equating "greatest precision I need in this application, such as 1/60th of a second displayed on a visual timeline" and "one pixel". There's no reason for that.



You don't use a tool for it because you don't need any precision for that, this isn't rocket science. It's the exact same principle.

Oh, really? There's no precision needed to move a sheet of paper? I suggest you ask every one of the billions of species on earth that can't do it, after you get done with the three that can. You might also ask the young of those three, who spend years mastering it. No offense, but you very, very clearly have no idea what you're talking about. Why are you so insistent?

Once again you reply with some braindead response because you know damn well that it's not possible to have Final Cut work better with multitouch surface over a keyboard and mouse. Sorry your point is null and void and all you are doing is grasping at straws trying to save yourself.

And, one more time, I'll ask if you have any evidence. I'm serious. What are you basing this on? What? There are a whole lot of things that people have said were "not possible", but rarely on a topic this mundane. :p So far they've all been wrong except (so far as we know, so far) time travel and reanimating the dead.
 
Jesus is apple going to stop making computers entirely? WTF, hire more people if you need to divert resources.
 
Future/Features

I question the logic of Gruber in the announcement. The last 2 WWDC focused primarily on iPhone and App development. This year, the latest iPhone OS has already been previewed, and the iPad has already been launched. What is left for Steve Jobs' keynote aside from something entirely new that we haven't already heard rumors about? I think something involving OSX will be made in the keynote because nothing is left. I think there will be major announcements and still a 2+ year development project. The feature set I imagine to be the reason because of the following:

1.) new UI (like iPad, marble or not)
2.) full resolution independence
3.) bluray support
4.) new file system

The file system is a pretty radical overhaul. I imagine it works like the iPhone where apps create their own databases for info, and the fs stores the apps, as opposed to the current model where data is spread on the hd and an app pulls that data from the hard drive. This model is inefficient, and not optimized for multicore processors. I hope for some ZFS like capabilities, meaning cloning, pooling, etc. Apple does have some experience with ZFS if we are to remember its development on the Forge. (Too bad they couldn't get a license with legal projects from the lawsuit involving ZFS.) This also means a finder overhaul, a file vault overhaul, time machine overhaul, networking overhaul, etc. I think a lot of this work has been done for the iPhone/iPad and could be adapted for a more complete OS. I'd hope they would fix smaller issues as well, like QTX. There would need to be a lot of time for developers to adapt their apps, which is why early announcement would be good.
 
Jesus is apple going to stop making computers entirely? WTF, hire more people if you need to divert resources.

Job's has said many times that it is not as easy as simply hiring more people if you wish to retain a certain level of quality.

The most important thing is to hire quality people who are going to provide a net gain for the company.

Quantity of employees doesn't guarantee quality of product.

I don't think we need a new version of OSX every 18 months unless there has been some real ground broken in the R&D department. I'd settle for continuous improvement of the existing version until 10.7 is bulletproof.
 
The file system is a pretty radical overhaul. I imagine it works like the iPhone where apps create their own databases for info, and the fs stores the apps, as opposed to the current model where data is spread on the hd and an app pulls that data from the hard drive. This model is inefficient, and not optimized for multicore processors. I hope for some ZFS like capabilities, meaning cloning, pooling, etc. Apple does have some experience with ZFS if we are to remember its development on the Forge. (Too bad they couldn't get a license with legal projects from the lawsuit involving ZFS.) This also means a finder overhaul, a file vault overhaul, time machine overhaul, networking overhaul, etc. I think a lot of this work has been done for the iPhone/iPad and could be adapted for a more complete OS. I'd hope they would fix smaller issues as well, like QTX. There would need to be a lot of time for developers to adapt their apps, which is why early announcement would be good.

ZFS is even more dead [edit: for Apple, at least] than it ever was, given the Oracle buyout. Not a chance in hell, but the best concepts from there will be permeating other projects moving forward. I think the iPhone model isn't too far off--it isn't true that applications can *only* use their own db for data. The filesystem not being exposed to the user should not be confused with applications not being able to share data stores (either through the filesystem or other mechanisms, both of which are already being done on the iPhone).
 
Look, I'm trying not to get personal, but you're an argumentative ninny. You know what is my opinion? You're one of those people who took a workshop on Photoshop at the local public library and now you fancy yourself a user of "Pro apps". Or you're an arts undergrad of some kind with an inferiority complex, in which case you just need to get over it. Arts are more fun than programming, embrace it. :p
Great, you see me as someone who just blew away $1000 on software because I took a workshop on Photoshop at a public library. Makes a ton of sense. I'm sure it's all adding up for you now...

In this case, "common" doesn't mean what you want it to. I mean "common" in the context of your "Pro apps", since that is what we're talking about. :p You can't get pixel-precision with a mouse without zooming unless you have an exceptionally steady hand, eagle eyes, and a large display. Every video editor I've sat in with or observed in a usability study zooms constantly. What makes you think, since you're already zooming anyway, that there's no way around the "accuracy" issue? I think that it doesn't even require very much foresight, since it's already being done for other equivalent actions in a variety of applications. Beyond a failure to see the obvious expansion in the use of zoom (especially when it's available as a simple gesture, rather than something that has to be clicked or typed), you're more broadly insisting on equating "greatest precision I need in this application, such as 1/60th of a second displayed on a visual timeline" and "one pixel". There's no reason for that.
Like, I've already stated here, I never said the the mouse was perfect. I never said the mouse was end all be all. I said multitouch is not the answer for a replacement, and it isn't. Oh and who's to say the only thing I use is Final Cut? I don't just stare at a timeline all day. I could also go into 3D modeling application if you really want me to.

Oh, really? There's no precision needed to move a sheet of paper? I suggest you ask every one of the billions of species on earth that can't do it, after you get done with the three that can. You might also ask the young of those three, who spend years mastering it. No offense, but you very, very clearly have no idea what you're talking about. Why are you so insistent?
Right, I have no idea what i'm talking about when we are referring to moving a piece of paper on a table. The problem here is that you have no idea what I'm talking about. There's a reason why doctors use tools instead of their bare hands because these tools have proven to improve the working environment, much like a mouse has. Now you will beg the same exact question as to why don't we use tools for moving around a piece of paper? Well, ask the 3 year old who only has to place their hand on it and move their arm around.

And, one more time, I'll ask if you have any evidence. I'm serious. What are you basing this on? What? There are a whole lot of things that people have said were "not possible", but rarely on a topic this mundane. :p So far they've all been wrong except (so far as we know, so far) time travel and reanimating the dead.
I'm basing this on the fact that the majority art can't be done without a paintbrush. I'm basing this on the fact that surgeries can't be done without the proper tools. This isn't science fiction we are talking here.
 
I don't think we need a new version of OSX every 18 months unless there has been some real ground broken in the R&D department. I'd settle for continuous improvement of the existing version until 10.7 is bulletproof.

Amen to that.
I just want them to make it work properly. Still some way to go in that regard.
I'd rather see some new hardware.
 
ZFS is even more dead [edit: for Apple, at least] than it ever was, given the Oracle buyout. Not a chance in hell, but the best concepts from there will be permeating other projects moving forward. I think the iPhone model isn't too far off--it isn't true that applications can *only* use their own db for data. The filesystem not being exposed to the user should not be confused with applications not being able to share data stores (either through the filesystem or other mechanisms, both of which are already being done on the iPhone).

My bachelors dissertation was on this topic actually. I built a shared information layer for GNOME that used the RDF model. I then defined some strategies for mapping these to idiomatic python. The upshot was that two programs could define some objects and these were mapped to queries over the shared graph. Changes were transactional and all programs shared information transparently without need of knowing anything about document files. Instead of having one or more semantic objects in a file that only a handful of applications can read and change, and only one at a time, the data could be combined ('smushed') and leveraged in new ways without hurting the developer or the user's experience.

My PhD research demonstrated an interesting point along these lines. I found a statistically significant bias in software process description towards nouns and adjectives. Including verbs in queries and the index actually reduced performance.

Multitouch UIs are perfect, in my opinion, for interfaces based on tangible 'objects' rather than abstract 'applications'. A multitouch interface that uses a shared information model (with a view/controller layer applying lenses that pick out the bits they're interested in) is something I think would improve HCI. Separate the information from the tools used upon it and allow users to work in a noun-y way.
 
Like, I've already stated here, I never said the the mouse was perfect. I never said the mouse was end all be all. I said multitouch is not the answer for a replacement, and it isn't.

... he asserts, one more time. Oy.

Right, I have no idea what i'm talking about when we are referring to moving a piece of paper on a table.

We aren't talking about whether you're familiar with moving a piece of paper on a table. I'm confident you've done it a few thousand times. We're talking about whether your assertions about the human brain, tools, and cognition have any validity, and they very clearly don't.

The problem here is that you have no idea what I'm talking about. There's a reason why doctors use tools instead of their bare hands because these tools have proven to improve the working environment, much like a mouse has. Now you will beg the same exact question as to why don't we use tools for moving around a piece of paper? Well, ask the 3 year old who only has to place their hand on it and move their arm around.

Yes. So, in some cases a tool is necessary, and in others it only gets in the way.

The question is what makes you so sure that you're right about which is which, or what the best tool is for a job? Maybe what's getting lost here is the idea that a doctor needs tools because of inherent limitations: hands can't cut through skin and bone. Hands can't pierce the skin, so she uses a needle to sew someone up; but as Arnia discussed that needle is a minimal intermediary. It is piercing the hole on the way through, but the hand itself is in fact doing the rest of the work. Using a mouse to manipulate your 3D model is not equivalent to a doctor using a scalpel; it is equivalent to a doctor using a mouse to control a robot arm that's holding a scalpel. Ironically, for certain kinds of work "they" (for example, the Mayo Institute) are experimenting with this scenario but the mouse is a non-starter. They're using multi-touch with haptic feedback (and they test it on our 20 foot wide video display downstairs).

I'm basing this on the fact that the majority art can't be done without a paintbrush. I'm basing this on the fact that surgeries can't be done without the proper tools. This isn't science fiction we are talking here.

Yes, and people manipulate that brush with their hands. Your timeline is the "brush" with which you're "painting" a video. The problem is that your video is not tangible, and for decades there was no good technology to allow you to control your "brush" other than an objectively fairly clumsy "robotic arm" that you control from across the room. That's changing, and for most applications it will be a good thing.
 
My bachelors dissertation was on this topic actually. I built a shared information layer for GNOME that used the RDF model. I then defined some strategies for mapping these to idiomatic python. The upshot was that two programs could define some objects and these were mapped to queries over the shared graph. Changes were transactional and all programs shared information transparently without need of knowing anything about document files. Instead of having one or more semantic objects in a file that only a handful of applications can read and change, and only one at a time, the data could be combined ('smushed') and leveraged in new ways without hurting the developer or the user's experience.

My PhD research demonstrated an interesting point along these lines. I found a statistically significant bias in software process description towards nouns and adjectives. Including verbs in queries and the index actually reduced performance.

Sounds interesting. Is this available online or interlibrary loan? You can PM me if you don't want to mention it on the forum. I'm finishing up my Masters thesis right now investigating the correlation between shortcomings of an underlying data model, and usability issues. This is using the Logical Data Structures approach to data modeling, so "shortcomings" here essentially means "may or may not be correct in some abstract sense, but differs from what real users mean by the noun Foo and its attribute Bar and its relationship to Blargh." Names are paramount, verbs are implementation details (to oversimplify) except inasmuch as they figure into the names of links. So I'm one of the choir on the noun vs. verb issue.
 
We aren't talking about whether you're familiar with moving a piece of paper on a table. I'm confident you've done it a few thousand times. We're talking about whether your assertions about the human brain, tools, and cognition have any validity, and they very clearly don't.
I'm sorry but I don't think you are getting anything here. Some things need tools and other don't. Tools make things easier for man than the task would previously. Would a tool that helps you move around a piece of paper be necessary when you can already do that without one? No. Would a precise tool be needed for open heart surgery that you can't achieve otherwise? Yes.

Yes. So, in some cases a tool is necessary, and in others it only gets in the way.

The question is what makes you so sure that you're right about which is which, or what the best tool is for a job? Maybe what's getting lost here is the idea that a doctor needs tools because of inherent limitations: hands can't cut through skin and bone. Hands can't pierce the skin, so she uses a needle to sew someone up; but as Arnia discussed that needle is a minimal intermediary. It is piercing the hole on the way through, but the hand itself is in fact doing the rest of the work. Using a mouse to manipulate your 3D model is not equivalent to a doctor using a scalpel; it is equivalent to a doctor using a mouse to control a robot arm that's holding a scalpel. Ironically, for certain kinds of work "they" (for example, the Mayo Institute) are experimenting with this scenario but the mouse is a non-starter. They're using multi-touch with haptic feedback (and they test it on our 20 foot wide video display downstairs).
Again, you're making it seem like I said the mouse was end all be all, when I clearly never have. That must be really great that they have to test multitouch on a 20 foot wide screen just to make it useable while I can do everything just as precise if not more on a 10 inch screen with a mouse. You also fail to mention fatigue, which you don't receive with a mouse as quickly as you do say editing a video 5 hours on end. If the only options I have for editing are holding my hands in the air or looking down at a desk then back pain and arm soreness here I come!


Yes, and people manipulate that brush with their hands. Your timeline is the "brush" with which you're "painting" a video. The problem is that your video is not tangible, and for decades there was no good technology to allow you to control your "brush" other than an objectively fairly clumsy "robotic arm" that you control from across the room. That's changing, and for most applications it will be a good thing.
Right, for most applications such as surfing the web and checking your mail i'm sure multitouch will work wonders for. But for real video editing and modeling where you will be selecting menus upon menus and having variable things you must access at all times, why bother with a clumsy multitouch interface where my hands get fatigue when I can slightly move my mouse around for more precision?
 
Just fix the bugs, then the finder

If Apple would just fix the bugs -- the resource fork stupidity, the open source components that are grossly out of date, the general inattention to security, the de-evolving of some of the graphics drivers -- I'd be feeling a bit better. Fix the finder and/or provide a decent file browser and I might actually be happy. Frankly, these are update kinds of things that should have been done long ago; they don't justify a full update. But I fear they'll just be allowed to fester until trouble breaks out, perhaps with a major hacking or virus incident.
 
This isn't aimed at you... I agree with what you're saying. I'm just curious where this strawman is coming from about "just touch". Nobody reasonable (and nobody in this thread, period) has said anything to remotely suggest that touch will always and forever be the only interaction. Just to take the most obvious example, it can never account for heavy text entry (not to say that the keyboard will remain as we know it, but touch as we know it won't be the replacement either).
To be fair "intel" has been talking about "properly designed" touch interfaces being the future, more efficient for pretty much everything. Which i disagree with and took it to the conclusion he was implying.. Keyboards and mice wouldn't be primary inputs, you'd likely be working on a more substantial yet still portable iPad-like form factor, making a plug-in tablet pretty undoable.

I see that as a possibility, but I'd never want the Mac to go there completely. I'd rather a future where Macs and iPods/Phones continue with UIs that suit their use and keep proper openness and flexibility on the Mac side.

Touch screens, unless they can react to stylii are just an emulation of fingers. We obviously need emulations of pencils, brushes etc if we want to do art in anything like the manner we're accustomed to. (and people telling artists to "adapt!" is totally missing the point.) Most digital artists don't draw with the mouse, so (some) people's assertion that it's so much better than the mouse is irrelevant to a lot of users.

Kind of makes me think some people in here should just get a Wacom, sounds like it'd blow their minds. :confused: I've been living in the future without even knowing! (accuracy+pressure > multi-touch for MANY activities)

Keyboards really aren't that bad either.

This is pretty much off topic anyway. I don't mind waiting for the next OSX - Snow Leopard is wonderful.
 
I'm sorry but I don't think you are getting anything here. Some things need tools and other don't. Tools make things easier for man than the task would previously. Would a tool that helps you move around a piece of paper be necessary when you can already do that without one? No. Would a precise tool be needed for open heart surgery that you can't achieve otherwise? Yes.

I think you are misunderstanding him. He is saying that there is a false dichotomy between touch and using tools. Multitouch interfaces can also use tools, just ones which do not require indirection. This is why I prefer the term 'direct manipulation UI' rather than 'multitouch UI' but, unfortunately, the latter is gaining ground as a synonym for the latter.

Again, you're making it seem like I said the mouse was end all be all, when I clearly never have. That must be really great that they have to test multitouch on a 20 foot wide screen just to make it useable while I can do everything just as precise if not more on a 10 inch screen with a mouse. You also fail to mention fatigue, which you don't receive with a mouse as quickly as you do say editing a video 5 hours on end. If the only options I have for editing are holding my hands in the air or looking down at a desk then back pain and arm soreness here I come!

He didn't say they were using a 20 foot wide screen to make multitouch work. That is an overeager inference. Moreover, given the domain, I think the reason for the large screen is immersion for a higher fidelity cognitive map of the haptics.

Anyway, I think you are confusing the issues of cognitive properties of UI schemes and the ergonomics of an implementation of that UI scheme. To illustrate, you could use an optical mouse upside down or vertically. You wouldn't because that is a poor ergonomic setup for a WIMP UI. Similarly, you wouldn't choose an ergonomic setup for a direct manipulation UI that required you to have poor posture. It is a strawman to make this claim so broadly.


Right, for most applications such as surfing the web and checking your mail i'm sure multitouch will work wonders for. But for real video editing and modeling where you will be selecting menus upon menus and having variable things you must access at all times, why bother with a clumsy multitouch interface where my hands get fatigue when I can slightly move my mouse around for more precision?

Why would you be selecting menus upon menus? That is indirection. Worse, verb-y indirection (the worst kind). An efficient DM UI would have a very different layout to an efficient WIMP UI. There is, as they say, more than one way to skin a cat.
 
Kind of makes me think some people in here should just get a Wacom, sounds like it'd blow their minds. :confused: I've been living in the future without even knowing! (accuracy+pressure > multi-touch for MANY activities)

There has been work on pressure-sensitive, responsive multitouch that can respond to inanimate objects such as this:

http://www.intomobile.com/2009/12/3...hscreens-with-multi-touch-coming-in-2010.html

You can have your cake and eat it in this particular case, and do it with a full direct manipulation UI without the messy indirection of a mouse. :)
 
I'm sorry but I don't think you are getting anything here. Some things need tools and other don't. Tools make things easier for man than the task would previously. Would a tool that helps you move around a piece of paper be necessary when you can already do that without one? No. Would a precise tool be needed for open heart surgery that you can't achieve otherwise? Yes.

1. What makes you so sure that "precision" is always the issue? A person's hands are perfectly capable of the precision of open heart surgery; those hands are what is controlling the motion. The tools are not there for precision, they are there to do things that hands just can't do, like cutting and piercing. They are not an abstract intermediary.

2. What makes you so sure you need this "precision"?

Right, for most applications such as surfing the web and checking your mail i'm sure multitouch will work wonders for. But for real video editing and modeling where you will be selecting menus upon menus and having variable things you must access at all times, why bother with a clumsy multitouch interface where my hands get fatigue when I can slightly move my mouse around for more precision?

I think you have your causation the wrong way around. It's the perception bias inherent in statements like: "Isn't it amazing that we happened to evolve in the one place that could support an organism like us??!" Well, no, we evolved into this organism exactly because this planet could support it. Similarly, it isn't amazing that you happen to have a mouse, which is just perfect for selecting many levels of menus; rather you have them because there was a mouse (or, in fact, a stylus or joystick in the early days). There are alternatives to all those menus, for many sorts of activities. These include 3D modeling. You can scoff that "oh, it takes a 20-foot-wall to use multitouch effectively!" but that's nonsense. They do that so they can do deep dives and effectively have a number of people working together at once; for many activities (like what you would do to an AutoCad model) one person sits in front of a normal desktop LCD and does amazing things, without much of anything that looks like navigating menus the way you're thinking about it.
 
Unbelievable...

I don't know WHY MacRumors keep posting these "on no..! iPhone OS x diverts resources from Mac OS X x!! aarrgghhh!!!".

It only kickstarts the same ol' arguments.... Please stop it.

If Apple choose to do so, gr8. Who cares if that means Mac OS X 10.7 gets finished 6 months later?
Snow Leopard is brilliant for the Intel users, and the PPC users are fine with 10.5.8

Somehow Win 7 has done wonders with the majority of MacRumor's forums readers, and probably these people are gr8 average users.... :rolleyes:
Vista has been bashed, and Win 7 hailed.... IMHO, the Vista bashing is just as unfair as the Win 7 hailing.

On the Mac, we use Mac OS X and in comparison to all others.... it simply is the best OS... for the users Apple targets.
Want the best gaming OS? Don't get a Mac. Want the easiest to administer in the corporate über-system-administered, anti-user, OS-strangled, scared-of-internet, completely-pwnd OS...? Use Windows in AD environment.
Want the OS which looks ugly, has the least supported apps, but is cheap...?
Sure you know what to get.

Mac OS X allows me to be free of system administration. I don't need IT specialists make me log-in with a non-admin password "for my own safety", and tell me which apps I have to use, and which are considered "unsafe" or are probably witchcraft as they seem to do something these sys-admins are unaware of.....
It looks good, is the most stable OS, and the apps run well and look gr8.

Please, people.

The iPhone and iPad are signals.
People buy these products assuming they can do whatever they want with them. Work and private stuff combined.
Check mail? Get work and private mail... same goes for your iCal. Don't tell me you can't address both agenda's... Want an app to let you navigate from Robe to Adelaide Down Under, or from München to Huizen in NW Europe? Just buy the damn app and use it.

I understand the MCSE qualified system admins to start bashing me. Sure.

But, in reality it really looks like the way we use IT products is starting to change. The iPhone first, iPad now.. and Mac OS in the future.

Maybe Microsoft can buy a company to help them try to alter this change...
 
I think you are misunderstanding him. He is saying that there is a false dichotomy between touch and using tools. Multitouch interfaces can also use tools, just ones which do not require indirection. This is why I prefer the term 'direct manipulation UI' rather than 'multitouch UI' but, unfortunately, the latter is gaining ground as a synonym for the latter.
But in the case of a mouse you get a tool that is more precise than the aforementioned multitouch no matter what the case in this particular situation.


He didn't say they were using a 20 foot wide screen to make multitouch work. That is an overeager inference. Moreover, given the domain, I think the reason for the large screen is immersion for a higher fidelity cognitive map of the haptics.
Still does not change the fatigue issue no matter what screen size you are working on. The larger it is when using a mouse the better for editing and viewability. The larger the screen is when multitouch the harder it gets to control and the more cumbersome it becomes to use.

1. What makes you so sure that "precision" is always the issue? A person's hands are perfectly capable of the precision of open heart surgery; those hands are what is controlling the motion. The tools are not there for precision, they are there to do things that hands just can't do, like cutting and piercing. They are not an abstract intermediary.

2. What makes you so sure you need this "precision"?
Right, try pulling a piece of hair off of your skin with your hands, then try pulling it off with a set of tweezers. You tell me which one is more accurate and precise. The tool you are using is made specifically for one task. Your hands are not made to do everything well, that's what tools are for. A hand can eventually pull the hair out if you try hard enough, but the tweezer will have it done that much faster.

I think you have your causation the wrong way around. It's the perception bias inherent in statements like: "Isn't it amazing that we happened to evolve in the one place that could support an organism like us??!" Well, no, we evolved into this organism exactly because this planet could support it. Similarly, it isn't amazing that you happen to have a mouse, which is just perfect for selecting many levels of menus; rather you have them because there was a mouse (or, in fact, a stylus or joystick in the early days). There are alternatives to all those menus, for many sorts of activities. These include 3D modeling. You can scoff that "oh, it takes a 20-foot-wall to use multitouch effectively!" but that's nonsense. They do that so they can do deep dives and effectively have a number of people working together at once; for many activities (like what you would do to an AutoCad model) one person sits in front of a normal desktop LCD and does amazing things, without much of anything that looks like navigating menus the way you're thinking about it.
And what exactly does this have to do with my complaints about dealing with a clumsy interface and hand fatigue? If you can show me where multitouch would in any way be superior to a mouse in work such as Final Cut, Motion, etc, then i'll concede defeat. So far all you have done is shown shoddy alternatives and very vague ideas that would not work as well.
 
There has been work on pressure-sensitive, responsive multitouch that can respond to inanimate objects such as this:

http://www.intomobile.com/2009/12/3...hscreens-with-multi-touch-coming-in-2010.html

You can have your cake and eat it in this particular case, and do it with a full direct manipulation UI without the messy indirection of a mouse. :)
Well, that would be very good indeed! Nice! :D

You'll have to drag my pen from my cold dead hands. Hmm. I wonder how long it'll be before that sort of tech rivals the Wacoms thousands of degrees of pressure sensitivity, angle sensing, etc. A long time I'd imagine.
 
For those who are interested, here are some links I found in a five minute Google search for UI concepts for direct manipulation interfaces for 3D modelling and video editing.

http://www.youtube.com/watch?v=V-Q-9ztpk14

http://www.youtube.com/watch?v=kdkxqgbXaSI

http://www.youtube.com/watch?v=TAanod1F6bI

(a bit more random) http://www.youtube.com/watch?v=W3dz2xpCJVU&NR=1

These are all very rough, proof-of-concept demos. Yet they still show levels of control that some have been arguing is impossible.

I am confident that the nature of direct manipulation interfaces will improve over time, just as WIMPs have. Right now though, they offer real benefits.
 
But in the case of a mouse you get a tool that is more precise than the aforementioned multitouch no matter what the case in this particular situation.


Still does not change the fatigue issue no matter what screen size you are working on. The larger it use when using a mouse the better for editing and viewability. The larger the screen is when multitouch the harder it gets to control and the more cumbersome it becomes to use.

Right, try pulling a piece of hair off of your skin with your hands, then try pulling it off with a set of tweezers. You tell me which one is more accurate and precise. The tool you are using is made specifically for one task. Your hands are not made to do everything well, that's what tools are for. A hand can eventually pull the hair out if you try hard enough, but the tweezer will have it done that much faster.

And what exactly does this have to do with my complaints about dealing with a clumsy interface and hand fatigue? If you can show me where multitouch would in any way be superior to a mouse in work such as Final Cut, Motion, etc, then i'll concede defeat. So far all you have done is shown shoddy alternatives and very vague ideas that would not work as well.

I know you guys are talking about multitouch but a lot of creative proffesoinals prefer a good quality Pen Tablet than a mouse.

I don't think multi-touch would fit be able to perform the fiddly movements needed to use a lot of pro apps. Hell, multitouch isnt really that suited to code either. I'd hate using quartz composer only using my fingers.
 
For those who are interested, here are some links I found in a five minute Google search for UI concepts for direct manipulation interfaces for 3D modelling and video editing.

http://www.youtube.com/watch?v=V-Q-9ztpk14

http://www.youtube.com/watch?v=kdkxqgbXaSI

http://www.youtube.com/watch?v=TAanod1F6bI

(a bit more random) http://www.youtube.com/watch?v=W3dz2xpCJVU&NR=1

These are all very rough, proof-of-concept demos. Yet they still show levels of control that some have been arguing is impossible.

I am confident that the nature of direct manipulation interfaces will improve over time, just as WIMPs have. Right now though, they offer real benefits.
Good god, hand fatigue and back pain are the first that come to mind. Imagine editing on that for 5 hours? Now that would be total hell, but I guess that isn't your main concern as long as it "just works" as in does the exact same thing that Windows Movie Maker can do only slower.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.