Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I saw something very very similar at a trade show for digital signage in Las Vegas. It is super sweet. They were using as an interface for when people enter a building or who are shopping. The system was really cool for the shopping because you could zoom in very apple iPhone like zoom to read the details on the packaging etc. It is very very sweet!
 
Hasn't anyone heard of the SmartBoard? They have been is school classrooms for a heck of a lot longer than this story even had its beginnings. So nice try on thinking that this is even news! http://smarttech.com

SmartBoards are simply projector tablets. They are not multitouch. That is the big deal here, not that you can draw on the screen. Using a SmartBoard is just like using the current mouse interface except that you use a stylus instead of moving a mouse around. In other words, it is just a tablet -- which, as you say, are very common these days (although not in the Mac world). It seems clear that Apple will eventually address the tablet market by introducing a multitouch tablet and skip the stylus era entirely.
 
I wondered about that? It is great though seeing their Apple laptops on their desks while doing the news.

I can't see using this much on my iMac or macbook, unless they replace the macbook touch pad with a nice wider full color screen. Same with the imac except add the screen to a keybd. Direct multi touch on the actual screens would require me to invest in iKleer or is it iClear? Either way I like my main viewing screen smudge free--)) :D

To both of you.. Who the hell watches Faux news? Surely you don't actually believe anything that comes from that nonsense network? :)
errr.. oh yeah.. no politics :)
 
I think that this screen-touching will eventually(15 yrs?) go towards the UI in Iron Man(which was very cool!). 3D manipulations would be so much more natural that it would be great for design and engineering types. As for multitouch screens, they're definitely the next step towards the
future of computing. Very cool.

It's hard to say exactly where HCI (human-computer interface) research will go. Multi-touch screen-based gesturing is definitely going to be the future of at least some parts of computer interaction. For collaboration, information display, etc it will be great.However, anyone can see the major problems with the technology in general computer use and productivity -- not to mention incredibly tiring to do for more than a few minutes at a time. Also, the ergonomics just don't work. Imagine sitting at your desk, whether you are talking about a vertical screen with your arms outstretched to it or a tablet type screen laid flat on the desk with your neck kinked to look at it, it doesn't seem to be anything anyone would want to do all day.
if you replace the touchscreen with a large touchpad or Wacom-type tablet, like the implementation on the Macbook Air, and couple it with a vast gesturing system, I can begin to see much more general applications of the technology, and I can imagine how it would be used to navigate the existing window/context menu/button interface we have now. It would be even better to add a stylus pen for when it is necessary to be used along with gesturing.

Now how a multi-touch gesture system could be used to completely redesign computer interaction and throwout the two-dimensional desktop/windows/button metaphor, I just don't know. We've all seen various attempts, good and bad, at a true "3d-dimensional space" interface, but again I don't yet see how they could make it practical for anything other than eye candy.

Anyone have original ideas? on a similar note, does anyone know of any 3D interface mock-up contests or galleries somewhere?


Probably not. I've seen those large screens. They're usually straight HDTVs - 1920x1080 or 1920x1200. An Apple 30" display has more resolution. A big 47" screen at 100dpi would be incredible, but nobody makes one. (At least I don't think anyone does.)


I know -- I hate this. 2560×1600 "WQXGA" resolution is all we get, and only in those huge 30" monitors. What pisses me off the most is that because of laziness and stupid decision by Microsoft (and Apple) to not make their operating system interfaces completely resolution-independent ("vector-based"), the general public equates high resolution display with "everything is too small and I can't read the tiny text". This has created the horrid situation where external monitor resolution actually WENT DOWN at some point. I'm positive you used to be able to find standard consumer 17"-20" monitors with 1920x1200 (WUXGA) resolution, although in those days the screens were 4:3 and not 16:10, so it would have been 1600x1200 (UXGA). Now you usually have to buy 22"+ monitors to get the same resolution as my 17" laptop screen. It just doesn't make ANY SENSE AT ALL.

Also, does anyone remember when IBM first made 3840x2400 "WQUXGA" monitors back in 2001?? They were incredible for their time, at only about 22". Thats a HUGE DPI increase. It's FOUR TIMES THE RESOLUTION OF "1920x1200". More pixels than FOUR HDTV screens in a 22" panel. Unfortunately, at the time, they had to make a lot of compromises in refresh rate (40hz), brightness, contrast, color, etc. Someone even recently started selling these old beasts again.
With the rise of HDTV, everyone now has 1920x1080 TV screens.. Its about damn time for the Panel manufacturers to get moving on making Ultra-high res large displays.. they need to move the DPI WAY UP on large screens. At least they could get the large 30" monitors to the 3840x2400 of that 7-year old IBM panel.
Also they should be able to bring down the 30" 2560x1600 panels to 20-22".

Understandably, it's not a simple procedure to scale up the ridiculous DPIs of cellphones with 3" screens running at 800x480 or even the 3.5" 480x320 iPhone. Getting decent manufacturing yields on a 20-30" 3840x2400 panel is probably very difficult, but surely there is high-end, high-margin market for these in professional CAD/Oil/Medical markets, in addition to wealthy consumers. Although for professional displays, I'm sure there are custom solutions for high resolution, and many markets won't need color (aka digital X-ray vis) or fast refresh rates of gaming. But It sure does seem like the display manufacturers have been sitting there resting on their laurels of HDTV and Cellphone display production and not giving a crap about pushing consumer computer screens farther. I guess another problem would be how to power such a display without need dual graphics cards for quad DVI or Dual HDMI or something. I know "DisplayLink" or whatever the new one is should allow for higher resolution, but IIRC, it's total bandwidth isn't that much higher than HDMI or Dual-link DVI.

As a final note, I do remember seeing some announcements (although focused around HDTVs and not computer monitors) about "quad-HDTV" becoming a reality soon by at least one of the manufacturers. Other than for 4K digital cinema, I don't see a TV/Film use for these TVs, although the large 40-50" HDTV size sort of removes any real computing applications, and besides the DPI is still not much better because the screen is so large.
And then of course there is "Super Hi-vision / Ultra High Definition Video" which the Japanese are developing that is 7680×4320 in resolution. Totally crazy! Imagine THAT on a 50" LCD on your wall. I have heard that there is a problem in that many people who view the current test setups actually get ill from motion sickness in certain video sequences since their brain believes the motion because the image is so sharp and clear.

Anyone know anything / seen any announcements about new higher resolution computer displays?
 
My Technology Prediction....

Toutching the screen will not work so well on a large monitor on your desk. So, what will happen is the computer will use it's "webcam" to watch your fingers, you can do all the mullti toutch gestures but just without the "toutch" part, just moving the hands in the air will be enough

In the long runs this "no toutch" interface will cost less. A couple web cams costs less to make than a huge toutch sensitve screen. Yes I think it will take two cameras to make a 3D stereo image of the space in front of the screen. LOTS of computation will be required so you will not see this until those 8 and 16 core CPUs become comon place and low priced

Many companies are working on that type of technology for both human-computer interface in addition to applications in visual processing / pattern recognition for face recognition, autonomous robotics, etc. And actually, you do NOT need two cameras for stereo vision, there are many other methods to get depth information including using infrared light reflection. I just read about a company making a depth-capable webcam just like what you are thinking of. I can't find the link unfortunately, but I will post it as soon as I find it. It was in either Wired news, MIT tech review, or Sci American.


Hasn't anyone heard of the SmartBoard? They have been is school classrooms for a heck of a lot longer than this story even had its beginnings. So nice try on thinking that this is even news! http://smarttech.com

Are you being sarcastic? If you are, then ignore this. If not, realize it is not even in the same league as Jeff Han's device. We are talking about MULTI-TOUCH as in multiple fingers touching the screen at the same time which allows complex gesture-based input control. If you are still confused, sorry, I just cannot help any further. Do some research.


I've been an iPhone owner since August, but seeing this screen on CNN, it really hit me. Multi-touch will and is changing the world and the way we interact with technology. User interfaces themselves are changing and becoming more natural, fluid, intuitive. Really, the iPhone seemed so personal and such a niche until i saw the election coverage on a huge multi-touch screen.

Actually, it really hit me, and i bought my iPhone, when i saw my mom pick up the first iPhone she had ever seen and started doing all the multi-touch gestures and navigating all the iPhone's apps and the web so fluidly i couldn't believe my eyes! :eek: One day (probably soon) we will all be navigating huge multi-touch iMac/media center/who knows machines.

Yep, it's amazing to see Apple's smart interface and otherwise computer illiterate people collide, whether that be the iPhone or OSX. Something magical happens! It's a great thing to watch.


The technology looks promising. Selling it to visualize political machinations isn't a convincing roll out.
Hey, money is money... They will use whatever sales they can get to push the development forward. I would have to assume also that these type of technology would be used heavily in military planning / intelligence.

Does anyone know how Apple has used multitouch that looks a lot like this on the iPod and iPhone without licensing it from these guys much less having their own patents on multitouch?

Fingerworks acquisition.. see here
 
I know -- I hate this. 2560×1600 "WQXGA" resolution is all we get, and only in those huge 30" monitors. What pisses me off the most is that because of laziness and stupid decision by Microsoft (and Apple) to not make their operating system interfaces completely resolution-independent ("vector-based"), the general public equates high resolution display with "everything is too small and I can't read the tiny text". This has created the horrid situation where external monitor resolution actually WENT DOWN at some point. I'm positive you used to be able to find standard consumer 17"-20" monitors with 1920x1200 (WUXGA) resolution, although in those days the screens were 4:3 and not 16:10, so it would have been 1600x1200 (UXGA). Now you usually have to buy 22"+ monitors to get the same resolution as my 17" laptop screen. It just doesn't make ANY SENSE AT ALL.

The reason it makes sense is because they are made in mass quantities now for what amounts to 1/4 the cost (performance, inflation, and labor adjusted). Remember what those monitors used to cost?

If you want cheap you cannot also get good.

Remember the recent flap over Apple portable monitor bit depth not being exactly what they implied (not so sure about a promise)? Never mind a human cannot detect a difference which Apple used to greatly reduce unit cost and increase unit production capacity and increase modal size and brightness.

Something has to give and some team of engineers and marketing guys made a decision, and so long as the masses were ignorant of the technicalities they were quite happy too. Even the smart ones.

Rocketman
 
The reason it makes sense is because they are made in mass quantities now for what amounts to 1/4 the cost (performance, inflation, and labor adjusted). Remember what those monitors used to cost?

If you want cheap you cannot also get good.

Remember the recent flap over Apple portable monitor bit depth not being exactly what they implied (not so sure about a promise)? Never mind a human cannot detect a difference which Apple used to greatly reduce unit cost and increase unit production capacity and increase modal size and brightness.

Something has to give and some team of engineers and marketing guys made a decision, and so long as the masses were ignorant of the technicalities they were quite happy too. Even the smart ones.

Rocketman

I agree with your points, but what does any of that have to do with not pushing resolution higher? I'm not talking about cheap or budget screens, I'm talking about the high-end. I mean we literally haven't seen ANY further DPI/resolution progress for years. 2560x1600 is still only on 30" screens, and 1920x1200 is only on 20-22"+ screens.
 
I'm positive you used to be able to find standard consumer 17"-20" monitors with 1920x1200 (WUXGA) resolution, although in those days the screens were 4:3 and not 16:10, so it would have been 1600x1200 (UXGA). Now you usually have to buy 22"+ monitors to get the same resolution as my 17" laptop screen. It just doesn't make ANY SENSE AT ALL.
I have one, but it's CRT based. 17" CRTs with 1600x1200 resolution, and 20" CRTs with 2048x1536 were common. They might still be common, except that it's hard to find people selling CRT displays today.

I've never seen that size/resolution on an LCD panel. There may be some manufacturing issues making LCDs with dpis that high that would drive the price up beyond the point of profitability.
With the rise of HDTV, everyone now has 1920x1080 TV screens.. Its about damn time for the Panel manufacturers to get moving on making Ultra-high res large displays.. they need to move the DPI WAY UP on large screens. At least they could get the large 30" monitors to the 3840x2400 of that 7-year old IBM panel.
Assuming they can manufacture LCDs at that resolution for a reasonable cost, there's still the matter of interfaces. A single-link DVI interface is pretty much maxed out at 1920x1200. To go beyond that (like the 30" displays) requires a dual-link connector. But even dual link will only double the number of pixels you can get. 1920x1200 is about 2.3MP. 3840x1200 is 9.2MP - about 4 times as many pixels. You'd need four DVI links (meaning two cables) to drive such a display. I don't think many consumers would want something like that - it would remain strictly a specialty/professional device and probably very expensive as a result (economies of scale, etc.)
I know "DisplayLink" or whatever the new one is should allow for higher resolution, but IIRC, it's total bandwidth isn't that much higher than HDMI or Dual-link DVI.
HDMI's video link is a single-link DVI connection with optional HDCP encryption. Dual-link DVI can push twice as many pixels as HDMI, but that's still half of what you need for 3840x2400.
 
With the release of the iPhone Dev kit, I can't wait to see what the type of multi touch apps come to market for the iPhone! One of my favorite show off apps when I was using a jail broken iPhone is Photoboard. Really good stuff!
 
Jeff Han and Perceptive Pixel only have patents on the use of Frustrated Total Internal Reflection with Multi-touch applications. This is the use of IR light beamed through acrylic sheets which get triggered by fingers (or other things) touching the surface, which some sort of computer vision interprets and processes.

As for if they are running OS X, pretty sure they aren't. The OS doesn't have much to do with it because its all custom software running on top of the OS, none of the features/capacity are determined by the OS (other than performance).

As for free floating displays, there really needs to be some sort of retinal tracking in conjunction with the finger tracking, otherwise the computer might not have any sense of what you're pointing at. Humans also work better when there is some sort of tactile interface, when you have a physical object to engage with, instead of the synthesis of the same.

Building your own multi-touch is fairly easy, and creating your own FTIR implementation is totally legal as long as you aren't selling it.
 
With the release of the iPhone Dev kit, I can't wait to see what the type of multi touch apps come to market for the iPhone! One of my favorite show off apps when I was using a jail broken iPhone is Photoboard. Really good stuff!

Multi-touch on the iPhone is a misnomer, I'm fairly certain it's only bi-touch. If someone can show me an example where three points of interaction are implemented, please let me know.
 
Multi-touch on the iPhone is a misnomer, I'm fairly certain it's only bi-touch. If someone can show me an example where three points of interaction are implemented, please let me know.

I can't think of any three+ point gestures, but the screen is certainly capable of registering more than two points of contact.
 
It's way more complex than that

I have one, but it's CRT based. 17" CRTs with 1600x1200 resolution, and 20" CRTs with 2048x1536 were common. They might still be common, except that it's hard to find people selling CRT displays today.

I've never seen that size/resolution on an LCD panel. There may be some manufacturing issues making LCDs with dpis that high that would drive the price up beyond the point of profitability.
Assuming they can manufacture LCDs at that resolution for a reasonable cost, there's still the matter of interfaces. A single-link DVI interface is pretty much maxed out at 1920x1200. To go beyond that (like the 30" displays) requires a dual-link connector. But even dual link will only double the number of pixels you can get. 1920x1200 is about 2.3MP. 3840x1200 is 9.2MP - about 4 times as many pixels. You'd need four DVI links (meaning two cables) to drive such a display. I don't think many consumers would want something like that - it would remain strictly a specialty/professional device and probably very expensive as a result (economies of scale, etc.)
HDMI's video link is a single-link DVI connection with optional HDCP encryption. Dual-link DVI can push twice as many pixels as HDMI, but that's still half of what you need for 3840x2400.


Yes, I'm sure there are yield issues with higher-DPI LCD screens, but I'm not talking about making these for your average budget PC buyer. Also, I just want to see some incremental progress, like moving the 2560x1600 panels down to a 20-24" size or making 1920x1200 18-20" panels. That's not that large of improvement over what we have now. Of course, a 30" 3840x2400 would also be welcome ASAP.

As far as interfaces, you are missing a lot of information. Yes, dual-link DVI maxes out at 2560x1600 if you want decent refresh rates (60hz+), but in fact 3840×2400 (WQUXGA) with all it's 9.2 megapixels can run over dual-link DVI, but only at 33hz. Not exactly optimal for most situations. Displayport, the new standard for PC monitors on it's way into the market, expands dual-link DVI's 7.92Gbits/s to 10.8Gbits/s. It's an improvement and can run 2560x1600 at 75hz, but obviously not groundbreaking, especially since it uses a different encoding system which increases overhead.

Now, what it appears you didn't know is that the HDMI spec is more complex and has evolved since the first versions. The single-link DVI video channel (4.9 Gbits/s) was only for the very first devices based on HDMI v1.0. The modern HDMI 1.3 specification increases the bandwidth of the video connection from 4.9Gbits/s to 10.2Gbits/s, which is higher than the 7.92Gbits/s of dual-link DVI, and is almost the same as the future DisplayPort connection.

Even more remarkable, the HDMI specification contains a second connector type, called "Type-B" which has 29 pins instead of the smaller, widely-used "Type-A" connection that has 19 pins. This larger connector (which actually looks like a standard DVI connector) doubles the signaling frequency of HDMI v1.3 and thus doubles the bandwidth from 10.2Gbits/s to 20.4Gbits/s. Although this alternative connection has not been manufactured at this time, the spec is ready to go for future high-resolution devices, such as the future "Quad-HD" 3840x2160 TV's that manufacturers have been showing off. Indeed, since regular dual-link DVI can run a 4 megapixel (2560x1600) display with it 7.92Gbits/s, then HDMI type-B's 20.4Gbits/s of video bandwidth should easily be enough to run the 9 megapixel 3840×2400 (WQUXGA) displays. Because of HDMI's impressive specs, I really hope that DisplayPort has specifications for some type of dual-link connection or a much faster future signaling rate to get above it's 10.8Gbits/s. I don't know enough about it yet to know, but I'm sure they have future revisions in the works.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.