Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Despite what some people here seem to think... this is actually a very good move – we should see a lot more new web driven applications in the near future, and maybe even some that are developed by Apple. All Palm needs is one or more killer applications – others can use patents too you know – and there you have it... a competitor to the iPhone (let the flaming begin).

Also, iPhone developers might want to expand and cash in that extra check. It might – I haven't checked – also be a little easier to write a WebOS app.

p.s. I am also wondering about the age of iPhone users, because that must be younger people – I don't know a single person above 25 with an iPhone (all Palm users I know are a lot older and less willing to install dozens of useless applications).

Now you! I was born in 1939, my daughter was born in 1964 and her daughter in 1996... That's 3 generations of iPhone users.
 
First of all I am not an Apple fanboy (ah girl) .

I have both sprint and att and probably by the end of the year WILL be doing cross application development for both Iphone, pre and if sprint or att get the Android . That platform too. You want to maxmize your customer base as much as you can because market conditions rapidly change.

There is so much misinformation about the pre its funny.

I think synergy is cool, but its not much use unless your tied to a palm server, but arn't phones connected something like 98% of the time.

Web OS / Mojo

Just starting reading a new book on that.

Palm webOS, 1st Edition
By: Mitch Allen
Last Updated on Safari: 2009/07/07
Publisher: O'Reilly Media, Inc.
Pub Date: September 2, 2009 (estimated)
ISBN: 978-0-596-15525-4
Pages: 400


Its interesting I'll give it that, the palm pre uses an embeded linux operating system, mojo is a very big class library with a lot of ties into the device.
This is not really new, and it is possible to use mojo with an iphone. There is at least one open source project I found on a google search.

Misconceptions -

1 Mojo apps are nothing more than web 2.0 apps, similar to what us developers started with for the iphone.
Answer : Yes and NO. Mojo is based on html5 , its a real rich library with a lot of native ties.


2) Mojo is a native development environment.

Answer: No in my opinion. The underlying environment is linux, much as the underlying environment on the iphone is osx, but with a lot of extensions which can be called from the mojo library. In a sense mojo is much like dojo, a very large and featured javascript library, But it is still ALL interpreted code.
I don't think you are going to be able to create games on this environment comparable to open gl , es on the Iphone. even though you do have access to open gl. Its highly abstracted.

3) I find this one funny, You can create apps faster on the pre than with the iphone.

This is an unfair comparison you can't compare web os/mojo to objective-c. A more fair comparison would be mojo development to:

1) a hybrid native/webapplication created with dashcode using QCConnect, or phoegap.

2) A serverside java/dashcode integrated application .

Using either of the above techniques you can create a fairly good , very featured application in a single day.

I've written tutorials on the very subject.

http://www.jsfcentral.com/listings/A21034


And a lot more on our sight at

www.mooncatventures.com/blogs


Personally I believe the pre is very comparable to the iphone, an iphone killer , probably not . But most of its key features can be duplicated on the iphone. And that includes synergy. Synergy is basically a cloud service, the webos does mashups , this can be duplicated using a capable backend server or google app engine , a native iphone app with APNS and a comet-d provider on the server side. Dojo, andromeda, icefaces etc.



Thanks for the analysis.

The comments about interpreted code and OpenGL ES are particularly interesting.

HTML and CSS tend to be pretty verbose and have to be interpreted, too, to render the page. I assume that this, along with the JavaScript interpretation, would put much more burden on the CPU (and battery) compared to executing native code.

This would also affect other apps running in the background/foreground.



Here's a potential test:

The iPhone has a CoverFlow UI for the iPod app, but the CoverFlow API is not approved (revealed) for developer use.

Using OpenGL ES, and Objective-C it is possible to implement a CoverFlow UI that equals (even outperforms) the Apple implementation (even on a 1G iPhone).

Would this be possible at the Mojo level... at the underlying Linux level?

Dick
 

Summary: iPhone can do more serious gaming and Pre can't. Pre Fail.

Why people keep coming back to this argument is beyond me. I DON'T WANT TO PLAY 3D GAMES ON MY PHONE. There I said it. I had an iPhone 3G for a year and bought games only to be dissapointed with the user interface (no controls). Tilt controls are only tolerable to a degree and ruin many games. Ironically the game I played the most on my iPhone 3G was Mafia Wars along with Uno, Scrabble,etc. which can EASILY be done on the Pre.

I think that's an over simplification. OpenGL ES and the Accelerometer have many serious (non-gaming) uses. Augmented reality has already been mentioned, some other potential uses of animation: scientific, education, medical... and animated presos in these or any business field.

In a single device the size of a pack of cigarettes, you have more computing, graphics, communications, networking, entertainment capabilities than existed a decade ago in a roomful of hardware/people and software costing millions of dollars. And these are all integrated, at your disposal with some inexpensive apps.

(graphics && animation && control) != games

Dick
 
I think that's an over simplification. OpenGL ES and the Accelerometer have many serious (non-gaming) uses. Augmented reality has already been mentioned, some other potential uses of animation: scientific, education, medical... and animated presos in these or any business field.

In a single device the size of a pack of cigarettes, you have more computing, graphics, communications, networking, entertainment capabilities than existed a decade ago in a roomful of hardware/people and software costing millions of dollars. And these are all integrated, at your disposal with some inexpensive apps.

(graphics && animation && control) != games

Dick

Well put. The convergence of devices. This is where it's all headed. Everything from gaming to conference calls, on a single device, that does everything well. That last bit wasn't really possible up until now. But we're moving away from devices dedicated to a single task. The technology is there to exploit and build on in terms of consolidating a mountain of functionality into a simple, well-designed device.
 
In a single device the size of a pack of cigarettes, you have more computing, graphics, communications, networking, entertainment capabilities than existed a decade ago in a roomful of hardware/people and software costing millions of dollars. And these are all integrated, at your disposal with some inexpensive apps.
While I agree that technology is getting smaller, faster and less expensive I think you got a little carried away w/your comparison. I mean, 10 years ago people were on broadband internet fragging each other in UT or Q3A and Mac users got to launch FCP for the first time. The 3Gs might be comparable to typical desktop machines from 10 years ago but saying it'll beat out the IBM and Cray supercomputers of the day is a bit of a stretch don't ya think?;)


Lethal
 
While I agree that technology is getting smaller, faster and less expensive I think you got a little carried away w/your comparison. I mean, 10 years ago people were on broadband internet fragging each other in UT or Q3A and Mac users got to launch FCP for the first time. The 3Gs might be comparable to typical desktop machines from 10 years ago but saying it'll beat out the IBM and Cray supercomputers of the day is a bit of a stretch don't ya think?;)


Lethal

Mmmm... I didn't mention IBM or Cray supercomputers.

I'll concede the point on timing, though-- more like 15-20 years ago.

I worked for IBM 1963-1980 as a Systems Engineer and later as a Technical Market support consultant for Database/Data Communications... all on mainframe computers. My responsibilities included assuring the customers and their IBM representatives did whatever necessary to successfully install their applications, on time, on budget. So I do have an understanding of a room full of mainframe computers and the people necessary to support them.

I bought an Apple ][ in 1978 and it had roughly the same capability (RAM, OS, etc) as an IBM 360/40 mainframe (except it didn't have a fan).

With the advent of VisiCalc, these "toys" began making inroads into businesses (including the IBM Corporation) to bypass the mainframe logjam.

Over the years, "microcomputers" revolutionized the industry and became the focal point for innovation.

When he was told that Apple Computer had just bought a Cray to help design the next Apple Macintosh, Cray commented that he had just bought a Macintosh to design the next Cray.

Now, it is happening again with the smart phone, which adds anywhere/anytime connectivity.

Hang on, it's going to be one helluva' ride!
 
Mmmm... I didn't mention IBM or Cray supercomputers.

I'll concede the point on timing, though-- more like 15-20 years ago.
I was just taking a stab in the dark as to what would've been a computer the size of a room and costing millions of dollars 10 years ago. W/the 40th anniversary of the moon landing upon us it still dumbfounds me how we were able to operate a successful space program given the technology of the times. Just incredible how much has changed in just 40yrs.

Now, it is happening again with the smart phone, which adds anywhere/anytime connectivity.

Hang on, it's going to be one helluva' ride!
Completly agree.


Lethal
 
I was just taking a stab in the dark as to what would've been a computer the size of a room and costing millions of dollars 10 years ago. W/the 40th anniversary of the moon landing upon us it still dumbfounds me how we were able to operate a successful space program given the technology of the times. Just incredible how much has changed in just 40yrs.


Completly agree.


Lethal

IBM made the computer that went on the Moon Landing flights. It was called the "Suitcase Computer" because of its small size (for that time).

The computer was quite sophisticated (again, for that time) and had 3 entirely different sets of everything (Power Supply, CPU, RAM, etc.). The threesome would each compute their result to a problem then they would "vote" on the answer.

Sounds scary-- but it brought men to the moon and back!
 
IBM made the computer that went on the Moon Landing flights. It was called the "Suitcase Computer" because of its small size (for that time).

The computer was quite sophisticated (again, for that time) and had 3 entirely different sets of everything (Power Supply, CPU, RAM, etc.). The threesome would each compute their result to a problem then they would "vote" on the answer.

Sounds scary-- but it brought men to the moon and back!

As I recall we invested something like 5% of our yearly GDP to get to the Moon.

Just taking 5% today would give NASA roughly $700 Billion towards their space program.

Yes. I'd imagine we could do it.
 
As I recall we invested something like 5% of our yearly GDP to get to the Moon.

Just taking 5% today would give NASA roughly $700 Billion towards their space program.

Yes. I'd imagine we could do it.

It was 4.6% of the federal budget as just for NASA at the time so that is quite a bit less than 5% of the GDP but still a lot of money.

Now NASA budget is just 0.5% of the federal budget.

These numbers where pulled from an article I read 2 days ago in the Houston Chronicle.
 
It was 4.6% of the federal budget as just for NASA at the time so that is quite a bit less than 5% of the GDP but still a lot of money.

Now NASA budget is just 0.5% of the federal budget.

These numbers where pulled from an article I read 2 days ago in the Houston Chronicle.

Ah, but we had a leader that inspired us to the greater good (though I didn't vote for him).

As an aside, we had a power outage over 1 hour ago, I have about 40 thou of computer gear sitting... waiting.

I am in the back yard (in total darkness) on my 3GS touching the world as if nothing happened!
 
IBM made the computer that went on the Moon Landing flights. It was called the "Suitcase Computer" because of its small size (for that time).

The computer was quite sophisticated (again, for that time) and had 3 entirely different sets of everything (Power Supply, CPU, RAM, etc.). The threesome would each compute their result to a problem then they would "vote" on the answer.

Sounds scary-- but it brought men to the moon and back!

Scary? Nah. Now if the three computers had played rock, paper, scissors instead of voting *that* would've been scary. :D


Lethal
 
When he was told that Apple Computer had just bought a Cray to help design the next Apple Macintosh, Cray commented that he had just bought a Macintosh to design the next Cray.

Extremely misleading to drop that into the "progress of computational horsepower" conversation. It is not an example of that.

It is a completely different approach to solving a problem. Apple was trying to model with Computer somethings that Seymour Cray would figure out himself ( modest brain + super computer ) . Seymour was using the Mac help him documenting/tinkering the design (super-brain + modest computer)


There are also limitations to the apps can put on a mobile device as long as the screen size is 3" too. For example wouldn't want someone reading an X-ray on a iPhone. Anything that needs to be read in the context that is bigger than the size of the screen isn't going to work.

Similarly wouldn't want medical records stored on a mobile (easily misplaced/lost ) device either. [ remote kill/wipe is a way of cleaning up after screw ups. Not how prevent them in the first place. ]

There is medical reference and data entry stuff that has classically run on Palm OS and others that fit the criteria.

Interfacing with medical diagnostics equipment. Again bar charts and graphs aren't an OpenGL problem.

There is a "could you" jam a program into a mobile device aspect, but there is also a "should you" aspect also.

Similar with education. Split between "game" and reference/knowledge. The latter isn't a OpenGL problem.

etc.



If want to go back in history.

When minicomputers showed up Mainframes were going to disappear in a decade or so.
When personal computers showed up. Mainframes were really going to die now (although had been about a decade and were still around). Those minis were on borrowed time now too.
Now were in an era where the mobile handhelds show up. Mainframes are really, really, really going to die now (it has now been over two decades since they were suppose to die). Minis so doomed. PCs .... dead platform walking.

Notice the pattern?

There is some stuff that will migrate down to smaller platforms and other stuff that won't.
Network/data bandwidth to the radio remote hand held will very likely lag way behind the hardwired connected machines.
 
Extremely misleading to drop that into the "progress of computational horsepower" conversation. It is not an example of that.

It was not a conversation about horsepower, rather about the capabilies integrated into a single small device and the usability of same. The OP was claiming that graphics and accelerometer were for games and not useful on a smart phone.

It is a completely different approach to solving a problem. Apple was trying to model with Computer somethings that Seymour Cray would figure out himself ( modest brain + super computer ) . Seymour was using the Mac help him documenting/tinkering the design (super-brain + modest computer)

Which is the reason I used the quote-- the convenience/utility, not the horsepower. Borrowing from car racing, we [IBM] mainframe proponents, used to claim: "There's no substitute for cubic inches" e.g. horsepower. But, in many cases there is, convenience-- you wouldn't need/want to take a mainframe (mini, desktop, laptop) to a baseball game...

There are also limitations to the apps can put on a mobile device as long as the screen size is 3" too. For example wouldn't want someone reading an X-ray on a iPhone. Anything that needs to be read in the context that is bigger than the size of the screen isn't going to work.

There are useful programs that do allow sharing of x-ray and other images:

http://abcnews.go.com/Technology/story?id=8054939&page=1


I agree that a small screen may be an issue to some degree. But, how big a screen do you need to use Google earth/maps? Certainly, a larger screen would allow the display of more data... how large? 21 inches? 30 inches? 200 feet?

Similarly wouldn't want medical records stored on a mobile (easily misplaced/lost ) device either. [ remote kill/wipe is a way of cleaning up after screw ups. Not how prevent them in the first place. ]
Like it or not, your medical/financial/personal data are stored on computers accessible to the Internet. Putting them on a smart phone need not expose them to unauthorized access. Encryption is one solution. Another, is to not put the data on the device, but stream them when needed.

There is medical reference and data entry stuff that has classically run on Palm OS and others that fit the criteria.

Interfacing with medical diagnostics equipment. Again bar charts and graphs aren't an OpenGL problem.

There is a "could you" jam a program into a mobile device aspect, but there is also a "should you" aspect also.

Similar with education. Split between "game" and reference/knowledge. The latter isn't a OpenGL problem.

etc.

Have a look at:
http://www.youtube.com/watch?v=lnLKJDGE9Dg

You may not enjoy someone poking about your entrails exposed on a smart phone, but it could save your life!

If want to go back in history.

When minicomputers showed up Mainframes were going to disappear in a decade or so.
When personal computers showed up. Mainframes were really going to die now (although had been about a decade and were still around). Those minis were on borrowed time now too.
Now were in an era where the mobile handhelds show up. Mainframes are really, really, really going to die now (it has now been over two decades since they were suppose to die). Minis so doomed. PCs .... dead platform walking.

Notice the pattern?

There is some stuff that will migrate down to smaller platforms and other stuff that won't.
Network/data bandwidth to the radio remote hand held will very likely lag way behind the hardwired connected machines.

Those are good points. But, if you consider the question to be one of having access to computing power (mainframes, minis, LAN/WAN, desktop, laptop), then the smart phone shows the future. The smart phone gives you much of what [computing power] you need within the device, itself. When you need more [computing power] it is a touch away: anytime, anyplace.

With the iPhone and iPod Touch, Apple, alone, has put access to computing power in the hands of over 40 million people. I doubt that 1 tenth of that number have direct access to mainframes and such.
 
I think this is probably scarier news for RIM than anyone else. As users get accustom to nice UIs on their smartphones and a decent selection of apps, the iPhone, webOS, and Android will appeal more and more to users, while the BlackBerry's antiquated user interface will turn off more and more buyers.

Just my predictions.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.