Wow - what emotion behind this subject, eh?
For those just tuning in...and no patience to read through the preceeding 100 pages of posts:
There's a whole group of "sky is falling" advocates. Many have the feeling the x86 chips are awful, and have the view that PowerPC chips were so technologicially advanced that this is what gave Apple part of it's superior performance and reliability. Mix in with that the view that Intel has no innovative brains left, and only PowerPC had the future of computing in hand.
Other threads of thought hinge on portability issues, under the assumption that major roadblocks exist moving code from PowerPC to Intel, along with major performance hits.
In case you missed my modest contribution around page 95 (which, by the way, was entered when there were only 90 pages in this thread), I'm a developer of 25 years. I know this stuff like you know your way home.
CPU's are going to evolve. By the time Apple puts an Intel chip in a machine for consumer purchase, it will not be the P4 you know today. Take that in with a deep breath if you're among those with the belief that all things Intel are junk.
Sure, various chips have "wizbang" this and "high speed" that - do any of you actually have an engineering level of understanding as to what that means?
Take, for example, the notion that a 70% performance hit is to be felt between Altivec and the G3. On paper, in some benchmark article, or an ad, this is just factual enough that it can't be argued per se. However, if you application uses this feature that provides 70% performance boots ONLY 5% OF THE TIME - then the total performance gain is only going to be less that 1% overall.
This is the problem of optimization in general. CPU features like these things (SSE, SSE2, 3DNOW, whatever) are an optimization concept. They apply to tiny segments of operation, often inside loops or long iterations of some code. There are occasions where these optimizations have significant impact, but only on occasions. Encoding video, for example, can get a considerable boost from an optimization of this sort. If you're constantly encoding video, this could be important to you. It will not change the overall personality of the computer.
It also doesn't serve well to compare G3 level performance to that of a machine that hasn't even been built yet. The P4 outperforms a G3 today. Comparing the P4 to the G5 doesn't really merit much result, either, because the P4 isn't a single chip - it's a line of chips that's evolving. Jobs is no idiot. He's not about to select a chip from TODAY'S lineup for machine to be released in 12 months or more. He's got engineering samples of chips no P4 customer will see for a year, and may just know a thing or two about what Intel performance will be compared to PowerPC's roadmap for a computer to be released in 12 months. Consider that before you spend to much energy complaining about Intel's current offerings. Have a little faith in Job's judgement here. He's had a good record about "looking ahead" for a while now.
On other points, don't overlook the fact that Unix is the interior of OS X. Unix is THE ORIGINAL portable operating system. Likewise, virtually anything a COMPETENT developer makes for Unix SHOULD have some portability notion in mind. I'm not talking about OS portability here either, I'm talking about chip independence.
How depending on the chip do you think code is? I'll tell you out right - only hand optimized assembler. No matter what application you're thinking of, somewhere underneath it is a build from C or C++. Unix was built using C, precisely to give it portability over chips. No matter what "framwork" or "runtime" obsticle you think is in the way, it can and liekly will be solved by porting that underlying technology. Anything built on top of it would have natural portability when this is done.
Primary example is Unix itself. You don't re-write Unix to move it to a new chip. You re-write C, the compiler upon which it's based - and generate a Unix kernel for the new chip with it, and all else that moves to the new chip too.
Now, along the way, I'll bet there ARE a few lazy programmers that let slip into production code that's dependent on the chip in some way. They should no longer retain their job. Unix and C breath chip independence. That MUST have been on the minds of the developers working on OS X, all internal Apple products, and SHOULD have been on the mind of any developer making MAC OS X products.
If you think there's ANY roadblock that stops and renders impossible the transportability of code from PowerPC to x86, that was originally written for OSX in the first place, you are simply mistaken. None of this stuff is made out of real material. It's all logic, thought, numbers - and all of it flows through a compiler to come out as a functional product. It can and will flow into an x86 build.
I'm not saying it's effortless - I'm saying unless the product is from incompetent hands, it's not only possible, it's practicable.
On the point of "the end of Apple". I think not - here's why. I don't own an Apple (surprised?). I'm a professional developer, and engineer. I don't care what's in the box - it's irrelavent as long as it's fast, it works and does what I command it to do. Trust me, it does or I take it apart (and machines don't like that so they behave around me).
Anyway, until now I've considered an Apple only as a cross platform development check - but to that end Linux has been fine for me. I've worked on Apple projects, but I've worked on client's Apples when I did.
Now, however, I have an option appearing that never existed. A machine that could run OSX, Windows XP and Linux. That animal never existed before. It just might have me buying an Apple as my development machine - like now!
As a developer, I can get one TODAY, according to news, for about $1000. At this point, if I'm considering any new machine, I must begin to ask myself, well, why not an Apple? I have fewer reasons to say no at this point.
Then, too, how about my clients? I make applications for smaller markets that pay big bucks for office systems (though it's not just 'bag of fields' type things, usually includes imagery, technical 'waveform' data, etc.). Some of these customers may think similarly - wow, a machine that works like an Apple, but can run that 'xxx' app I can only get on XP? Hmmm - how much?
There just might be a new market emerging there. Apple might be able to take advantage.
Now about OSX hacked for nonMac machines...uh, why? I'm certain there will be a hack or two - no doubt protection will be difficult to enforce. Some nut will spend 4 months of his mental energy disecting and reverse engineering OSX, and for what? So a few geeks that might still run OS/2, Linux, XP and such can add OSX to their boxes? The underground distribution might be abuzz for a while, but to what cause?
Apple's OSX isn't all about just being OSX. It's reason for value is Apple, upgrades, support - applications. A hacked OSX will have the popularity that it has right now. And...if that hacked version crashes one time - do you think the user would want to keep using it? It's just not going to have "market level" influence on Apple.