Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Some comments and a question.
After going through the phases of denial and anger, I have managed acceptance. From a software point of view, this switch should be a good thing. A handful of smaller software vendors will doubtlesly be lost. However, all reports are saying that rewriting for the new chip will be relatively painless. Supposedly Apple's software is already ready to go on Intel chips... and certainly all the apps should be by the time the new hardware goes on sale.
There will probably be a generation of major third-party software that will be installable on both hardware types. Then the version after that will require Intel chips. Which is really no different from most other two-generation intervals out there... some software that was brand-new when the 266Mhz G3's were first released doesn't run in Classic on a G5 running Tiger. Another half-dozen years from now, supporting PPC chips will be irrelevant.
Touching on what PaulyPants and Mr. Vene said, this move should encourage vendors to port more software to the Mac, because it will probably be a lot easier with Intel chips underneath. Results will vary, of course, depending on the software. But game makers in particular, who I'm told rely heavily on low-level interaction with the chips, will hopefully find it much easier to make Mac versions with Intel chips underneath. I don't think we'll see OSX go the way of OS/2, for two reasons. One is that Apple makes hardware. You buy an Apple machine that comes with OSX, and no Windows. You'd then have to go out of your way to put Windows on it. The other reason is that OS/2 wasn't superior to Windows the way OSX is. I seriously doubt many folks who buy a Mac are going to say, "Man, this sucks. I'm just gonna use Windows for everything I can."

This assumes that OSX won't run on non-MacIntel machines, which people seem to be generally taking for granted. My guess is that as part of this deal Apple is giving Intel some chip tech involving hperthreading, Altivec, or some such. Then the MacIntel chips will have features that standard Wintel chips do not. Thus OSX won't run on Wintel chips, while Windows will run on MacIntel, only without being able to fully utilize the chip.

So: Apple ditches a chip maker that's increasingly unable to provide the production and improvements they need to remain competitive. They gain software titles due to low-level porting being easier. In order to create a bigger product base for the online media-purchasing systems they already dominate, they pick up Intel's digital rights management capabilities. Imagine a day when you no longer need cable TV, because you can watch what you want cheaper through Apple's iMedia store.
And what about Microsoft, who you'd think would be unhappy about sharing a chipmaker with their OS enemy? The thing is, Microsoft isn't a hardware maker, but Apple is. How much does Microsoft really care who sold the hardware, as long as the customer is still buying MS Office? This move will probably cut into sales of Windows, as people realize how much better OS X is. However, Windows is already hugely pirated. Office is doubtlessly more profitable. The way Longhorn is (or isn't) coming, Microsoft may have decided that they don't have much hope of coming out with an OS that's superior to OSX. Combine all that with the fact that Windows is losing more market share to Linux than they are to Mac, and I can see M$ deciding that it's better to accept OSX gaining market share in order to reduce the ranks of people running Linux and buying nothing from M$ at all. As Jobs pointed out, Office is the Microsoft product nobody really wants to mess with.

My question is: what is it about Intel chips that's so bad compared to AMD? Unlike Tilmett I'm not particularly convinced of the G5's superiority. I've read that the G5 was a bit of a kludge, with stuff like the Altivec engine just tacked on instead of properly incorporated. It certainly hasn't been improved upon significantly in the last two years, and from what Apple's saying it's not going to be in the future. So, lack of elegance notwithstanding, Intel and AMD are going to make chips that perform better than anything PPC has to offer. To me that is all that matters. Apart from Intel's having the DRM business and better expected production ability, any thoughts on why Intel instead of AMD? I don't know all that much about the hardware differences. I've never had to care before. :|

PS: Having just read the last couple posts, I'd like to point out that many of us haven't been Mac owners for years due to primarily emotional reasons. As far back as 1988 I compared Apple software to M$ software and found that what was written for Apple was simply better. Years later I was agog to see that Win95 had a ugly, blocky 16-color interface when Mac had been 256 for years already. For me this has always been about superior functionality, which is mostly software. And who's to say that MacIntel chips won't have some sexy additions like keeping Altivec? I really doubt Macs will have generic Pentiums inside. I expect there will always be things that "only Macs can do". And finally we'll be able to put to rest some of the arguments over benchmarking, with OSX and Windows on the same machine the differences wil become clearer.
 
The big thing against Intel right now is their power-hungry "Netburst" architecture (which is the philosophy behind the Pentium 4). Fortunately, Intel realized that Netburst is a dead-end before it was too late, and have started taking steps to purge it from their lineup. Once this happens, Intel and AMD will be on more equal terms.
 
I think my biggest fear wasn't the fact that PPC CPU's were going to be dropped but the tie up solely with Intel. I'm all for OS X on x86 if the speed gains are going to be there but Intel do not make the best chips for desktops and haven't for years. I'm not a real techie but I know enough. None of my PC using friends, both those who use them for work in the programming and aerospace fields as well as those who just use them as general purpose machines will choose Intel over AMD. Intel are currently the company to go to for mobile technology I don't think anyone would debate this but in the high end workstation and desktop arena AMD rule the roost. The Opteron and new Athlon X2 are world beating chips. Now obviously if supply is going to be an issue then fair enough but there have been no real complaints from with in the resale industry about not being able to get AMD offerings.

I think Apple could have put a lot more people at ease by just announcing a move to x86. Unless Intel will be making customised chips for Apple why do they have to be tied in with a specific company? AMD's Dresden FAB 36 is supposed to enter into full volume production by 2006 this should solve any of the supposed supply issues.

When the G5 moved to 90nm a lot of people said that it was IBM's inability to master that new process. I feel that statement needs to be revised, it must surely be IBM's inability to produce the G5 under the new 90nm process. While AMD's new FAB is under production it is my understading that IBM took up some of their production at the Fishkill plant. AMD suffered no major problems with their transition. AMD are also on course to introduce 65nm next year. It has always intrigued me how Steve Jobs and Apple never really made any mention of AMD. Steve stood on stage and always compared Apple/IBM chips to Intel. Most PC techies laughed at this knowing full well the performance kings were neither of those parties.

I hope as Jason has said on this thread that Apple have left themselves open to the possibility of using AMD further down the line.
 
I'm pretty much assuming that Apple is going with Intel in part because they're going to make customizations (or already have some, like DRM) that Apple wants. Having more power-efficient chips is a big deal for Apple due to their desire, as a hardware manufacturer, to make the best laptops they can. The whole Yonah thing is doubtlessly a significant part of Apple's future. I'm just curious what the essence is of AMD's performance superiority from a technical standpoint.
IMHO, absolute power matters less and less as fewer and fewer people need their whole processor to perform their usual tasks. I think Apple's decision was based mostly on factors other than maximum top end power.
 
Mukansa monkey:

In line with your comments;

I’m not yet convinced that Intel will be motivated to alter their line solely for Mac or Apple. My reasoning is simply based on the notion that Intel has it’s own interests in mind, and has never demonstrated this trait in the past. To Intel, while the Mac news is a PR landslide, it isn’t a major business boost compared to the rest of their business. I think, instead, Mac represents one of a range of high end customers to which they intend to cater, and for which they likely have technological advancements in plan. Whether we’ll see those plans bear fruit is another matter, I must agree.

As to game development – I think the trend will move toward “physic engines” in the future. This is in the form of a drop in card which performs simulation calculation similar to the graphics engines we see now, but in the domain of physical simulation which is not in the graphic’s chip at present, and which even dual or quad CPU systems could use a boost. Though it might be best if physics simulation features WERE built into the main CPU, I don’t yet get the sense that this is the direction of either Intel or AMD. If the physics engine does take the market, then Apple has, by virtue of the platform, brought the physics engine card developers a little closer to the Apple platform, without making them develop dual product pipelines (though by the nature of a drop in card, it might not have been much extra work anyway).


As to your question, “What makes Intel chips so bad?” Well, I too use AMD chips on my systems, and the comparisons I’ve made with Intel based machines makes it clear to me that Intel isn’t the speed king at present. Intel and Amd have leapfrogged each other over the years, but AMD has made rapid comebacks.

Intel based their main thrust on raw clock speed, and sacrificed circuit complexity to do it. Generally, the more complex the circuit, the tougher it is to ramp it’s raw speed upward. AMD realized that gains in IPC, even at the expense of clock speed, would provide more dramatic gains when clock speeds can be raised, and they knew other “physics” were as much at the heart of that issue than simply compromising early rather than later. Intel was short sighted in this manner.

Hyper threading hasn’t been much of a gain either, though in theory it seemed a good idea, real dual core is by far more productive. AMD recognized this as well.

PowerPC, by nature of it’s RISC roots, pushed IPC over clock speed too. There are compromises in all designs, despite the advertising to the contrary, because physics, electronics, logic and the difficulties of mass production impose them. Intel simply didn’t make great choices on those compromises. Still, I wouldn’t agree to the extent that they’re chips are garbage, or seriously bad.

I also don’t quite agree that there’s any significant mangling of the stream of code and logic moving through the processor. It certainly isn’t as convoluted as the Windows operating system, which certainly is a mangled mess to a great degree. Windows 95 and it’s cousins were the worst at this, and while NT is vastly superior to Window 95/98/ME, it still bears the marks of legacy which give it warts everywhere. Still, there’s a decent potential in there if developers set aside most of the bad, and it’s no excuse for developing bad applications for it. Photoshop on the PC is one example – it works very well on my system, as do 3DS Max, Steinberg’s Wavelab, and a range of other applications. Adobe’s Premiere, however, can’t match Final Cut – but that’s not the chip or the OS at fault (well, Adobe might have had some complications from the OS, but it still isn’t a good excuse).

What is quite true, though, is that Intel’s offerings aren’t the fastest, and that does bring to puzzle just why Jobs went Intel only. I’d like to learn if his contract with them has the “gotcha’s” that Dell has (which is apparently why they’re all Intel to date).

A lot of people here have attributed behavior and fault of unreliable operation to Intel. This simply isn’t a factual representation. That’s strictly in the realm of motherboard/chipset/driver and occasionally at fault when integrators (like Dell) toss in marginal power supplies, drives or other peripherals. Of all the P4 1.4 through 2.5Ghz Dell’s I’ve known (dozens in various offices, for example), they all failed miserably because of bad disk drives (IBM Deskstars), bad motherboards, or something similar.

Now, to the contrary – I know a manufacturing plant that had, until about a year ago, an IBM PC XT in use since 1981. Yep, darn thing ran every day, 12 hours a day, for 23+ years! They also have an AT (original 6Mhz build) which is still running, from 1986 (or is that 87?). My own, old (old) dual P2-333Mhz box is still running in it’s present home, which I original bought (as parts) in 1998. The AMD based 233, 300 and 450 (K6 variety) I bought in 1996 are still working. One served my son (when he was 3, now he’s 4 so I moved him up to a P2-450 from 1998). In other words, Apple isn’t the only company that makes a PC capable of running every day, all day (many of these machines are 24/7) for years.

My own Win2k development machine was last booted on May 9th, when I last updated it with Windows service updates. It’s not been shut down since then. I typically leave it running for weeks at a time, often with debugging sessions that are continued for days at a time.

I would expect OS X to be easier to keep reliable than Windows of any flavor, for years ongoing. My point is that nothing about Intel makes the machine inherently unreliable or flawed, nor is AMD any less capable in this regard.

As to the attitude about preference for the PowerPC with any of the adjectives like elegant, superior, more advanced…well – here’s my thought. I’ve heard that cussing in Italian is better than cussing in English. A character in the Matrix films preferring cussing in French for reasons that made me grin, but I speak neither language well. My Colombian wife manages a searing retort in Spanish now and then, which I thankfully don’t understand, but when I feel the need to berate my fellow man, I haven’t found a lack from the English language, though I admit I must not really know what I’m missing. A few words my wife taught me from Spanish colloquial insults sound impressive to me because they’re new. Once she mentioned, at a business party, how someone was “kissing arse” to gain favors, I looked at an acquaintance who’s ears burned a bit and said, “Well, it sounds prettier in Spanish, I’m sure.”

The fact is, CPU’s are all about turning a language of bits and bytes into function. They all use some kind of ALU, bit shifters, comparators and a host of other electronic gizmos which come to us from decades past (and a few from centuries past). Does it matter, really, if a cylinder head has a hemispherical top, or a pyramidal one? There may be minor performance differences in one technological approach or another, but what really makes a difference is when performance jumps by orders of magnitude. A mere 20 or 30% difference is hardly noticeable for most situations; double and triple (or that major 10K+ jump we’ve made since the old 6502) does make major, meaningful differences.

PowerPC is still used in some of IBM’s own machines, but they also have the RISC6000 design to consider. Since they’re competing for customers in an arena where Intel and now even AMD compete, we see things happening to IBM that have happened to HP, Sun and SGI. Sun’s own Sparc, an interesting concept in it’s day, is giving way to Opterons. HP has several Opteron servers, and IBM can’t find a way to compete in the price/performance space all that well.

I have no doubt that innovation in the CPU will move forward by bounds in the year to come. I really thought Transmeta had some great ideas, but fizzled out with licenses to the big guys, and got out entirely. Development in this industry is about genius and cash. John Cocke is gone, now. I don’t know where his counterpart is employed, but someone at AMD is no slouch, and I think there may be at least one still at Intel, just moved out of the Itanium group. The rest is up to the physics guys, and some of them are elsewhere (unknown to me).

HP has a bright idea that might make big news in a few years (a new idea to replace the transistor, shrink size to 1/10th of today’s best silicon, and run faster than anything we know at present). So there’s at least a few ideas in physics we aren’t yet even able to taste in product. Guess who’s got the most cash to license that gem?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.