View Full Version : AMD edges past Intel in October processors for U.S. computer...
Nov 9, 2005, 09:44 AM
Category: News and Press Releases
Link: AMD edges past Intel in October processors for U.S. computer market. (http://www.macbytes.com/link.php?sid=20051109094459)
Posted on MacBytes.com (http://www.macbytes.com)
Approved by Mudbug
Nov 9, 2005, 09:52 AM
Apple should be using AMD or IBM or Intel
i don't even know who i want Apple to use any more! juts pick on and stay with it!:eek:
Nov 9, 2005, 09:58 AM
No surprise. It was bound to happen. I've been using AMD in PCs for about 5 years now. Better product and better price.
Nov 9, 2005, 10:05 AM
Intel on the road in laptops.
AMD on the desk in PowerMacs.
Can they call an Intel/AMD desktop machine a PowerMac? They are called PowerMacs because they use the PowerPC processor.
Nov 9, 2005, 10:41 AM
It used to mean that :) But the "Power" in "PowerMac" means whatever Apple wants it to. They're not PowerPCMacs after all.
For instance, PowerBooks existed long before they had PowerPCs. "Power" simply meant power. No reason that PowerMacs and PowerBooks can't share that reason in future.
I'd expect Apple would want to keep getting value from the PowerMac brand.
(And to clarify, AMD didn't sell more chips than Intel, but its chips were in more RETAIL store sales. Intel still has Dell, for instance. Still an achievement. And Apple can use AMD or Intel or both in future if they wish.)
Nov 9, 2005, 10:43 AM
They are called PowerMacs because they use the PowerPC processor.
Didn't Apple use 68k before PPC and still call them PowerMacs
Nov 9, 2005, 10:48 AM
Didn't Apple use 68k before PPC and still call them PowerMacs
I thought that that it was Quadra or Performa before they went PPC.
Well, I guess I'll just have to hop over to apple-history.com and check-ch-check it out.
edit: yeah, looks like they weren't called PowerMacs until the got the PowerPC chips. Quadra, Centris, LC, Performa. Some of the later Performas had PPC chips, but no PowerMacs had pre-PPC chips.
The PowerBook however is another story.
Nov 9, 2005, 11:14 AM
God people. Wait for 2006. Intel's going to get their collective **** together. If they weren't Apple would have never gone with Intel. The simple undeniable truth is when Intel isn't being led around by the nose by marketing dweebs they can make some DAMN good products. The P4 and netburst was propagated by marketing morons who sole goal was to propagate the Mhz myth and in doing so increase sales. This worked for a time until people actually started catching a clue that more Mhz != performance. At which point someone at Intel said: “ah-oh. We are screwed” and started working on a replacement. Said replacement will show itself in fall 2006. Some are saying improvements could be as much as 20% for lower clock speeds. (Think I read that on anantech.) Watch as AMD starts getting a few upper cuts by Intel in 2007. By no means is there going to be a single knockout blow but I think Intel has gotten their second wind.
The new chips + the massive fabrication capabilities available to Intel is putting Intel back into the desktop game.
The only concern I have is this insane stubbornness of Intel to keep the memory controller off the chip. Does anyone know a legit reason why they are doing this? I think this one feature alone gives AMD a massive leg-up when it comes to future designs. Part of AMD's overall design for their chips is what allowed such a smooth transition to dual cores.
I'm hoping to see Intel jump onto the onboard memory controller in the coming years. Maybe 2007? :confused:
Nov 9, 2005, 11:16 AM
AMD and Intel are constantly leap frogging each other in performance of chips and sales. IRC, AMD have been infront of Intel in terms of sales before.
This means nothing for Apple - its not an indication that AMD should have been chosen over Intel.
Like Nvidia and ATI, Intel and AMD are leap frogging each other.
Anyway, this is us sales, not worldwide - the world really doesn't revolve around the states.
Nov 9, 2005, 12:06 PM
I have no desire to see Intel "beat" AMD (or vice versa). I'm just glad that in the long term, Apple will still have TWO potential chip suppliers, both with a better delivery record than IBM and Freescale.
I don't care which one has the fastest chip on a given date any more than I cared whether IBM or Motorola made the chip after the G4. Just as long as Apple is using the fastest out there, with about the same frequency and timeliness as other PC makers.
Nov 9, 2005, 02:06 PM
WRT performance, as has been mentioned, both AMD and Intel are comparable. For 32-bit code, each company's high-end chips perform similarly, with the companies leapfrogging each other with each release.
For 64-bit code, AMD has a huge head start. Their 64-bit model (by virtue of being backward compatible with 32-bit x86 code) was the first to be adopted by a non-trivial number of PC users. Intel has had to play catch-up, since their 64-bit solution (Itanium) at the time was not backward compatible. Of course, Intel now has 64-bit x86 chips, but, ironically, they have to follow AMD's model, because Microsoft has refused to create yet another incompatible 64-bit x86 release of Windows.
I always recommend AMD to friends looking to build systems. Not because they're more powerful, but because they tend to cost less. In some cases, a lot less.
But in terms of what Apple should use, that doesn't matter as much. Intel's chips are a lot less expensive when you buy millions of them. Which is why Dell can sell P4 systems for dirt-cheap prices, when an individual like you and I couldn't come close to building that system for the same price.
I'm sure Apple will get similar pricing. So Apple can choose between two vendors, with comparable power and comparable prices. Which means the decision is now based on other criteria. Fabrication capacity and a very attractive roadmap for low-power chips is almost certainly what drove Apple's decision.
Of course, if Intel doesn't live up to the promises, Apple can still switch to AMD. Since the chips are software compatible, they won't break the OS or applications, which means very few users will ever be impacted. (I suppose there will be some programs that directly make 3DNow! calls without using a portable framework, but any developer with a brain will know better, after having had to port code away from direct AltiVec calls.)