Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
nagromme said:
I'll let you research Intel's upcoming chip platforms yourself (they're not mediocre),

Anything that everyone uses is mediocre...

1) IBM had a history of saying something could be done and then being wrong.

Hmmm, let's see:
Montecito promised for 2004. Then 2005. Now it's 2006. Foxtron completely dumped. Tukwila promised to bring eternal bliss and infinite processing power through a complete re-design, now it's just a lame Itanic-goodbye-version. 4 GHz P4 cancelled. New proper FSB moved to 2007. Whole Netburst-line cancelled prematurely, long before reaching the promised 9-10 GHz. Should i go on or do you see the fallacy in your argumentation?

Not a good business partner--unless of course you sell consoles which a) only need a new CPU every few years and b) are expected to be sold at a loss anyway.

Apparently alot of huge corporations think IBM is a very good business partner. In fact that's where they make most of their money: Big business. The difference to Apple being their other customers don't beyotch and moan and frequently pay for IBMs services...

Seriously: If there's one company that's notoriously known to act childish and unprofessional because of an ego-tripping CEO, it's Apple. Need i say "manually crossing out Radeons from Cube Spec-sheets"?

2) If Apple DID pay IBM the extra amounts required to help IBM meet its promises, then Macs would cost huge amounts more. (And STILL be released late when IBM misses its deadlines.) Is that a solution?

Apple has $8 Billion in the bank now. 8 friggin billion! What for is all that money if not for R&D? I have always valued Apple as one of the few remaining Computer manufacturers to actually still do R&D, with the rest of them just following the Dell "just put the bruises on the banana and off it goes"-Model. Well, not anymore they don't! Besides: CPU design-cost is a one-time cost and does not at all scale with the number of CPUs produced.

Can you name any current PC maker that uses custom-made processors designed just for their unique computers, pays the development costs to the chip-maker, and still manages to be priced competitively and sell at the same profits other PC companies make? If not, then who is "everybody but Apple"?

Gosh, that's a lot of determinators you're throwing at me there. I'll take the liberty to not adhere to all of them: Sony paid them. M$ paid them (that's "everybody but Apple"! But it also perfectly normal in other fields of electronics, who ever said i was talking specifically of Computers?). Besides: Sun still designs their own stuff.

shamino said:
Don't know, but it's certainly not for x86 compatibility. Itanium's total incompatibility with x86 softwre is what killed it as a viable product.

Wrong. Its totally ridiculous performance with x86 is what killed it! Plus it being much much too late!
Itanic CAN run x86-Software. The emulation is even hardwired into the Chip (Intel wants to take it out and do it in software though (like F/X32), just iike they should've done right from the start!)

And most consumers don't run enterprise web servers on their desktops.

Doesn't matter. Take Media-Encoding, still the same laughable performance! You missed the point, the point was that these benchmarks were questioned for credibility. An Apache-Benchmark, on a server-CPU. Yeah right! ;-)

The Xeon has always been the contender for the G5. So the Paxville is the perfect comparison for the 970MP! The 970MP (and the Opteron) seriously kick its ass, even if you can't stand to hear that shiny-happy-people-Intel is just ridiculously outclassed right now...

The applications you're holding up as a benchmark are the kind that are best suited to Linux PCs (or larger UNIX systems from IBM, Sun and SGI) not anything running Mac OS.

Media Encoding isn't done on the Mac? Or Science work? Wow, i bet there are a few hundred thousand people who beg to differ! ;-)
And what exactly gives you the idea it will be ANY different in any other benchmark? Hell, its Cinebench-Performance also TOTALLY SUCKS!

As others have said, you're picking and choosing unrealistic benchmarks in order to make your point.

i call BS. Just because you don't like the results doesn't make these results "unrealistic". GamePCs tests have been recited the web over, by various respected sources (e.g. Arstechnica). What exacty gives you the right to question their credibility? Bring on YOUR results then! Oh! You don't have any? Then how come you know the ones I'm quoting are "unrealistic"?
..and i'll say it again, just to make sure: Media Encoding IS ACTUALLY A VERY POPULAR DESKTOP-TASK!

minimax said:
No you don't, as there will also be single core Yonah's. It is very likely to the point of almost certain that Apple will use those for the mac mini and ibook.

Then they won't come in January. Check the roadmaps, single core Yonahs are scheduled for mid-2006!

siliconaddict said:
Gah....people are pulling some of these specs out of their [bleep]. Aiden where are you getting 10-20%?

If you informed yourself before drivelling and questioning others numbers you'd know this yourself. Epic has said so when talking about their Unreal-Version for the Opteron. Cinebench also clearly shows it: 1107 for a Dual 275 Opteron in 32bit, 1381 in 64bit. 24,7% gain, hello thar!

You act as if Apple hasn't taken any of this into consideration. Do you really think that Apple wasn't aware of Intel’s roadmap when they made the plunge?

Well - if they did: Where's their friggin plan for x86-64bit? Apple simply does NOT SAY A SINGLE THING on the topic! Which should have the bells ringing for anyone with just half a brain!..
Wouldn't you think they should actually TELL developers how they're gonna proceed with 64bit when they're totally unexpectedly introducing x86 right in the middle of the big 32->64bit transition in the PC-world?

As such don't you think that XCode is going to be/is ready for the migration between 32-bit and 64-bit?

Well - it can be made ready. Right now it isn't AT ALL! What exactly is the point in moving your code over from 32bit PPC to 32bit Intel and then moving it AGAIN from 32bit Intel to 64bit Intel? How many developers (and customers! Think "have to buy new versions"!) will seriously put up with this everlasting transition crap, especially considering the OS X transition is just over?

Fact: In five years, when Apple dumps support of the last PPC-Macs and the Intel-Switch is finally over there will be NO Software still made for 32bit! Just like there's none anymore that doesn't use SSE2, because Intels Compilers automatically generate it!

Apple has done this type of migration before. They know what they are doing.

Apple has moved to a mainstream CPU before? Wow, i never knew! Jeez, stop with the Steve-reurgitating pretty please: This is NOT like 68k->PPC! Apple is not just switching CPUs, they are ENTERING THE X86-MARKET this time and they do NOT have any control over the CPU anymore!

Right now it doesn't look AT ALL like they know what they're doing. See: Missing x86-64-Plan!

siliconaddict said:
There is still some contention as to when single cores will be shipping. I've read everything from January '06 - Mid Summer '06.

Name one source that says January 06.

Also keep in mind that if its true that the Mac Mini may be going entertainment center on us

"To go entertainment center on us" has to go down in history as one of the most d*mb sounding expressions ev4r....
 
Kai said:
Apple has $8 Billion in the bank now. 8 friggin billion! What for is all that money if not for R&D?
This is your argument for saying that Apple could pay more to make IBM get their job done, and yet not have to raise Mac prices as a result? In other words, Apple wouldn't be making a profit on those product lines, but that's OK because they have enough money already? They'd be taking a loss on every Mac, yet not go bankrupt for years! I like it! But it's out of touch with business reality.

Kai said:
Gosh, that's a lot of determinators you're throwing at me there. I'll take the liberty to not adhere to all of them:
Precisely. You are carefully picking and choosing which "facts" not to ignore--and the ones you are choosing are often irrelevant to the current reality. (Which is troll-like behavior.) So, which of my determinators did you decide you could throw out because it's not relevant to Apple's situation? They ALL are. So you can't throw them out or your attempted point about Apple being too stingy to pay IBM is not made.

I'm not seeing a coherent argument to respond to--and that tells me that even YOU don't take your posts too seriously. I would recommend my fellow forum-goers share your attitude in that matter :) You sound deeply angry, but you're really just having fun. And fun is what Mac rumors should be about :)

I am reminded of the methods here:
http://homepage.mac.com/bhoglund/forumFudsters.html
 
I like to look at it as ... Apple isn't going x86, but going CPU agnostic. Since Apple has been running OS X on x86 for quite a while, I wouldn't be surprised if they have it running on an ARM CPU in a PDA. Or Sun big iron processors.
 
nagromme said:
This is your argument for saying that Apple could pay more to make IBM get their job done, and yet not have to raise Mac prices as a result?

I didn't say "pay more", i said "pay". As in: Pay them any money at all to have custom chips designed!

In other words, Apple wouldn't be making a profit on those product lines, but that's OK because they have enough money already? They'd be taking a loss on every Mac, yet not go bankrupt for years! I like it! But it's out of touch with business reality.

Excuse me, have you recently looked at how long Apple sells one CPU? The G4 is now six years old and Apple has surely sold some 15 Million of them - at least! Are you honestly trying to tell me a one-time development cost would still matter?
And in how far would a one-time cost raise the price of EVERY Mac? Price-raises are just for rising component costs - if there are price raises at all, it's been some time since i saw one in the IT-sector (oh wait - there will be one from a $80-CPU to a $150-CPU with some wellknown Computer manufacturer soon! ;-).
Besides: Do you even have a glimpse of a clue what Apples profit margins are? They're in a range that PC-Makers have wet dreams about at night! Apple could EASILY earn a lot less (like they have chosen to do f.ex. with the Mac mini!) and still have a profit way way beyond anything the PC OEMs are seeing!


Precisely. You are carefully picking and choosing which "facts" not to ignore--and the ones you are choosing are often irrelevant to the current reality.

More BS from you. I was talking about custom chip designs in general. YOU limited it to PC-Makers! Hence it's my perfect right to bring it back to where it started from and ignore your wrongly inserted denominators.
In how far is the simple fact that you pay someone to build you a custom chip "irrelevant to the current reality"? And what's a "current reality" anyway? Are we in for a complete overhaul, yes?

(Which is troll-like behavior.)

Nice trick. You eat your own words, hence the other person is automatically a troll. Hmm, i wonder why you don't say a single thing about the all the slips, broken promises and fudgeups in Intels roadmaps that i quoted!...

So, which of my determinators did you decide you could throw out because it's not relevant to Apple's situation?

The "PC-Makers" one. Because neither the Xbox nor the PS3 is a PC. And because this is also standard practice in other areas of electronics.

They ALL are.

No. Read above.

So you can't throw them out or your attempted point about Apple being too stingy to pay IBM is not made.

BS. If you add denominators to my original statement it is my perfect right to remove them and get back to what i originally said. You may try to steer the discussion into waters that suit you better, but these games won't work with me, dude!...

I'm not seeing a coherent argument to respond to--

Well, i can't help you there then.. If you don't want to see it, there's nothing i can do about it.
Here's what i say in bold letters, to help you understand:

Custom Chip design is something you pay for

In fact it's really simple, just _try_ to grasp it!...

and that tells me that even YOU don't take your posts too seriously.

Is that so? That's some impressive logic, really.. Honestly: I don't have the slightest idea how you end up with this conclusion, but it sure is entertaining!

I would recommend my fellow forum-goers share your attitude in that matter :) You sound deeply angry, but you're really just having fun. And fun is what Mac rumors should be about :)

Actually i'm having a bit of fun with you because much of what's said on here is so easily refuted, but it's a bit like shooting sitting ducks...

I'm not having fun thinking of Apple switching to Intel. See first posting by me in this thread as to why. Until i get answers to all of this, I'll remain furious. Why? Because i just don't like being lied to, and by now it's darn obvious Steve Jobs is lying!...


Haha, that's nice. Who would you think pays me? Apple? Intel? I'm most curious to see what company you could possibly come up with! ;-)
 
Kai, I am saying this nicely, but you have an attitude problem and need to tone down.

A FUDer doesnt need to get payed, and in 99< % probably isnt. The point from the linked article was that you're a 'loser' if you don't get payed for it as you have nothing to win or loose but your own ego.

The paper, rock, scissors comparison was pretty relevant for your case. You choose the artificial benchmarks that suit you best to make your point but you know damn well any serious comparison between the athlon and the pentium m with real application based benchmarks that cover the whole spectrum (not just FP) doesnt show the figures you are claiming to be representative for their performance.

I hope you take this advice to heart. We have an 'ignore user' function on this forum and if you persist in this behaviour I'm quite sure you will end not only on mine.
 
minimax said:
Kai, I am saying this nicely, but you have an attitude problem and need to tone down.

There's only one person who has an attitude problem. He's usually wearing a turtleneck and many many people even keep bolstering his ego to make it even worse.
Well, him and maybe the guys that call GamePCs benchmarks "unrealistic" and hallucinate something about "numbers pulled out of your *ss"...

A FUDer doesnt need to get payed, and in 99< % probably isnt. The point from the linked article was that you're a 'loser' if you don't get payed for it as you have nothing to win or loose but your own ego.

So when I point out serious flaws in Apples strategy that nobody so far was able to refute just remotely i do FUD? That's really an interesting concept to avoid actually listening to what the other guy is saying. Pretty lame and childish, too. Think covering your ears with your hands singing "Lalalala! I'm not hearing you, FUDer!"

I would call people who can't think for themselves and rely on Father Steve to do the thinking for them ("Apple knows what it's doing" - yeah right: Think Cube, Pippin, Newton, Copland, Licensing and the whole shebang of wrong Apple decisions throughout its history!) "losers", but you might see things different...

The paper, rock, scissors comparison was pretty relevant for your case. You choose the artificial benchmarks that suit you best to make your point but you know damn well any serious comparison between the athlon and the pentium m with real application based benchmarks that cover the whole spectrum (not just FP) doesnt show the figures you are claiming to be representative for their performance.

We're talking about the Paxville DC-Xeon, which is the current 970MP contender, not a Notebook-CPU. Amazing you haven't even realized that by now... *I* chose the Paxville as an example, so i set the rules. If you quote the Pentium-M now, YOU're doing the paperrockscissors-thing to ME!

But: The difference to the paper, rock, scissors analogy on that page is that there are mobile CPUs, Desktop CPUs and Workstation/Server-CPUs, and each of them can be compared separately to their AMD and IBM equivalents (except for IBM in mobile, which is not there) , not with one single imaginary Intel-Chip (which would be the Honda Accord in the example!). Right now Intel is pretty behind on the Desktop and completely destroyed on the Workstation/Server and can only show a lead on the Notebook. And we all know the real numbercrunching is done on Notebooks, right... FYI: Apples mobiles sell like hotcakes IN SPITE of the G4 being seriously outdated by now...
So summing up we have: 0:1 on the Desktop for AMD, 0:2 on the Workstation/Server and 1:0 for Intel in the mobile space. I'd say AMD definately leads 1:3!

btw: You might want to read this:

"First Dual-Core Pentium 4 a Rush Job, Intel Says - Design rushed out the door to beat AMD, Intel engineer says."

This is also highly interesting:

"Second-class Intel to trail AMD for years
[...]
Nathan Brookwood, an analyst at InSight 64, summed up the issues well in a research note issued this week.
"If Intel is ever to reclaim the performance lead from AMD, it must make the transition to an on-chip memory controller," he wrote.
"Intel slated Whitefield (a Xeon) and Tukwila (an Itanium) as its first processors to incorporate on-chip memory controllers. Tukwila will still use this approach, but Tigerton, Whitefield's replacement, will rely on a memory controller built into the chipset that supports the CPU, Intel's traditional approach, rather than a controller built into the CPU itself. Given the two year cycles that drive Intel's server roadmaps, this means that Intel will not be able to field a server processor with an on-board memory controller until 2009 at the earliest. Between now and then, we see little likelihood that Intel will be able to claim performance leadership."

[...]

As stated, Intel will struggle to match Opteron in the next 18 months but instead of rolling out Opteron killers in 2007, Intel will introduce more processors tied down by its aging architecture dependencies. The company does not have a realistic chance of besting Opteron on typical server benchmarks until the new chips arrive in 2009. By that time, AMD will have new four-core designs of its own and who knows what other innovations."

I hope you take this advice to heart. We have an 'ignore user' function on this forum and if you persist in this behaviour I'm quite sure you will end not only on mine.

Feel free to ignore me if you choose to. But ignoring won't make the very real very burning open questions on Apples switch go away...
 
Kai said:
btw: You might want to read this:

"First Dual-Core Pentium 4 a Rush Job, Intel Says - Design rushed out the door to beat AMD, Intel engineer says."

Thanks for that info, geez I didnt know that :rolleyes: :rolleyes:
You need to stop beating the dead horse called Netburst.
 
Rock paper scissors is exactly it :D Comparing Apple's processor choice to other PC makers isn't relevant, but comparing to game consoles that sell at a loss is? Sounds like what's "relevant" might be based more on what he WANTS to talk about. ;) Ah well, it was fun while it lasted.

I guess after the coming transition we'll see if Intel is as bad as he claims to expect :p
 
Please remember to keep things civil. Name calling and abusive behaviour isn't good for you or the rest of us. Besides, if that's the real you, it might be a good idea to search for a better way of doing things.

If that isn't enough, there is not only an ignore function--there is a ban function--for these forums. :D
 
So where did our 7448 go?

I wish Cnet asked this Mayer guy why Freescale or Apple did not go ahead with the 7448 G4 in the last PowerBook revision. The 7448 G4 would have been a very nice upgrade and a last hurrah for the Moto/Freescale PowerPC on the Mac.

http://www.freescale.com/files/32bit/doc/fact_sheet/MPC7448FACT.pdf

Why did we not get this chip? Was Apple not willing to pay? Was Freescale not willing to commit to production? Would this chip have performed better than the first intel chips in first generation macintels? I guess we will never know.
 
Little Endian said:
I wish Cnet asked this Mayer guy why Freescale or Apple did not go ahead with the 7448 G4 in the last PowerBook revision. The 7448 G4 would have been a very nice upgrade and a last hurrah for the Moto/Freescale PowerPC on the Mac.

http://www.freescale.com/files/32bit/doc/fact_sheet/MPC7448FACT.pdf

Why did we not get this chip? Was Apple not willing to pay? Was Freescale not willing to commit to production? Would this chip have performed better than the first intel chips in first generation macintels? I guess we will never know.
though its clock went up it L3 was removed meaning not much gained. Its performance is about that of G4s with less clocks and a L3 and no its performance wont touch the Yonah thats coming from Intel. It offered very little if anything for apple.
 
Dont Hurt Me said:
though its clock went up it L3 was removed meaning not much gained. Its performance is about that of G4s with less clocks and a L3 and no its performance wont touch the Yonah thats coming from Intel. It offered very little if anything for apple.

The 7448 Doubled the L2 cache to 1 Megabyte twice that of the current 7447 G4s and 4 times that of the L3 cache G4s which only had 256Kb of L2 albeit with 1-2Mb of much slower L3. I think you are getting 7448 confused with 7447. The Altivec registers are improved, faster FSB, faster memory and the chip could have been clocked as high as 2Ghz all while being even more energy efficient and cooler.

Yonah is a good chip but we may not see it until mid next year and the 7448 could have tided us over till then. Also I would not be surprised if high end PowerPC chips trounce Yonah and whatever chips are used in the first wave of intel macs. While rosetta supports Altivec it still won't be as fast as hardware altivec and in the beginning most apps from Apple and even more so third parties won't be optimized to truly flex X86 muscle at least in the first year.
 
Aggamemnon said:
On the other hand, why should Apple invest in a platform they wish to move away from?
You've got the chronology backwards. Apple decided to ditch IBM as a result of IBM refusing to manufacture the chips Apple wanted (without a huge "investment" payment above and beyond the cost of buying the chips.)
 
Little Endian said:
The 7448 Doubled the L2 cache to 1 Megabyte twice that of the current 7447 G4s and 4 times that of the L3 cache G4s which only had 256Kb of L2 albeit with 1-2Mb of much slower L3. I think you are getting 7448 confused with 7447. The Altivec registers are improved, faster FSB, faster memory and the chip could have been clocked as high as 2Ghz all while being even more energy efficient and cooler.

Yonah is a good chip but we may not see it until mid next year and the 7448 could have tided us over till then. Also I would not be surprised if high end PowerPC chips trounce Yonah and whatever chips are used in the first wave of intel macs. While rosetta supports Altivec it still won't be as fast as hardware altivec and in the beginning most apps from Apple and even more so third parties won't be optimized to truly flex X86 muscle at least in the first year.
where is this 7448? please dont tell me its on paper.
 
Aggamemnon said:
IBM make G5.

Why should Apple invest in buying a new G4 when they have decided to abandon it?
OK. One of us has something confused.

As far as I know, freescale has never asked Apple for "investment" money as a prerequisite to new processor development. If you're talking about the G4, I don't understand what "investment" you're referring to.

If you're merely asking why Apple would put a G4 into a new computer, keep in mind that the Intel transition is not going to happen overnight. Apple will upgrade some models soon, and some a year or more later, ultimately completing the transition before the end of 2007.

Although we are expecting the current G4 systems to be the first to go Intel, Apple has said nothing about this schedule. They might not all transition immediately. They may even keep one or two G4 systems in production until the very end of the Intel transition. If they do this, then a decision to use the most recent G4 variant available would not surprise me.
 
Assuming some G4 models won't go Intel right away, another reason not to use a newer G4 is if (just a possibility) Motorola is unable to deliver them in quantity. Sometimes that has been an issue for Apple: a chip exists, but orders can't be filled. Sticking with the same G4 might remove that question.

In any case, the question is about to become less important: when Apple's delayed by Intel it will be at the same time as the rest of the industry!
 
shamino said:
OK. One of us has something confused.

As far as I know, freescale has never asked Apple for "investment" money as a prerequisite to new processor development. If you're talking about the G4, I don't understand what "investment" you're referring to.

There was AIM (Apple IBM Motorola) that jointly developed the PowerPC for the desktop.
 
~Shard~ said:
Very interesting that he sold Jobs the G5 the first time he wanted to move to Intel. ;) With this, and the fact that OS X has been co-developed on Intel platforms since it's inception, you can see where Jobs's head was at all this time. No portable G5 solutions was probably the straw that broke the camel's back.
Yep. It looks like Steve was being very cautious while at the same time willing to take a risk on a newer and better architecture of the G5 at the time.

Just goes to show, that great leaders always have a plan B.

devman said:
Now this also ignores the very large crowd though that kept bleating about 64bit simply because "it's bigger than 32" without really having any idea what it was they were asking for.
Isn't this the truth!

Buzzwords. It's all about buzzwords! :(

Flynnstone said:
I like to look at it as ... Apple isn't going x86, but going CPU agnostic. Since Apple has been running OS X on x86 for quite a while, I wouldn't be surprised if they have it running on an ARM CPU in a PDA. Or Sun big iron processors.
You raise and interesting possibility.

What if Apple was working on more than just the PowerPC and x86 platforms?

Being platform agnostic would allow Apple to easily change in the future to the best hardware option at the time. While no small undertaking, it would help secure their future regardless of who is the processor/mobo leader.

Sushi
 
With a "C" compiler and proper coding, the CPU doesn't really matter. Except for performance. One big factor that is essentially outside the realm of the compiler is "Endianness". But it appears that Apple has that licked.

Does OS X on x86 (developer) rely on the video card or GPU heavily? Is most or all done in software with a video buffer? If so, ports to a decent PDA or a Cray would be relatively little work.

If they support too many processors, I can see fat binaries get getting realy fat :D
 
sushi said:
Yep. It looks like Steve was being very cautious while at the same time willing to take a risk on a newer and better architecture of the G5 at the time.

Just goes to show, that great leaders always have a plan B.

Very well said. It's reasons like this why I have a great deal of confidence in Jobs. Hopefully it's not that blasted Steve RDF coming into play again... ;) :cool:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.