Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

DakotaGuy

macrumors 601
Jan 14, 2002
4,226
3,791
South Dakota, USA
For all the people on this board that wishes Apple would have never went with the G5 in the first place, know that we would all be computing with Pentium 4's right now. Is the P4 a better processor then the G5? I don't think so, but the way things go around this place anymore many would say it is.
 

DeepIn2U

macrumors G5
May 30, 2002
12,824
6,878
Toronto, Ontario, Canada
@AidenShaw;

Whomever said that Apples' switch to Intel would possibly force 2 transitions on developers - one of OS X to 32-bit x86, the other on x86-64bit??

This transition kit & annoucement was supposed to prepare developers for summer 2006 right? If your thinking the first release of dual core Intel chips for early January/February, Apple still has enough time & allows Intel some time for x86-64EM chips for summer right?

Am I missing something here?
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
nagromme said:
Would x64 run in some way on 32-bit Intel chips too? I hadn't heard that.

Or are you saying that within a few months, all PCs even on the low end (like the Mini) will have 64-bit processors? No more Dothans or Yonahs within a few months?
Even current Celerons are 64-bit today in the Intel world.

Dothan (and Yonah) are about the only 32-bit chips around (except for older 32-bit designs that are still around for compatibility or low end stuff).

P4s, Xeons, and the Merom laptop chip that is due around WWDC are all x64 chips (with fully compatible 32-bit mode).
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
Prom1 said:
@AidenShaw;

Whomever said that Apples' switch to Intel would possibly force 2 transitions on developers - one of OS X to 32-bit x86, the other on x86-64bit??

This transition kit & annoucement was supposed to prepare developers for summer 2006 right? If your thinking the first release of dual core Intel chips for early January/February, Apple still has enough time & allows Intel some time for x86-64EM chips for summer right?

Am I missing something here?
The Apple Intel Developer's Transition Kit released last WWDC is a 64-bit Pentium system, the hardware to do 64-bit is already in the developer's hands.

What's coming next summer are the 64-bit follow-ons to Yonah - 64-bit low power chips.

I'm just confused by Apple's abandonment of 64-bit, after so much over-the-top, almost dishonest, hype of 64-bit as the future.

I wonder why Apple didn't:

- make the DTK version of OSX pure, true 64-bit - and have only 64-bit tools
- add a "short address" mode to the toolset, so that a program would load in low 32-bit addresses, and addresses could be stored as 32-bits in memory and expanded to 64-bits when used. This could make 32-bit to 64-bit ports much easier for those apps that don't need the expanded address space. (Alpha used this trick to run 32-bit NT on a chip without a 32-bit mode).
- Rosetta could still run 32-bit mode PPC programs (after all, it's an emulator), but use the x64 ISA for added performance

As it is, Apple has implicitly disavowed all of its 64-bit advertising and marketing, and will be forced to make its software partners do another port to 64-bit. They'll be giving Microsoft a 20% performance lead on the same hardware all the time that OSX is stuck in 32-bit.

The alternative - delay the first MacIntels until Merom is out, a few months and would still mostly meet the WWDC deadline. (Too bad The Steve said "By this time next year" again. :eek: )
 

nagromme

macrumors G5
May 2, 2002
12,546
1,196
AidenShaw said:
Even current Celerons are 64-bit today in the Intel world.

Dothan (and Yonah) are about the only 32-bit chips around (except for older 32-bit designs that are still around for compatibility or low end stuff).

P4s, Xeons, and the Merom laptop chip that is due around WWDC are all x64 chips (with fully compatible 32-bit mode).
That sounds like the very EARLIEST possible date for Merom. Do you have any recent info to suggest that Merom will BE that early? I still see recent reports that say they might not ship in quantity until early 2007. Don't get me wrong, I'd like them to be earlier :) I just haven't been seeing those expectations. Much less the kind of certainty that would be needed, if Apple was going to stake its whole laptop line on a G4-to-Merom scenario. If you have good reason to be that certain, then that makes me happy... for the sake of my Conroe PowerMac :)

In any case, WWDC is 7 months off, and I think Apple isn't necessarily wrong to get the G4 out of PowerBooks before then. (I wouldn't want a Celeron PowerBook though! Give me my dual cores--I can make do with 32 bits.)

Re OS X being slower than Windows due to the 64-bit issue: I won't be at ALL surprised if it's slower for OTHER reasons, regardless. It's a very different design, and some things will be slower or faster. I'll take OS X's design even if it IS 20% slower than Windows (on hardware faster than I'm used to anyway). In other words, I don't see raw benchmark speed as something Apple must target above all other factors. Apple CAN'T have ALL factors in their favor at once: best OS, top speed, AND shipping NOW. They have to compromise. A 32-bit compromise (YonahBook instead of waiting another 6-12 months on G4) isn't an automatic mistake in my view. I do understand the merits of your suggested compromise too--stick with G4 for at least another half a year. But for my needs, I hope it's not the compromise Apple makes.
 

bousozoku

Moderator emeritus
Jun 25, 2002
15,716
1,890
Lard
Abercrombieboy said:
For all the people on this board that wishes Apple would have never went with the G5 in the first place, know that we would all be computing with Pentium 4's right now. Is the P4 a better processor then the G5? I don't think so, but the way things go around this place anymore many would say it is.

The G5s are just fine.

I just wish that Apple had never decided to go with the G3 or G4 processors. AltiVec has some great points but neither the G3 or G4 could hold a candle to the 604/604e in terms of double precision floating point performance. It's there that the original PPC processors were exemplary and, with the exception of the 603/603e/603ev were quite a bit better than any processors Intel could muster at the time.

The trouble is that Motorola wanted the embedded market to buy PPC processors and they had to kill the power usage. In the end, the G4 ended up being the worst of both worlds with terrible power consumption while AltiVec was running and pathetic double-precision floating point performance.

I'm sure the manager on watch at the time has it boldly emblazoned on his resume though.
 

solvs

macrumors 603
Jun 25, 2002
5,684
1
LaLaLand, CA
AidenShaw said:
I wonder why Apple didn't:
They haven't done anything yet. Wait until the models are actually shipping, then you can complain. Plus, I've heard Leopard will completely change things (mostly for the better), so who says Tiger has to be anything other than what it is already? This is just a transitional period. Maybe we'll get 32 bit minis and iBooks, then 64 bit PowerBooks and upgrades across the line from there.

We need Intels as soon as we can get them (never thought I'd say that).
 

BlizzardBomb

macrumors 68030
Jun 15, 2005
2,537
0
England
Kai said:
Complete BS. The new iMacs are much much more silent and don't even start their fans when you put them under load. Makes you wonder why when the G5 is operating at its limit, right?

Then why is the new iMac clocked at 2.1Ghz. An incredible .1Ghz faster. If that was the only updated bit of the new iMac, you wouldn't be able to tell the difference in real world apps without a stopwatch or some benchmarking tools.

AidenShaw said:
I think that your response missed the context....

I wasn't saying not to switch - I was saying that waiting a few more months and switching to 64-bit x64 might make more sense.

Oh sorry my misunderstanding. Well when you think about it, this January launch date may not happen and instead it would be used to demo the new Intel Macs. The summer launch date may be the one after all? But if its January after all I guess its Apple's desicion.
 

curmi

macrumors regular
Jun 5, 2000
150
14
Melbourne, Australia
Apple release G5 Macs.

Microsoft see this and see the momentum. They have to stop the momentum of Apple and OS X. No way they can release this G5 in a laptop - that will make the PowerBook look enticing to businesses.

They talk to IBM. "We want you to concentrate on G5s for consoles. We'll use you exclusively for the 360 - but you have to concentrate on us, not others".

IBM does the maths. Apple are just too small. Ta da.
 

Fukui

macrumors 68000
Jul 19, 2002
1,630
18
shamino said:
A lot of people seem to believe that 64-bit means 32-bit incompatibility. But it's not true. At least not in the AMD-style 64-bit model (which Intel is supporting in their newest chips.)
Thanks. Then maybe apple does know what its doing... :eek:
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
shamino said:
A lot of people seem to believe that 64-bit means 32-bit incompatibility.
Actually, my argument is that 32-bit means 64-bit incompatibility.

(32-bit OSx86 source will need to be ported to 64-bit OSx64 source, and tested - and both 32-bit and 64-bit will need to be supported.)

shamino said:
As for making everything 64-bit, it could be done, but there's really little point to it. Unless your program requires large registers or more than 4G of address space, moving to 64-bit is counterproductive.
As has been pointed out - true for the G5, but not true for x64.

x64 mode has twice as many 128-bit SSE registers, and nearly three times as many usable general-purpose integer/address registers. This can give a boost to most programs due to the ability to hold more data in registers and spend less time shuffling between registers and memory.

shamino said:
Apple's solution with the G5 (support both 32- and 64-bit code) is best. Let the developers choose which model to code their apps to. (But I do hope they eventually ship 64-bit equivalents to all of the APIs, so 64-bit code doesn't need to shunt all UI calls though 32-bit code.)

Sun used a similar solution for Solaris - a 64-bit kernel that supports both 32- and 64-bit apps, so it's not like this is a terribly unusual concept.
Apple's current 64-bit support is a joke. For 64-bit "support", you need to re-architect your application into separate programs using a client-server model (32-bit GUI client, 64-bit compute server).

Remember that Apple shipped an OSX update that completely disabled 64-bit support - and nobody noticed for a while (certainly nobody in QA or beta testing noticed)! Really shows how popular it is.

The only extended memory support that is worse than Tiger's is the old AWE (Address Windowing Extensions) on Windows 2000 that forced the programmer to manually remap a section of 32-bit address space to different regions of a larger physical address space.
 

Kai

macrumors member
Mar 13, 2003
39
0
Germany
ksz said:
Yeah, right. There are several pages of benchmarks here published by Tom's Hardware.

That's a Game benchmark. We're talking about Dualcore-CPUs. In particular Paxville, which is not in there. So what are you talking about?

AMD's processors have a nice performance edge almost across the board, but the difference is "not very significant". Your link shows 2x improvement over Intel, but that's not evident in Tom's benchmarks. You can always twist benchmarks just as you can twist a "study" to suit your foregone conclusions.

Well, this _may_ be due to the fact that Tomshardware is notoriously very Intel/nVidia-friendly, more than is acceptable, which is also the reason why Anand split off IIRC!...
It may also be due to the fact that Dual-Dualcore is a totally different ballpark than Single-Dualcore (see Tomshardware). Because Intel is especially bad in Dual Dualcore (in contrast to the Opteron or 970MP) because they squeeze EVERYTHING over one FSB! Intel's last Netburst-Xeons (Dempsey) coming Mid-2006 have a Cinebench-Score of 883 on four 3.2 GHz Cores. The Dual 2.7 GHz G5 does 709 while the Quad 2.5 GHz G5 does 1150 and a Dual Opteron 280 (2.4 GHz) does 1104!

What i quoted is a friggin Apache-Benchmark. If that's not a proper measure of an Enterprise-Level Pro-CPU i don't know what is!
You may also look one page earlier to see some media encoding benchmarks of Paxville and one page after to see Sciencemark-Benchmarks. They show the same picture: Intel is getting its butt handed to them by AMD!

Intel developed two generations of the Pentium M as well as an ultra-low voltage version of the chip. They introduced HyperThreading and will have Virtualization Technology on the desktop with Yonah.

Wow, so Yonah is a desktop-CPU now? I never knew, when did Intel change this? ;-)
Hypnothreading is NOT, i repeat NOT in the Pentium-M. Neither is it in the Yonah. Furthermore Intel's stayed mum on the topic even with Merom! The word on the street is that the Pentium-M uses its resources so efficiently that Hypnothreading yields no improvements, so Intel will probably abandon it...

Intel has also pushed aggressively on new process technologies and have had better success than IBM.

Aha, so that's why Dell had to dump the Prescott from their lineup when it was released until further notice?
Gosh, i remember the HUGE uproar when Apple downclocked the G4 by 50 MHz when it was released because Moto couldn't get 500 MHz ones done in volume! Strangely enough nobody seems to remember Intels Prescott-disaster with Dell!...

To their credit, IBM has produced noteworthy advances (alone or in partnership) in process technology including the copper damascene process, strained silicon, SOI, double-gated FinFETs, improved junction properties with high-k dielectrics, etc. etc.

...and FC-BGA, and SSDOI, and copper, and Dualcore. In retrospective IBM is responsible and pioneered most innovations in Chip manufacturing (and design) during the last 7 years. Intel's always only been on the forefront of Die-Shrinks. And that was only a few months with 90nm (hard earned months if you ask Dell!), let's see how it turns out with 65nm!

The problem the industry as a whole encountered 2-3 years ago was power management and high leakage currents. These two issues brought conventional scaling to a virtual standstill as the average power density increased to about 13 Watts / cm^2. A steam iron, by comparison, dissipates 5 Watts per cm^2. The industry as a whole rode the CMOS power curve up to its very limits, and is actively searching for new materials and techniques to continue to increase both performance and packing density while managing heat dissipation and leakage. This is a very difficult problem, which is a key reason for the paradigm shift away from raw Mhz to increased function.

In effect, if you cannot continue to jam more speed, then you must jam more features. This is the driving force behind dual and multi core processors, VT, additional FPUs, improved vector units, more L1 cache, etc. More features are going on-chip because the customer is not going to pay top dollar without a good reason. Clock speed has been the historical justifier for top dollars, but that's changing.

Here, let me sum up what you're saying in one sentence: "You're right - Singling out IBMs problems was wrong of me"

I don't fault IBM for technological incompetence. I do fault them for problems with execution in the time to ramp yield, in the time to introduce more differentiation based on features, and in the time to introduce low-power mobile models.

Low-power models are a CUSTOM DESIGN, because IBM themselves have no use for this! Everyone but Apple knows that if you have a custom design done for you, you have to pay for it. Sony/Toshiba did it. Even M$ did it, and they're friggin M$! Only El Turtleneck didn't want to pay, so here we are looking forward to the mediocrity that is Intel!

Btw: IBM isn't a company that thrives on Chip sales alone, like Intel or AMD is. Back when they introduced the G5 it was clear that with one single OEM with 3.5% marketshare worldwide we would just still NOT see new generations and variations of chips released on a quarterly basis as is the case with Intel and AMD! Be realistic!
 

devman

macrumors 65816
Apr 19, 2004
1,242
8
AU
AidenShaw said:
Actually, my argument is that 32-bit means 64-bit incompatibility.

(32-bit OSx86 source will need to be ported to 64-bit OSx64 source, and tested - and both 32-bit and 64-bit will need to be supported.)


As has been pointed out - true for the G5, but not true for x64.

x64 mode has twice as many 128-bit SSE registers, and nearly three times as many usable general-purpose integer/address registers. This can give a boost to most programs due to the ability to hold more data in registers and spend less time shuffling between registers and memory.

Right, so let's be clear about cause and effect here. This is not about 64bit, it's really about being register starved. An implementation has mixed the two things together.

AidenShaw said:
Apple's current 64-bit support is a joke. For 64-bit "support", you need to re-architect your application into separate programs using a client-server model (32-bit GUI client, 64-bit compute server).

I disagree. It's not a joke. As you point out above in your post, on the G5 things are different. On the G5 64bit was truly about address space and the very-specialised scientific apps. Everyone else would be penalised by 64bit pointers to windows and widgets and, and, and...

Thus on the G5 Apple's approach to 64 bit was not a joke. It was a way to give people that had such specialised needs access to 64bit without penalising everyone else (the overwhelming majority).

Now this also ignores the very large crowd though that kept bleating about 64bit simply because "it's bigger than 32" without really having any idea what it was they were asking for.

AidenShaw said:
Remember that Apple shipped an OSX update that completely disabled 64-bit support - and nobody noticed for a while (certainly nobody in QA or beta testing noticed)! Really shows how popular it is.

Well then maybe it isn't such a big deal... :rolleyes:
 

ksz

macrumors 68000
Oct 28, 2003
1,677
111
USA
Kai,

You're coming across as a man with a lot of emotional baggage to carry. What is the point you're making? That Jobs should have paid IBM handsomely to keep innovating and producing the PPC line? Apparently Intel is too mediocre and IBM is in need of lots of cash (and a lot of good luck in their ability to bring up yields).

Your point?
 

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
devman said:
Right, so let's be clear about cause and effect here. This is not about 64bit, it's really about being register starved. An implementation has mixed the two things together.
I think that I've been pretty clear that the performance improvements due to enhancements in the x64 ISA are important across the board, even to programs with no need for > 2GiB of virtual address space.

By not having 64-bit support, OSx86 is handicapped by 10-20% (typical) compared to 64-bit Windows/Linux.

In addition, there's the prospect of a disruptive transition to OSx64 in the future.
 

minimax

macrumors 6502
Feb 9, 2005
351
0
ksz said:
Yeah, right. There are several pages of benchmarks here published by Tom's Hardware. AMD's processors have a nice performance edge almost across the board, but the difference is "not very significant". Your link shows 2x improvement over Intel, but that's not evident in Tom's benchmarks. You can always twist benchmarks just as you can twist a "study" to suit your foregone conclusions.

And ALWAYS take artificial benchmarks with a teeny weeny drop of salt.
It's funny how people say the pentium M is so much faster as the G4 and point to the infamous Cinebench results on Barefeat (how could a benchmarking site be less professional btw?)

1) the G4 is rather weak in FP tested in cinebench but much stronger in Vector and Integer
2) artificial benchmarks in general inflate performance differences out of propertion, i.e. a 50% difference might well be 20% or less in reality (and benchmarks based on real applications show this very clearly)
2b) a newer processor is always optimised for existing benchmarks, so you need to use a benchmark that is newer then the processors used *
3) both are on completely different platforms

I would not be too surprised if the G4 actually holds up pretty well against the Yonah in real benchmarks on the same platform (OSX PPC vs OSX x86) accross the spectrum.

* 2b might explain 2

/off topic
 

nagromme

macrumors G5
May 2, 2002
12,546
1,196
Any word on my question above, re getting Merom-based 64-bit Intel chips into ALL computer models, from the low to the high, inside of a few months (not a best-case time, a certainty Apple can count on), without low-end machines costing more than today?

Because otherwise, the people saying "wait for Merom and stay with G4" don't have a case. I'd love to be convinced of that optimistic timetable for a full range of Merom-based chips, low to high, portable to desktop. A range that includes the Mac Mini and the iBook, not just the PowerBook.


Kai said:
Everyone but Apple knows that if you have a custom design done for you, you have to pay for it. Sony/Toshiba did it. Even M$ did it, and they're friggin M$! Only El Turtleneck didn't want to pay, so here we are looking forward to the mediocrity that is Intel!
I'll let you research Intel's upcoming chip platforms yourself (they're not mediocre), but regarding Apple paying IBM to make the chips they need--like high-speed laptop G5s... Yes, of course if Apple threw enough money at IBM, in theory anything could be done. BUT:

1) IBM had a history of saying something could be done and then being wrong. So then Apple accepts being late AND having to pay more than IBM said? When IBM told Apple 3GHz in a year, I'm sure they didn't say "but we might ask for more money than you can afford to pay." They thought it could be done more cheaply and quickly than it could. Not a good business partner--unless of course you sell consoles which a) only need a new CPU every few years and b) are expected to be sold at a loss anyway.

2) If Apple DID pay IBM the extra amounts required to help IBM meet its promises, then Macs would cost huge amounts more. (And STILL be released late when IBM misses its deadlines.) Is that a solution?

So maybe it's not IBM's "fault" that they can't provide Apple with a solution for the future (only their fault that they incorrectly thought they could). But Apple DOES still have to look for a solution. The Pentium 4 isn't it. Motorola isn't it. But Intel's future Pentium M-derived chips ARE it.

Can you name any current PC maker that uses custom-made processors designed just for their unique computers, pays the development costs to the chip-maker, and still manages to be priced competitively and sell at the same profits other PC companies make? If not, then who is "everybody but Apple"?

It sounds like you're trolling--which I don't think is the case, I think you're just really emotional about this issue. I understand: PowerPC really did have potential, and it's a shame there was no way to realize it without making Macs cost a lot more. Step back and look into all the realities of this transition and you will feel much better.


minimax said:
I would not be too surprised if the G4 actually holds up pretty well against the Yonah in real benchmarks on the same platform (OSX PPC vs OSX x86) accross the spectrum.
But you'd have to compare DUAL G4s. Yonah had dual cores, and every OS X can benefit from that.

Some apps are multithreaded to take best advantage of it, but I'm led to believe that ANY app will get some boost, because the OS will run the foreground app on one CPU and all the system functions--and other current apps--on the other CPU. So even a single-CPU app gets a CPU to itself, which never happens on a single G4 that's also running the OS. Then add the fact that real multi-processing apps will be increasingly common.
 

shamino

macrumors 68040
Jan 7, 2004
3,443
271
Purcellville, VA
Kai said:
Hmm, i kinda wonder what that big chunk on the Itanic-Die labelled "IA32" is for then! ;-)
Don't know, but it's certainly not for x86 compatibility. Itanium's total incompatibility with x86 softwre is what killed it as a viable product.

Kai said:
What i quoted is a friggin Apache-Benchmark. If that's not a proper measure of an Enterprise-Level Pro-CPU i don't know what is!
And most consumers don't run enterprise web servers on their desktops.

If you're interested in running a server, which will be running in a machine room, you don't need a Mac with all of its UI overhead. The applications you're holding up as a benchmark are the kind that are best suited to Linux PCs (or larger UNIX systems from IBM, Sun and SGI) not anything running Mac OS.

As others have said, you're picking and choosing unrealistic benchmarks in order to make your point.
 

Hiroshige

macrumors member
Mar 31, 2004
86
0
Michel Mayer says he sold Steve Jobs the G5, fine.
But he is the President of the company that is still producing an outdated G4 at 130 nanometers when Intel and AMD and IBM long since stepped down to 90 and Intel is weeks away from going to 65.
As I said the day of the Intel announcement, Jobs should have put Mayer on the video screen and said, "You have failed me for the last time Michel Mayer," and done a force choke on him.
 

minimax

macrumors 6502
Feb 9, 2005
351
0
nagromme said:
But you'd have to compare DUAL G4s. Yonah had dual cores, and every OS X can benefit from that.

Some apps are multithreaded to take best advantage of it, but I'm led to believe that ANY app will get some boost, because the OS will run the foreground app on one CPU and all the system functions--and other current apps--on the other CPU. So even a single-CPU app gets a CPU to itself, which never happens on a single G4 that's also running the OS. Then add the fact that real multi-processing apps will be increasingly common.

No you don't, as there will also be single core Yonah's. It is very likely to the point of almost certain that Apple will use those for the mac mini and ibook.
 

bousozoku

Moderator emeritus
Jun 25, 2002
15,716
1,890
Lard
devman said:
...
I disagree. It's not a joke. As you point out above in your post, on the G5 things are different. On the G5 64bit was truly about address space and the very-specialised scientific apps. Everyone else would be penalised by 64bit pointers to windows and widgets and, and, and...

Thus on the G5 Apple's approach to 64 bit was not a joke. It was a way to give people that had such specialised needs access to 64bit without penalising everyone else (the overwhelming majority).

Now this also ignores the very large crowd though that kept bleating about 64bit simply because "it's bigger than 32" without really having any idea what it was they were asking for.
...

I don't usually agree with AidenShaw but Apple went down a haphazard path toward supporting 64-bit computing. Other than the way they enabled 64-bit memory addressing (on a 42-bit address bus), they'll have to re-think them later as most of their techniques have come and gone within other UNIX-based operating systems.

It was all about disturbing the fewest number of people and spending the least cash to do it. The choices were many:

  • Downplay the fact that Mac OS X doesn't truly support the G5
  • Create a 64-bit version of Mac OS X to support the G5, abandoning 32-bits
  • Turn on bridge mode by itself
  • Turn on bridge mode and use old techniques to support 64-bits

Obviously, they did the latter but they also do a bit of the first along with it. It's not a joke but remembering how they supported PowerPC processors in System 7, Mac OS 8, and Mac OS 9, it could be a very long time until they take things seriously.
 

nagromme

macrumors G5
May 2, 2002
12,546
1,196
minimax said:
No you don't, as there will also be single core Yonah's. It is very likely to the point of almost certain that Apple will use those for the mac mini and ibook.
I agree, on the low-end that's what Apple is almost certain to use. Very possibly preceded by Dothan Pentium Ms, since Yonah2 is expected before Yonah1. Or Apple might just wait for Yonah1. Not every Mac will get dual cores.

The need to move on from the G4 is more urgent in the PowerBook line, and that's where the early excitement will be: Apple's first dual-core laptops.
 

Sunrunner

macrumors 6502a
Nov 27, 2003
600
2
What this story really indicates is that the long term co-platform development of OS X was for more than "just in case". I think Steve likely saw the direction where he wanted to go with Apples product line and the direction Intel was taking it's processor roadmap had some commonality. Also, with the possibility of multi-OS hardware, what better way to make true inroads into eroding Microsofts marketshare?
 

SiliconAddict

macrumors 603
Jun 19, 2003
5,889
0
Chicago, IL
Gah....people are pulling some of these specs out of their [bleep]. Aiden where are you getting 10-20%? Seriously. And 32-bit code can sure as hell run in a 64-bit environment assuming you have OS support. I've been running 32-bit apps on my AMD64 Athlon at work that is running Windows XP 64-bit edition. No problems at all.
You act as if Apple hasn't taken any of this into consideration. Do you really think that Apple wasn't aware of Intel’s roadmap when they made the plunge?
Anyone care to take bets on how Leopard (Prob coming in early Spring 2007.) will support 32/64 bit software? What you say? You have no idea what is in store for Leopard? OK then.
Its a moot point anyways for all but the most hard core apps that are practically coded on the metal. Most app (all?) converted over to x86 are being migrated to XCode. As such don't you think that XCode is going to be/is ready for the migration between 32-bit and 64-bit?

Again people are pulling crap out of thin air without giving Apple the benefit of the doubt. Apple has done this type of migration before. They know what they are doing. Sit back and chill for a few months. Guaranteed that some of you will be eating your posts in the end.
 

SiliconAddict

macrumors 603
Jun 19, 2003
5,889
0
Chicago, IL
minimax said:
No you don't, as there will also be single core Yonah's. It is very likely to the point of almost certain that Apple will use those for the mac mini and ibook.

There is still some contention as to when single cores will be shipping. I've read everything from January '06 - Mid Summer '06. What I think can be agreed upon is when Apple's high end mobile wares go to Merom that there is a good possibility that Apple's low end systems such as the iBook and Mac mini will go to dual core Yonah's. (Assuming the price drops that is.) I think, NOTE: think, Apple's ultimate goal is to get dual cores/and or CPU's across the board at some point. Also keep in mind that if its true that the Mac Mini may be going entertainment center on us that could necessitate a dual core. H.264 isn't exactly processor friendly even at 720p. Not to mention 1080p which would chew up and spit out a Mac Mini.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.