Analog Kid said:
64bit workstations and full 64bit OS's are only used in very, very specialized applications: ones that need a lot of memory.
Well, Solaris ships a 64 bit OS that runs on all UltraSparc machines (which are the only kind they've sold for many years - all of which are 64-bit). But it does not require you to run 64-bit apps - and Sun recommends that most apps remain 32-bit unless they have a good reason to go 64-bit (like large memory, large-integer arithmetic, ANSI-C access to large files, or large system constants.)
But I don't know if you consider Solaris to be a
full 64-bit OS or not.
Analog Kid said:
My work uses a mix of 32bit and 64bit machines. We've got applications that run for weeks at a time processing data. Whenever possible we run those on 32bit machines because they're faster. We only run on 64bit machines when the data model exceeds the memory limit of the 32bit machines.
Just out of curiosity, are these 64-bit machines Sun workstations? And if they are, have you tried simply compiling the code as 32-bit to run on those workstations? Is 32-bit code on a 64-bit Solaris actually slower than 32-bit code on 32-bit Solaris?
Analog Kid said:
Look at what they did with the "fat binaries" to transition from the 68000 to the PowerPC-- talk about jumping through hoops to support outdated hardware!
Actually, fat binaries were trivial. 68K apps store the code in CODE resources in the resource fork. PPC apps (are supposed to) store the code in the data fork (the way apps on most other operating systems do). To make a fat binary, you just compile the code twice - once for 68K and once for PPC. You store the PPC code in the data fork and the 68K code in the resource fork. At launch time, 68K systems will only look in the resource fork - and they'll find the 68K code; PPC systems will look in both places, trying the data fork first - and they'll find the PPC code.
OS X's package-based application system makes this even simpler, though. In this model, each application is actually a directory tree of files. Among the files is the executable image. It is possible (even simple) to put multiple executable image files in the package. The OS looks for one (in the directory named "Contents/MacOS") and launches it, ignoring the others.
So, if Apple would want to allow 32-/64-bit fat binaries (as they probably should), they can simply define a new folder name (maybe "MacOS64"). A 32-bit system will ignore this folder. A 64-bit system will look there in addition to "MacOS" to find executables. (The choice of which to check first can be made configurable on a system-wide or per-application basis, although it's probably not necessary. Presumably, developers wouldn't release a 64-bit executable image in an application unless the app has a legitimate need to be 64-bit.)
Analog Kid said:
When dual cores are ready, that will be the distinguishing feature between Powerbooks and iBooks.
Until dual-core chips become cheap and abundant. Then the iBooks will get them and some other feature will distinguish them.
Note how the dividing line used to be G3/G4. Then the iBook/G4 came out, and the differences became pretty minor. (Today, there isn't much difference between an iBook and a 12" PowerBook, for instance.)
Analog Kid said:
Powerbooks will continue with the dual G4 until memory limitations become a problem for the majority of users-- not the folks who want bragging rights, but the folks who make smart buying decisions based on all the factors in their portable. When that happens we'll see a move to a dual 64bit core. Maybe from IBM, or maybe the e700 series if it materializes.
Or until IBM gets their yields up (and temperatures down). Sometimes you upgrade a machine simply because it doesn't cost that much to do so.
Of course, by then, the PowerMacs will have dual-core G5's, so there will still be a distinction between the lines.