Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
johnnyjibbs said:
For that to happen they need everything to move to G5 first :p :D

That's my point. Slow the OS cycle down so the hardware has time to go all 64 bit thus allowing the OS to go strictly 64 bit. This instead of several dual 32/64 bit OS releases. Of course, this is all based on the assumption that it is difficult or at least storage intensive to develop/deploy a dual 32/64 bit OS.
 
Rower_CPU said:
AS-

I think what SiliconAddict is saying is that Apple will most likely have their 32-bit apps rewritten for 64-bit when they release 64-bit OS X - it's a logical expectation. That's what "ready to go" means.

Exactly. Apple is in the fortuitous position of readying everything for one massive launch. I'm well aware that the primary selling point of a 64-bit system is RAM. But in the case of the G5 simply compiling for the G5 has shown a dramatic increase in performance. (Sorry can't remember the exact numbers.) This is what most people should be interested in not the 64-bitness.

Beyond that I stand WAY corrected. I've read previews. Read Paul T's site and in every case there has been ZERO mention of a 64-bit version of longhorn. Which I consider odd. Why isn't this fact being marketed?
 
Sped said:
That's my point. Slow the OS cycle down so the hardware has time to go all 64 bit thus allowing the OS to go strictly 64 bit. This instead of several dual 32/64 bit OS releases. Of course, this is all based on the assumption that it is difficult or at least storage intensive to develop/deploy a dual 32/64 bit OS.

Keep in mind that even when they do move to a 64-bit \ G5 version of the OS they many very well have a 32-bit \ G4-G3 version being released as well. There are a lot of people out there still running G3's and god know how many on G4's. They aren't going to go out and buy a new system simply because Apple says do it.
I'm guessing they are going to have a migration period where Apple offers both 32 and 64 bit versions of the OS but its not going to be forever. Maybe....what? 2-3 revisions? :confused:
 
SiliconAddict said:

Ok, then your statement was fine except for saying that Apple "will have" it ready out-of-the-box. I read your wording to say that the current boxes are 64-bit capable.

On the other hand, the majority of applications have no need for more than 4 GiB of RAM. It would be a waste of effort to port those to 64-bit if they'll run full speed in 32-bit mode.


SiliconAddict said:
Beyond that I stand WAY corrected. I've read previews. Read Paul T's site and in every case there has been ZERO mention of a 64-bit version of longhorn. Which I consider odd. Why isn't this fact being marketed?

Maybe it hasn't been mentioned because it's so obvious.

Microsoft is making 3 versions of XP (and Server) from the same codebase today. (32-bit x86, 64-bit x86e, and 64-bit IA64)

It would be safe to assume that they'll do the same with the next version of the OS, especially since 64-bit will be more likely to cross over into the mainstream during the lifetime of Longhorn.

(By "mainstream" I mean that a desktop system with more than 4 GiB of RAM will be fairly common, and that at least a few important desktop apps will be needing that much RAM for a single instance of the application.)
 
Windowlicker said:
i can't really even think of much reasons to upgrade from panther.. this system fits all my needs and more. so i could pretty much say they've made it. now, if i get a major upgrade every 2 years or so i'll be glad.
That is easy... SPEED. More gains can be had in software than in a 200MHz speed bump for a processor.

That, coupled with the shift to 64-bit architecture in the next few years.
 
Hector said:
SWC

the pc venders pay microsoft for oem copys of windows and they put that cost on the customer was there anything incorrect in my post? no so what was with the angry tone

yes i am awear that most self builders pirate windows or have linux


I don't mean to have an angry tone, but my point was that you say windows costs so much for self builders, when you dont even have that option with apple your stuck paying their inflated prices (which I do and enjoy their products) But you cant really say that building your own pc and paying retail for the OS is a negative for the PC world when on this side of the fence you don't have the option.

Someone mentioned too that they see no mention of a 64 bit OS from microsoft, they currently have xp 64 bit in beta testing right now and should be released by the end of the year and I am aure longhorn will follow with a 64 bit release as well but just isnt being touted much right now as even XP is still in beta for the 64 bit version much like im sure apple has the 64 bit version of OS X in works but isn't mentioning much right now.
 
XP 64-bit has been shipping for a year!

SWC said:
...as even XP is still in beta for the 64 bit version.

Windows XP 64-bit was released in March 2003 - over a year ago. It is a shipping product.

http://www.microsoft.com/windowsxp/64bit/evaluation/overview.asp


The "beta" is for the AMD64 (Opteron/Athlon 64) version of the IA64 version.

This is a recompilation for most of the 64-bit code, although the support for 32-bit applications within the 64-bit system needs to change (often simpler, but different nonetheless).
 
AidenShaw said:
Windows XP 64-bit was released in March 2003 - over a year ago. It is a shipping product.

http://www.microsoft.com/windowsxp/64bit/evaluation/overview.asp


The "beta" is for the AMD64 (Opteron/Athlon 64) version of the IA64 version.

This is a recompilation for most of the 64-bit code, although the support for 32-bit applications within the 64-bit system needs to change (often simpler, but different nonetheless).

I'm not even sure why they bothered to release a version of XP for the itanium. That is a very niche product and at $1400 per processor I would expect that to be used strictly in a server environment considering intel has said the only thing the itanium is good for is database servers and such since it cant run 32 bit apps very well and most not at all.
 
advantage of common codebase

SWC said:
I'm not even sure why they bothered to release a version of XP for the itanium.


As I said in an earlier post:

Even though Windows supports 64-bit memory, and lets individual programs use more than 4 GiB - there is little demand for that capability from desktop apps. There are many 64-bit server apps, for example, but the strongest reason to get a 64-bit desktop today is for software development of 64-bit server apps - not for running 64-bit desktop apps.


Another factor is that Windows is a common codebase - once you've ported one version/architecture - you've done most of the work for the others. Therefore, once IA64 server was running, it wouldn't be a huge effort to make an IA64 desktop.
 
qubex said:
That said, OS X in its 10.3 incarnation is far from complete. It still has serious flaws. For example, the Finder is still awful. It is badly in need of being rewritten to stop crashing - it panicked my kernel a few days ago when moving a bunch of .jpg images with preview.
I thought that particular problem was fixed in 10.3.3. Are you all up-to-date, with no weird haxies or anything running?

WM
 
Sped said:
I don't know anything about writing OS code, but it seems that slowing down will coincide nicely with 64 bit computing coming online. It will take some time for Apple's entire lineup to become G5 or better, and of course there will still be G4 and older machines being used for a long time. Like I said, I am not a software designer, but it seems supporting 64 bit and 32 bit hardware has the potential to bloat the code. If this assumption is correct, slowing down will allow the user base to gradually transition to newer hardware.

By the way, remember the tremendous bitching going on about how OS X was crap and that many would never stop using OS 9? I was switcher during that time and have never really used OS 9. I think it's funny now how hardly anyone is griping anymore.

Granted, that was in the time of 10.0 and 10.1, when there was a bigger software library for MacOS 9 and it ran faster. Don't forget, 10.0 itself was rushed as a beta. Believe it or not, my iBook ran OS 9 faster than 10.0 and 10.1! Then 10.2 released and it ran much better.
 
Sped said:
That's my point. Slow the OS cycle down so the hardware has time to go all 64 bit thus allowing the OS to go strictly 64 bit. This instead of several dual 32/64 bit OS releases. Of course, this is all based on the assumption that it is difficult or at least storage intensive to develop/deploy a dual 32/64 bit OS.

Here is the thing, the move to 64-bit on the PPC arch is VERY different from the move to 64-bit on x86. In the case of MacOS X, it simply needs to be made 64-bit clean. Because the PPC spec was always ready for the move to 64 bits, it has the advantage. In this case, a dual 32/64-bit OS would not be difficult, and fat binaries would not be needed (we already have the methods to enable/disable code based on the presense of Altivec, we can do the same for 64-bit in apps at no performance loss if done correctly).
 
Krevnik said:
Here is the thing, the move to 64-bit on the PPC arch is VERY different from the move to 64-bit on x86. In the case of MacOS X, it simply needs to be made 64-bit clean. Because the PPC spec was always ready for the move to 64 bits, it has the advantage.

Please explain this.

I would think that my C or C++ or Objective C program would determine the issues with being 64-bit clean - not the target architecture that my compiler is using. There isn't a "PPC C++" distinct from an "x86 C++", is there?

A C++ program that is casting an "int" to a "*" is unclean on both PPC and x86.

"Simply needs to be made 64-bit clean" is not always a simple process - particularly when the best coding practices have not always been followed.

I've ported a lot of 64-bit code, and "simply make it 64-bit clean" makes me smile.
 
AidenShaw said:
Please explain this.

I would think that my C or C++ or Objective C program would determine the issues with being 64-bit clean - not the target architecture that my compiler is using. There isn't a "PPC C++" distinct from an "x86 C++", is there?

A C++ program that is casting an "int" to a "*" is unclean on both PPC and x86.

"Simply needs to be made 64-bit clean" is not always a simple process - particularly when the best coding practices have not always been followed.

I've ported a lot of 64-bit code, and "simply make it 64-bit clean" makes me smile.

I was attempting to point out that the OS need not be 'ported' in the sense the vast majority of this board thinks needs to happen. The difference between making Windows 64-bit, and MacOS X 64-bit is that to make MacOS X 64-bit, it only needs to be expanded to be completely aware of the full environment, rather than modified to the point where it is no longer compatible with older hardware.

While I do agree it isn't a simple matter, assuming we need to dump G3/G4 code simply to become 64-bit friendly is complete BS. So is the idea that the OS would become bloated to support 32-bit and 64-bit PPC chips.
 
a very limited view....

Krevnik said:
The difference between making Windows 64-bit, and MacOS X 64-bit is that to make MacOS X 64-bit, it only needs to be expanded to be completely aware of the full environment, rather than modified to the point where it is no longer compatible with older hardware.


Wow, where to begin.... :eek:


This is a bit confused.

Neither Windows nor OSX are written in assembly language - they are both written primarily in higher level languages. Compatibility with older hardware on a simple level is a compiler option - tell it to generate code for the 32-bit or the 64-bit CPU (or use an IA64 compiler instead of the x86 compiler).

The fact that the PPC 64-bit ISA (code) is mostly a superset of the PPC 32-bit ISA is irrelevant - as soon as the compiler generates a single 64-bit only instruction the OS is incompatible with the older hardware.

In some cases you might decide to intermix 32-bit and 64-bit code, and choose between them at runtime - but that's a very significant complication to the system.

*****

But, the kernel is among the least of your worries when migrating to 64-bit. The real problem is that a huge number of the system APIs as well as higher level libraries take memory pointers (addresses) as parameters.

To support both existing 32-bit programs and new 64-bit programs, you have to have duplicate routines for each of these - one supporting the old 32-bit pointers, and one supporting the new 64-bit pointers. It's a lot of work to redefine the APIs and to create and test a parallel set of 64-bit libraries, to create a version of the 32-bit libraries that can run on the 64-bit system, and to devise the mechanisms so that they can co-exist.

Some of the 32-bit APIs can simply be left as 32-bit routines (for example, a floating point math routine). Other 32-bit routines might need to be rewritten as jackets to translate their arguments and call the native 64-bit routines (for example, a file I/O or memory management API).

The only "superior" feature of PPC in regards to 64-bit is that the 32-bit system is a proper subset of the 64-bit ISA. This makes it much easier for PPC than Itanium to support both 32-bit and 64-bit concurrently - and easier for the hardware to run 32-bit mode at full speed.

Note that x86-64 (AMD64) has exactly the same advantage as PPC - an Opteron or Xeon-64 can run 32-bit code at full steam because all of the 32-bit instructions are part of the 64-bit architecture.
 
AidenShaw said:
In some cases you might decide to intermix 32-bit and 64-bit code, and choose between them at runtime - but that's a very significant complication to the system.

Not always. A properly designed system can intermix the code quite easily, and Apple has already had the problem of intermixing code with Altivec, so 32/64-bit code is nothing special.

AidenShaw said:
But, the kernel is among the least of your worries when migrating to 64-bit. The real problem is that a huge number of the system APIs as well as higher level libraries take memory pointers (addresses) as parameters.

To support both existing 32-bit programs and new 64-bit programs, you have to have duplicate routines for each of these - one supporting the old 32-bit pointers, and one supporting the new 64-bit pointers. It's a lot of work to redefine the APIs and to create and test a parallel set of 64-bit libraries, to create a version of the 32-bit libraries that can run on the 64-bit system, and to devise the mechanisms so that they can co-exist.

Depending on how the system is crafted, converting the 32-bit libs to be compatible with the new back end is simply grunt work. Co-existing is the least of the worries, as you just said that the routines that normally take 32-bit parameters would need to be updated to handle 64-bit parameters. Since this by itself breaks the usual specification, or there is a specification already available for quite a few libs MacOS X uses for 64/32-bit compatibility, this is also simply grunt work.

AidenShaw said:
Some of the 32-bit APIs can simply be left as 32-bit routines (for example, a floating point math routine). Other 32-bit routines might need to be rewritten as jackets to translate their arguments and call the native 64-bit routines (for example, a file I/O or memory management API).

And here is the rub, the work required to do this is actually smaller than you think. While it is work, and it does need to be done, it is grunt work. It will not warrant splitting the OS into 32-bit and 64-bit versions that people keep claiming. Show me proof that it is a significant complication to the system to support 32-bit and 64-bit calls within the system (more than the current complication of detecting and adjusting to Altivec at runtime), and I can believe you, otherwise I find it hard to believe that this fairly minor arch change would require such a large split in development.
 
Krevnik said:
Not always. A properly designed system can intermix the code quite easily, and Apple has already had the problem of intermixing code with Altivec, so 32/64-bit code is nothing special.



Depending on how the system is crafted, converting the 32-bit libs to be compatible with the new back end is simply grunt work. Co-existing is the least of the worries, as you just said that the routines that normally take 32-bit parameters would need to be updated to handle 64-bit parameters. Since this by itself breaks the usual specification, or there is a specification already available for quite a few libs MacOS X uses for 64/32-bit compatibility, this is also simply grunt work.



And here is the rub, the work required to do this is actually smaller than you think. While it is work, and it does need to be done, it is grunt work. It will not warrant splitting the OS into 32-bit and 64-bit versions that people keep claiming. Show me proof that it is a significant complication to the system to support 32-bit and 64-bit calls within the system (more than the current complication of detecting and adjusting to Altivec at runtime), and I can believe you, otherwise I find it hard to believe that this fairly minor arch change would require such a large split in development.


Your original comment was "it only needs to be expanded to be completely aware of the full environment". I felt that didn't really describe the magnitude of the job.

We're not too far off - you're now saying that it's a lot of work (albeit not rocket science). It's doable (look at SUN's Solaris - that's been shipping both a 32-bit only version and a 64-bit version that supports both 32-bit and 64-bit binaries.

*****

The other part of a "lot of grunt work" is that implies "one hell of a lot of quality assurance work". Even the best of grunts make mistakes - both at the typo level and at the design level.

Definitely doable, but a lot of work.
 
mklos said:
QUOTE from nmk
I just hope this doesn't mean the beginning of the end for the Mac. Apple annoncing this in the same week as the new iPod devision makes me a little uneasy.

QUOTE from mklos
I seriously don't think this is the end of Apple Computers. After all they have spent over $1 Billion on the beta version of Mac OS X and they have spent hundreds of millions to further develop OS X to what it is today. Also, they are spends hundreds of millions of dollars for Apple Retail Stores around the world. I don't think Apple is building retail stores for the iPod!

Well said. I know it's human nature to be upset when changes happen. Given his track record, I think these moves are more about Steve trying to streamline production efforts than it is about trying to prioritize divisions on an absolute "this-division-is-more-important-than-that-one" basis. That being said, this doesn't mean we might not see shake-ups, but on the whole, I think Steve has this well in hand.

Anyone wanting to argue this point should kindly direct their attention to the pre-Stevian period at Apple, where you had multiple divisions screaming like little spoiled brats, each competing against the other instead of operating as a cohesive whole.

The end of Apple as a company will be the day they stop selling computers. Even though they sell more iPods than computers, they still made about 75% of their profit off computer sales. Apple cannot survive off the iPod, or anything of its other hardware/software things besides it computers.

Again, way to go mklos! Apple is in no position strategically to abandon their PC hardware base.

Keep in mind that they did this a few years ago when they shifted their main software personel to a seperate Applications team and because of that we've seen some awesome OS X apps come out. Keynote, iLife '04, FCE, Motion, etc. are all examples of cool new apps that havce come out of this split.

I think this is great for Apple. It splits specific things into specific groups so that team can focus on one specific thing. Putting John Rubenstein in charge of the iPod operations could of been a bad move, but I'm sure his replacement is fully qualified and if he isn't then I'd like to think that Steve Jobs will take care of that.

Apple is running strong, has 4.5 Billion in cash, and is STILL consistantly making a profit. They aren't going away for quite sometime unless they screw up big time like Gateway did.

As I said, it streamlines things.

Mike
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.