That's the trouble with using a server chip, you get useless features like multiple OS support, instead of focussing on things like ramping up the megahertz.
If this was implemented on systems that support it, here's how it would work:JonMaker said:UPDATES![]()
In the server world (and even at home for me) rebooting to install low level OS updates is a big deal. IT staff can be fired for five minutes of downtime.
Would this multi OS thingy allow stuff like hot-swapping the kernel?![]()
e.g. Would we be able to boot one OS, update it (after a few months of uptime), then "hot-reboot" into the newly updated OS?![]()
AidenShaw said:This is about as wrong as you could possibly be!!
Right now you can get Windows for:
- x86 32-bit - Windows XP, Windows Server 2003 (Intel/AMD, in 32-bit mode)
- IA64 (Itanium) Windows Server (XP for IA64 has been available, but withdrawn when HP stopped IA64 workstation sales)
- x86_64 (Xeon EM64T, Opteron, Athlon 64) XP-64 - available for a free download
- x86_64 Windows Server - free download, and free upgrade to 64-bit when released if you buy 32-bit today
Did you know that Windows 2000 64-bit on Alpha went to internal beta testing?
Did you know that much of the IA64 support was developed on Alpha systems? (MS could buy 64-bit Alphas from Digital and do development while they waited for IA64 prototype systems.)
Did you know that the "performance critical hardware dependent" parts of Windows are isolated in a component called the HAL (Hardware Abstraction Layer) - so that all the hand-tuned architecture-specific code is in one single module? (In other words, the stuff that really needs to be in assembly language in in one small place.)
Did you know that Windows CE (and the Visual Studio .NET tools needed to build it) supports ARM, PPC, and SH in addition to x86? (http://msdn.microsoft.com/embedded/usewinemb/ce/supproc/default.aspx)
In addition to the three supported architectures for the full NT-based systems, I wouldn't be surprised to hear that MS does regular builds of the PPC, MIPS and Alpha code streams. I know that they built the Alpha stream long after Alpha support was dropped (they didn't have Itanium systems for testing, they had to use Alpha for 64-bit development). It would be entirely in character for that development group to continue to do regular builds against those targets, however, just to make sure that no x86 dependencies showed up in the high level code.
To claim that NT is "stuck to x86" is one of the most ignorant statements that one could make. Doing platform independence right was one of the "job 1" goals with NT. The core development team certainly had learned that lesson the hard way...
Does that even matter anymore? The HAL is still around in Windows XP, and it's still called HAL. The purpose it serves now is probably less cross-platform and more with things like allowing the OS to use the graphics card for certain tasks regardless of what kind of graphics card it is.MacNeXT said:And yes I know about HAL, but if I'm not mistaken, there is no such thing in XP, or at least they would have changed its name because it wouldn't be appropriate anymore.
Macrumors said:CNET News takes a look at IBM's PowerPC 970FX chip and the likely roadmap IBM plans to take the POWER line. From the article:
A single-core version of the IBM 970FX chip is currently used in the Power Mac G5 line of desktop computers, as well as the iMac G5 line.
Eric_Z said:MIPS is very much still alive, albeit in the embedded sector.
Tuttle said:I think it has to do more with the fact that the pc market is rushing to the bottom. Just look at how the prices of the low to midrange desktop computers have fallen over the past few years. It probably no longer was worth the effort to try to compete in a market where profit margins are drying up.
~Shard~ said:I don't know if I'd want Windows anywhere near my OS X though.![]()
wrldwzrd89 said:Does that even matter anymore? The HAL is still around in Windows XP, and it's still called HAL. The purpose it serves now is probably less cross-platform and more with things like allowing the OS to use the graphics card for certain tasks regardless of what kind of graphics card it is.
MacNeXT said:First of all I did not claim NT was stuck to x86. A claimed every version AFTER NT got more and more stuck to x86, because they dropped the multi platform thing. Read again before telling it's an ignorant statement.
MacNeXT said:It's a FACT that they dropped MIPS, PPC and Alpha. And why on earth would they waste resources in still maintaining MIPS, PPC and Alpha (note: 2 DEAD processors in that list) ports of their products???
MacNeXT said:And yes I know about HAL, but if I'm not mistaken, there is no such thing in XP, or at least they would have changed its name because it wouldn't be appropriate anymore. And yes I also know W2K/Alpha, in fact I own a copy.
MacNeXT said:I think it would be far from trivial to retrofit XP with a non-x86 HAL.
budugu said:You donot need virtualization & 2 CPUs for that! just a decent motherboard! look at this..
http://www.tomshardware.com/motherboard/20041119/index.html
MacNeXT said:I don't see how this could be that interesting for users like the most of us.
Linux and OSX could be interesting for some but don't forget OSX is already a pretty useful UNIX based (FreeBSD / Darwin to be precise) system, and the Mac (modulo OSX) is not a particularly interesting platform for Linux users.
What I think "Apple's plans to use it" are in the server department. Multiple OS'es simultaneously is very useful in critical applications. Testing a new application without disrupting an already running system, running multiple servers independently, redundance etc. etc.
What reason do I have to believe that the core of Mac OS X is "hacked"? Right now, none at all. I agree that it's buggy for the simple reason that it's next to impossible for even tiny software programs to be bug-free.alfismoney said:Don't forget, Apple released a severely hacked version of the FreeBSD core that is nowhere near as stable as UNIX: there are plenty of power users that lock up their systems all the time with virtual memory traps and hardware freezes that should never (and i do mean NEVER) freeze a unix box. All those deck problems you see running Final Cut? Every piece of terrible arbitrary software that Avid/Digidesigns releases? Video driver issues? Printer crashes? Network errors? All of those come back down to problems at the core level of the operating system which, if it were truly properly written, OS X could run unaffected by.
As for the Linux users not seeing the mac os as an appealing platform, keep in mind that while it's a hacked and buggy version of unix, the mac os still serves as a version of unix that has graphics capabilities. this alone is important enough since everyone wants to be able to use quicktime and windows media (and sometimes little programs like photoshop) no matter what platform they run on. Being able to run two computers in one box has serious benefits if it's done with hardware instead of with software because it removes programming errors from the chart along with providing much faster system performance. trust me, the only way this could be better news for apple is if someone starts placing 4 of these in machines instead of only 2...
MacNeXT said:And like I said before, Mac OS X is a great UNIX based platform. It would be much better to port (Linux) applications to Mac OS X natively, which is already done a lot.