Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
JonMaker said:
UPDATES :eek:

In the server world (and even at home for me) rebooting to install low level OS updates is a big deal. IT staff can be fired for five minutes of downtime.

Would this multi OS thingy allow stuff like hot-swapping the kernel? :confused:

e.g. Would we be able to boot one OS, update it (after a few months of uptime), then "hot-reboot" into the newly updated OS? :cool:
If this was implemented on systems that support it, here's how it would work:
1. You run the old system, and download and install the updater.
2. After the updater installs, the updated OS boots silently in another partition.
3. You restart like you normally would, except what ends up happening is you get switched over to the updated OS while the old OS is shut down.
4. Once the old OS shuts down completely, its components that are still around that were updated get deleted, and you can continue working normally.
 
AidenShaw said:
This is about as wrong as you could possibly be!!

Right now you can get Windows for:


  • x86 32-bit - Windows XP, Windows Server 2003 (Intel/AMD, in 32-bit mode)
  • IA64 (Itanium) Windows Server (XP for IA64 has been available, but withdrawn when HP stopped IA64 workstation sales)
  • x86_64 (Xeon EM64T, Opteron, Athlon 64) XP-64 - available for a free download
  • x86_64 Windows Server - free download, and free upgrade to 64-bit when released if you buy 32-bit today

Did you know that Windows 2000 64-bit on Alpha went to internal beta testing?

Did you know that much of the IA64 support was developed on Alpha systems? (MS could buy 64-bit Alphas from Digital and do development while they waited for IA64 prototype systems.)

Did you know that the "performance critical hardware dependent" parts of Windows are isolated in a component called the HAL (Hardware Abstraction Layer) - so that all the hand-tuned architecture-specific code is in one single module? (In other words, the stuff that really needs to be in assembly language in in one small place.)

Did you know that Windows CE (and the Visual Studio .NET tools needed to build it) supports ARM, PPC, and SH in addition to x86? (http://msdn.microsoft.com/embedded/usewinemb/ce/supproc/default.aspx)

In addition to the three supported architectures for the full NT-based systems, I wouldn't be surprised to hear that MS does regular builds of the PPC, MIPS and Alpha code streams. I know that they built the Alpha stream long after Alpha support was dropped (they didn't have Itanium systems for testing, they had to use Alpha for 64-bit development). It would be entirely in character for that development group to continue to do regular builds against those targets, however, just to make sure that no x86 dependencies showed up in the high level code.

To claim that NT is "stuck to x86" is one of the most ignorant statements that one could make. Doing platform independence right was one of the "job 1" goals with NT. The core development team certainly had learned that lesson the hard way...

First of all I did not claim NT was stuck to x86. A claimed every version AFTER NT got more and more stuck to x86, because they dropped the multi platform thing. Read again before telling it's an ignorant statement.

It's a FACT that they dropped MIPS, PPC and Alpha. And why on earth would they waste resources in still maintaining MIPS, PPC and Alpha (note: 2 DEAD processors in that list) ports of their products???

And yes I know about HAL, but if I'm not mistaken, there is no such thing in XP, or at least they would have changed its name because it wouldn't be appropriate anymore. And yes I also know W2K/Alpha, in fact I own a copy.
 
MacNeXT said:
And yes I know about HAL, but if I'm not mistaken, there is no such thing in XP, or at least they would have changed its name because it wouldn't be appropriate anymore.
Does that even matter anymore? The HAL is still around in Windows XP, and it's still called HAL. The purpose it serves now is probably less cross-platform and more with things like allowing the OS to use the graphics card for certain tasks regardless of what kind of graphics card it is.
 
@MacNeXT

MIPS is very much still alive, albeit in the embedded sector. For example there is a dual core MIPS chip running @1.8Ghz with DDR2 800, HT and PCI-E + kitchen sink due out from one of the two main MIPS companies in Q2-05. </nitpick>
 
Mainly Useful for Servers

Macrumors said:
CNET News takes a look at IBM's PowerPC 970FX chip and the likely roadmap IBM plans to take the POWER line. From the article:
A single-core version of the IBM 970FX chip is currently used in the Power Mac G5 line of desktop computers, as well as the iMac G5 line.

This would mainly be useful for virtual servers, and may not have any relevance to the chips Apple would buy.
 
Future of computing?

This might come in handy for real power users or people wanting to operate a two-in-one Mac simultaniously with tasks exceeding one core's power but all in all I don't find it useful for avarage users, at least not at this moment.
 
My Dell laptop already has a dual-core MIPS chip....

Eric_Z said:
MIPS is very much still alive, albeit in the embedded sector.

The "tigon" series of gigabit ethernet chips use a pair of MIPS R4000 embedded cores to do TCP offloading from the host CPU, and other tasks.

The "tigon3" is manufactured by Broadcom and used in a number of different cards. The chip is the BCM570x series, and is frequently found as the embedded GbE controller on systems and laptops from major and minor companies.
 
Tuttle said:
I think it has to do more with the fact that the pc market is rushing to the bottom. Just look at how the prices of the low to midrange desktop computers have fallen over the past few years. It probably no longer was worth the effort to try to compete in a market where profit margins are drying up.

BS. Their ThinkPad line is a top seller among laptop brands. Many of the Fortune 500 companies of the world use..or at least did use..IBM. Desktops are another matter. I know of no one who uses IBM desktops.
This delusional thinking that the PC world is going to dry up and go belly up is more Apple cool aid being handed around. PC sales are strong however they are being cannibalized by companies like Dell that sell their wares for bottom of the barrel, for fair to inferior quality, that quality players like IBM can't keep up. Dell makes their money in quantity, corp sales, accessories, and extended warrantees. I suppose if IBM wanted to make cheap crap they could compete against Dell. But why? :confused:
 
~Shard~ said:
I don't know if I'd want Windows anywhere near my OS X though. ;)


I don't know... I think I would love to have the option of running windows on a mac.

It would meet all my needs at school (that I currently have to use a PC for),

It would thrill my son who loves to play on-line games (many of which he can't play well on a pc) and allow him to download those obscure apps that he and his friends love to play with (that are only available on PC)

From my perspective this removes issues that people may have with buying a mac. If you can have the ease of use and elegance of using a mac without missing any of the pc only apps people it may create a demand for the pc only apps to be ported.

Can't be closed minded about this (as much as I would like to be) the world uses Windows.
 
wrldwzrd89 said:
Does that even matter anymore? The HAL is still around in Windows XP, and it's still called HAL. The purpose it serves now is probably less cross-platform and more with things like allowing the OS to use the graphics card for certain tasks regardless of what kind of graphics card it is.

You're right, there is still a HAL, but it only serves to "abstract" uniprocessor/multiprocessor and ACPI/Non-ACPI hardware. I think it would be far from trivial to retrofit XP with a non-x86 HAL.

Eric_Z: point taken about MIPS.
 
MacNeXT said:
First of all I did not claim NT was stuck to x86. A claimed every version AFTER NT got more and more stuck to x86, because they dropped the multi platform thing. Read again before telling it's an ignorant statement.

I don't need to read it again, because you have again repeated the mistake.

PPC and MIPS dropped out early - no sales. The Alpha port stayed through the Windows 2000 beta and release candidate stages, in both 32-bit (as in Windows NT 4 Alpha) and the new 64-bit Win64 version that was in development.

Windows 2000 was released on x86 and IA64, two very different architectures. Today XP is running on x86, IA64 and x64 - same with Server 2003. How is that "more stuck" on x86? How is that "dropping multi-platform"???


MacNeXT said:
It's a FACT that they dropped MIPS, PPC and Alpha. And why on earth would they waste resources in still maintaining MIPS, PPC and Alpha (note: 2 DEAD processors in that list) ports of their products???

The Win64 version continued (after Alpha was dropped from Win2K) to be built and used internally for 64-bit development - MS had a lot of Alpha systems for 64-bit work, and very few IA64 systems at the time.

The reason to "waste resources" is quite simple - compiling the other architectures is a sanity check to ensure that the code does not get "stuck on x86". It's mostly a compute cost, not a major people effort.

Plus, it's insurance. You never know when you might want to run NT on a PowerMac G5 (http://www.gamesindustry.biz/content_page.php?section_name=dev&aid=3039) :eek:


MacNeXT said:
And yes I know about HAL, but if I'm not mistaken, there is no such thing in XP, or at least they would have changed its name because it wouldn't be appropriate anymore. And yes I also know W2K/Alpha, in fact I own a copy.

HAL is the only way to touch the hardware - even device drivers have to go through the HAL. There are different HALs for single CPU and multi-CPU systems, and different power management architectures.

Your Win2K/Alpha is a beta 32-bit version, right? Alpha was dropped as a target before the official release of Win2K, and the 64-bit versions were mainly used by developers inside MS and Digital/Compaq.
 
.done deal.

MacNeXT said:
I think it would be far from trivial to retrofit XP with a non-x86 HAL.

There's already an IA64 HAL series and HALs for the x64 (EM64T and AMD64).

And they're not "retrofits" - the IA64 HAL was part of Windows 2000 for Itanium, and the x64 one is just a new piece to support the low-level architecture of the Athlon 64/Opteron/Xeon-64 systems.

The HAL is actually a virtualization layer - it deals with hardware differences so that the rest of NT can run on an idealized virtual platform.

Device drivers don't have to deal with different bus layouts - the HAL hides the details behind a common API.

The HAL does not change the instructions though - all of the other NT code has to be recompiled for IA64 (or PPC or MIPS or Alpha).

x64 is a little different - some things can stay 32-bit (the "notepad" editor, for example) under x64. All 64-bit code and supporting libraries would need to be recompiled.

You'd also want to recompile any performance sensitive code, since x64 64-bit is usually faster than x86 32-bit.
 
Multicore OSX?

I personally think that IF Apple ever uses multiOS feature of those chips, they'll use it for their own multicore OS, not for providing multiple OS-s on their machines. May happen a OSX with one DArwin and other Linux core (for handling Linux binaries natively).

But it's far more likely, that such feature won't be used.
 
This is nothing new for IBM

There's really nothing that new here. IBM mainframes can already run multiple instances of Linux on the same machine. As another poster already mentioned, the idea is to partition a bigger machine into smaller "virtual" machines. The benefit of having multiple "virtual" machines instead of multiple physical machines would be purely cost and easier maintenance, I think. A desktop user would have almost no use for this.
 
I think that this is very good news. IBM is working hard so that Apple will be able to keep up with the rest of technology. It would seem to be much better to be able to run Windows natively on the Mac. No Virtual PC to slow down the process. If it is done by Apple it will always be better.
 
MacNeXT said:
I don't see how this could be that interesting for users like the most of us.

Linux and OSX could be interesting for some but don't forget OSX is already a pretty useful UNIX based (FreeBSD / Darwin to be precise) system, and the Mac (modulo OSX) is not a particularly interesting platform for Linux users.

What I think "Apple's plans to use it" are in the server department. Multiple OS'es simultaneously is very useful in critical applications. Testing a new application without disrupting an already running system, running multiple servers independently, redundance etc. etc.

Don't forget, Apple released a severely hacked version of the FreeBSD core that is nowhere near as stable as UNIX: there are plenty of power users that lock up their systems all the time with virtual memory traps and hardware freezes that should never (and i do mean NEVER) freeze a unix box. All those deck problems you see running Final Cut? Every piece of terrible arbitrary software that Avid/Digidesigns releases? Video driver issues? Printer crashes? Network errors? All of those come back down to problems at the core level of the operating system which, if it were truly properly written, OS X could run unaffected by.

As for the Linux users not seeing the mac os as an appealing platform, keep in mind that while it's a hacked and buggy version of unix, the mac os still serves as a version of unix that has graphics capabilities. this alone is important enough since everyone wants to be able to use quicktime and windows media (and sometimes little programs like photoshop) no matter what platform they run on. Being able to run two computers in one box has serious benefits if it's done with hardware instead of with software because it removes programming errors from the chart along with providing much faster system performance. trust me, the only way this could be better news for apple is if someone starts placing 4 of these in machines instead of only 2...
 
alfismoney said:
Don't forget, Apple released a severely hacked version of the FreeBSD core that is nowhere near as stable as UNIX: there are plenty of power users that lock up their systems all the time with virtual memory traps and hardware freezes that should never (and i do mean NEVER) freeze a unix box. All those deck problems you see running Final Cut? Every piece of terrible arbitrary software that Avid/Digidesigns releases? Video driver issues? Printer crashes? Network errors? All of those come back down to problems at the core level of the operating system which, if it were truly properly written, OS X could run unaffected by.

As for the Linux users not seeing the mac os as an appealing platform, keep in mind that while it's a hacked and buggy version of unix, the mac os still serves as a version of unix that has graphics capabilities. this alone is important enough since everyone wants to be able to use quicktime and windows media (and sometimes little programs like photoshop) no matter what platform they run on. Being able to run two computers in one box has serious benefits if it's done with hardware instead of with software because it removes programming errors from the chart along with providing much faster system performance. trust me, the only way this could be better news for apple is if someone starts placing 4 of these in machines instead of only 2...
What reason do I have to believe that the core of Mac OS X is "hacked"? Right now, none at all. I agree that it's buggy for the simple reason that it's next to impossible for even tiny software programs to be bug-free.
 
As we all know, Window boxes outnumber Mac boxes big-time.

Many of us Mac users have bought Virtual PC so that we can run Windows applications when the need arises.

Those of us who run Virtual PC are aware of VPC's massive performance hit on ram and processor speed.

It would be nice to run a Windows application without the performance hit of Virtual PC.

I look forward to a new processor chip that could run Windows and Mac OS at the same time. Eliminate the need for Virtual PC.

Time will tell whether or not the new chip supports Windows and Mac OS, or simply a variety of flavors of the same OS. :rolleyes:
 
how about a realtime version of OSX

A realtime version of OSX would be very useful in science, engineering, and various industrial control and process control systems. A very small market probably, but, it all adds up.
 
As a long time Mac user (1989), I almost fear being able to use Windows on my Mac machine. I once had the PC card in an old PowerPC I had, and of course it was terrible (slow, DOS only, etc.). Then years later I bought VPC and last year got VPC 6 - so, like someone else mentioned, I could run some PC apps occasionally. But VPC is such a beast of burden on any Mac, even with 3 gb ram and dual processors. I would certainly love to throw it away, but having Windows run directly without emulation, as this processor may provide, brings me to wonder whether or not viruses written for M$ would start affecting OS X files, or HD structure in general. That's what scares me. :eek:

Call me crazy, but didn't Apple once try to create an OS that ran Mac and Windows files simultaneously? Rhapsody I think was the CODENAME for the project, not the name of the OS. Anyone remember that, or am I crazy? :confused:
 
Nice way to avoid porting

MacNeXT said:
And like I said before, Mac OS X is a great UNIX based platform. It would be much better to port (Linux) applications to Mac OS X natively, which is already done a lot.


One of the problems with porting Unix/Linux apps to OSX is that the OSX implementation of Unix standards is incomplete or broken in places. This makes porting a real pain for any software that isn't vanilla stdlib, and OSX breaks a surprising amount of Unix software in unpleasant ways. It is one of the most "incomplete" common Unix variants out there right now. On the other hand, it is very easy to port other vaguely Unix-y things to Linux because Linux tends to support a superset of every obscure Unix standard or API ever implemented these days.

I would love to be able to run Linux in parallel on the PPC hardware so that I could get a better and more complete Unix environment than OSX provides. For most things, I don't even bother to port to OSX any more even though the code base is server-side Linux C/C++, as it is not worth the effort (especially since the most common target is Linux on AMD64 anyway).
 
As far as everyones great speculation that somehow till will enable windows to run on a MAC at a hardware level. I think those hopes are far fetched (but something that I think that more mac users would like than dislike). However the only way I can see this happening is either somehow building a second instruction set on the chip (x86), or more probably (and slower) some sort of on-chip, hardware based, instruction translation to 1 or more PPC instructions. Either way I doubt this what they meant by running multiple OS's. I think you're either looking @ functionality geared towards servers, or Linux/OSX combination with possibilities for R&D envirornments where a variety of computers are used (at work we have every type of desktop you could imagine).
 
Legacy Sound Card

Wow, this is great news, now I will be able to use my legacy Korg Oasys soundcard with its OS9 drivers whilst using Logic 7 on the same machine!!

Any ideas of a release date??

Dont suppose there's any chance of having this feature in the G5 PBs at some point in the future?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.