Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
well, dear, if you read my post clearly, i don't think I said anything about OSX "isn't good enough for me". and i don't disagree with you on that.

and cut the whole "dear" thing... $%$^&&*(*&^%
 
its sometime hard to balance ur stability/security and usability when you have a massive of demand of the abilities of doing something that make your OS going opposite directions. and also Windows isn't binding with specific hardwares, which might be another point.
Its good to read your post, tho. thanks

I'd differ on that point. There's no reason why an OS cannot be fast, stable, secure and feature-rich ... especially on today's capable hardware.

For example, what're the architectural differences between OS X I'm running on this here MacBook Pro where I enjoy some World of Warcraft and the OS X which powers Virginia Tech's System X G5-based supercomputer (at one point the 4th fastest and one of the cheapest in the world)? None.

Microsoft's coding was lax. If you're coding a driver architecture, which you know 3rd parties are going to be interfacing with, you write your code absolutely defensively. You write code to ensure that no matter how badly the driver you're talking to misbehaves, you don't let that propagate far enough to bring down the kernel. This stuff isn't impossible, but it requires patience, thoroughness and (above all) a rock-solid design which you've revised and revised until it's as good as you can make it ... and if it's broken-by-design, you throw it away and try again. MS don't like doing this sort of thing. They like adding a 'new' way of doing something, with a kludge back to the old way available. Makes life easier for some, but those kinds of decisions always come back to bite in the end.

Current opinion seems to be that Vista will be the last of Windows as we know it, with something entirely new replacing it and current Windows apps being run in a 'Classic' environment. Sounds familiar, doesn't it? Apple bit this bullet 6 years ago and Microsoft will have to as well, possibly before the decade is out.
 
I'd differ on that point. There's no reason why an OS cannot be fast, stable, secure and feature-rich ... especially on today's capable hardware.

For example, what're the architectural differences between OS X I'm running on this here MacBook Pro where I enjoy some World of Warcraft and the OS X which powers Virginia Tech's System X G5-based supercomputer (at one point the 4th fastest and one of the cheapest in the world)? None.

Microsoft's coding was lax. If you're coding a driver architecture, which you know 3rd parties are going to be interfacing with, you write your code absolutely defensively. You write code to ensure that no matter how badly the driver you're talking to misbehaves, you don't let that propagate far enough to bring down the kernel. This stuff isn't impossible, but it requires patience, thoroughness and (above all) a rock-solid design which you've revised and revised until it's as good as you can make it ... and if it's broken-by-design, you throw it away and try again. MS don't like doing this sort of thing. They like adding a 'new' way of doing something, with a kludge back to the old way available. Makes life easier for some, but those kinds of decisions always come back to bite in the end.

Current opinion seems to be that Vista will be the last of Windows as we know it, with something entirely new replacing it and current Windows apps being run in a 'Classic' environment. Sounds familiar, doesn't it? Apple bit this bullet 6 years ago and Microsoft will have to as well, possibly before the decade is out.
i dont doubt you are a good programmer, but i do think You might underestimate how large the windows project is. even M$ isnt doing it with ease, lol
at least Apple borrowed a big, nice chunk of code from BSD as beginning.
 
That was Microsoft's admirable intention originally, however I think they ran into far more obstacles than they ever imagines and began to borrow more and more from their Windows model.
 
i dont doubt you are a good programmer, but i do think You might underestimate how large the windows project is. even M$ isnt doing it with ease, lol

Ahh, let me clear up what I meant :D

Microsoft's programmers were lax when much of the architecture of Windows was built because they could afford to be at the time. Software's organic - you make some structural decisions to start with, and grow it from there. During this time, the 'theory' of computing moves on... new practices, tools, architectures arise. Old ways get revised and improved. Goals change. To a significant extent, how easily you can adopt and take advantage of these progressions depends on the flexibility and strength of your initial structure.

Arguably, Windows' initial structure back in the eighties wasn't particularly cutting-edge even at the time. There were already (relatively) secure multi-user Unix systems around. Even OS X's favoured language, Objective-C was already kicking around in the form of IBM's SmallTalk language and NeXT's NEXTSTEP platform, with quite splendid object-oriented design patterns. Windows was a monolithic C (or C++) system. It was already beginning to accumulate the cruft of backwards compatibility at the API level.

The key thing is: you can't mask a weak design by piling more code on top. You can band-aid the thing together so to all appearances it seems good and stable, but you eventually get mired in the complexity this creates. The long, expensive development period of Vista bears this out. Can you hand-on-heart say that the Vista we're about to receive is what you'd expect from a corporation with all the resources of Microsoft after 6 years of development? For comparison, look what Apple (a company with much fewer resources -- both wealth and manpower) managed between the OS X Public Beta in 2000 and the upcoming Leopard.

Microsoft's Vista team are going through every developer's nightmare: trying to build something new and exciting on top of millions of lines of existing code, some of which well into its second decade of life and whose original writers may have long since quit, leaving a tangle of ill-documented and interdependent code. Imagine you're trying to implement "New Feature X", which uses some other much older code which has a bug in it. For your new feature to work, that bug would need to be fixed. But if you change that code, you might break another team's code ... and if they fixed their code to match, that could break yet another team's code. Mix that in with the Byzantine bureaucracy that Microsoft seems to have grown, and you're all set up for stagnation.

Have a read of this Vista developer's struggle to simply improve Vista's shut-down user interface. Briefly, out of the years he spent on the Vista team, he managed a total of about 100 lines of code on that feature, purely because of organisational bureaucracy and the pain of working with old, intertwined code.
 
yeah, I understand, its about the existing code are large, and therefore difficult to build new stuff. What i m wondering now is, is it the time for windows to completely re-build from scratch now?
just to be practical, OSX started because Apple was losing ground fast enough that they can't afford not to do it, they made a gamble, which would not hurt even if not successful eventually, and M$ windows is at top now, and show no sign of decline, so it might be hard for M$ to abandon all the stuff and redo it from scratch.
Maybe you are right, OSX in the future will have more games, etc, etc, and be very promising, but now, I see apple being very restricted, OSX is much more closed an environment than I expected before I switched to it.
 
well, how far? what are the most dramatic progresses linux has made in past 6 years? in your opinion?
Honestly, I think the strength in Linux continues to be its server applications. Closely behind that is the strength of its community.

For example, the PS3 ships with Linux installed. Linux is the first thing to be ported to most of the game consoles out there. Google is built, from its cornerstone, on Linux. Corporations have deployed Linux site-wide in many places. Entire governments have adopted Linux.

Then there's the OLPC project that would be financially impossible w/out a free OS like Linux.

I'm not a primary user of Linux, myself, so I can't speak much of its individual gains over the last six years, but the sheer adoption rate is incredible. I had my first taste of Linux in '98, using an old version of RedHat (5.0, I think) and, since then, it's grown beyond anything I ever expected to see.
 
Honestly, I think the strength in Linux continues to be its server applications. Closely behind that is the strength of its community.

For example, the PS3 ships with Linux installed. Linux is the first thing to be ported to most of the game consoles out there. Google is built, from its cornerstone, on Linux. Corporations have deployed Linux site-wide in many places. Entire governments have adopted Linux.

Then there's the OLPC project that would be financially impossible w/out a free OS like Linux.

I'm not a primary user of Linux, myself, so I can't speak much of its individual gains over the last six years, but the sheer adoption rate is incredible. I had my first taste of Linux in '98, using an old version of RedHat (5.0, I think) and, since then, it's grown beyond anything I ever expected to see.

well, I don't disagree, its funny some people a while ago thought Linux should giveup desktop area and focus on server, and that was exactly what redhat did.

I think the major improvement of linux that attract the normal users is the fact that Linux now support more and more hardwares, which at least made it possible for people to use it for everyday work without worry about the installation and hardware compatibility.

anyway, off topic.
 
when new windows out, users always ask "what architecture improvements are there", rather than "what small new features are there".
Actually, when a new Windows version comes out, I just think about how I'm practically forced to upgrade. Never have I upgraded Windows for any reason other than I had to. And I highly doubt Windows users think about under-the-hood changes. The average Windows user is using Windows because they don't know that there are alternatives.
 
yeah, I understand, its about the existing code are large, and therefore difficult to build new stuff. What i m wondering now is, is it the time for windows to completely re-build from scratch now?
just to be practical, OSX started because Apple was losing ground fast enough that they can't afford not to do it, they made a gamble, which would not hurt even if not successful eventually, and M$ windows is at top now, and show no sign of decline, so it might be hard for M$ to abandon all the stuff and redo it from scratch.

They'll have to do it for the same technical reasons that Apple did. Apple were lumbered with an OS that couldn't keep up with the progress of operating system technology. Pre-emptive multitasking, proper memory management - all things that were par-for-the-course for operating systems in the 1990s. Mac OS 9 couldn't do it. No matter how much time and talent Apple developers had, it just simply wasn't possible. Apple's mistake at the time was they didn't do it soon enough. 1995, Windows 95 had multitasking and memory management that 'Classic' Mac OS couldn't do. Sure, the Mac was arguably the nicer OS to use, but the scope for progression was limited.

A stagnant OS contributes to a bleed-off of developers. Apple suffered massively because their OS made application development tougher. As apps wanted to do new things, the OS wasn't progressing to support them. Could you imagine an application like Aperture running on the Classic MacOS? Perhaps, with a Herculean amount of work, a rough approximation of it could exist for OS9, but it'd be more effort than it's worth.

Unless Windows progresses, it awaits the same fate. In their favour, Microsoft's .NET Framework is already a move a disengage applications from relying on the underlying old operating system, so re-implementing .NET atop a new OS is a distinct possibility, since .NET apps would run just fine.

Windows also needs to progress for a couple of 'selfish' reasons for Microsoft. Firstly, other teams at MS want to produce great apps. They want new capabilities from the OS. What could Office turn into if Windows had things like Leopard's CoreAnimation and CoreData? Secondly, they need to keep attracting top-notch developers. If a coder's smart enough to make sense of the tangled mass of the current Windows code, they're going to want to be working on something amazing, not fighting with code complexity for years only to end up coding a couple of hundred lines on a shut-down feature (like the guy I linked to in my other post .... who now works at Google).

The Microsoft insider who writes the Mini Microsoft blog, campaigning for radical change at MS mentioned a joke that's going around some of the Microsoft programmers:

"What's the difference between OS X and Vista?
Microsoft employees are excited about OS X..."

Maybe you are right, OSX in the future will have more games, etc, etc, and be very promising, but now, I see apple being very restricted, OSX is much more closed an environment than I expected before I switched to it.

OS X doesn't need games. We can (and, indeed, I do) run XP under Bootcamp for gaming, and I'm perfectly happy with that. Other than that, I can't think of a single thing I'd rather use Windows for. I find there's more solid, useful software that works well for OS X than there is Windows. The amount of quality 3rd party software seems higher on OS X than on Windows. It's almost painful wading through the oceans of crappy software for Windows when I'm trying to find something to perform a specific task. If you're interested, I can reel off a list of software I use on my Mac for which there is no Windows equivalent of the same quality, integration and feature-set.

As for restrictions, can you mention anything specific?
 
i avoided the question is because I don't feel like answering it. Clear?
Im glad you think im in my teens, which im not, sorry, i don't think its necessary to talk about that.
about your last question. I can not tell for all users, but to my experience, windows 98 introduced fat32 system, activex, windows 2000 introduced NTFS system, windows xp, unfortunately really was not a big improvement from 2000, vista, I don't know yet.

If you don't feel it's appropriate to state what kind of scientist you are, you should have never mentioned being one in the first place.

I highly doubt you're a scientist.
 
OS X doesn't need games. We can (and, indeed, I do) run XP under Bootcamp for gaming, and I'm perfectly happy with that.
As for restrictions, can you mention anything specific?
I found it strange that you think OSX can just say to its user: "use bootcamp and windows to play game", just strange. if user spend money for OSX, they shouldnt be asked to spend money for another OS. after all, they are both OS.
restrictions. I just have the general feelings, I wouldnt say my feeling is perfectly right, but let me try to list
1. OSX is binding to hardware and not open to 3rd party hardware producers. User can not build their own hardware with OSX, and thus lack of variability to meet different demand of different users
2. the way apple handle the codes of KHTML and webkit, i heard KHTML developer aren't happy with it.
I can say nothin about OSX's core, since I really don;t know that much about them.
 
If you don't feel it's appropriate to state what kind of scientist you are, you should have never mentioned being one in the first place.

I highly doubt you're a scientist.

oh, i bet you highly doubt everything i said. since im not saying anything you like or enjoy.
 
What i m wondering now is, is it the time for windows to completely re-build from scratch now?

This was Microsoft's plan originally for Longhorn, but it just didn't happen.
It was taking far too long to even get it off the ground, so they were forced to go with the old.. and they are STILL 2 or 3 years behind their original supposed release date.
 
This was Microsoft's plan originally for Longhorn, but it just didn't happen.
It was taking far too long to even get it off the ground, so they were forced to go with the old.. and they are STILL 2 or 3 years behind their original supposed release date.
no, actually, original longhorn was supposed to be a minor transition between XP and "windows vienna", although later they changed plan to make it more "major" than a transition product.
altho M$ stated Vienna would have "complete departure from the way we have typically thought about interacting with a computer. " they didn't say they would completely rewrite the codes. Which maybe not a bad idea.

ref#
http://en.wikipedia.org/wiki/Windows_Vista
http://en.wikipedia.org/wiki/Windows_"Vienna"
 
I found it strange that you think OSX can just say to its user: "use bootcamp and windows to play game", just strange. if user spend money for OSX, they shouldnt be asked to spend money for another OS. after all, they are both OS.

I'm not saying that at all.

As many have said in this thread, there is absolutely, positively no technical reason why OS X isn't suitable for game development. None. The little World of Warcraft icon in my dock is just one proof of this.

I'm simply dealing with the reality: Game developers code for Windows and Consoles. The only 'quick' way I can see to change this is if Apple somehow licensed DirectX from Microsoft and ported it to run atop OS X. That's something Microsoft wouldn't agree to, and Apple wouldn't see the profit in doing. So, Bootcamp is what I use. I get to use my Mac's hardware for gaming, which is something its very capable of, and also it means my XP installation can be 100% geared for gaming - all the optimisations I can think of, and not a single extra application installed. It's quite nice how stable XP can be when it's used solely for the purpose of gaming :)

I bought my copy of XP a few years ago and simply transferred it to the Mac before retiring the PC to the attic. If you buy a retail copy of Windows, it eventually pays for itself after a few years of moving it between machines (assuming Vista's EULA permits this....). Of course, I've had to phone Microsoft a couple of times to explain what I was doing, but they never argued with me exercising my right to use the OS on any one computer. Of course, if you buy a PC with Windows included, you're screwed because your license only allows you to run it on that computer. If the machine dies, your copy of Windows dies with it (woohoo for OEM licenses! :( )

restrictions. I just have the general feelings, I wouldnt say my feeling is perfectly right, but let me try to list
1. OSX is binding to hardware and not open to 3rd party hardware producers. User can not build their own hardware with OSX, and thus lack of variability to meet different demand of different users

Ah, now, to be fair, you knew that before you switched ;) There will of course be users for whom this is a genuine problem. If you're a 'build-it-from-scratch' kind of person (which I was when I had a PC) it'll be an issue. Personally, I found that having the ability to swap individual components rapidly became less of a benefit than it seemed.

For example, my last PC was a Socket 745 AMD Athlon 64. I bought the motherboard, RAM and CPU and an AGP graphics card. I used that system for about 18 months, and I bought it just when the 745 boards came out. I thought I had a fine upgrade path: plenty of scope for higher-clocked CPUs, the fastest AGP slot I could get, a beautiful motherboard with 5.1 audio and optical out.

Fast-forward 18 months...

I could still put in a much quicker CPU. But for decent gaming I'd need a PCI Express VGA card. But for that, I'd need a new motherboard. And to get the best out of what I'd bought, I'd need faster RAM to go with the new board. Pretty quickly, I was looking at spending a ton of money every year and a bit just to stay at the same relative position compared to the 'state-of-the-art'. The truth is, computer components are so interdependent that the ability to upgrade piecemeal is largely a waste of time.

The Mac has less upgradeable components. But the parts that, if upgraded, would genuinely be useful are still changeable. More RAM, bigger hard disks... My MacBook Pro has an external hard drive which is powered from the Firewire port - I keep all the benefits of portability whilst having plenty of space for my downloads from Steam :)

My old Power PC Mac Mini has a USB audio interface for 5.1 audio and optical input and output. Possibly the best £39 I've spent.

2. the way apple handle the codes of KHTML and webkit, i heard KHTML developer aren't happy with it.

Yep, there was certainly some friction there. Apple no doubt stepped on some toes. However, there's still plenty of two-way traffic. Check out the Unity project (some info here) where open source developers are seriously working on integrating WebKit back into the Qt toolkit... 3-4 developers taking 3 days to get a proof-of-concept working sounds like Apple's WebKit code has been kept clean and easily portable for the Open Source community to take advantage of.

I can say nothin about its core, since I really don;t know that much about them.

Overall, I'd say it's worth exploring the things your Mac will let you do, rather than worrying about the things it can't if they don't affect your ability to do awesome stuff with it :)
 
oh, i bet you highly doubt everything i said. since im not saying anything you like or enjoy.

You're saying things I don't agree with.

And you really, REALLY sound like a pathetic teenager with your ranting. I've seen people claim to be scientists or experts of things just to make themself sound smart, yet once asked to actually back up their claims, they fall flat on their face. Don't go and mention **** you refuse to elaborate on, especially if you deem it irrelevant when that statement is challenged.

Oh yeah, and the ad-hoc defense won't work on me. Sorry.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.