I'd agree 100% CS should be about learning the concepts and philosophies of computing, not how to use Visual Studio (or any other specific toolset). When I did CS at Edinburgh we were told that our assignments for our Object-Oriented programming class could be handed in written in any OO language we wanted as long as it would compile and run on Solaris. Most chose Java, but the class did not teach any specific language...
Why do I feel like we agree, and yet still aren't talking about the same thing?
I school/program/classroom/professor/or subject matter can not be taught equally well on every platform in every language for practical logistics reasons.
So, tools must be chosen, perhaps a few tools, in order to learn the art of Computer Science.
My point and argument, is that those tools lean more towards a windows environment, with a larger push towards Linux of late, then it does towards OSX. Am I mistaken?
So while I fresh graduate way be trained in an art that can be applied using any tool out there, that student will inevitably be more familiar with the tools he used to learn with, than a brand new set (XCode/Object-C).
If someone wanted me to whip out a quick calculator program for them, I would not go to XCode and Object-C, sadly, because those are not the tools I am currently familiar with. Could that program be written with any number of IDE's, in any number of languages? Most certainly.
I think we all agree on the definition of CS, and what the ideal education in such a field would involve. But does anyone disagree with my assertion of the situation as it stands today in the Software Engineering carrier?
All of
that to ask my question yet again: Who is starving/getting paid more, the guys working on the Windows side, or the guy on the OSX side?
Where is there a greater demand for competent knowledge of the tools?
~Earendil