There's a big difference between the kernel and the operating system. Windows 7 already has a ton load more user features than Vista brought over XP. Don't get me wrong, I thought 10.3 to 10.4 was a pretty decent jump, but 10.6 just doesn't seem so great unless they really update the Finder. Snow Leopard (like its name implies) should be a reduced price upgrade.
7 brings more to the table. As will Snow Leopard.
10.6 doesn't seem a great jump now to some, but then neither does 7!
Would Windows give 7 out as a reduced price upgrade?
I think the interesting thing about 10.6 will be that it'll keeping getting better, with better hardware - less of a drop off in terms of improved performance as you up the kit.
Of note, roughlydrafted has an interesting article
here, which pulls together more of a view on Apple's goals with PA Semi, imagination Technologies, iPhones, OS X 10.6 and more - in terms of graphics, and more. Might not see it till v4 iPhone, as they're job adverts (we're awaiting to see what the job advert for RF wrt GPS antenna brings after a year for the v3 iPhone when it's launched for example).
So in context, there is hiring, for tech related to getting something very akin to Snow Leopard, doing GPGPU. On a phone. Potentially. All caveats apply, but Apple may push ahead with some of the concurrency stuff, with the OpenCL, Grand Central etc stuff alongside. Let's not spin off into a Microsoft versus Apple debate too much - I think the comparisons between 7 and Snow Leopard are valid and I'd imagine it'll only get louder on this front - both are not out yet, and only really in the hands of developers at this point, with key vaunted technologies missing.
Edit - I think we need to see the finder in terms of the times. Apple had a decent patent out around 2000 I believe. And some reviews of it were
fairly complementary
NVIDIA's Supercomputer in a box
http://www.nvidia.com/object/personal_supercomputing.html
"The NVIDIA® Tesla™ Personal Supercomputer is based on the revolutionary NVIDIA® CUDA™ parallel computing architecture and powered by up to 960 parallel processing cores."
"250x the compute performance of a PC" (though the price is nearing £10k
Plug a GPU in
Plug a deskside GPUS system into your desktop
Plug a 1U server size unit in
Apple working with NVIDIA. NVIDIA making big claims. Apple not making big claims. Wery wery qwuiet
(That is one big ass protein to be working with - having played with Sybil a bit on a decent today's standard desktop).
Could the XServe get a boost? Or have a linkable in NVIDIA unit?
NVIDIA's 1U is a 4 Teraflop unit.
Put another way - A HPC that's nearby, is ~850 processors, with a peak of ~7 Teraflops. It's a few racks of blades for the XC. More space with the GPU route, though bigger concentration of heat issues.
This initial jump to GPGPU might be huge (it'll redefine Moore's Law for at least a cycle potentially, as now we're dealing with Moore's Law for 2 things, CPU & GPU, which are additive potentially)
That could rival a current HPC e.g. a HP XC cluster (
e.g.)
~ ?x
vs
(Double point vs single point flops not given, so it's a harder comparison than initially thought - i'd imagine the HP XC is used for double, with the work done on it. A 96 Opteron HPC would be about equivalent to one of the NVIDIA products in GFLops. i.e. 4 Tesla GPUs - a rough 20x reduction in PUs)
Now i'm not knocking current HPC centres (the above was in the top 200 worldwide), i'm just saying GPU based HPC is potentially quite close. A place like EBI, has looked at using GPU, but doesn't have plans yet apparently. And they have a decent HPC going on. Who's going to overach whom? HP would buy the NVIDIA product presumably, and brand it. What's Apple's angle?
As Clay Shirky says, more is different. If you have the money, you could potentially have a supercomputer now, without the hassle of current generations of HPCs. Look at the XServe - people actually moan about how easy it is to use. How they get offended by the simplicity, and ease of use. They moan for complicatedness, in some Calvinistic sado-masochistic way.
Think Apple XServe/Mac Pro, meets the NVIDIA Tesla Supercomputer. That's what Snow Leopard might unleash. It'd shut up some detractors, that's for sure.
Apple might just have a lot of Boom for your buck coming up. These HPCs are running Xeons. And those Xeons are about to get a tasty Nehalem boost.
Power - What is the power consumption of a 1U NVIDIA ? 800W. And it fits on your desk. Would keep your toes warm. And also, you could horde it's power, rather than rely on distributed power if you wanted.
Performance per buck? They're
saying 100x better.
And what was the difference for Intel vs PowerPC? Wasn't that large...
Not to get too carried away, there are issues, it's new tech, new implementation etc etc etc. Just a quick shout out to a OP on another thread in the Mac OS thread - sorry if I was a little harsh. For certain things, certain programs, it may well be getting to the levels your article looked at. I think the NVIDIA site may well be testament to that. And to think Apple's bringing this tech to iPhones and other Macs soon... As previously said, some desktop PCs now are better than an original Cray. The next lot might go one further, and have a stab at rivalling current HPC cluster set ups.
If you crunch numbers, it might just be big.