Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm a bloody programmer;
But have you got the GCD choo-choo stamped on your hand yet?

Uh oh, got to go! There's a train coming in, and I've got a thread-pool to catch.. To Infinity!

2002cbr600f4i said:
It's not that the app has gotten faster just because of GCD. It's that the OS support that the App makes use of has gotten faster because it's been rewritten to use GCD. So, yes, SL can run some things faster without apps having to be rewritten. There is SOME benefit.
Some people wouldn't call this GCD(People other than marketing). Some people would call this fixing your longstanding crappy kernel.

wizard said:
I will have to continue to object to this idea, well written apps already benefit from SL from what I can see. In some cases I would seriously doubt the developers will do anything more to optimize specifically for SL.
Yep, it is just a marketing invention. OpenCL is the only meat on this carcass.

wizard said:
Well this is certainly true! It be iteresting to see what apps adopt GCD heavily and more so which apps go one step further and adopt OpenCL.
You'll hardly be able to tell. Any app that would have seriously benefited, would already be doing multi-threading. GCD is just the latest Apple gimmick, selling people multi-threading for the 6th or 7th time. Just like 10.6 is selling 64-bit for the 4th or 5th time. The GCD technology is so inconsequential that the Windows equivalent, ConCRT, is hardly even talked about.

If Apple really felt GCD was important, and really wanted it adopted by as many developers as possible, they would have backported the Objective-C 2.1 runtime to 10.5, and possibly even 10.4. Then app developers would have no reason NOT to use it. Currently, if you want to write apps that can be used by the entire Mac community and have hopes for portability to Win or X, you don't touch GCD.
 
Things to consider.

I'd like to see figures for the independent contributions of:
- SnowLeopard running non-optimised code
I'm not sure this would be all that valid because SL so improves threading support that many apps are running on top of brand new code. So ineffectual old programs are in many ways running optimized code from libraries and are being served via improved threading management.
- SL with OpenCL
There are significant limitations on what OpenCL can accelerate. Unless you have a specific need there is little reason to focus on OpenCL. Plus the numbers that you get are highly dependant on the actual GPU in the system.
- SL with GCD
SL really doesn't come without GCD. In fact libdispatch and the kernel work together very tightly. From what I can see the major reason to get SL out the door was to get GCD out in the wild, GCD is the whole point of the SL release.
- SL with both optimisations
Both GCD and OpenCL ship with SL. About the only thing you could do would be to test hardware with and without OpenCL hardware installed. The thing is certain parts of SL, such as Core Image do make use of OpenCL and compatible GPUs.
Wonder if there's more detailed information anywhere?

Well not exactly what you are asking for but there is some interesting bench marking out there. Some apps are showing impressive speed ups from simply running under SL. There are also regressions so Apple has work to do.

While I don't benchmark I've noticed that some apps are indeed much snappier under SL. There is likely several explanations here so I'm not even going to try to explain what may be happening. I like to describe it as getting a new faster machine for all of $30.


Dave

PS

There are glitches too with SL but you really can't focus to much on them. I expect they will be fixed in time. It is more important in my mind that they have fixed numerous issues with Leopard that gave nothing to do with GCD or OpenCL, Snow Leopard is a big win for Apple.
 
It is sad that you have such a negative view on reality.

But have you got the GCD choo-choo stamped on your hand yet?

Uh oh, got to go! There's a train coming in, and I've got a thread-pool to catch.. To Infinity!
Like it or not for some people catching that train will be real important to staying in business.
Some people wouldn't call this GCD(People other than marketing). Some people would call this fixing your longstanding crappy kernel.
And some people are totally ignorant of what is being discussed here. When GCD gets up and running on Linux are you going to accuse them of having a crappy kernel?
Yep, it is just a marketing invention. OpenCL is the only meat on this carcass.
A little cranky today. OpenCL is pretty impressive and currently is the only platform independant way, that has a chance at wide scale acceptance, to harvest the power of GPUs for non graphics work.
You'll hardly be able to tell. Any app that would have seriously benefited, would already be doing multi-threading.
I hear this all the time and each time I hear it, it is still wrong. It all depends upon the application and the programmers ability to extract parallel operation out of it.
GCD is just the latest Apple gimmick, selling people multi-threading for the 6th or 7th time. Just like 10.6 is selling 64-bit for the 4th or 5th time. The GCD technology is so inconsequential that the Windows equivalent, ConCRT, is hardly even talked about.
It is something for developers to discuss, so obviously you would not hear about it. However it is of far more interest to Apple developers due to the types of software found on Apples platforms.
If Apple really felt GCD was important, and really wanted it adopted by as many developers as possible, they would have backported the Objective-C 2.1 runtime to 10.5, and possibly even 10.4. Then app developers would have no reason NOT to use it. Currently, if you want to write apps that can be used by the entire Mac community and have hopes for portability to Win or X, you don't touch GCD.
Let's face it anybody running 10.4 isn't going to be taking on modern software and thus none of this means anything to them. As to 10.5 why do you think the SL update is so cheap?

Besides if you back ported all this what would you have? Snow Leopard obviously. Your suggestion like the majority of your posts has no merit.


Dave
 
Like it or not for some people catching that train will be real important to staying in business.
You mean the gimmick business? I concede that point.

And some people are totally ignorant of what is being discussed here. When GCD gets up and running on Linux are you going to accuse them of having a crappy kernel?
Linux (nor Windows) doesn't need to have thread management totally revamped in order to accommodate a high thread environment.

A little cranky today.
Not really.. It just seems that way because I don't work for Apple's PR dept.

I hear this all the time and each time I hear it, it is still wrong. It all depends upon the application and the programmers ability to extract parallel operation out of it.
Pardon a pun, but this is a train that never seems to get to the station. Programs that benefit from threads, like Handbrake, already use threading. They won't gain from GCD, except to the degree that they were suffering from OSX's bad kernel scheduler. This scheduler has been showing it's warts to some degree since the Quad G5, and greatly since the OctoPro.

It is something for developers to discuss, so obviously you would not hear about it. However it is of far more interest to Apple developers due to the types of software found on Apples platforms.
It seems to be of more interest to disciples than developers.

Let's face it anybody running 10.4 isn't going to be taking on modern software and thus none of this means anything to them. As to 10.5 why do you think the SL update is so cheap?
10.4 was what you bought on a Mac less than two years ago, say, an 8-core Mac Pro. That isn't 'modern?' SL is cheap because Apple needs people to update to keep their upgrade treadmill alive. If people stop updating, they can't keep obsoleting older OSes, and older systems, and then the HARDWARE sales machine breaks down. SL is cheap because they have got to find a way to kill the value of previous Macs.

Besides if you back ported all this what would you have? Snow Leopard obviously.
Some people would call that a service pack.

You'd also have 100% deployment of GCD, which if it was really so important, Apple would have done so, so that developers could TRULY use it without reservation.

Your suggestion like the majority of your posts has no merit.
Maybe we should box those posts up and sell them for $29.
 
You'd also have 100% deployment of GCD, which if it was really so important, Apple would have done so, so that developers could TRULY use it without reservation.

Maybe we should box those posts up and sell them for $29.

Just like how MS said that DirectX 10 was so important... So important that they didn't bother to backport it to Windows XP in order to maximize it's penetration...

As a result, most games still are written in DX9, with some having DX10 support. But DX10 has largely been a huge failure because MS tried to use it as a mechanism to force people to Vista, without demonstrating why DX10 was so superior to DX9...

Now, some would argue the same thing is true with Apple and GCD and Leopard vs. Snow Leopard....
 
Just like how MS said that DirectX 10 was so important... So important that they didn't bother to backport it to Windows XP in order to maximize it's penetration...

As a result, most games still are written in DX9, with some having DX10 support. But DX10 has largely been a huge failure because MS tried to use it as a mechanism to force people to Vista, without demonstrating why DX10 was so superior to DX9...

Now, some would argue the same thing is true with Apple and GCD and Leopard vs. Snow Leopard....

True, it would be nice if Apple backported basic GCD style thread pooling to 10.5/10.4. Even if it had to create local thread-pools for every application and created no speed advantage on those platforms. Porting the GCD style syntax(again with local threadpools) to Windows and Linux versions of GCC would be even better.

I wouldn't doubt GCD gets ported to more architectures and OS versions.
 
Like it or not for some people catching that train will be real important to staying in business.

I'm sorry. If you needed your application to be multi-threaded to stay in business it would have already been done.

And some people are totally ignorant of what is being discussed here. When GCD gets up and running on Linux are you going to accuse them of having a crappy kernel?

Process scheduling is HARD. Linux has wrestled with it for years to find a balance between being user/desktop responsive and server responsive. The scheduler in OSX is pretty poor though and hasn't seemed to get much better over OS releases.

I hear this all the time and each time I hear it, it is still wrong. It all depends upon the application and the programmers ability to extract parallel operation out of it.

And here's the kicker, most applications have no parallel optimizations available. Server applications benefit, batch processing apps benefit, but the typical GUI app is not constrained by the machine, but by how fast the user can process and respond to the information being displayed. If run of the mill GUI apps suddenly get a boost it's because Apple is using GCD as a proxy to finally fix their scheduler. Other OSs schedulers have already managed this (splitting process on unused cores) for years though.

It is something for developers to discuss, so obviously you would not hear about it. However it is of far more interest to Apple developers due to the types of software found on Apples platforms.

Do tell what types of software is on Apples platforms that are not on others? I would argue that other platforms have more use for GCD or something similar than Apple does, simply because both Linux and Windows have a much larger market share in the server arena. Servers is where you could really see something like GCD shine. That's when you get into 16+ core machines.

Years ago I wrote a multi-threaded server process that I spent a lot of time optimizing the thread usage for a dual proc machine (best at the time). When we upgraded to quad procs, I had to re-optimize. GCD hopefully removes the optimization part and that is where I think there is a big win.
 
True, it would be nice if Apple backported basic GCD style thread pooling to 10.5/10.4. Even if it had to create local thread-pools for every application and created no speed advantage on those platforms. Porting the GCD style syntax(again with local threadpools) to Windows and Linux versions of GCC would be even better.

I wouldn't doubt GCD gets ported to more architectures and OS versions.

Actually, Apple HAS put in to have their concept of blocks get added to the C/C++/Obj-C standards...

They've also put out a version of GCD to be used by other OS'es.

LLVM, the compiler they're using, is also freely available IIRC.

So, everything you need to implement and use GCD +the coding facilities involved with it, IS available to be added to other platforms. It's up to those other platforms to decide if they want to use it or not.
 
You (and most other people) need to realize that there is a relatively small set of computations that can be accelerated by this kind of technology. Of course certain types of video, image, and sound processing will work, but your run-of-the-mill Mac app isn't going to be able to take advantage of GCD or OCL.

This is patently untrue. Many, many common operations could be easily multithreaded if the cost -- in terms of development technique and setup/teardown of new threads -- was not so prohibitive. That's the problem GCD aims to solve.

Consider a typical function that performs some sort of IO, do a set of 5 or more independent operations (such as applying filters or validation), and then record the result of those operations. Ordinarily, we perform these three stage in serial, and once performance becomes an issue, we go back and re-implement one of the stages (usually the IO) in such a way that it populates a synchronized "queue" of results, which can be handled by another worker thread. Rarely do developers go more than three threads deep, because it's difficult to think about three things happening in parallel. GCD purports to change that by allowing developers to state parallelizable blocks in a way that's just as natural as a for loop. Assuming you had enough CPUs, you could launch all 5 independent operations at the same time on separate threads, and coordinate the result when they all completed. Of course, if you only have one CPU -- what is this, Russia? -- the code executes same as it did before.

Another example is event multicasting (the issuance of requests to handle an event to more than one registered recipient). Typically, multicasting is also performed in serial. However, it's rare that multicast clients are dependent on each others' results, in part because the order of dispatch is often nondeterministic, but mostly because the whole idea of event multicasting is to give the illusion to the user that many things are happening at the same time. It's quite possible that, in a cast where clicking a button should result in several UI mutations (such as the button being greyed, the cursor changing state) as well as one or more independent background operations. No developer is going to manage a thread stack to perform such trivial operations, but every developer will need to write a block of code to manage them. If the act of turning this block over to GCD is as simple as writing the block a little differently, and brokering a call to GCD, the barrier to multithreading is eliminated.

What about logging? Every application logs something somewhere, and most of them do it in serial too. I recently wrote a log adapter using plain old non GCD threads and it boosted our apps performance by about 20%.

As for OCL -- I've read papers claiming benefit from using the GPU to handle such basic tasks as string operations and node traversals. Any task where the number of operations exceeds the size of the data they're operating on can be pushed onto the GPU for a performance bump.

Most of the time, multithreading and GPU computing are ignored in an application's development because developers make the same assumption you do, that the average operations isn't likely to benefit from them. This is done because performing this kind of optimization when it isn't needed is costly and error prone -- "Do not preoptimize" is the motto of any developer whose tasks are more complex than Hello World. The beauty of OCL and GCD is that they allow developers to make relatively minor changes to the way they write that vastly improve performance before even the first pass of optimization occurs.
 
This is patently untrue. Many, many common operations could be easily multithreaded if the cost -- in terms of development technique and setup/teardown of new threads -- was not so prohibitive. That's the problem GCD aims to solve.

That particular problem was already solved... OpenMP. It has been in XCode with the GCC 4.2 compiler. GCD might be based on OpenMP, and that is why Apple is being so 'generous' and releasing their modifications as GCD.

The real problem (that GCD & OpenMP doesn't solve) is that these little chunks of so-called parallel code are either embarrassingly parallel, and thus already being threaded, or they are not, and the reason is that they are not that parallel. In fact, most operations DO depend on the data in other elements or in other steps in a series. Those that don't, often still need some kind of locking or synchronization to be done before data can be read or written back to the application's main data structures. The best case is that you can clone the data structure, do the work, and then substitute the new data structure for the old in one locking/store. Your overhead in that case is a read lock, the memory copy, and a write lock. That's the best case. Anything else is going to be lock intensive, unless you just want to freeze access to that data for all other threads for the duration of the process. That would mean your GUI can't display the data until it comes out of the lock.

Locking, synchronization, overhead, serial code. Amdahl's Law.
 
Those that don't, often still need some kind of locking or synchronization to be done before data can be read or written back to the application's main data structures.

The best case is that you can clone the data structure, do the work, and then substitute the new data structure for the old in one locking/store. Your overhead in that case is a read lock, the memory copy, and a write lock.

If you can simply overwrite the main structure, why was a lock needed in the first place?

Perhaps a better example is where the main structure needs to be updated, not overwritten.

One good example of this is when the application has global counters (for performance data or other reasons).

Instead of locking/update/unlock every time something needs to be counted, one can have thread-local counters that are updated without synchronization. When the thread ends, the lock/update/unlock can be done once to the global counter.
 
Why doesnt OpenCL support the ATI HD3870? Its got the hardware, so where is Apple giving us software? I remember Steve Jobs once said 'but because we [Apple] believe in choice....'

Well, consumers believe in choice more so than you, Steve, and some of us paid good money for an OpenCL capable card.
 
These cards were sold to you for rendering video... they just aren't capable of doing the calculations. It was normal for the engine of a graphics card to not support double precision floating point, or bastardize IEEE754 floating point numbers, they were made to render graphics fast. If you wanted to be able to be able to do math calculations on your card, then you should have made sure to get a CUDA capable graphics card.

There isn't anything anyone can do... as far as I see, there are valid reasons for every unsupported graphics chipset; which is most of them.

They are capable... I can tell you that much.
What do you think rendering video involves? MATHS!

Your card was designed for rendering video... but Apple decided to make a driver that allowed it to work like a backup CPU. They were just laaaazy and have a really bad relationship with ATI nowadays.
 
Why doesnt OpenCL support the ATI HD3870? Its got the hardware, so where is Apple giving us software? I remember Steve Jobs once said 'but because we [Apple] believe in choice....'

Well, consumers believe in choice more so than you, Steve, and some of us paid good money for an OpenCL capable card.

I'm sorry, but I call BS here. I'm betting you bought that card LONG before OpenCL was announced! And if you didn't, well, why did you buy that card wthout knowing if it would be supported???
 
I guess you're making the mistake in thinking they dropped PPC to save time. No, they dropped PPC to screw customers. And that's what they are doing to you, too. This won't stop until customers stop excusing Apple for this kind of behavior. Of course, they dropped 'Computer' from their name because they make toys now.

That makes you a poor customer. :D It used to be you bought a Mac, you got long life out of it. Now, you buy a Mac. And then you buy another one within two years, or you are a worthless Apple person.

Next time, don't give Apple your money unless they promise you a certain number of years of support.
Umm, my (early 2006) Mini runs Snow Leopard just fine ... it's over 3 years old now. Apple doesn't support the hardware (out of warranty, or even Applecare) but the software is current, so I don't see how I wouldn't be supported that way. With the current timetable on the OS, I can see running for another 2-3 years before the next iteration of OS X requires a dedicated GPU. That's 6 years on my computer - far more than a run-of-the-mill PC.

My eMac bought in 2002 was able to run everything up to Tiger, so easily 5 years of software support. (Tiger was succeeded by Leopard in 2007.) It's not a promise, but with technology moving as fast as it does, 5 years is an eternity.
 
I'm sorry, but I call BS here. I'm betting you bought that card LONG before OpenCL was announced! And if you didn't, well, why did you buy that card wthout knowing if it would be supported???

OpenCL was announced some years ago, its only just hit a stable release! It was DEVELOPED using older cards...

Apple's just been MEGA lazy with the drivers!!!
We're talking 3.5 years of computers... they've chosen not 2 support 2 VERY POPULAR ATI cards (which basically cover everybody who doesn't have the latest round of MBP's or Mac Pro's.)

It's just a grudge match against ATI... Apple ALWAYS does this...
OS X OpenGL - didn't support the older ATI cards (OS 9 OpenGL did though? Apple got sued big time for that one because they'd sold a bunch of iMacs to a HUGE law firm... claiming they were OS X capable as they had G3's)
Quartz extreme - didn't support the Rage Pro...
Quartz extreme (when updated) - didn't support the GeForce 2
OpenCL - doesn't support computers more than 2 years old (although it was drafted out and announced before then)

---

It's Apple's tactic. Most Macs have built-in graphics cards and you CAN'T get these new features without buying a whole new computer.

Windows is different... you can generally upgrade your graphics card even if you have a cheap computer. And... Windows... tends to support pretty well every card (out of a MUCH larger pool that Apple's offering)... Apple's restricted OS X to the last 3 years of computers produced and STILL can't support every computer. It's a farce!
 
OpenCL was announced some years ago, its only just hit a stable release! It was DEVELOPED using older cards...

Apple's just been MEGA lazy with the drivers!!!
We're talking 3.5 years of computers... they've chosen not 2 support 2 VERY POPULAR ATI cards (which basically cover everybody who doesn't have the latest round of MBP's or Mac Pro's.)

It's just a grudge match against ATI... Apple ALWAYS does this...
OS X OpenGL - didn't support the older ATI cards (OS 9 OpenGL did though? Apple got sued big time for that one because they'd sold a bunch of iMacs to a HUGE law firm... claiming they were OS X capable as they had G3's)
Quartz extreme - didn't support the Rage Pro...
Quartz extreme (when updated) - didn't support the GeForce 2
OpenCL - doesn't support computers more than 2 years old (although it was drafted out and announced before then)

---

It's Apple's tactic. Most Macs have built-in graphics cards and you CAN'T get these new features without buying a whole new computer.

Windows is different... you can generally upgrade your graphics card even if you have a cheap computer. And... Windows... tends to support pretty well every card (out of a MUCH larger pool that Apple's offering)... Apple's restricted OS X to the last 3 years of computers produced and STILL can't support every computer. It's a farce!

Thank you.

To be on point, I'm really waiting for proper support of OpenCL for my 3870. Had I known Apple wouldn't care, I would have bought the Geforce 8800.
 
If you can simply overwrite the main structure, why was a lock needed in the first place?
If there is no other thread in your code for changes(or reads) to the structure to initiate, then you don't. But if there is, then you need a read+write lock before the copy, to make sure the structure is in a consistent state. Release the read lock once copied, but leave the write. You need to read lock again before you make the substitution. You can drop some of those locks, and instead simply prohibit all access to the structure. But if you are doing that, the copy isn't necessary in the first place.

Perhaps a better example is where the main structure needs to be updated, not overwritten.
Yes, but I'm trying to look at a best-case practical example of GCD. The point being, there aren't very many that are 'stupid easy' enough to get excited about.

Instead of locking/update/unlock every time something needs to be counted, one can have thread-local counters that are updated without synchronization. When the thread ends, the lock/update/unlock can be done once to the global counter.
Yep, excellent example of avoiding serialization.

AFter G said:
My eMac bought in 2002 was able to run everything up to Tiger, so easily 5 years of software support. (Tiger was succeeded by Leopard in 2007.) It's not a promise, but with technology moving as fast as it does, 5 years is an eternity.
My primary reference is to the software itself. Apple doesn't believe in providing security support. They expect you to buy bugfixes by continuously buying OSX versions, until the time they decide your hardware can't get run newer versions. Panther lost support when 10.5 shipped, and Tiger has lost support now that 10.6 has shipped. Tiger was being sold as new less than 24 months ago. That's pretty embarrassing.

ungraphic said:
To be on point, I'm really waiting for proper support of OpenCL for my 3870. Had I known Apple wouldn't care, I would have bought the Geforce 8800.
Exactly. This is the message Apple is broadcasting: Don't buy our products expecting anything better than what you see today. The future isn't for you, unless you're standing at the cash register when it arrives.
 
My primary reference is to the software itself. Apple doesn't believe in providing security support. They expect you to buy bugfixes by continuously buying OSX versions, until the time they decide your hardware can't get run newer versions. Panther lost support when 10.5 shipped, and Tiger has lost support now that 10.6 has shipped. Tiger was being sold as new less than 24 months ago. That's pretty embarrassing.
I see nothing embarassing with a current version - 1 support scheme.

To give an example, Ubuntu LTS versions are only supported with security updates for 3 years from release. Non LTS versions only get 18 months of support.

Tiger came out April 2005, and was succeeded by Leopard in October 2007. It continued to receive security updates until August 2009, when Snow Leopard came out. If all updates ended there, that means Tiger was supported for 4 years and 4 months, far longer than even an LTS release of Ubuntu.

Actually, Apple just released a security update for Tiger this month. It's still supported :)

Note that support is not measured from when you bought the product, but when the product was released. In fact Microsoft has ended its own support for XP, and only extended (read:business) support remains to 2014. Does the fact that netbooks are still being sold with XP mean that they're out of support? No. The OEM supports it, and only for a year at that. I'm sure you could pay someone to support OS X if you wanted to stay on the same version badly enough when it becomes unsupported.
 
Thank you.

To be on point, I'm really waiting for proper support of OpenCL for my 3870. Had I known Apple wouldn't care, I would have bought the Geforce 8800.
Hopefully they will. Just because its out for NVIDIA first, doesn't mean ATI won't support it. The delay might be because ATI was working on its own competitor to CUDA before OpenCL.

OpenCL was announced some years ago, its only just hit a stable release! It was DEVELOPED using older cards...

Apple's just been MEGA lazy with the drivers!!!
We're talking 3.5 years of computers... they've chosen not 2 support 2 VERY POPULAR ATI cards (which basically cover everybody who doesn't have the latest round of MBP's or Mac Pro's.)

OpenCL - doesn't support computers more than 2 years old (although it was drafted out and announced before then)
I can agree with the lazy driver part. You're lucky at least you have an ATI card which means the possibility of support. I have integrated Intel graphics on my 2006 mini, which probably means no support ... oh wait, OpenCL runs on CPUs too! Yay, I'm supported!

Exactly. This is the message Apple is broadcasting: Don't buy our products expecting anything better than what you see today. The future isn't for you, unless you're standing at the cash register when it arrives.
This is not the message Apple has.

You don't expect this sort of thing from other companies; for example car companies don't update any of the electronics on your car barring a life-threatening situation in which they are forced to do a recall.

It's the same type of ridiculousness that copyright supporters try to push - the artists create once and we pay forever. With computers, we are expecting the same thing - we pay once, and expect the programmers to work for us forever. Not going to happen.
 
This is not the message Apple has.

You don't expect this sort of thing from other companies; for example car companies don't update any of the electronics on your car barring a life-threatening situation in which they are forced to do a recall.

It's the same type of ridiculousness that copyright supporters try to push - the artists create once and we pay forever. With computers, we are expecting the same thing - we pay once, and expect the programmers to work for us forever. Not going to happen.

I think it is their message, and it's VERY smart marketing.

Their message is... expect EXACTLY the same hardware that you purchased, we're not going to make new drivers that enhance old hardware.

Now... yes, one would be naive to expect Apple to turn their old hardware into something blazing fast for free. What I'm talking here is... my computer came with Leopard! Since then Apple's made a TINY update to the graphics drivers... not some of the CPU load can be pushed over to the GPU (they could technically do this for ANY GPU that's sitting there idle) but... they've chosen not to.

Now... I've actually paid for this update! 2/3 of the BIGGEST selling points aren't even supported (OpenCL and 64-bit kernel.)

Meh I'm just a little let down... I have the right to be let down, just as I do with other companies. I'm a consumer!
 
I see nothing embarassing with a current version - 1 support scheme.

To give an example, Ubuntu LTS versions are only supported with security updates for 3 years from release. Non LTS versions only get 18 months of support.
So you're comparing Apple to a free product so you can have a favorable comparison? Why don't you compare to Microsoft instead?

Tiger came out April 2005, and was succeeded by Leopard in October 2007. It continued to receive security updates until August 2009, when Snow Leopard came out. If all updates ended there, that means Tiger was supported for 4 years and 4 months, far longer than even an LTS release of Ubuntu.
It was sold less than 24 months ago as new, and now it is unsupported. Don't talk about someone who bought it 4 years ago, and don't compare it to a free product.

Actually, Apple just released a security update for Tiger this month. It's still supported :)
That update was in the can before SL shipped, just as Tiger got 10.4.11 two weeks after Leopard shipped. Tiger did not get the Java update, even though JDK 1.4.2 was updated for 10.5. Support has been discontinued for Tiger, in line with Apple practices.

Note that support is not measured from when you bought the product, but when the product was released. In fact Microsoft has ended its own support for XP, and only extended (read:business) support remains to 2014. Does the fact that netbooks are still being sold with XP mean that they're out of support? No. The OEM supports it, and only for a year at that. I'm sure you could pay someone to support OS X if you wanted to stay on the same version badly enough when it becomes unsupported.
Support is measured from when a customer buys a supported product from the vendor to the time they stop getting support. PERIOD. Microsoft is still providing security updates to XP for ALL customers until 2014. Windows 2000 is getting security updates for ALL customers until 2010. Microsoft does it right, better than anyone I can think of it.

It's the same type of ridiculousness that copyright supporters try to push - the artists create once and we pay forever. With computers, we are expecting the same thing - we pay once, and expect the programmers to work for us forever. Not going to happen.
How do you explain Microsoft? 2014 isn't forever. It is called standing behind your product. It's about selling computers, software, and systems that the customer can rely on. Apple is about selling toys.
 
So you're comparing Apple to a free product so you can have a favorable comparison? Why don't you compare to Microsoft instead?
Free product, not free support. You're also making the mistake that free is inferior. I'm comparing them because they're both similar (Unix-based) and updated a lot more than Windows. Actually I could argue that Ubuntu is better than either Windows or Mac OS because updates are free so theoretically you are supported in perpetuity just for the cost of time downloading the ISO and upgrading.

That update was in the can before SL shipped, just as Tiger got 10.4.11 two weeks after Leopard shipped. Tiger did not get the Java update, even though JDK 1.4.2 was updated for 10.5. Support has been discontinued for Tiger, in line with Apple practices.
Okay, but my 4 years, 4 months still stands. Which you will see if you read the rest of my post is quite good.

Support is measured from when a customer buys a supported product from the vendor to the time they stop getting support. PERIOD. Microsoft is still providing security updates to XP for ALL customers until 2014. Windows 2000 is getting security updates for ALL customers until 2010. Microsoft does it right, better than anyone I can think of it.
Don't lie about the all customers thing. When's the last time anyone could just call MS up and ask about their product. If you had a prebuilt box, they'd say, "Go to HP, Dell, etc, they are the ones who provide support." And OEM's don't do much longer than three years with extended warranty. Unless you bought retail, which most people don't, the offer of Microsoft support doesn't apply to you. If you bought an OEM disk because it was cheaper, the offer of Microsoft support doesn't apply to you - because you are the one supporting your own box. You're referring to business support, which of course the business paid for. You can get support for anything if you pay enough.

How do you explain Microsoft? 2014 isn't forever. It is called standing behind your product. It's about selling computers, software, and systems that the customer can rely on. Apple is about selling toys.
Alright, let's compare.

Mainstream support (read:Consumer, Hardware, and Multimedia products) is 5 years for product + 2 years for successive service packs. No extended support. So that 2014 you state wouldn't even apply to me. Given, it wouldn't apply since I bought all my PCs from a OEM, who is apparently supposed to do the support and not Microsoft.

Based on this XP mainstream support is over for every single version of Windows XP except SP3. Each service pack gives you 2 years and then support ends. Less than the 3 years of Ubuntu, and less than 4 years with Tiger. If I couldn't upgrade to the next service pack (which many can't for reasons of compatibility), I'm SOL.

It was sold less than 24 months ago as new, and now it is unsupported. Don't talk about someone who bought it 4 years ago, and don't compare it to a free product.
Don't mix up who supports the hardware purchase. For consumers the support is limited to the OEM - 1 year for basic warranty. 3 years for extended. Anything else, you're either a business with a support contract or you support yourself. And I can compare what I like. Don't dismiss my comparison because of your own internal prejudices. And Ubuntu support is 3 years from date of release - so in your eyes that would make it even worse if someone downloaded the LTS ISO 2 years and 11 months down the line.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.