Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well I'm not an expert/fanboy/troll but I do know that after using OSX for a while now, and liking it, I will not be buying a Mac again. The reason is that I don't want to spend lots of cash on a computer which is clearly inferior to the windows equivalent in terms of value and spec. I have friends happily running the latest version of windows - like it or loathe it - on quite old machines. I have been told by people ( photographers who have no axe to grind and who have used both systems professionally ) that they have assembled the same spec machine as a MacPro which run photoshop a lot faster with Windows than with OSX. Many ( not all ) of them have moved to windows in the last few years.

My other gripe is the ubiquitous glossy screen. I use my mac for photography and this screen is a non starter. I know Jobs said Apple don't know how to build a cheap computer that isn't junk but that isn't a reason for me to continue to pay over the odds for Apple's particular brand.

I slightly regret having to leave Apple but there really is not much difference between OSX and windows and the disadvantages of buying Apple hardware are just becoming too much for me.

W
 
get a used pair of 8800 GT's then, its cheap and it will run openCL just fine...
i dont think anyone has screwed you over.
you people take this Waaay to personal.

has it ever occured to anyone of you that computers is suppose to work with the software they come with? you cant expect to run the NEWEST software on the OLDEST computer? ive heard from a wiseman, a computer can take two generations of software, else, it needs to be updated with hardware too..

the computer will always work as it was supposed to work.
thats why many pro studios still run Tiger on their 06 Mac Pro's

willie45: just because macs doesnt suit your needs that doesnt make them "overpriced for the functions they have", and just because "many of your friends" have switched back, that doesnt even remotely reflect on the current marketshare trend.
furthermore, you have provided zero to none evidence to your allegations of macs being inferior and overpriced for the functions they come with...
 
Well I'm not an expert/fanboy/troll but I do know that after using OSX for a while now, and liking it, I will not be buying a Mac again. The reason is that I don't want to spend lots of cash on a computer which is clearly inferior to the windows equivalent in terms of value and spec. I have friends happily running the latest version of windows - like it or loathe it - on quite old machines. I have been told by people ( photographers who have no axe to grind and who have used both systems professionally ) that they have assembled the same spec machine as a MacPro which run photoshop a lot faster with Windows than with OSX. Many ( not all ) of them have moved to windows in the last few years.

My other gripe is the ubiquitous glossy screen. I use my mac for photography and this screen is a non starter. I know Jobs said Apple don't know how to build a cheap computer that isn't junk but that isn't a reason for me to continue to pay over the odds for Apple's particular brand.

I slightly regret having to leave Apple but there really is not much difference between OSX and windows and the disadvantages of buying Apple hardware are just becoming too much for me.

W


I completely fail to see the relevance of your post to this thread… but I'm bored so I'll go ahead and take it apart anyway.

1. Good riddance. Freedom is awesome, dip your tongue in ink and draw out 1s and 0s if it makes you happy.

2. The Mac Pro is not "clearly inferior" in terms of value or spec by any measure I've seen.

3. Your friends can run the latest version of Windows on old machines because there have been no significant architectural changes to Windows-based PCs this century. That's not a slight, that's a good thing.

4. Photoshop is 32bit on Mac, 64bit on PC. Apple's fault, really. They pulled the rug out from under Adobe with Carbon 64 and we suffered for it. Valid point.

5. Glossy screens are completely irrelevant in this forum. Nobody's forcing you to buy an Apple display to use with a Mac Pro. The MacBook Pro is also available in matte on the 15" and 17" models, though admittedly the 15" took too long to bring to market. It'd be nice if they offered it on the 13", but anyone attempting to do serious photographic work on a 13" display as his only monitor is probably a retard anyway. Honestly, anyone doing color critical work on a laptop display in general is probably not the brightest laser at the lightshow no matter how you slice it.
 
3. Your friends can run the latest version of Windows on old machines because there have been no significant architectural changes to Windows-based PCs this century.

Please let me chime in on that. The opposite of that is actually true. AMD took computing into a new age by releasing dual core and 64 bit processors which is a strategy for performance increase that still continues.

Microsoft released Server 2003 with a 64 bit kernel way before Apple ever thought about such a thing and in 2005 they were the first to give consumers a choice for a complete 64 bit operating system with XP Professional Edition 64. This enabled to running AMD X2 64 systems to their full ability. Apple at this time was talking about 64 bit with Power Macs but castrated the system until they finally made it obsolete without ever developing its potential. Not a very innovative strategy. If we talk about performance and innovation in architecture Apple is the classic example of a me too. They have never had the lead for the last six years. Ask a gamer what he uses for performance or a scientist and you will find very few of those who need cutting edge computing power turn to Apple. Steve lost the plot when he introduced PPC G5 and screwed it up completely. He is now picking up the developments of the Wintel community and exploits his unique OS. But don't talk about architectural break throughs!
 
Please let me chime in on that. The opposite of that is actually true. AMD took computing into a new age by releasing dual core and 64 bit processors which is a strategy for performance increase that still continues.

Microsoft released Server 2003 with a 64 bit kernel way before Apple ever thought about such a thing and in 2005 they were the first to give consumers a choice for a complete 64 bit operating system with XP Professional Edition 64. This enabled to running AMD X2 64 systems to their full ability. Apple at this time was talking about 64 bit with Power Macs but castrated the system until they finally made it obsolete without ever developing its potential. Not a very innovative strategy. If we talk about performance and innovation in architecture Apple is the classic example of a me too. They have never had the lead for the last six years. Ask a gamer what he uses for performance or a scientist and you will find very few of those who need cutting edge computing power turn to Apple. Steve lost the plot when he introduced PPC G5 and screwed it up completely. He is now picking up the developments of the Wintel community and exploits his unique OS. But don't talk about architectural break throughs!



Part of that is valid, I totally forgot about AMD because I don't think they have anything worth taking seriously right now, so I often forget they even exist.

I'm not talking about architectural breakthroughs, though. Dual core and 64bit are important, but not what I meant…

I mean the change from PowerPC to Intel on the Mac. Windows users haven't had to deal with anything like that in a long long time, so direct comparisons between "zomg old PCs can still run new Windows" and "WTF Apple why can't I use my G5?" aren't really valid because of it. Frankly, I got many years of happy computing out of the PowerPC+Mac platform, but it's in the past and dragging the corpse along just holds everything back. Supporting them would have made Snow Leopard bloated… just like Tiger and Leopard. It's still pudgy, but at least it's no longer obese.

As far as I'm concerned, I wouldn't have seen a problem with Apple making Snow Leopard not have support for the Core Duo and Core Solo 32bit processors.
 
In case it has not been said yet OpenCL is GPU nor driver dependent and the old ATi cards are incapable and could never be made capable the same goes for old Nvidea cards.

This is not some fisher price toy where you can force the wrong shape though the hole if you push hard enough.

If you wish to make use of OpenCL you will need a compatible video card that has been designed for GPGPU use which Nvidea where first to market yet the latest of Ati/AMD now is starting to support.

I will stress again a ATi 2600 has no possible way to run OpenCL regardless of drivers.
 
Technically, to use OpenCL you could just get one new card. OpenCL will still use it regardless of which monitor is hooked up to which card. Any program making OpenCL calls will just use the OpenCL compatible card for computations. (at least this is my understanding that OpenCL isn't just offloading display tasks to the card which is then displayed directly on a monitor connected to that card, OpenCL can be used to offload encoding tasks and other things that aren't actually displaying anything)


And I agree that Apple is much too fast in abandoning relatively new models.

But it is possible that it is Nvidia and ATI's fault if older cards don't support OpenCL, which is a very new specification. Possibly, they could do a firmware update to provide OpenCL compatibility at some point.

I'm also disappointed that Quicktime X can't use my 2009 MacPro's GT120 for H.264 acceleration. Yeah, I know my beast can handle it with the CPU alone fine, but it'd still be nice to offload that mundane task while doing other tasks on the CPUs. :)

+1 ... I believe all you have to do is purchase just one card. If you go for one GeForce 8800gt, you'll spend around $200. and be all set. Understand that to keep costs down Apple does not put top-of-the-line graphics cards in their base models.
 
As far as I'm concerned, I wouldn't have seen a problem with Apple making Snow Leopard not have support for the Core Duo and Core Solo 32bit processors.

speak for yourself!! that makes my computer "actually" obsolete ;)

OP: a MBA/MB even with OpenCL will never perform as well as a quad/octo, its just not that much of an increase. its not like every calculation can just be transferred to the GPU - its a select number of ones such as number crunching/video editing etc..
 
The annoying thing is watching CPUs wrestle with h.264 encoding. They have dedicated chips for that in the 3870 and 4870 which will do it 50 times faster than Xeon CPU and do not make use of it. Then Elgato comes along and adds a h.264 chip on a USB stick and speeds it up by a factor of 5. It shows how much h.264 could profit from using the hardware that we allready have.
 
Reading this thread, is one to conclude that if you have a MP 2008 with a ATI 4870 or ATI 3870 + SL that its Open Cl performance is = crap?

And that a 8800GT is better (faster) b/c it has Nividia drivers?

em.. wouldnt this be a FW update from ATI to fix things, assuming they care enough to support these (now) old'ish cards. can anyone confirm this?
 
pretty much. I don't see how anyone would expect ATI to not update their drivers to optimize performance under SL.
 
Hey everyone, I was looking at that apple website, and it seems as thou apple has forgotten about all 2008 mac pro owners. Snow Leopard's Open CL does not work with the ATI 2600 XT HD! Thats the card that came standard with every 2008 mac pro! Why apple? why?

Open CL is a bit niche atm, so I wouldn't be too concerned right
now. However, I think Open CL might be supported on your card
under Windows:

http://developer.amd.com/gpu/ATIStreamSDK/pages/ATIStreamSystemRequirements.aspx
 
it seems obvious to me that apple does this crap merely so its users will be forced to pay top dollar every couple years, instead of upgrading the hardware or software side of things to be compatible with latest technology.

Another explanation: Apple is running a lean software dev operation.
They're not trying specifically to force people to upgrade, but they
aren't putting massive resources in. They're doing the least they can
get away with so they can keep their costs down.

Consider how Leopard was delayed by the shifting staff to the iPhone
and how small an update Snow Leopard is. Consider, too, how ZFS
never fully materialized (even read support is gone now, apparently)
and how resolution independence wasn't fully implemented. How the
64-bit OS X kernel appeared 8 years after Microsoft's 64-bit kernel.
How Apple's self-written NVIDIA drivers appear under-develeoped
(they lack performance). How UEFI 2.x firmware was never implemented.
How 64-bit machines ended up with 32-bit EFI firmware. How hardware
accelerated decoding in SL is available for only one (one!) graphics
card. How the Quicktime X player lacks so many of the features of
its predecessor.

When you consider all that, and more, it strikes me that Apple is
struggling or unwilling to put the effort in.
 
If you spent $4,000 on the computer - and I hope you're including aftermarket upgrades in that price - why would you not shell out $200 more for a graphics card that doesn't bite?

This reminds me of those stores that don't have price tags in the window
the rationale being, if you need to know the price, you're probably can't
afford it.

And yet there are always people just on the edge of being able to afford
something.
 
The 2600 can do things the 9400 can't, but guess which one predates the technology necessary to use OpenCL? If you guessed the 2600, you're right! This is also the reason the GeForce 7300 isn't compatible: The hardware is physically incapable of using OpenCL.

But is the 2600 incapable? Once again, the ATI link:

http://developer.amd.com/gpu/ATIStreamSDK/pages/ATIStreamSystemRequirements.aspx


I'm not claiming it is or isn't, by the way. I find the whole
Open CL/CUDA/DirectCompute thing very confusing. If I
learn something here, great.

[EDIT] Eek, wow! What a posting frenzy! Extra keen today :)
 
But is the 2600 incapable? Once again, the ATI link:

http://developer.amd.com/gpu/ATIStreamSDK/pages/ATIStreamSystemRequirements.aspx


I'm not claiming it is or isn't, by the way. I find the whole
Open CL/CUDA/DirectCompute thing very confusing. If I
learn something here, great.

[EDIT] Eek, wow! What a posting frenzy! Extra keen today :)


Um, you DID see that there's footnotes associated with the 2600XT, right?

1 - Does not support double precision floating point operations
2 - Does not support kernel scatter

Now... If you're Apple and you're supporting OpenCL in the OS, and encouraging people to use it in their applications, do you REALLY want to support a card that potentially is going to give slightly different results in numerical operations because it doesn't support double precision FP operations? I don't think so... When you're doing high end math like that, you need to rely upon the results you're getting back.

Face it folks, the 2600XT was an OLD card when Apple put it in the 2008 models. It's ANCIENT now. I have one in my 2009, but only to drive my 2 side monitors, I use a 4870 to drive my center monitor and run OpenCL on.

Upgrade if OpenCL is that important to you. Fact is, it'll be a few months before there's even any apps out that call for OpenCL, so it's not the end of the world if you don't get a compatible card.
 
Can I get a nvidia gt120 just for open CL and my cinema display, and still use my xt2600hd's to power my other displays?

Yes. This is why we were all so confused by your outrage. We assumed you knew. Things like this are why you buy a Mac Pro, for the expandability and relative flexibility.

I also think it's safe to assume that updates for the 4870 and 3870 are coming, because we saw driver updates for them and the 8800 GT included in OS updates more than once through Leopard's run.

I think we can all calm down and not rage at  at this point, because the technology itself is in its infancy right now, and nobody but  is even using it at all in extant software.

Another point worth considering - any speed gains from OpenCL are on top of what we're used to. Nothing will slow down, but we actually get extra speed for the price of Snow Leopard and a hardware upgrade (if necessary). Not a bad deal at all for the end-user.

Let's hope Adobe, , and others really start using this stuff. I'd be delighted to halve render times by yoking the GPU.
 
Can I get a nvidia gt120 just for open CL and my cinema display, and still use my xt2600hd's to power my other displays?

Yes, but its an ugly setup. Two different drivers and different video hardware from different companies trying to use the same system and OS resources works, but not well. Just get two gt120 or two 8800gt. The ati 2600 is worth getting rid of, as is most ATI stuff.
 
Oh dear. Ploki and grue you do need to grow up. How can you be so utterly imbecilic as to get so bitter just because I happen to say Apple no longer meet my needs. It is reactions like this that make mac users seem a bit urmm odd
 
Yes, but its an ugly setup. Two different drivers and different video hardware from different companies trying to use the same system and OS resources works, but not well. Just get two gt120 or two 8800gt. The ati 2600 is worth getting rid of, as is most ATI stuff.

Cool, after I get payed for my current editing job, I will buy two new Nvidia gt120's and sell my AMD stock.
 
Oh dear. Ploki and grue you do need to grow up. How can you be so utterly imbecilic as to get so bitter just because I happen to say Apple no longer meet my needs. It is complete fools like you who are the best advert for windows machines.

Maybe past you kiddies bed times

Neither of their replies seemed particularly bitter, if at all. I can't say the same for your reply...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.