Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
i dont know about you but i would struggle with a Celeron.

Why?

A Celeron is a Core 2 Duo with some of the cache defective, so it's running with half the cache at a lower price.

Since you may well be running prosumer apps that need the extra cache and MHz - feel free to spend 3 to 9 times more for the extra speed. (And if you really need portable power, of course you can get a Windows laptop that blows any Apple laptop away.)

If you're a typical consumer, though, a dual-core Celeron will be plenty fast enough - and your wallet is far happier.
 
It's funny, a lot of your comments sound like ads for MS products -- which is kind of strange on a Mac forum.

I love Macs. I created the first Mac only magazine for programming, now published as Mac Tech, in 1984. But the Visual Studio development system with C# is an absolute joy to work with in the PC/Windows world. Would love to have something as comprehensive in the Mac world.
 
Could it be value?
attachment.php

1.8 GHz is too slow. I can't imagine trying to get any work done on that system. 2 GiB of RAM in this day and age might as well be 2 MiB. And a 250 GiB hard drive? I would fill up that sucker in a heartbeat. :eek:

Now a Core 2 Quad Lenovo with a dedicated NVIDIA card I could live with. :D
 
Why?

A Celeron is a Core 2 Duo with some of the cache defective, so it's running with half the cache at a lower price.

Since you may well be running prosumer apps that need the extra cache and MHz - feel free to spend 3 to 9 times more for the extra speed. (And if you really need portable power, of course you can get a Windows laptop that blows any Apple laptop away.)

If you're a typical consumer, though, a dual-core Celeron will be plenty fast enough - and your wallet is far happier.

why? you said it! im a high end prosumer. i have hobbies that entail using high CPU cycles.

however, the CPU isnt really the main concern - it should be the RAM/HDD because these can make far more of a difference then the CPU can.

if i just used Safari, iTunes, Mail etc then yes - a Celeron would be fine for me. but it isnt. im purchasing a quad core i7 iMac, that will last me a while i hope. :) (i do more multitasking then anything).

also, the laptop may be cheap and "comparable" in benchmark terms, but that isnt the only factor that you must look at when purchasing a laptop (or any computer) - unless you're a cheapskate ;)
 
why? you said it! im a high end prosumer. i have hobbies that entail using high CPU cycles.

however, the CPU isnt really the main concern - it should be the RAM/HDD because these can make far more of a difference then the CPU can.

if i just used Safari, iTunes, Mail etc then yes - a Celeron would be fine for me. but it isnt. im purchasing a quad core i7 iMac, that will last me a while i hope. :) (i do more multitasking then anything).

also, the laptop may be cheap and "comparable" in benchmark terms, but that isnt the only factor that you must look at when purchasing a laptop (or any computer) - unless you're a cheapskate ;)

Choice is not a "bad thing", in spite of Apple's stance.

I was trying to make it clear throughout the last few posts that the $339 MSI would be fine for the low end consumer space....

If you need more power, there are lots of options from all vendors.

If you don't, however, you can save quite a bit of money if you're watching the budget.
 
Choice is not a "bad thing", in spite of Apple's stance.

I was trying to make it clear throughout the last few posts that the $339 MSI would be fine for the low end consumer space....

If you need more power, there are lots of options from all vendors.

If you don't, however, you can save quite a bit of money if you're watching the budget.

choice is never a bad thing, but neither is having limited options.

the $339 MSI *might* be ok for the lower end consumers. then again it might not. you have to take a view from both sides.
Pro:
• Cheap
• Fairly powerful
• Small, lightweight
• Etc.
Cons:
• Lowest powered machine (basically the lowest one can buy).
• Weak structual wise
• Users might be hard on their machines, this won't stand up to it
• Etc

the i7 iMac is a pretty good deal i feel, its only $207Aus to upgrade from the i5 chip (i5 chip is cheaper then the E8600), the i7 chip by itself costs $346Aus. a cheap upgrade that is justified by the form factor and the guaranteed customer service that comes with it. would the MSI have that?
 
Not iMac specific, not Safari specific


I have but don't know to interpret the results. The Flash lib is stripped of debug symbols so even the information provided Shark is fairly limited.

Looks like the bottleneck is in the Flash lib but, who knows. Could be stalled waiting on something inside the ATI drivers or the kernel.
 
1.8 GHz is too slow. I can't imagine trying to get any work done on that system. 2 GiB of RAM in this day and age might as well be 2 MiB. And a 250 GiB hard drive? I would fill up that sucker in a heartbeat. :eek:

Now a Core 2 Quad Lenovo with a dedicated NVIDIA card I could live with. :D

Well, for my first two years of college I used a 1.6 GHz core duo with 2 GiB of RAM and an 80 GiB hard drive! it was perfect for typing up notes in class and the occasional stepmania session before said classes. I wouldn't do anything like encode video or 3d render on it of course though ;]
 
Then why not use 5750 or 5750?
The cards are built on 40nm technology, they have better performance and they consume less power and they also support Directx 11 which may not matter on a mac platform but if you dual boot it can come handy. The price difference is negligible to nothing and it wouldn't have hurt Apple to include these chips rather than last year's graphics technology
If that's true then I believe Apple must have had a valid reason to not use it, I do not think Apple would have just left out the better card if:
it wasn't much more expensive, had more performance, had similar heat output, used less energy, and would appeal to gamers more than the 4850 (at least I think it does appeal to gamers).

I'm actually very curious to know why they wouldn't have used the 5750? I didn't even know the 5750 was available already :eek:
As for Gaming, 4850 does fine when it comes to gaming at 1680 X 1050 resolution but at 1080p resolution it doesnt perform that great; considering iMac 27's screen resolution (which is a plus), this card will fail to perform adequately
It all depends on what you play though, I'd play Crysis on a 30" 2560x1600 resolution and cut the quality back a bit compared to my 23" 1920x1200 resolution if I had the choice :p.
I'd love to try the iMac out though, test it on Crysis and some other games and see what it comes back with, hopefully we will get some good benchmarks when the Core i7 starts to ship (if it hasn't already).
 
If that's true then I believe Apple must have had a valid reason to not use it, I do not think Apple would have just left out the better card if:
it wasn't much more expensive, had more performance, had similar heat output, used less energy, and would appeal to gamers more than the 4850 (at least I think it does appeal to gamers).

I'm actually very curious to know why they wouldn't have used the 5750? I didn't even know the 5750 was available already :eek:

It all depends on what you play though, I'd play Crysis on a 30" 2560x1600 resolution and cut the quality back a bit compared to my 23" 1920x1200 resolution if I had the choice :p.
I'd love to try the iMac out though, test it on Crysis and some other games and see what it comes back with, hopefully we will get some good benchmarks when the Core i7 starts to ship (if it hasn't already).

ATI 5xxx series card have very good power management because of their 40nm technology and an aggressive power down functions; for instance their top of the line 5870 uses 27w at idle which is excellent considering it is world's fastest single GPU card. The only reason that I can think of that would have prevented Apple from using 5750 or 5770 is availability issues. iMacs are already using i7 processors, why not use a GPU that is also current?

To clarify though, I'm not saying 4850 is a bad card. If gaming is not your thing then 4850 is more than enough for anything else including HD video playback and even accelerated video transcoding using GPU; it is just disappointing to see that they could have used something more advanced at the same price.

Even worst is the 4670 card. I have a 4670 that I bought last year and it is a fabulous card for its price and it performs surprisingly good at the 1680X1050 resolution but it is showing its age with all the current games coming; again with 5750 or 5770 in the market there is absolutely no reason to stick with this card


As for Crysis I dont play that game period :p. It is an average game with a rather poorly optimized code. The expansion runs better though but it still not enough

Apple has almost hit a home run with the 27" iMac in terms of all in ones but it is rather disappointing to see the GPU is something outdated
 
Happening on Unibody MBP 17"

This already happens on my unibody Macbook Pro 17" running Snow Leopard, so it may well be an Adobe problem?
 
To clarify though, I'm not saying 4850 is a bad card. If gaming is not your thing then 4850 is more than enough for anything else including HD video playback and even accelerated video transcoding using GPU; it is just disappointing to see that they could have used something more advanced at the same price.

just so you know, the 9400M is the only card that Apple supports that is currently able to use this feature.
 
I've been having Flash problems on my aluminum unibody MacBook (late 2008) long before I upgraded to Snow Leopard. The fans go crazy when watching Flash and sometimes I get message screen indicating there is problem with Flash plug-in. I never had this problem when I first got machine last year so somewhere along the line either a OS or Flash update has caused this problem. I think what people are missing in that video is the FLash in activity monitor is going crazy. On mine it is taking up a lot of processor usage. Never used to have this problem before. It's hard to see but I'm sure that's what they were trying to emphasize in the video.

Same here on the same type of machine.
Anyways, thanks soo much to ClickToFlash: it shall be the software of the year!
 
just so you know, the 9400M is the only card that Apple supports that is currently able to use this feature.

I read about that when I was helping my friend to buy his Macbook Pro

ATI 4xxx series are definitely capable of video transcoding as the applications exist on Windows and it is possible if Apple decides to support (and why not considering their iMacs are coming out with these cards). I think it has more to do with nVidia investing more in GPU applications and having more tools available. nVidia is definitely ahead when it comes to programming tools for their GPUs
 
I read about that when I was helping my friend to buy his Macbook Pro

ATI 4xxx series are definitely capable of video transcoding as the applications exist on Windows and it is possible if Apple decides to support (and why not considering their iMacs are coming out with these cards). I think it has more to do with nVidia investing more in GPU applications and having more tools available. nVidia is definitely ahead when it comes to programming tools for their GPUs

oh yes, the 4*** and 5*** series of ATi cards can support these features - absolutely. currently all of the Mac GPUs can be supported, but not via OSX.

i agree with nVidia being ahead, their CUDA technology is pretty amazing - i have trailed it with my old 8500GT and converting videos was really snappy! i wonder why the new iMacs do not use the nVidia cards? i was very shocked by that, and somewhat disappointed. they musnt have the right thermal specifications.
 
Update, please....

So, some of these posts are a bit challenging for many of us who are not programmers. Can someone please fill me in on:

1.) Is it a driver issue where a simple update can fix the problem?

2.) Is the problem on iMacs with ATi or Nvidia cards, regardless of size (21 or 27")?

3.) Should I hold off on purchasing a new 21" iMac (I was going to get the ATI model)?

thanks
 
oh yes, the 4*** and 5*** series of ATi cards can support these features - absolutely. currently all of the Mac GPUs can be supported, but not via OSX.

i agree with nVidia being ahead, their CUDA technology is pretty amazing - i have trailed it with my old 8500GT and converting videos was really snappy! i wonder why the new iMacs do not use the nVidia cards? i was very shocked by that, and somewhat disappointed. they musnt have the right thermal specifications.

If i'm not mistaken Apple uses Open CL in OSX and Windows uses Open CL and DirectCompute (part of Directx). The benefit of these two technologies is that they can be applied to any graphcis card from nVidia or ATI and not being specific to a particular GPU maker

To take advantage of nVidia's GPU technology one needs to use CUDA. While CUDA might have better tools available now but the other two are approaching it very fast considering both Apple and Microsoft are serious about GPU computations. CUDA is going to become less relevant as Open CL and DirectCompute catch up in terms of tools and performance; developers will prefer these two APIs because they can work on any GPU regardless of the GPU maker (which is fine since nVidia cards also play nicely with Open CL and DirectCompute)

CUDA was a great idea when nobody took GPU processing seriously; it started a trend and made some tools available for developers to harness the power of GPUs for tasks other than gaming; but because of its closed nature I doubt it will ever be a significant API in the consumer market. It might be preferred in high performance computing markets when one only works on nVidia GPUs and ATI is non existent in that market
 
If i'm not mistaken Apple uses Open CL in OSX and Windows uses Open CL and DirectCompute (part of Directx). The benefit of these two technologies is that they can be applied to any graphcis card from nVidia or ATI and not specific to a particular GPU maker

To take advantage of nVidia's GPU technology one needs to use CUDA. While CUDA might have better tools available now but the other two are approaching it very fast considering both Apple and Microsoft are serious about GPU computations. CUDA is going to become less relevant as Open CL and DirectCompute catch up in terms of tools and performance; developers will prefer these two approaches because they can work on GPU regardless of the GPU maker (which is fine since nVidia cards also play nicely with Open CL and DirectCompute)

CUDA was a great idea when nobody took GPU processing seriously; it started a trend and made some tools available for developers to harness the power of GPUs for tasks other than gaming; but because of its closed nature I doubt it will ever be a significant API in the consumer market. It might be preferred in high performance computing markets when one only works on nVidia GPUs and ATI is non existent in that market

yup, Apple uses OpenCL as does Windows. good to see they finally agree on something ;)

CUDA has become pretty popular in the folding world, allowing users to get extra computing power from their GPU. using my 8500GT i found it to be roughly 2x, 3x, and even 5x faster then the dual core 3GHz CPU that i use with it - a substantial increase considering the price i paid for it. but as you said, too bad for those with ATi cards.

i wonder if the game companies utilise OpenCL that much, as most games these days are DirectX compliant. it would be nice to see them all being OpenCL compatible (would that make it easier to port to OSX?).

the only thing i care about here is ALL the mac graphics cards that are compatible with OpenCL being supported by apple.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.