Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
first of all think. wow is not a four year old game as wrath of the lich king was recently released. if you go back into pre BC areas you will get much higher fps. at max on my imac 3.06 with 8800 GS i get 30 fps every were in wrath areas but i hit 60-70 in org or close to 80 in empty pre bc areas such as silithus.

personally i like the max settings but i feel safer with a lot of fps so instead of max settings i use default. the only thing i do is enable vsync so i don't get tearing and i max the resolution. at those settings i have a constant 60 no matter what

I am talking 50fps in pre-BC areas, Tirisfal Glades. On an empty server no less.
 
Don't be stubborn. Just turn down the resolution a bit and it will run like butter. The 130 has plenty of power to run WoW.

But that doesn't mean the drivers might need some tuning too as is often the case with new vid cards.

Otherwise yeah the 4850 was the way to go. A 4850 is the best vid card option I can recall the iMac having in a long time.
 
Only the low end iMacs have the 9400M G integrated GPU. Just FYI, the 9400M G is an improvement vs. the 128MB ATI Radeon HD 2400 XT in the old low end iMac.

Only just though. In the benchmarks I've seen, the only thing propelling the new low-end above the old low-end in gaming is the enhanced clock speed of the processor because the difference seems to be only around 10%.
 
Only just though. In the benchmarks I've seen, the only thing propelling the new low-end above the old low-end in gaming is the enhanced clock speed of the processor because the difference seems to be only around 10%.

From what I saw of the Macworld benchmarks the 9400M is 15%-20% faster than the 2400xt 128mb card.
 
I am talking 50fps in pre-BC areas, Tirisfal Glades!! On an empty server no less.

first of all i don't believe you because i can hit 50 or 60 in an empty pre bc area with the 8800m GTS in my vista 64 laptop and it only has a 1.6ghz processor.

i also back this up with the fact that the game runs twice as fast in opengl rather than directx. if i run in opengl mode in windows or linux i can get much more fps around 70.

the gt130 is an upgrade. a driver update may be coming for you but considering that its most likely a slowed desktop version of the 9600 i don't think the 8800 gs could really be faster as it is as well a slowed desktop version.

go to the video settings and click the default button and then click all settings. once you load back into wow change the res to max and flip on vsync and see what you get.
 
Don't be stubborn. Just turn down the resolution a bit and it will run like butter. The 130 has plenty of power to run WoW.

But that doesn't mean the drivers might need some tuning too as is often the case with new vid cards.

Otherwise yeah the 4850 was the way to go. A 4850 is the best vid card option I can recall the iMac having in a long time.

Rumor is that it's going to be desktop version. Only time will tell; but if it is, that's fine enough for me for a looong while.
 
OP is a fool. Neglecting to upgrade graphics forfeits your right to complain about graphics.

There's a new "shadowing" graphics doohickey in WoW now which will drag down all graphics performance everywhere. You can manipulate that setting alone, and I'm sure your performance will spring up to whatever imaginary stellar performance you were expecting:
http://www.wowwiki.com/ExtShadowQuality
 
Rumor is that it's going to be desktop version. Only time will tell; but if it is, that's fine enough for me for a looong while.

if its a desktop version i will be glad i waited. the desktop version is supposed to be faster than a GTX 260 but that is only if apple doesn't slow it down.
 
if its a desktop version i will be glad i waited. the desktop version is supposed to be faster than a GTX 260 but that is only if apple doesn't slow it down.

I've remember reading one post where someone was speculating that it'll be an overclocked mobility version; but I find it hard to imagine ;-)

Another question that interests me - will it have 9400M too? I don't see a reason why not -> it's the same design as any other iMac.
 
I completely understand why no one wants to view something on an LCD at any resolution other than native for obvious reasons but 50% native can't be so bad once you get to 1920 x 1200.

Bearing in mind consoles are generally through large screen TVs that you sit metres away from and monitors are generally 60 - 100cm away from where you're sat, I don't know why it would be so disappointing to run at a lower than native resolution but with anti-aliasing?

Pixel doubling would be enough to look good at half the native resolution with everything on full at 24 - 30fps no matter what game your playing.

I've never understood why people stress so much about not getting frame rates of over 30fps.

You spend your whole life watching films in the cinema (24fps) and TV (25fps UK and 30fps in the US) without issue and even with anamorphic wide screen your stretching a 720 x 576 or 720 x 480 picture to 16:9 ratio.
 
I've never understood why people stress so much about not getting frame rates of over 30fps.

You spend your whole life watching films in the cinema (24fps) and TV (25fps UK and 30fps in the US) without issue and even with anamorphic wide screen your stretching a 720 x 576 or 720 x 480 picture to 16:9 ratio.

People have "the faster, the better" mentality when it comes to games.
 
I completely understand why no one wants to view something on an LCD at any resolution other than native for obvious reasons but 50% native can't be so bad once you get to 1920 x 1200.

Bearing in mind consoles are generally through large screen TVs that you sit metres away from and monitors are generally 60 - 100cm away from where you're sat, I don't know why it would be so disappointing to run at a lower than native resolution but with anti-aliasing?

Pixel doubling would be enough to look good at half the native resolution with everything on full at 24 - 30fps no matter what game your playing.

I've never understood why people stress so much about not getting frame rates of over 30fps.

You spend your whole life watching films in the cinema (24fps) and TV (25fps UK and 30fps in the US) without issue and even with anamorphic wide screen your stretching a 720 x 576 or 720 x 480 picture to 16:9 ratio.

i agree with the 30 fps statement as too much fps is a bad thing. but i disagree with the resolution. to me native resolution is very important. at lower resolutions things tend to pixelate
 
People have "the faster, the better" mentality when it comes to games.

I'm not trying to say that it's useless. Sometimes it is useful, I think. I think that when you're playing say FPS game then if you have some capability for ~60fps it gives you advantage. Like when you make a sharp, rapid turn, it gives you more precision, since your computer has to render a lot to get from one point to the other.
 
I completely understand why no one wants to view something on an LCD at any resolution other than native for obvious reasons but 50% native can't be so bad once you get to 1920 x 1200.

Bearing in mind consoles are generally through large screen TVs that you sit metres away from and monitors are generally 60 - 100cm away from where you're sat, I don't know why it would be so disappointing to run at a lower than native resolution but with anti-aliasing?

Pixel doubling would be enough to look good at half the native resolution with everything on full at 24 - 30fps no matter what game your playing.

I've never understood why people stress so much about not getting frame rates of over 30fps.

You spend your whole life watching films in the cinema (24fps) and TV (25fps UK and 30fps in the US) without issue and even with anamorphic wide screen your stretching a 720 x 576 or 720 x 480 picture to 16:9 ratio.

While I agree that people obsess too much over FPS... I have to point out that comparing 24fps on TV is in no way similar to comparing 24fps on a video game.

Movies and television utilize motion blur. Video games do not.

Anyone who has gamed at 30fps vs 60fps can testify to the huge difference.

Then there is also this camp of folks who claim the human eye can only perceive 24fps max. I don't know where that rumor originated, but it's complete bollocks.

I know I personally can tell the difference up to about 85fps, and some people claim to be able to go higher.


Also, your FPS seem too low... are you sure you've got proper drivers loaded? What are you getting on 3dMark06?

I read all this junk about the 2600 Pro (which is actually an XT Mobility...) in the last iMacs being such a horrid card, but I'm scoring over 5000 in 3DMark06. I'm quite happy with the little beast, seeing as I think I got an awesome price for this refurb.
 
It's probably at this point a driver issue. The card is relatively new and OS X drivers are never good. Have you tried using bootcamp?
 
24 fps in movies is totally different from 30 fps in games.

Movies have inherent motion blur which gives a more smoother transition between frames.

computers graphics don't have that and there is a sharp transition between frames.

3o fps average also means slowdowns when more sht comes on the screen. Different from 30 fps locked.

And 60 fps locked in games is noticeably smoother than 30 fps locked. I love me a nice smooth 60 fps game.
 
i use to have an old imac the white ones like 2 times back... and it played better games then most pc's!!! i played wow, cod4, fear, and many more games on it and it ran everything perfect it even loaded faster then my exboyfriends pc which was built just for gaming!!! i have now ordered the 3.06 with the ATI card.... im sure its gonna be even better then i expect it to be!!!
 
So I'm playing World of Warcraft and I loaded it up both on OSX and Windows and updated all the drivers in Windows (it sees a GT130 card) and on high settings in desolate not popular areas I'm getting 43 fps!! wtf? I expected 80 or so. This graphics card and/or machine is terrible. Apple should be embarassed. Wish I could send it back. I might just do that even with the 15% restocking fee hit from Amazon.

I agree with everyone else that you should have got the 4850 upgrade. I did.

That said, my advice is this: turn all those graphics down.

Personally, I find that WoW really doesn't lose much by turning everything to the minimum levels, excluding view distance. Until a few weeks ago, I was playing WoW on a quad-core Windows machine with an nVidia GT285, which allowed me to play with most sliders maximized, at 1080i res with great FPS. I have since sat that machine aside, getting ready to sell it to my brother to help pay for the 24" iMac w/4850 I ordered.

So, currently, I am playing on my new MacBook Pro hooked up to a 24" LED Cinema Display. I made a custom resolution of 1950 x 1000 (or so, going by memory) and play in a window. All sliders are minimized except for view distance, which is at half, and I added a line to the config.wtf to shut off shadows completely. It plays great. Sure, it doesn't get as many FPS, nor is it as pretty, but during actual play, I find I don't even notice the difference. I just plug along and play as I have always played. I am sure the iMac will do even better.
 
first of all i don't believe you because i can hit 50 or 60 in an empty pre bc area with the 8800m GTS in my vista 64 laptop and it only has a 1.6ghz processor.

Why would I lie about this? I'll adjust the shadow settings and try that, I did go down to 1600x1000 and I get to 60fps at High settings, but of course the monitor is meant to be at 1920... And again, I'm talking about pre-BC desolate areas like Tirisfal.

I'm praying this is a driver issue as well, wouldn't be the first time Nvidia had bad drivers.

And again for all those who said I should've waited for the 4850, my old computer just crashed and I needed a new one asap.
 
It is interesting to me that I get better fps in OS X than Vista 64 via Boot Camp. Lends credence to the driver idea. (I hope)
 
From Blizzard :
You can turn off shadows completely as well, this will help take out extra graphics as well.

Turn off shadows:
Go into the World of Warcraft => WTF ƒolder and open the Config.wtf file using TextEdit, then add the following line:
SET shadowlod "0"
In the game go to your Video settings and change the following sections:
Resolutions: have the Multisampling set to the lowest setting and you might want to
turn ON Reduce Input Lag
Now test the game.
Side Note: The human eye won't see a difference between 72 FPS and 100 FPS and over. To read a great article about this check out the following 3 part article: http://www.daniele.ch/school/30vs60/30vs60_1.html

Also : With the game closed please go to Mac HD => Applications => World of Warcraft ƒolder and:
1. delete the CACHE ƒolder
2. move WTF and INTERFACE ƒolders to the desktop
Moving them to the desktop makes it easy to replace them if necessary.

Oh and here is the Thread on Blizz forums which give you informations about performance for every Mac, included 2009.

For The Lazy :
Early 2009 iMac with 512MB ATI 4850 Graphics Card : 70-90 fps
Early 2009 iMac with 256MB or 512MB nVidia GT120 or GT130 Graphics Cards : 40-55 fps
Early 2009 iMac with nVidia 9400m Integrated Graphics : 20-30 fps
iMac with 256MB or 512MB nVidia 8800 GS Graphics Card : 60-70 fps
 
Gaming at 1920 x 1200 on the desktop 4850 can be tough for current games. You're already looking at a GTX 260 at the minimum.
 
From what I saw of the Macworld benchmarks the 9400M is 15%-20% faster than the 2400xt 128mb card.

On MacWorld I see:
2008 (2.4 GHz) - 33.4 FPS
2009 (2.66 GHz) - 37.3 FPS
A difference of just over 10%. Now you're getting an extra dual .26 GHz too and faster RAM so that is probably worth about 5% of the 9400M advantage.

The Quake 4 result is closer to 20%, but I'm thinking that's because Quake 4 is an old game and at a mere 1024x768, it's more processor intensive than GPU intensive.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.