Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The issue isn't so much Diablo III performance, as it is the performance of the next game you get or of other software that makes more use of the graphics processor.

Also, having more graphics ram will speed up some programs. I think for longevity it would be worth the graphics upgrade.

And let's not forget, gaming under OS X is pretty darn horrible. Diablo 3 will not run at 60fps solid, period, with the 680MX, at 2560x1440. I get slowdown pretty frequently on my iMac, playing Diablo 3, even with settings turned down to low. I'd be willing to bet that's minimized in Boot Camp, but I'd need to reinstall Diablo 3 there (may have to do that!). I really hope 10.8.3 brings better driver support for those video cards..
 
And let's not forget, gaming under OS X is pretty darn horrible. Diablo 3 will not run at 60fps solid, period, with the 680MX, at 2560x1440. I get slowdown pretty frequently on my iMac, playing Diablo 3, even with settings turned down to low. I'd be willing to bet that's minimized in Boot Camp, but I'd need to reinstall Diablo 3 there (may have to do that!). I really hope 10.8.3 brings better driver support for those video cards..

Ah, so do you play anything on bootcamp, do you "tinker" with o/c on the gnu?
 
Ah, so do you play anything on bootcamp, do you "tinker" with o/c on the gnu?

Yes I played Far Cry 3 in Boot Camp, and Devil May Cry etc. Devil May Cry runs at 60fps (vsync locked) maxed out (literally every option checked) in 2560x1440, which is lovely. Far Cry 3 doesn't, of course. :)

I don't want to overclock on a permanent basis at this state since I don't feel the necessity to do so. I have tried it, though, and the 680 does take VERY well to overclocking.
 
I went into apple today and asked 3 apple employes about the 675 vs 680. All three told me the same thing, the 680 is ideal if you plan on gaming other then that, the 675 would be plenty.
 
more likely the 680m

as deconstruct60 pointed out, the CUDA cores match up (1344) and so does the clock.

I can't believe that it will remain like this - that it is the result of short-term supply issues from Nvidia. I think some of the early purchasers may have gotten lucky, but doubt it will be consistent. If Apple is advertising the 675mx, then that is what people should expect to receive if they order it (960 CUDA cores and a 600Mhz clock). It's a little annoying though if it skews some of the benchmarks and comparisons that are being reported as it affects buying choices.

Has anyone with a recent iMac purchase and 675MX been able to confirm the CUDA cores and other specs of the card are still matching up with a 680M?
 
Has anyone with a recent iMac purchase and 675MX been able to confirm the CUDA cores and other specs of the card are still matching up with a 680M?

Hi there ,I received my 27 in i5 3.2 675 mx 2 days ago and upon reading this thread , I can confirm that my 675 match the 680 spec:),I'm in the uk
 
My 675MX vs 680MX story.

2 weeks ago I ordered an iMac with i7/3TB Fusion and 675MX. I didn't think I'd need to choose the 680MX, thinking that the money saved will go to 32GB of 3rd party RAM. I certainly didn't need it for gaming since I play on a PS3 and have a PC gaming rig (3930k @ 4.3Ghz, 2 x GTX 680's in SLI for nVidia 3D Surround etc) I built in the last year. The 675MX would be more than good enough for 99% of the things I use my iMac for.

Then in recent days I've started to second guess my choice, thinking why not just get the higher end graphics card, because the additional, theoretical graphics horse power (50% more CUDA cores, higher memory/core speeds etc) compared to the 675MX was worth the extra AUD$175 (I consider it a bargain). This morning I called apple to change my configuration to a 680MX and thankfully, my delivery date hasn't changed ;)

Anyways, 675MX or 680MX are both great and you should be happy with either card, but if the additional cost of the 680MX won't break the bank, I'd go for the 680MX.

Sorry but I don't think it's quite right, but 675MX in $1999 iMac is a 680M, yep it's a bit confusing compared to what nVidia says on the website, but for $150 more, 680MX contains a little bit more CUDA cores (1536 instead of 1384 on iMac 675MX, certainly not 50% more). It has double the memory indeed though.
 
Hi,

I am currently working as a 3D illustrator in the interior depart of my office.
Therefore, there are some side project on 3d for interior design and some photo editing.
So I plan to change to iMac. So is iMac suitable for 3d modelling and rendering works? I mostly used autodesk 3ds max and sketchup for work.
If I getting an imac 27, should i upgrade the processor to i7 and the gpu to 680mx?

Thanks.
 
Hi,

I am currently working as a 3D illustrator in the interior depart of my office.
Therefore, there are some side project on 3d for interior design and some photo editing.
So I plan to change to iMac. So is iMac suitable for 3d modelling and rendering works? I mostly used autodesk 3ds max and sketchup for work.
If I getting an imac 27, should i upgrade the processor to i7 and the gpu to 680mx?

Thanks.

3ds max would have to run via bootcamp. Look up what cards are tested by autodesk and how the gpu options perform in max with the nitrous viewport. Some gaming cards are terrible in these programs, but it's not always the case. If your scenes won't be too heavy, it may make little difference. Typically if severe driver bugs aren't an issue, you won't see a big difference at lower polygon counts. The i7 provides hyperthreading. I'm not sure how much that helps for rendering. Offline rendering with mental ray or whatever you use doesn't rely on the gpu unless you're using a specifically gpu based renderer like the iray mode they incorporated a couple cycles back. Otherwise it's just about viewport performance. The 680mx offers no real world advantage for photo editing over the other options. The ram would help if you were using a 3d paint app.
 
Cuda vs. Memory

Correct me if I am wrong regarding the 1G vs. the 2G memory importance, the 680mx NEEDS to mate with 2G to maxamize its function where the 675mx only needs 1G- a functional requirement rather than an upgrade. An example is the New Ipad; the Retina display call for the bigger 11,666 mAh battery because of the heavier Current draw. Putting 2 G in the 675mx would really not improve performance all that much and putting 1G in the 680mx would limit its function. Focus on the Cuda and not the memory.
 
Correct me if I am wrong regarding the 1G vs. the 2G memory importance, the 680mx NEEDS to mate with 2G to maxamize its function where the 675mx only needs 1G- a functional requirement rather than an upgrade. An example is the New Ipad; the Retina display call for the bigger 11,666 mAh battery because of the heavier Current draw. Putting 2 G in the 675mx would really not improve performance all that much and putting 1G in the 680mx would limit its function. Focus on the Cuda and not the memory.

No, the ram it utilised when you run programs on a higher resolution. For a 27", a 2GB card should be the standard minimum for running new games and taxing visual applications at that resolution.

If you had a 675mx with 1gb and a 675mx with 2gb, frame rates on a game would be the same at smaller resolutions.
 
so with 675mx or 680mx can you play native resolution on high on macos? or still you have to bootcamp? i mean games like starcraft 2 diablo 3 league of legends etc?
 
Sorry but I don't think it's quite right, but 675MX in $1999 iMac is a 680M, yep it's a bit confusing compared to what nVidia says on the website, but for $150 more, 680MX contains a little bit more CUDA cores (1536 instead of 1384 on iMac 675MX, certainly not 50% more). It has double the memory indeed though.

The 675MX only has 960 CUDA Cores if you want the specifics. The 50% was in reference to the previous statement, just a stupid typo.

What I typed:

"because the additional, theoretical graphics horse power (50% more CUDA cores, higher memory/core speeds etc) compared to the 675MX was worth the extra AUD$175 (I consider it a bargain)."

This is how I should have phrased it:

"because the additional, theoretical graphics horse power (50%), more CUDA cores, higher memory/core speeds etc, compared to the 675MX was worth the extra AUD$175 (I consider it a bargain)."
 
The 675MX only has 960 CUDA Cores if you want the specifics. The 50% was in reference to the previous statement, just a stupid typo.

What I typed:

"because the additional, theoretical graphics horse power (50% more CUDA cores, higher memory/core speeds etc) compared to the 675MX was worth the extra AUD$175 (I consider it a bargain)."

This is how I should have phrased it:

"because the additional, theoretical graphics horse power (50%), more CUDA cores, higher memory/core speeds etc, compared to the 675MX was worth the extra AUD$175 (I consider it a bargain)."

Like I said, the nVidia website indeed lists 675MX as having 960 CUDA cores. But in practical, if you're looking at an iMac with 675MX inside and runs GPU-Z, it has 1366 CUDA cores, identical to one on 680M.

It's a bit misleading, but it is just what it is. Worth the extra money or not is irrelevant since it's down to personal consideration. But to say 680MX has 50% more cores than 675MX on iMac is questionable.
 
Like I said, the nVidia website indeed lists 675MX as having 960 CUDA cores. But in practical, if you're looking at an iMac with 675MX inside and runs GPU-Z, it has 1366 CUDA cores, identical to one on 680M.

It's a bit misleading, but it is just what it is. Worth the extra money or not is irrelevant since it's down to personal consideration. But to say 680MX has 50% more cores than 675MX on iMac is questionable.

Did you even read my reply? Read it again. I made a typo or two, which I corrected, as described in my reply to you.

As far as GPU-Z goes, I'll go by nVidia specifications for their chips over anything a 3rd party program says.

This what the actual company that makes the chips says:

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-675mx/specifications
 
Last edited:
Did you even read my reply? Read it again. I made a typo or two, which I corrected, as described in my reply to you.

As far as GPU-Z goes, I'll go by nVidia specifications for their chips over anything a 3rd party program says.

This what the actual company that makes the chips says:

http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-675mx/specifications

Im afraid you are wrong kind sir...see this tread...the 675mx in the imac has been confirmed many times over as a custom chip for apple, very close if not exactly the 680m...good day now

https://forums.macrumors.com/threads/1568279/
 
Im afraid you are wrong kind sir...see this tread...the 675mx in the imac has been confirmed many times over as a custom chip for apple, very close if not exactly the 680m...good day now

https://forums.macrumors.com/threads/1568279/

Well, since my original post was from January, I'll put it down to information available at the time. All the information I had at that time of the post was that the 675MX was a cut down 680M. Now after reading around it seems that the number of cores on the 675MX's in iMacs compared to 680m's, may be the same. Good to know.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.