Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

deconstruct60

macrumors G5
Mar 10, 2009
12,296
3,891
I am curious, though, why nVidia would launch the 675MX the same day as the new iMacs with a card with the same name but completely different specs.

The 670MX , 675MX , 680M , and 680MX are likely the same die with subcomponents flipped on/off. The 680M launched in June 2012 so there are probably a decent number of those parts around.

Either a part got binned slightly wrong ( e.g. labeled a 675MX but set 680M ) or Nvidia gave Apple some "free upgrades" due to part shortage , or ???

I would be surprised if this was widespread over several months. It is more likely a part fluke. Hit the lotto.

It is the 680MX that got announced late around iMac time. Those are likely just more selectively filtered parts. (specs on the 675MX were out at the beginning of Oct http://www.cpu-world.com/news_2012/2012100502_NVidia_Launch_4_New_Mobile_Graphics_Cards_Update_.html )
 

Tri-stan

macrumors 6502
Oct 27, 2012
268
0
I would be surprised if this was widespread over several months. It is more likely a part fluke

Every person that I have seen has given a benchmark of the 675mx has given this 6100 + score in 3D mark. It is posible that the iMacs 675MX is overclocked to give a better score?
 

Mac32

Suspended
Nov 20, 2010
1,263
454
I think it's more likely an issue of Nvidia having not released a proper Windows driver for 680MX. I suspect that if given a proper driver like 675MX has, the 680MX would perform (even) better. (If we're talking about Windows performance obviously.)
 

Tri-stan

macrumors 6502
Oct 27, 2012
268
0
I'm a owner of an iMac 27" with the 675mx card. When I test it With 3dmark i get right more than 6100 score.

If you look at the nvidia page of the 675mx it will say that it has 960 Cuda, 600 GPU clock and 1800 MHz memory clock. But the 675mx in the iMac does not have that spec. It has 1344 Cuda, 720MHz GPU clock and 2500 MHz memory clock. So it is not that far away from the 680mx card after all.

The two big differences is 1GB vs 2GB and 1344 Cuda vs 1536 Cuda cores

If oyKris could come back to clarify his result that would be great. How did you find thoes specs and where did you run your benchmark, os or osx?
 

oyKris

macrumors newbie
Feb 3, 2013
5
0
I got my benchmark from running in Windows8 64bit with the latest nvidia drivers (310.90).

If I open the nvidia Control panel and chooses system information I get the info about the cuda cores and the other stuff. I also get the same data from the gpu-z program.

Hope this answers some of your questions
 

Zandros

macrumors regular
Sep 1, 2010
124
82
I got my benchmark from running in Windows8 64bit with the latest nvidia drivers (310.90).

If I open the nvidia Control panel and chooses system information I get the info about the cuda cores and the other stuff. I also get the same data from the gpu-z program.

Hope this answers some of your questions

Would you consider attaching a screengrab of GPU-Z?

I'm not doubting you, I know Apple has played fast and loose with the GPU names in the past. It'd just be nice to have a detailed look.
 

oyKris

macrumors newbie
Feb 3, 2013
5
0
I have been trying to overclock now to see the effect of that. Raising the gpu clock has a large effect. Raising the gpu clock With 200MHz I get a little bit more than 7300 in 3dmark. But raising the memory clock With 200 only gives med 40-50 more in 3dmark.

The gpu temperature does'nt seem to go up that much when overclocking. The fan will be quiet up to 84C and will get more noisy after that. The highest temperature I have had is 86C (i will get to this temperature with or without overclocking).
 

oyKris

macrumors newbie
Feb 3, 2013
5
0
Ok here are a Picture of the gpu-z app

iMac675mx_zps48409bf5.gif
 

Yebubbleman

macrumors 603
May 20, 2010
5,789
2,379
Los Angeles, CA
Hi,
I want to order my new iMac soon, but I'm not 100% sure which one I should order :) I definitely want a 27" and a fusion drive. I'm using it mainly for office work and photo editing (Aperure, PS, 5DMarkII RAWs). I'm not a professinal photographer, mainly just personal, but I do get some jobs from time to time. And I want to play Diablo 3 for example, but not that often and until yet I had to play on 1024x640 with low details (you could also say without details ;)), so everything will be a big step! So what do you think? Should I spend the extra money for a GTX 680MX? My favourite would be a 27" iMac, "only" i5 (no i7), 3TB fusion drive, 8GB RAM (I can upgrade it myself) and the GTX 675MX... Should be good enough I think, but I also don't want to buy a new device in about 2 years... At the moment I'm "working" at the first Unibody 13" MacBook ^^ So basically I can't do much wrong either way

There are three reasons to get the GTX 680MX over the GTX 675MX. (1) You have Windows games that you want to run today, (2) you want to be able to run newer Mac games reasonably tomorrow, (3) you want to further future proof your machine. If you don't fall into any of these categories enough to justify the expense, then it's not worth it for you to go for it; the GTX 675MX will do fine. Otherwise, if you have the money to spend and you fall into one or more of those three camps, then you should get it.
 

Tri-stan

macrumors 6502
Oct 27, 2012
268
0
Nice one oyKris ground breaking work here!

Would you consider attaching a screengrab of GPU-Z?

I'm not doubting you, I know Apple has played fast and loose with the GPU names in the past. It'd just be nice to have a detailed look.

What do you think of thoes specs Zandros? As soon as I get my iMac I am going to check it out right away in windows. It is going to be like hitting the lottery with thoes spec's. PCI express 3.0 this card is much more powerful than nvidia's quoted 675mx. Apple must be getting some special chips here, maybe to prevent confusion to customers during upgrades with the 180m name which I think this card is?
 

Zandros

macrumors regular
Sep 1, 2010
124
82
Thanks, oyKris!

What do you think of thoes specs Zandros? As soon as I get my iMac I am going to check it out right away in windows. It is going to be like hitting the lottery with thoes spec's. PCI express 3.0 this card is much more powerful than nvidia's quoted 675mx. Apple must be getting some special chips here, maybe to prevent confusion to customers during upgrades with the 180m name which I think this card is?

I think it's enough to make me think again about which GPU I'll order. It does explain the 3DMark numbers we've seen before: ~10 % more performance for a ~15 % increase in stream processors seem way more plausible than ~10 % performance increase for a ~60 % increase in stream processors (at a lower frequency to boot).

Don't think it's a lottery either.

Still, 10 % can sometimes be the difference between playable and unplayable, so I'm still leaning toward the 680MX for a bit more longevity.

Things like these is why I've been waiting for an Anandtech review before committing, but it seems unlikely we'll be getting one this time.
 

plasmaj

macrumors newbie
Mar 2, 2008
22
0
I agree, it's strange. Has anyone else checked the specs on their 675mx, especially on more recently delivered iMacs? Those specs are so similar to a 680m, that it does seem like Nvidia may have been suffering a shortages of the newly released 675mx that they delivered an alternate, better card. Why would Apple advertise that it is offering an inferior card to what it is in fact delivering?

If it is a fluke, it would explain some inconsistency in the benchmarks that have been posted.

e.g. this:
https://forums.macrumors.com/showthread.php?p=16529401#post16529401

compared to this:
http://www.barefeats.com/imac12g4.html

the LuxMark and high res Heaven benchmarks are completely different. Possibly because the Barefeats review had the 675mx iMac running 10.8.3, but I can't imagine that created such a difference.

I plan on ordering my iMac this weekend, but can't decide on the GPU. If in fact the 675mx is as reported above, I would probably stick with that.
 

jmpage2

macrumors 68040
Sep 14, 2007
3,224
549
It almost sounds like what Nvidia is delivering is a binned down-clocked 680MX with some of the features turned off.

Very interesting stuff.
 

plasmaj

macrumors newbie
Mar 2, 2008
22
0
It almost sounds like what Nvidia is delivering is a binned down-clocked 680MX with some of the features turned off.

Very interesting stuff.

more likely the 680m

as deconstruct60 pointed out, the CUDA cores match up (1344) and so does the clock.

I can't believe that it will remain like this - that it is the result of short-term supply issues from Nvidia. I think some of the early purchasers may have gotten lucky, but doubt it will be consistent. If Apple is advertising the 675mx, then that is what people should expect to receive if they order it (960 CUDA cores and a 600Mhz clock). It's a little annoying though if it skews some of the benchmarks and comparisons that are being reported as it affects buying choices.
 

maharajah

macrumors member
Jul 22, 2002
41
5
I agree, it's strange. Has anyone else checked the specs on their 675mx, especially on more recently delivered iMacs? Those specs are so similar to a 680m, that it does seem like Nvidia may have been suffering a shortages of the newly released 675mx that they delivered an alternate, better card. Why would Apple advertise that it is offering an inferior card to what it is in fact delivering?

If it is a fluke, it would explain some inconsistency in the benchmarks that have been posted.

e.g. this:
https://forums.macrumors.com/showthread.php?p=16529401#post16529401

compared to this:
http://www.barefeats.com/imac12g4.html

the LuxMark and high res Heaven benchmarks are completely different. Possibly because the Barefeats review had the 675mx iMac running 10.8.3, but I can't imagine that created such a difference.

I plan on ordering my iMac this weekend, but can't decide on the GPU. If in fact the 675mx is as reported above, I would probably stick with that.

Just tested my 675MX, getting 749 in LuxMark (3.2GHz i5). Either it's a pretty early 10.8.3 in the Barefeats test or there are two different cards being branded as 675MX as some are speculating.
 

Tri-stan

macrumors 6502
Oct 27, 2012
268
0
Can anyone with a newly recieved iMac with 275mx say that it has also the higher clocks and cuda cores of the 680m? People are saying now that the 680m is in fact the 675mx in the iMac 27inch.
 

Kroner

macrumors regular
Jul 13, 2011
183
9
Norway
Can anyone with a newly recieved iMac with 275mx say that it has also the higher clocks and cuda cores of the 680m? People are saying now that the 680m is in fact the 675mx in the iMac 27inch.

I bought this one last week from a retailer. I am guessing it had been in stock for ca. 2-3 weeks.

screenshot2102131413.png
 

Tri-stan

macrumors 6502
Oct 27, 2012
268
0
I bought this one last week from a retailer. I am guessing it had been in stock for ca. 2-3 weeks.

screenshot2102131413.png

1344 cores again, that is another 680M. Its looking pretty evident that this is the processor which comes standard with the top 27inch iMac. I still think when mine comes if it is a standard 675MX not this Apple upgraded one I will be very disappointed. Still crossing fingers.
 

dvdlovr24

macrumors 6502
Jun 3, 2008
329
117
I bought this one last week from a retailer. I am guessing it had been in stock for ca. 2-3 weeks.

screenshot2102131413.png

Mine is showing the same thing. I ordered it on 1/31 and it delivered on 2/21. The iMac i received was also assembled in the US.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.