Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry to bring this topic back up, does anyone know if it is possible to install NVIDIA's official Verde drivers on the 2012 iMac? Or has Apple banned them?
 
So you've just picked a driver from another mobile GPU and installed it on your system?
 
So you've just picked a driver from another mobile GPU and installed it on your system?

Yep, just select the GTX680m version and it's perfectly compatible, despite it being unofficial. Also, today they released new drivers so I'll be testing how they are.
 
Yeah, the only difference from the 680M to the MX are a few clock tweaks, nothing that would affect programming per se. Been using the 680M drivers since December and they've been working great.
 
Yeah, the only difference from the 680M to the MX are a few clock tweaks, nothing that would affect programming per se. Been using the 680M drivers since December and they've been working great.
Well, it's actually quite a difference besides the clocks - more SMX blocks resulting in more CUDA cores etc.
In case you didn't know, the "675MX" Apple is using IS 680M - it has 1344 CUDA cores instead of 960 normal 675MX should have.
 
Well, it's actually quite a difference besides the clocks - more SMX blocks resulting in more CUDA cores etc.
In case you didn't know, the "675MX" Apple is using IS 680M - it has 1344 CUDA cores instead of 960 normal 675MX should have.

Why do you think that Apple didn't just call the top option's for the 27 inch iMac the "680M and 680MX" rather than the "675MX and 680MX" I guess they preferred the clearly different titles knowing it does not affect the gfx drivers too much. Still it's stupid selling consumers a lower spec card when they are really selling you a higher spec card. If it was the other way round there would be a lawsuit involved.
 
Yep, we can conclude 675MX is 680M. For some reason Apple humbly downgrading the spec sheet.

And 680MX option is well.. The 680MX

All in all, the difference is like Radeon 6970M on 2011 iMac vs. souped up 6990M at the time, a bit more stream processors, a bit higher clock. Except there was no 6990M iMac option back then.
 
Why do you think that Apple didn't just call the top option's for the 27 inch iMac the "680M and 680MX" rather than the "675MX and 680MX" I guess they preferred the clearly different titles knowing it does not affect the gfx drivers too much. Still it's stupid selling consumers a lower spec card when they are really selling you a higher spec card. If it was the other way round there would be a lawsuit involved.
I believe there are two reasons for that:
1. Not to confuse us too much (680M or 680MX? wth?)
2. Simply to force us to buy higher spec option for extra bucks :)
 
The iMac 680 has been neutered down to GTX570/660 levels.

Only way to extract heat in the package.

675 has been helped to even more slicing.

"M" vs "MX" would be too confusing.
 
Why call it an MX then if it is just a desktop 680? To show that it is not a full on PCIe card or what? :mad:
It has about 30% lower clocks and is, obviously, less powerful by that percentage. Though the silicon is the same, MX is a mobile chip in terms of TDP.
 
Faster than the GPU in the New PS4?

Even though the graphics may not be the bleeding edge available today. I was reading an article speculating that the PS4's graphics power will also lag behind the top tier gaming rigs and even would lag behind the GTX 680mx.
 
The video certainly is very very pretty.

Yes it does look pretty on the video. I'm not so sure it will blow everything that is currently on the market away.

"Sony states that the PS4’s graphics chip, which is derived from existing Radeon technology and integrated into the Jaguar processor die, can push 1.84 TFLOPS. That number puts the power of the GPU roughly on par with a Radeon HD 7850 video card."

I'm sure they have lots of tweaks and optimizations. But are they really going to blow everything out of the water? More knowledgeable people than me could certainly weigh in.

"A gaming computer with even a moderately powerful graphics component, like the Nvidia GTX 660 Ti, is far more capable than this new console. Specifications suggest the PlayStation 4 isn’t impressive when compared to a PC."

And as someone earlier in the thread pointed out. The GTX 680mx is more powerful than the GTX 660 Ti.

Still I would expect the PS4 to impress. It seems that this iMac may not be a total dog however.
 
From what I've heard the PS4/Nextbox will have a lower GPU than current high-end PC. That game should look just as good, if not better on a high-end PC.

I severely doubt it. But time will tell. I can tell from looking at the Killzone video there's no way my 680MX-powered iMac could run that game as smoothly. Comparing the two system is pointless anyway. The PS4 uses GDDR5, and doesn't have anywhere near the overhead of a modern PC.

Yes it does look pretty on the video. I'm not so sure it will blow everything that is currently on the market away.

"Sony states that the PS4’s graphics chip, which is derived from existing Radeon technology and integrated into the Jaguar processor die, can push 1.84 TFLOPS. That number puts the power of the GPU roughly on par with a Radeon HD 7850 video card."

I'm sure they have lots of tweaks and optimizations. But are they really going to blow everything out of the water? More knowledgeable people than me could certainly weigh in.

"A gaming computer with even a moderately powerful graphics component, like the Nvidia GTX 660 Ti, is far more capable than this new console. Specifications suggest the PlayStation 4 isn’t impressive when compared to a PC."

And as someone earlier in the thread pointed out. The GTX 680mx is more powerful than the GTX 660 Ti.

Still I would expect the PS4 to impress. It seems that this iMac may not be a total dog however.

Just wait and see. I don't think anything out there currently will rival what the PS4 will put out. Just a guess, though...
 
Killzone Shadowfall Capped at 30fps.

Just read an interesting article from Eurogamer on it. They say the video is a smooth 30fps. Apparently, the developer capped it at that for some reason. Still it looks nice. Some people are more finicky about needing 60fps though.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.