Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyway, what is it about Starcraft 2 that's supposed to be broken? I was just playing that using Metal for a little while and couldn't find anything that didn't work.
Maybe it's HotS that isn't playable, not SC2. The minimap cannot be clicked.
 
Your point was frame rate (60fps). And I pointed out that a Mac from 5 years ago can pull that off... recent ones can also.
A *percentage* of Macs from 2012 can play above 60fps you mean. Only the high end variant as matter a fact, as most sold included a gtx 660M ;). Which goes back to my earlier post. Also 660m is not the same as the 660.
 
More TFLOPS =/= automatically mean its a better GPU
Probably =/= definitely/for sure
A 680MX is definitely better than a 750ti.
Many Macs from the past few years can play overwatch at 60 fps with decent settings. That includes the 2016+ MacBook pros, which apparently sell like hotcakes.
Now stop digging that hole.
 
That's not what he meant. Apple doesn't offer good gpu's, and the 580 is the best they offer currently. Their second best gpu offered probably can't play overwatch at a consistent 60fps, which i personally feel is pretty low for a fps.

I’m replying to this post.. Not your most recent one... You know exactly why.

A Gpu from a 2012 Imac can run overwatch at 60fps.... Their ‘second best gpu’ will most certainly run the game at 60fps.
[doublepost=1510817238][/doublepost]
A 680MX is definitely better than a 750ti.
Many Macs from the past few years can play overwatch at 60 fps with decent settings. That includes the 2016+ MacBook pros, which apparently sell like hotcakes.
Now stop digging that hole.

He’s simply trolling now.
 
The PS4 GPU isn't as thermally constrained as the Radeon 460 Pro in a MacBook Pro so will sustain higher performance over time, even though they have similar 1.84 vs. 1.86 TFlop peaks.

However, the point you are making is that this is more than adequate to play a PS4-level game and it absolutely should be with some tweaks for said thermal constraints.
These thermal constraints can't be that much of a problem, as Overwatch indeed runs with constant 60 fps on my Radeon 460 Pro equipped rMBP with at least PS4 equivalent settings (where the game uses dynamic resolution scaling to maintain the framerate).
 
These thermal constraints can't be that much of a problem, as Overwatch indeed runs with constant 60 fps on my Radeon 460 Pro equipped rMBP with at least PS4 equivalent settings (where the game uses dynamic resolution scaling to maintain the framerate).

I defer to your hands on experience - sounds like it is at least equivalent.
 
Scorched Earth,

Do you game competitively, as in, with prize money on the line?
I play casual and competitively, but it depends on the game honestly. I'm just a fan of video games in general, and playing them "the way it's meant to be played".
 
Last edited by a moderator:
I play casual and competitively, but it depends on the game honestly. I'm just a fan of video games in general, and playing them "the way it's meant to be played".

That's cool. It just sounded like you approached gaming with the priorities of someone whose livelihood depended on getting the highest possible performance out of them.

Certainly, if I could, I would prioritize gaming higher on the budgeting continuum, but , life happens, as they say.

The most exciting development on the Mac from my point of view is Apple's support for external thunderbolt gpus in macOS.

What are your thoughts on that?
 
That's cool. It just sounded like you approached gaming with the priorities of someone whose livelihood depended on getting the highest possible performance out of them.

Certainly, if I could, I would prioritize gaming higher on the budgeting continuum, but , life happens, as they say.

The most exciting development on the Mac from my point of view is Apple's support for external thunderbolt gpus in macOS.

What are your thoughts on that?
While I LOVE to play video games, and I make money from it on the side, I couldn't do it as a full time streamer/pro. The idea of that as a job seems like it would ultimately take away from the enjoyment of playing them.

On the eGPU's...to be honest I am not a fan of them in any environment, whether it be desktop/laptop, windows or mac. I can definitely see where they can come in handy. Just not for me. I would personally have a laptop for on the go situations (with integrated graphics to cut costs) and have a custom gaming build at the house. I'm not even a fan of gaming laptops, because of power supply/battery, throttling, price, etc.

eGPU's can have a pretty big decrease in performance (20%-30%) and can add latency, so you'd would want to have TB3 with it's higher bandwidth capability vs TB2. Now if you only have a laptop, an eGPU will obviously be your best option. But if you are going to get a desktop (with the intent to play video games), a laptop gpu doesn't make much sense (M/MX models), nor does a eGPU. (I'm all about functionality over form)
 
I’m replying to this post.. Not your most recent one... You know exactly why.

A Gpu from a 2012 Imac can run overwatch at 60fps.... Their ‘second best gpu’ will most certainly run the game at 60fps.
[doublepost=1510817238][/doublepost]

He’s simply trolling now.
Definitely not trolling, smh. Did you even bother to read how they tested the game? In an EMPTY map, which means 1 player in an empty map, with no explosions, bullet ricochets, sparks, extra lighting effects, character models, etc. Here's an excerpt.


"Testing Methodology
Benchmarking Overwatch accurately is a real problem because it's exclusively an online multiplayer game. Getting more than a dozen co-operative players on a map at the same time to carry out a series of benchmarks over two days isn't realistic. Therefore, we decided to test GPU performance using a custom map with no other players.

This saw me walk around an empty map for 60 seconds taking the exact same path each time using the same character. The Hollywood map was used though it wasn't selected for any particular reason."


You don't even have to reply to this, as there is no point in arguing about this topic. With an additional 11 people added into the mix, as well as constant firefight's, etc, those frames are going to drop. Its not rare to drop 20 to 30 frames when moving from an empty area in the map, to an area with a few other players. I even found you a video of the HIGH end 2012 mac, with the BEST gpu they offered, the gtx 680MX ;). You can seen it drop to 43fps, and this isn't even on HIGH settings. So much for constant 60fps...:rolleyes::rolleyes: which again, was the point of my earlier posts. I was never bashing you or your hardware, so no need to take anything personal.
 
Last edited by a moderator:
Definitely not trolling, smh. Did you even bother to read how they tested the game? In an EMPTY map, which means 1 player in an empty map, with no explosions, bullet ricochets, sparks, extra lighting effects, character models, etc. Here's an excerpt.


"Testing Methodology
Benchmarking Overwatch accurately is a real problem because it's exclusively an online multiplayer game. Getting more than a dozen co-operative players on a map at the same time to carry out a series of benchmarks over two days isn't realistic. Therefore, we decided to test GPU performance using a custom map with no other players.

This saw me walk around an empty map for 60 seconds taking the exact same path each time using the same character. The Hollywood map was used though it wasn't selected for any particular reason."


You don't even have to reply to this, as there is no point in arguing about this topic. With an additional 11 people added into the mix, as well as constant firefight's, etc, those frames are going to drop. Its not rare to drop 20 to 30 frames when moving from an empty area in the map, to an area with a few other players. I even found you a video of the HIGH end 2012 mac, with the BEST gpu they offered, the gtx 680MX ;). You can seen it drop to 43fps, and this isn't even on HIGH settings. So much for constant 60fps...:rolleyes::rolleyes: which again, was the point of my earlier posts. I was never bashing you or your hardware, so no need to take anything personal.

I'll reply...

That person has the resolution set at 1440p.... And only the i5 cpu.

Try harder.
 
I'll reply...

That person has the resolution set at 1440p.... And only the i5 cpu.

Try harder.
Where did you see it set to 1440p? The video is 1080p, not 1440p. The gtx 680mx is about equivalent to the gtx 660. again for reference, with no characters on screen, effects, explosions, etc...
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-660/m8956vs2162
1080p.png


i7 or i5, doesn't matter. The game is more GPU bound than CPU bound. The game will run on i3's.
 
Last edited by a moderator:
Where did you see it set to 1440p? The video is 1080p, not 1440p. The gtx 680mx is about equivalent to the gtx 660. again for reference, with no characters on screen, effects, explosions, etc...
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-660/m8956vs2162


i7 or i5, doesn't matter. The game is more GPU bound than CPU bound. The game will run on i3's.

On the actual youtube video you posted? If you click on it the details are right there.

The 680mx is not 'about equivalent' to the gtx 660... The 660 has lower shader performance/memory bandwidth/fill rate than the 680mx.

Use the 750ti you seem to be so fond of as a comparison...

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-750-Ti/m8956vs2187

The 680mx is faster than a 660.... And the 660 is the 'recommended' graphics card for overwatch.
https://us.battle.net/support/en/article/65159

https://www.reddit.com/r/Overwatch/comments/4jll9u/list_of_gpus_that_can_meet_or_exceed_recommended/

At 1080p, overwatch can happily maintain 60fps on a 680mx... A graphics card that shipped in a mac in 2012.

We're almost done with 2017.... Modern macs will happily run the game at 60fps (even laptops).
That's the point I was making from the start, and the one I'm making now.
 
Last edited:
On the actual youtube video you posted? If you click on it the details are right there.

The 680mx is not 'about equivalent' to the gtx 660... The 660 has lower shader performance/memory bandwidth/fill rate than the 680mx.

Use the 750ti you seem to be so fond of as a comparison...

http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-750-Ti/m8956vs2187
Highly doubt that is at 1440p, as these results show otherwise. I would trust a reputable website's thorough testings, over a videos description with no settings being shown. Either its at 1080p on low-medium settings, or 1440 low-medium, with resolution scaling at 50% (not really 1440p). Either way, here are the results from the same website that tested multiple GPU's.
1440p.png


Edit:
At 1440p the R7 260X and GTX 660 averaged around 40fps, and while this isn't ideal performance, it is playable. For the much more desirable 60fps something like an old HD 7950 or GTX 680 will get the job done nicely, as will the R9 380 and GTX 960.

Keep in mind that Overwatch is a competitive online first person shooter and we found a significant difference between playing at 60fps and say 90fps for example. That being the case, serious gamers will want at least a GTX 970 or R9 390 for competitive performance at 1440p.
 
Last edited by a moderator:
Highly doubt that is at 1440p, as these results show otherwise. I would trust a reputable website's thorough testings, over a videos description with no settings being shown. Either its at 1080p on low-medium settings, or 1440 low-medium, with resolution scaling at 50% (not really 1440p). Either way, here are the results from the same website that tested multiple GPU's.

Then don't post random youtube videos without context as 'evidence'....
I'll repeat my point.

We're almost done with 2017.... Modern macs will happily run the game at 60fps (even laptops).
That's the point I was making from the start, and the one I'm making now.
 
Then don't post random youtube videos without context as 'evidence'....
I'll repeat my point.

We're almost done with 2017.... Modern macs will happily run the game at 60fps (even laptops).
That's the point I was making from the start, and the one I'm making now.
Well my point wasn't against modern systems, so unsure why you have been quoting me ever since...I mentioned that the newer ones can play it, the older ones on the other hand will struggle with 60fps with high-ultra-epic settings... You can't upgrade the imacs, so in essence an upgrade would consist of a new computer all together...unlike PC's
 
Well my point wasn't against modern systems, so unsure why you have been quoting me ever since...I mentioned that the newer ones can play it, the older ones on the other hand will struggle with 60fps with high-ultra-epic settings... You can't upgrade the imacs, so in essence an upgrade would consist of a new computer all together...unlike PC's
That's not what he meant. Apple doesn't offer good gpu's, and the 580 is the best they offer currently. Their second best gpu offered probably can't play overwatch at a consistent 60fps, which i personally feel is pretty low for a fps.
Oh?
 
Last edited:
Probably. And if you were to open the game without changing the default resolution or settings, do you REALLY think the Rx 575 can output 60fps at default resolution/settings? the 580 can barely maintain 60, so I doubt the younger brother Rx 575 can... I'll wait though...

I don't know about the 575.. But here's the 570 running at 1440p ULTRA...
 
2048x1440 =/= 5120‑by‑2880 or even 4k 3840 × 2160 for that matter...

Oh.. Now we're into 'gaming must be at 5k'... Neat. Remember this?

You don't speak for everyone either. No one said anything about 4k 120fps. I only mentioned what card should be used as a minimum, as dropping framerate in online gaming is not an enjoyable experience. My guess is you haven't played overwatch on a PC, as you don't seem to grasp how critical it is have a constant 60fps.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.