Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My point exactly. Blizzard is in fact not devoting much, if any, resources to Mac development anymore. Broken SC2 for months, no D3 at all, lame excuses re: Overwatch. I just hope they still provide a Mac client for the next WoW expansion

D3 couldn't go metal until it went 64bit... This has only happened recently.
 
Currently, only WoW works on Metal. SC2 has been broken since septembre and support appears very poor if you read the forum. They don't fell like fixing the game apparently.
And the reason raised for not porting overwatch is very dubious. They say the Mac technology isn't adequate. Even now with Metal 2 and the latest Macs? What's so special with Overwatch that makes it impossible on the Mac, considering the games works on consoles and several advanced game engines now run on Metal? This looks like a cop-out. :rolleyes:

Not everyone is running the 2017 iMac with the Radeon Pro 580. This is really the only system worth a damn in terms of 3D performance. A fast paced FPS like Overwatch isn't going to be a fun experience at absolute minimum detail and 25fps on an Intel IGPU.
 
If a radeon 580 or better was required to play overwatch, blizzard would have issue selling the game to PC users.
 
  • Like
Reactions: Spikeosx
If a radeon 580 or better was required to play overwatch, blizzard would have issue selling the game to PC users.
That's not what he meant. Apple doesn't offer good gpu's, and the 580 is the best they offer currently. Their second best gpu offered probably can't play overwatch at a consistent 60fps, which i personally feel is pretty low for a fps.
 
The real question is how many Mac users would purchase the game. It was implied that only those with a radeon 580 iMac would/should. I highly doubt that the case, because PC sales probably include a very large portion of gamers with a slower GPU. If they didn't, blizzard wouldn't be making that game on the PC. Not so many PC users have a very recent GPU.
And I wouldn't be surprised if Mac users were willing to make more sacrifices than PC users, due to the lack of high-end GPUs.

I'm not saying that porting the game to Mac is worth it. In fact, I'm surprised that developers even spend the time and money. I think those who do mostly base their decision on principles (make a game/engine available on all platforms, regardless of user base), undisclosed agreements with Apple (?), or some personal interest in the Mac (e.g, Cyan, who always liked Apple). Economically speaking, I really don't think porting a game to the Mac can be profitable (except for houses specialised in porting, but that's another topic). I'm not sure in which category Blizzard falls and why they even considered porting their games. What's unclear is why Overwatch is different from their other games. Few Macs were capable of running SC2 decently on release.
 
Last edited:
The Overwatch minimum specs on PC include integrated graphics. Not everyone needs 4K max settings 120fps, and that goes for PC gamers too.

--Eric
 
The Overwatch minimum specs on PC include integrated graphics. Not everyone needs 4K max settings 120fps, and that goes for PC gamers too.

--Eric
Anything below the Gtx 750 Ti, would be dropping below 60fps. While Overwatch does not require the beefiest specs, an integrated video card cannot do justice for this game. If you have to turn the settings to low, and play at a much smaller resolution just to maintain frames, it's not worth it.
1080p.png



Here is a video playing on the minimum specs. Intel Hd 4400
This cannot be considered "playable" when it comes to a online only, competitive game.



 
Last edited by a moderator:
If you have to turn the settings to low, and play at a much smaller resolution just to maintain frames, it's not worth it.
Your opinion only.

This cannot be considered "playable" when it comes to a online only, competitive game.
Also mere opinion. I wouldn't play it using integrated graphics, but again, not everyone cares that much. Given the millions of console gamers playing online stuff at 30 fps for however many years, it's clearly not as big of a deal as you're making out. Also, even the recommended specs are just a HD 7950, which is pretty far from a RX 580.

--Eric
 
Anything below the Gtx 750 Ti, would be dropping below 60fps. While Overwatch does not require the beefiest specs, an integrated video card cannot do justice for this game. If you have to turn the settings to low, and play at a much smaller resolution just to maintain frames, it's not worth it.



Here is a video playing on the minimum specs. Intel Hd 4400
This cannot be considered "playable" when it comes to a online only, competitive game.




It runs at 60fps on a stock ps4... Even last years top 15" MBP has a better video card than that. iMacs dating back years would be fine even for that.
 
Your opinion only.


Also mere opinion. I wouldn't play it using integrated graphics, but again, not everyone cares that much. Given the millions of console gamers playing online stuff at 30 fps for however many years, it's clearly not as big of a deal as you're making out. Also, even the recommended specs are just a HD 7950, which is pretty far from a RX 580.

--Eric
There is a reason why PCMR exists. Because no one wants to play at a mere 30fps on pc. No one wants to have that one teammate that suffers from low framerate and has excuses why they aren't playing up to par. I personally wouldn't want that person on my team.

I said Rx 580 will play it perfectly fine, but not many other Macs will play it at a constant 60fps besides the later models. I mentioned the gtx 750ti because that is the lowest card it should be played on. Anything below that, will drop frames.
[doublepost=1510802495][/doublepost]
It runs at 60fps on a stock ps4... Even last years top 15" MBP has a better video card than that. iMacs dating back years would be fine even for that.
Just check out those same cards on YouTube playing overwatch and see how they fare. There's a reason many companies aren't porting games to Mac's. It's not just because of the small user base, but lack of hardware to play many AAA games. Hardware of a console =/= pc hardware, nor does the performance scale the same way because of the differences of OS and how resources are used.
 
I said Rx 580 will play it perfectly fine, but not many other Macs will play it at a constant 60fps besides the later models. I mentioned the gtx 750ti because that is the lowest card it should be played on. Anything below that, will drop frames.
[doublepost=1510802495][/doublepost]
Just check out those same cards on YouTube playing overwatch and see how they fare. There's a reason many companies aren't porting games to Mac's. It's not just because of the small user base, but lack of hardware to play many AAA games. Hardware of a console =/= pc hardware, nor does the performance scale the same way because of the differences of OS and how resources are used.

You mentioned the GTX 750ti... My iMac from 2012 has a better card that that (680mx).

What exactly do you mean by 'not many macs will play it at a constant 60fps'?! Many absolutely will.. And as can be seen from the benchmarks you posted yourself, that's on ULTRA settings!.. It's OKAY to turn settings down to gain framerate.
 
You mentioned the GTX 750ti... My iMac from 2012 has a better card that that (680mx).

What exactly do you mean by 'not many macs will play it at a constant 60fps'?! Many absolutely will.. And as can be seen from the benchmarks you posted yourself, that's on ULTRA settings!.. It's OKAY to turn settings down to gain framerate.
680m is not much better than the 750ti. You realize there base model imac in 2012 came with a 660m. The more expensive one had the 680mx. The M/MX variants of nvidia cards are no where as powerful as the non M/MX versions.

With that out of the way, there isn't much you can do to upgrade, besides dropping another 2 Grand in Apples system. On pc, you could just upgrade the video card, and be good to go. Also that upgrade would be for a full performance, non-M variant gpu.
 
No. Again, that's simply your own opinion; you don't speak for everyone. The "must have everything maxed out 4K 120fps or go home" crowd is actually a minority. I'll point out once more that Blizzard's recommended (not minimum) PC specs start with the HD 7950. If they catered only to the ultra-hardcore crowd, they would not have enough players to survive.

--Eric
 
680m is not much better than the 750ti. You realize there base model imac in 2012 came with a 660m. The more expensive one had the 680mx. The M/MX variants of nvidia cards are no where as powerful as the non M/MX versions.

With that out of the way, there isn't much you can do to upgrade, besides dropping another 2 Grand in Apples system. On pc, you could just upgrade the video card, and be good to go. Also that upgrade would be for a full performance, non-M variant gpu.

I'm simply stating that my FIVE YEAR OLD imac has a better graphics card than a 750ti, and yes.. It's a quite a bit better. (1.4t flops for the 750ti vs 2.2 tflops for the 680mx, as well as more than double the fill rate/memory bandwidth!)

I'm not debating the pros and cons of building your own gaming PC, or anything like that.. Enough with the strawman arguments.

I'm simply stating that from the bechmarks you posted my FIVE YEAR OLD iMac more than meets the requirements to run overwatch on ultra at 1080p/60fps.

Got it?
 
Your opinion only.


Also mere opinion. I wouldn't play it using integrated graphics, but again, not everyone cares that much. Given the millions of console gamers playing online stuff at 30 fps for however many years, it's clearly not as big of a deal as you're making out. Also, even the recommended specs are just a HD 7950, which is pretty far from a RX 580.

--Eric

No. Again, that's simply your own opinion; you don't speak for everyone. The "must have everything maxed out 4K 120fps or go home" crowd is actually a minority. I'll point out once more that Blizzard's recommended (not minimum) PC specs start with the HD 7950. If they catered only to the ultra-hardcore crowd, they would not have enough players to survive.

--Eric
You don't speak for everyone either. No one said anything about 4k 120fps. I only mentioned what card should be used as a minimum, as dropping framerate in online gaming is not an enjoyable experience. My guess is you haven't played overwatch on a PC, as you don't seem to grasp how critical it is have a constant 60fps.
[doublepost=1510806838][/doublepost]
I'm simply stating that my FIVE YEAR OLD imac has a better graphics card than a 750ti, and yes.. It's a quite a bit better. (1.4t flops for the 750ti vs 2.2 tflops for the 680mx, as well as more than double the fill rate/memory bandwidth!)

I'm not debating the pros and cons of building your own gaming PC, or anything like that.. Enough with the strawman arguments.

I'm simply stating that from the bechmarks you posted my FIVE YEAR OLD iMac more than meets the requirements to run overwatch on ultra at 1080p/60fps.

Got it?
So the base model imac of 2012 will play this game at a BASIC 60fps, CONSTANTLY? That would be with a 660m, not 680m. As not everyone bought the upgraded 2012 mac. The 750ti is 86.4GB/sec vs 115.2GB/sec for the 680M. No where near double.
TFlops don't equate to better GPU. Why is it that a rx 580 can produce higher TFlops than a gtx 1070? Yet a gtx 1070 will run circles around a rx 580.

The most used GPU on steam as of now is the GTX 960.
http://store.steampowered.com/hwsurvey/
You can use this as a reference to see how far behind Apple is when compared to PC. This is another reason why dev's don't care to port games. Not sure why this seems so trivial for you two.
 
Last edited by a moderator:
You don't speak for everyone either. No one said anything about 4k 120fps. I only mentioned what card should be used as a minimum, as dropping framerate in online gaming is not an enjoyable experience. My guess is you haven't played overwatch on a PC, as you don't seem to grasp how critical it is have a constant 60fps.
[doublepost=1510806838][/doublepost]
So the base model imac of 2012 will play this game at a BASIC 60fps, CONSTANTLY? That would be with a 660m, not 680m. As not everyone bought the upgraded 2012 mac. The 750ti is 86.4GB/sec vs 115.2GB/sec for the 680M. No where near double.
TFlops don't equate to better GPU. Why is it that a rx 580 can produce higher TFlops than a gtx 1070? Yet a gtx 1070 will run circles around a rx 580.

The 680mx has 160GB/s memory bandwidth.... Here.. Happy reading.
https://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/specifications

Would you please cut out the damn strawman arguments? It's painful. I didn't say a damn thing about a 660m, you did.
 
The 680mx has 160GB/s memory bandwidth.... Here.. Happy reading.
https://www.geforce.com/hardware/notebook-gpus/geforce-gtx-680mx/specifications

Would you please cut out the damn strawman arguments? It's painful. I didn't say a damn thing about a 660m, you did.
My mistake, I personally don't use mobile cards in a desktop. (Not sure why anyone would use a mobile gpu in a desktop honestly) Sorry for the misunderstanding. But again, that doesn't mean much.

Do you understand why dropping frames in online gaming is detrimental though? It seems you have yet to focus on that point in each one of your irrelevant posts. Until you can grasp this, you will not get any of the points I have made thus far.

Watch this in its entirety and see if you can take anything form this video. It might help you understand why fps matter in online gaming.
 
You don't speak for everyone either.
Right, and nothing about what I'm saying is doing so. Your claims of what is "acceptable" applies only to you, which you seem unaware of. I'm perfectly happy letting people play on any settings they want and settling for 30fps if they want. I wouldn't personally, but I'm capable of realizing that my standards for games don't apply to everyone. The point is that whatever reason Blizzard has for not porting Overwatch, it's not really hardware since they cater to a much broader audience than only high-end.

Anyway, what is it about Starcraft 2 that's supposed to be broken? I was just playing that using Metal for a little while and couldn't find anything that didn't work.

--Eric
 
My mistake, I personally don't use mobile cards in a desktop. (Not sure why anyone would use a mobile gpu in a desktop honestly) Sorry for the misunderstanding. But again, that doesn't mean much.

Do you understand why dropping frames in online gaming is detrimental though? It seems you have yet to focus on that point in each one of your irrelevant posts. Until you can grasp this, you will not get any of the points I have made thus far.

Watch this in its entirety and see if you can take anything form this video. It might help you understand why fps matter in online gaming.

I'm honestly not sure what point you're trying to make anymore...

One more time...... I'll state mine.

MY FIVE YEAR OLD IMAC WILL RUN OVERWATCH AT 60FPS.
My FIVE YEAR OLD IMAC has a much better video card than a 750ti (overwatch minimum for 1080p/60fps/ultra)
Not a recent model...
FIVE.YEARS. OLD.
Got it?!
 
I'm honestly not sure what point you're trying to make anymore...

One more time...... I'll state mine.

MY FIVE YEAR OLD IMAC WILL RUN OVERWATCH AT 60FPS.
My FIVE YEAR OLD IMAC has a much better video card than a 750ti (overwatch minimum for 1080p/60fps/ultra)
Not a recent model...
FIVE.YEARS. OLD.
Got it?!
And I will state mine.
YOUR 2012 model can play it sure, but the base model of that SAME computer may not fare so well.
YOU paid more to get the upgrade, but did EVERYONE ELSE?
Many pc's from that time have no issue playing overwatch, above 60fps and only paid a fraction of what you did.
More TFLOPS =/= automatically mean its a better GPU
Probably =/= definitely/for sure
STABLE 60fps >>>> Below 60fps.
Did you get that?! lol
 
And I will state mine.
YOUR 2012 model can play it sure, but the base model of that SAME computer may not fare so well.
YOU paid more to get the upgrade, but did EVERYONE ELSE?
Many pc's from that time have no issue playing overwatch, above 60fps and only paid a fraction of what you did.
More TFLOPS =/= automatically mean its a better GPU
Probably =/= definitely/for sure
STABLE 60fps >>>> Below 60fps.
Did you get that?! lol

No one was talking about 'base model of same computer' apart from you.
I didn't pay for the upgrade, I bought this imac second hand for very little money.
More TFLOPS/Compute/Memory Bandwidth/Fill Rate = Better GPU. YES! FOR SURE! I'm happy to bet money on this one!
STABLE 60fps is Stable 60fps.

As I said... Please cut the strawman arguments.

My five year old imac can run overwatch at a stable 60fps on ultra/1080p... It's not just recent mac models.
That's the point I made at the start.. And the point I'm making now.
 
TFlops don't equate to better GPU. Why is it that a rx 580 can produce higher TFlops than a gtx 1070? Yet a gtx 1070 will run circles around a rx 580.

Comparing TFlops alone across vendors is generally not meaningful - neither was comparing Flop/GFlop scores between PPC & x86. There are too many other variables in play when comparing different architectures for it to be a very accurate measure of real-workload performance.

The two big GPU vendors have taken quite different philosophical approaches to making faster GPUs - much like the way PPC & x86 differed - so AMD have higher theoretical performance but Nvidia get more out of their TFlops in games.
[doublepost=1510811924][/doublepost]
It runs at 60fps on a stock ps4... Even last years top 15" MBP has a better video card than that. iMacs dating back years would be fine even for that.

The PS4 GPU isn't as thermally constrained as the Radeon 460 Pro in a MacBook Pro so will sustain higher performance over time, even though they have similar 1.84 vs. 1.86 TFlop peaks.

However, the point you are making is that this is more than adequate to play a PS4-level game and it absolutely should be with some tweaks for said thermal constraints.
 
Last edited:
  • Like
Reactions: Irishman
No one was talking about 'base model of same computer' apart from you.
I didn't pay for the upgrade, I bought this imac second hand for very little money.
More TFLOPS/Compute/Memory Bandwidth/Fill Rate = Better GPU. YES! FOR SURE! I'm happy to bet money on this one!
STABLE 60fps is Stable 60fps.

As I said... Please cut the strawman arguments.

My five year old imac can run overwatch at a stable 60fps on ultra/1080p... It's not just recent mac models.
That's the point I made at the start.. And the point I'm making now.
Please quote my post which I pointed YOU out specifically, and said YOU can't run overwatch? The base model of that computer will NOT run it at a consistent 60fps. It will NOT happen. YOU do NOT equal the whole userbase of the 2012 macs. Not everyone had the upgraded model either, which means not everyone with a 2012 mac can play 60fps. As matter a fact, neither models of the 2012 Mac can play it, as Overwatch is not available on Apple products. Your posts are overflowing with straw. You keep arguing something I never even said. LOL
 
Please quote my post which I pointed YOU out specifically, and said YOU can't run overwatch? The base model of that computer will NOT run it at a consistent 60fps. It will NOT happen. YOU do NOT equal the whole userbase of the 2012 macs. Not everyone had the upgraded model either, which means not everyone with a 2012 mac can play 60fps. As matter a fact, neither models of the 2012 Mac can play it, as Overwatch is not available on Apple products. Your posts are overflowing with straw. You keep arguing something I never even said. LOL

You’re a funny guy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.