Can't find where I stated that? You keep arguing that you can play it, but you can't seem to find where I said otherwise... My main point was about framerate. Not inability to run the game.You’re a funny guy.
Can't find where I stated that? You keep arguing that you can play it, but you can't seem to find where I said otherwise... My main point was about framerate. Not inability to run the game.You’re a funny guy.
Can't find where I stated that? You keep arguing that you can play it, but you can't seem to find where I said otherwise... My main point was about framerate. Not inability to run the game.
Maybe it's HotS that isn't playable, not SC2. The minimap cannot be clicked.Anyway, what is it about Starcraft 2 that's supposed to be broken? I was just playing that using Metal for a little while and couldn't find anything that didn't work.
A *percentage* of Macs from 2012 can play above 60fps you mean. Only the high end variant as matter a fact, as most sold included a gtx 660MYour point was frame rate (60fps). And I pointed out that a Mac from 5 years ago can pull that off... recent ones can also.
A 680MX is definitely better than a 750ti.More TFLOPS =/= automatically mean its a better GPU
Probably =/= definitely/for sure
That's not what he meant. Apple doesn't offer good gpu's, and the 580 is the best they offer currently. Their second best gpu offered probably can't play overwatch at a consistent 60fps, which i personally feel is pretty low for a fps.
A 680MX is definitely better than a 750ti.
Many Macs from the past few years can play overwatch at 60 fps with decent settings. That includes the 2016+ MacBook pros, which apparently sell like hotcakes.
Now stop digging that hole.
These thermal constraints can't be that much of a problem, as Overwatch indeed runs with constant 60 fps on my Radeon 460 Pro equipped rMBP with at least PS4 equivalent settings (where the game uses dynamic resolution scaling to maintain the framerate).The PS4 GPU isn't as thermally constrained as the Radeon 460 Pro in a MacBook Pro so will sustain higher performance over time, even though they have similar 1.84 vs. 1.86 TFlop peaks.
However, the point you are making is that this is more than adequate to play a PS4-level game and it absolutely should be with some tweaks for said thermal constraints.
These thermal constraints can't be that much of a problem, as Overwatch indeed runs with constant 60 fps on my Radeon 460 Pro equipped rMBP with at least PS4 equivalent settings (where the game uses dynamic resolution scaling to maintain the framerate).
I play casual and competitively, but it depends on the game honestly. I'm just a fan of video games in general, and playing them "the way it's meant to be played".Scorched Earth,
Do you game competitively, as in, with prize money on the line?
I play casual and competitively, but it depends on the game honestly. I'm just a fan of video games in general, and playing them "the way it's meant to be played".
While I LOVE to play video games, and I make money from it on the side, I couldn't do it as a full time streamer/pro. The idea of that as a job seems like it would ultimately take away from the enjoyment of playing them.That's cool. It just sounded like you approached gaming with the priorities of someone whose livelihood depended on getting the highest possible performance out of them.
Certainly, if I could, I would prioritize gaming higher on the budgeting continuum, but , life happens, as they say.
The most exciting development on the Mac from my point of view is Apple's support for external thunderbolt gpus in macOS.
What are your thoughts on that?
Yeah, no problems clicking the minimap in SC2.Maybe it's HotS that isn't playable, not SC2. The minimap cannot be clicked.
Definitely not trolling, smh. Did you even bother to read how they tested the game? In an EMPTY map, which means 1 player in an empty map, with no explosions, bullet ricochets, sparks, extra lighting effects, character models, etc. Here's an excerpt.I’m replying to this post.. Not your most recent one... You know exactly why.
A Gpu from a 2012 Imac can run overwatch at 60fps.... Their ‘second best gpu’ will most certainly run the game at 60fps.
[doublepost=1510817238][/doublepost]
He’s simply trolling now.
Definitely not trolling, smh. Did you even bother to read how they tested the game? In an EMPTY map, which means 1 player in an empty map, with no explosions, bullet ricochets, sparks, extra lighting effects, character models, etc. Here's an excerpt.
"Testing Methodology
Benchmarking Overwatch accurately is a real problem because it's exclusively an online multiplayer game. Getting more than a dozen co-operative players on a map at the same time to carry out a series of benchmarks over two days isn't realistic. Therefore, we decided to test GPU performance using a custom map with no other players.
This saw me walk around an empty map for 60 seconds taking the exact same path each time using the same character. The Hollywood map was used though it wasn't selected for any particular reason."
You don't even have to reply to this, as there is no point in arguing about this topic. With an additional 11 people added into the mix, as well as constant firefight's, etc, those frames are going to drop. Its not rare to drop 20 to 30 frames when moving from an empty area in the map, to an area with a few other players. I even found you a video of the HIGH end 2012 mac, with the BEST gpu they offered, the gtx 680MX. You can seen it drop to 43fps, and this isn't even on HIGH settings. So much for constant 60fps...
which again, was the point of my earlier posts. I was never bashing you or your hardware, so no need to take anything personal.
Where did you see it set to 1440p? The video is 1080p, not 1440p. The gtx 680mx is about equivalent to the gtx 660. again for reference, with no characters on screen, effects, explosions, etc...I'll reply...
That person has the resolution set at 1440p.... And only the i5 cpu.
Try harder.
Where did you see it set to 1440p? The video is 1080p, not 1440p. The gtx 680mx is about equivalent to the gtx 660. again for reference, with no characters on screen, effects, explosions, etc...
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-660/m8956vs2162
i7 or i5, doesn't matter. The game is more GPU bound than CPU bound. The game will run on i3's.
Highly doubt that is at 1440p, as these results show otherwise. I would trust a reputable website's thorough testings, over a videos description with no settings being shown. Either its at 1080p on low-medium settings, or 1440 low-medium, with resolution scaling at 50% (not really 1440p). Either way, here are the results from the same website that tested multiple GPU's.On the actual youtube video you posted? If you click on it the details are right there.
The 680mx is not 'about equivalent' to the gtx 660... The 660 has lower shader performance/memory bandwidth/fill rate than the 680mx.
Use the 750ti you seem to be so fond of as a comparison...
http://gpu.userbenchmark.com/Compare/Nvidia-GTX-680MX-vs-Nvidia-GTX-750-Ti/m8956vs2187
Highly doubt that is at 1440p, as these results show otherwise. I would trust a reputable website's thorough testings, over a videos description with no settings being shown. Either its at 1080p on low-medium settings, or 1440 low-medium, with resolution scaling at 50% (not really 1440p). Either way, here are the results from the same website that tested multiple GPU's.
Well my point wasn't against modern systems, so unsure why you have been quoting me ever since...I mentioned that the newer ones can play it, the older ones on the other hand will struggle with 60fps with high-ultra-epic settings... You can't upgrade the imacs, so in essence an upgrade would consist of a new computer all together...unlike PC'sThen don't post random youtube videos without context as 'evidence'....
I'll repeat my point.
We're almost done with 2017.... Modern macs will happily run the game at 60fps (even laptops).
That's the point I was making from the start, and the one I'm making now.
Well my point wasn't against modern systems, so unsure why you have been quoting me ever since...I mentioned that the newer ones can play it, the older ones on the other hand will struggle with 60fps with high-ultra-epic settings... You can't upgrade the imacs, so in essence an upgrade would consist of a new computer all together...unlike PC's
Oh?That's not what he meant. Apple doesn't offer good gpu's, and the 580 is the best they offer currently. Their second best gpu offered probably can't play overwatch at a consistent 60fps, which i personally feel is pretty low for a fps.
Probably. And if you were to open the game without changing the default resolution or settings, do you REALLY think the Rx 575 can output 60fps at default resolution/settings? the 580 can barely maintain 60, so I doubt the younger brother Rx 575 can... I'll wait though...
Probably. And if you were to open the game without changing the default resolution or settings, do you REALLY think the Rx 575 can output 60fps at default resolution/settings? the 580 can barely maintain 60, so I doubt the younger brother Rx 575 can... I'll wait though...
2048x1440 =/= 5120‑by‑2880 or even 4k 3840 × 2160 for that matter...I don't know about the 575.. But here's the 570 running at 1440p ULTRA...
2048x1440 =/= 5120‑by‑2880 or even 4k 3840 × 2160 for that matter...
You don't speak for everyone either. No one said anything about 4k 120fps. I only mentioned what card should be used as a minimum, as dropping framerate in online gaming is not an enjoyable experience. My guess is you haven't played overwatch on a PC, as you don't seem to grasp how critical it is have a constant 60fps.