Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not true anymore. A lot of modern games are making use of more than 4 GB now, even at 1080p and 1440p. For example, with Red Dead Redemption 2, you can turn up texture quality to Ultra without much hit to performance if you have an 8 GB card, which makes the game look amazing, but you need to turn down texture quality on 4 GB cards. And anything less than Ultra texture quality in RDR2 looks significantly worse.

Just look at this comparison between a 4 GB GTX 1060 and an 8 GB GTX 1060 that was done a couple years ago. The 4 GB card skips and lags in every game. And the GTX 1060 is very similar in performance to the 5500M, so this isn’t some ultra high res 4K gaming comparison that won’t apply. These tests are at 1080p.

This is a test of 4GB vs 8GB of system ram. Of course the game is going to lag with only 4GB of ram. The video card being used is GTX 1060 6GB card, which only came in 3GB and 6GB variants.
 
  • Like
Reactions: TTOZ
This is a test of 4GB vs 8GB of system ram. Of course the game is going to lag with only 4GB of ram. The video card being used is GTX 1060 6GB card, which only came in 3GB and 6GB variants.

Oh shoot, you're right. Well, even ignoring the video, my comment about Red Dead Redemption 2 still stands. Can't enable Ultra quality textures in that game unless you have an 8 GB card, even at 1080p.
 
Oh shoot, you're right. Well, even ignoring the video, my comment about Red Dead Redemption 2 still stands. Can't enable Ultra quality textures in that game unless you have an 8 GB card, even at 1080p.

Isn’t RDR 2 running poorly even on super high end gaming rigs? Seems like it may be asking too much to expect good performance even with the 8GB 5500
 
Isn’t RDR 2 running poorly even on super high end gaming rigs? Seems like it may be asking too much to expect good performance even with the 8GB 5500

Even the PS4 Pro can’t run RDR2 natively in 4K @ 30 FPS so I’d say it’s out of the scope for these machines.

There doesn’t seem to be consensus on these GPUs yet. I’ll probably just stick with the base configuration and save the money for a PS5.
 
Isn’t RDR 2 running poorly even on super high end gaming rigs? Seems like it may be asking too much to expect good performance even with the 8GB 5500

It’s running poorly with everything set to Ultra, but by setting some of the settings to Medium, the game can be made to run well without much loss in visual quality even on midrange cards. I saw a YouTube video of it running fine on an RX 580 at 1080p, and the 5500M should be around RX 580 performance. Look up Hardware Unboxed’s optimization videos if you want to see how to get RDR2 running well without much loss in visual quality. Note, Ultra quality textures look vastly better than High quality and don’t impact performance that much IF you have 8 GB VRAM.
 
If we are living in a simulated universe, does the simulation run faster for someone with a 16” mac, or a more-recent windows machine?
 
Looks like the 5500m (4gb) scores around the same as the 580 Pro and GTX 1060 in Unigine Heaven.

Red Dead Redemption 2 will hit 30 FPS on the 580 and 1060 in 4k. Reduce that resolution to the MacBook native's and you'll probably hit 35 fps.

This is assuming the very small number of benchmarks translate to gaming performance, but it's looking like this may be the first actual gaming MacBook ever released.

 
  • Like
Reactions: cap7ainclu7ch
So my previous post is validated.

So get the 5300M unless your work flow needs the extra video memory in which case get the 5500M 8GB. OR, you're that convinced that extra video memory is going to help you that much in gaming.

Just because the 5500M is only 7% faster in an ancient benchmark doesn’t mean there won’t be larger differences in some modern games. Go with a 5300M if you want, but if I was buying one with some gaming in mind, I would want all of the performance and future proofing I could get. The 8 GB 5500M upgrade is not that expensive. Why cheap out and risk regretting the decision later? At the very least I would wait for more benchmarks in actual modern games, and if the difference is still small, only then consider the 5300M.
 
Last edited:
  • Like
Reactions: TTOZ and Never mind
The cost of upgrading the GPU is relatively small compared to the other upgrades, and it obviously comes down to a personal decision, but gaming is a consideration in my laptop purchase and I wouldn't even consider anything other than the 8GB 5500M.

The extra VRAM absolutely matters in games, even at lower resolutions all the way down to 1080p. This GPU should be more than functional in games for a few years going forward. 8GB is pretty much the mid-level minimum in PC gaming right now.

Bootcamp will be updated, probably this week, and the lack of drivers now shouldn't be a significant concern. The holiday return window is until early January if for some reason Apple decides to suddenly abandon support for Bootcamp in their laptops.

Yeah for gaming I'm just upgrading the GPU all the way
 
  • Like
Reactions: TTOZ
This is the benchmark gaming on 16-inch MacBook Pro with 5300 4gb, seem like it perform quite will.

The performance is almost same with the Dell XPS 15 with GTX 1650. I assume if you get Pro 5500 with 8GB the performance might be same with GTX 1660 or even better.
 
Oh shoot, you're right. Well, even ignoring the video, my comment about Red Dead Redemption 2 still stands. Can't enable Ultra quality textures in that game unless you have an 8 GB card, even at 1080p.

What I am wondering about is whether a mid-range GPU such as Navi 14 even has enough memory bandwidth and texture filtering performance to support this Ultra mode, VRAM notwithstanding. All gaming GPUs on the market I know that have a 8GB option also have massive memory busses with 2x or more of VRAM bandwidth compared to Pro 5500M...
 
So my previous post is validated.

So get the 5300M unless your work flow needs the extra video memory in which case get the 5500M 8GB. OR, you're that convinced that extra video memory is going to help you that much in gaming.

I'd like to see a comparison with actual games and published framerates, not old synthetic benchmarks, personally.
 
  • Like
Reactions: TTOZ and Never mind
for gaming the best is the vega 48 8gb imac 5k, or the radeon pro 5500m 8gb of the macbook pro 16 "?
 
for gaming the best is the vega 48 8gb imac 5k, or the radeon pro 5500m 8gb of the macbook pro 16 "?

"Best" being one of those words that lawyers like because it can be defined myriad ways.

None of these chips are really ideal for gaming, and if that's your primary concern there are other PC laptops and desktops that would be far better values.

The Vega 48 is a desktop chip and though older, should outperform the 5500M 8GB. It can draw significantly more power, and sits between the desktop GTX 1060 and RX 590 in terms of performance.

The 5500M 8GB has the advantage of being portable, which if you need portability makes it best for Mac laptops. It seems to be around the mobile GTX 1650-Max-Q chip in terms of performance.

We are still waiting to see actual game framerate scores for the 5500M 8GB chip, partially because we're constrained by not having bootcamp drivers to properly test the performance in a proper gaming environment.

Personally, I wouldn't get less than 8GB because a lot of the gaming I do is replaying older titles with modded 4k texture files and those eat up VRAM like nobody's business.
 
  • Like
Reactions: Surfer13134
It appears the 5500m scores much lower than even a GTX 1060 Max-Q in Luxmark.

11329 2019 Macbook 5500m
12094 2018 Dell G7 1060 Max-Q @60watts

hmmm, maybe driver optimization are still to be had down the pike.
 
Last edited:
It appears the 5500m scores much lower than even a GTX 1060 Max-Q in Luxmark.

11329 2019 Macbook 5500m
12094 2018 Dell G7 1060 Max-Q @60watts

hmmm, maybe driver optimization are still to be had down the pike.


I was talking about the GTX 1650 Max-Q which is what AMD is using as the target in their press releases and their probably-juiced chart which shows the 5500M "mobile" part beating the GTX 1650 "mobile" part handily.


Scroll down to see the AMD press chart. It's apples and oranges a bit as this seems to be for the 5500M part that has fewer compute units than the one used in the MBP16, but as far as AMD is concerned, their laptop version of the 5500M beats the laptop version of the GTX 1650 by 30%.

Synthetic benchmark scores aren't as good a measure for determining gaming performance as actual framerate tests across a variety of games which will show the true strengths of each chipset. Some might perform better in big RTS titles, while another might perform better in FPS twitch shooters. Some games and genres will see a significant benefit from 8GB VRAM, while others might be a total wash between 8GB and 4GB VRAM.

We're still missing that data to make even a preliminary conclusion. We need to start seeing comparisons across multiple games, not benchmark tools which can be skewed by driver cheats.
 
  • Like
Reactions: Never mind
I was talking about the GTX 1650 Max-Q which is what AMD is using as the target in their press releases and their probably-juiced chart which shows the 5500M "mobile" part beating the GTX 1650 "mobile" part handily.

RX 5500M and Pro 5500M are two rather different GPUs.
 
I was talking about the GTX 1650 Max-Q which is what AMD is using as the target in their press releases and their probably-juiced chart which shows the 5500M "mobile" part beating the GTX 1650 "mobile" part handily.


Scroll down to see the AMD press chart. It's apples and oranges a bit as this seems to be for the 5500M part that has fewer compute units than the one used in the MBP16, but as far as AMD is concerned, their laptop version of the 5500M beats the laptop version of the GTX 1650 by 30%.

Synthetic benchmark scores aren't as good a measure for determining gaming performance as actual framerate tests across a variety of games which will show the true strengths of each chipset. Some might perform better in big RTS titles, while another might perform better in FPS twitch shooters. Some games and genres will see a significant benefit from 8GB VRAM, while others might be a total wash between 8GB and 4GB VRAM.

We're still missing that data to make even a preliminary conclusion. We need to start seeing comparisons across multiple games, not benchmark tools which can be skewed by driver cheats.
exactly, but luxmark uses OpenCL, which i know apple uses Metal now, but OpenCL is used a lot in video editing so this benchmark shows what the macbook pro is actually used for, which is not gaming. I was hoping for better results.

The 1650 doesn't seem like a good card to begin with either. The higher model number makes you believe otherwise ;)
 
This is the benchmark gaming on 16-inch MacBook Pro with 5300 4gb, seem like it perform quite will.

The performance is almost same with the Dell XPS 15 with GTX 1650. I assume if you get Pro 5500 with 8GB the performance might be same with GTX 1660 or even better.

Cannot wait to see this! Hope we get more benchmarks this week.

Waiting for full gaming benchmarks before pulling the trigger
 
RX 5500M and Pro 5500M are two rather different GPUs.

Right, but no one's comparing them. I wasn't.

In the absence of proper gaming benchmarks, in FPS with actual games, we're left looking at press release charts like the above, where the "mobile" version of the 5500M is compared to the "mobile" version of the 1650. I'm not comparing desktop chips with mobile, only guessing relative performance compared to other GPUs that have proper validated benchmark scores in real games.

The Pro 5500M has two additional compute units but who knows that that will mean to gaming once drivers mature.

According to this spec chart, https://www.techpowerup.com/gpu-specs/radeon-pro-5500m.c3463 the Radeon Pro 5500M should be in the ballpark of the mobile GTX 1650, maybe a hair faster.

Which is pretty good as that chip can drive a lot of games at medium pretty damn well.

Assuming boot camp drivers come out soon and we can play in Windows. I'm really looking forward to testing out low-medium settings for Red Dead 2.
 
It appears the 5500m scores much lower than even a GTX 1060 Max-Q in Luxmark.

11329 2019 Macbook 5500m
12094 2018 Dell G7 1060 Max-Q @60watts

hmmm, maybe driver optimization are still to be had down the pike.

I wouldn't read too much into it. First of all, where are these scores even from? Which Lexmark version? Which of the Lexmark benchmark scenes? If this is the default one (v3.1, LuxBall), then why are these scores so low? My Vega Pro 20 scores 11080, and it is certainly way slower than 1060 (even Max-Q). Why is that 1060 claimed to be 60 watt? The normal 1060 Max-Q is supposed to be a 80W part.

Some more points to consider:

Anyway, all this tells me that LuxMark is a very random piece of software that just does it's thing and does it in a very idiosyncratic fashion. I have no idea what it measures exactly, but it's fairly clear that one should not take it as a representative benchmark.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.