Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What's the PL2 on the CPU and the TDP of the GPU?

I have an i7-10700 and the CPU has a TDP of 65 Watts but the PL2 is actually 228 Watts. The CPU is pulling 53 Watts at 40% CPU usage. Max since reboot is 92 Watts. I believe that the max CPU/GPU on the M1 is 20 Watts.

GPU makers aren't making low-end and mid-range GPUs anymore - they're just selling older cards because they're made on old processes where they can still get fab capacity which is why the 1050, 1060, 1080, 1650, 1660 GPUs are still relevant.

I looked at an XMG with 3070: 54 Watts for the CPU and 115 Watts for the GPU. I'm guessing that this isn't an All-Day battery life system.
 
  • Like
Reactions: JMacHack
Who cares about the 1050? That is a super old GPU.

For the same money as the M1 MBP (16gb with 1TB SSD), you can by the AMD 14” Razer Blade which has higher multi-core performance and higher GPU performance with the RTX 3060 in a similar thin design.
Exactly my point! Why nitpick an iGPU?

To win an online argument?
 
  • Like
Reactions: JMacHack
PC HAS to win at all costs!
Thankfully I grew out of that.

I instead focus on something else like wondering if I should have stuck to a late 2017 64GB iPhone 8 Plus & changed my pospaid's plan to SIM-only at $1-12/month until countries stop needing vaccine passports & vaccine IDs. I've been largely at home for over 18 months and I don't think poor countries will be back to normal until 2025.

If I did that then the base model iPhones would have 256GB storage and be using iOS 19.

Or better yet maybe I should kept to a 2011 Macbook Pro 13" then switch over to the 2021 Macbook Pro 16" with Apple Silicon instead that will be released in 4-8 weeks from today?

How about from my 2012 iMac 27" to a 2022 iMac 27" or larger screen with Apple Silicon?

More eye rolling discussions that have little material worth.
 
Last edited:
For Shadow of the Tomb Raider you can find the benchmark scores here and on the Internet.

Macbook Air M1
Shadow of the Tomb Raider 1080p Medium M1 Rosetta 2: 23 fps
Shadow of the Tomb Raider 1080p Ultra M1 Rosetta 2: 23 fps

I can confirm this as approximately correct: for Macbook M1 Air 16 Gb
1280x800 TAA: 34 fps
1920x800 TAA: 22
1920x800 SMAA 2x: 22
2560x1600 SMAA 2x: 13

For the 1050 see
which gives at 1280x720 lowest preset: 97 fps
and for 1920x200 at medium, high AASM, high AAT: 38, 31 and 25 respectively (or 31, 28, 26 with other config),
at 2560x1440 AAT: 17 fps

It seems the 1050 is still a step above the M1 GPU.

For a 1650 the figures on my setup are
1280x800 TAA: 79 fps
1920x800 TAA: 50
1920x800 SMAA 2x: 50
 
I found a rather funny video comparing a Dell gaming laptop with GTX 1660ti and M1. For 1660ti it takes "2000 years" to start up Fortnite. :)

 
For Shadow of the Tomb Raider you can find the benchmark scores here and on the Internet.

...

It seems the 1050 is still a step above the M1 GPU.

Given that it is a game that runs via Rosetta 2 and, even with the same GPU, the game runs better on Windows than it does on macOS, M1 seems to be much more impressive than GTX1050.

Having the exact same FPS count across low/medium/high indicates that there is some sort of inefficiency due to lack of native support. Not to mention that this not even the chip that directly rivals the likes of Dell XPS as the upcoming prosumer 14" and 16" chip will be so.
 
  • Like
Reactions: Vazor
That could well be true, I simply refered to the fps scores. I have read enough discussion on the GPU of the M1 to know that proper comparison is difficult and requires advanced knowledge of the internal workings of CPU and GPU. I don't know the 1050, there is only a slight difference in fps scores on the complete system.

I wouldn't know whether Rosetta makes a lot of difference in performance or whether it is more the GPU that is the bottleneck.

For my purposes the M1 works fine: I play i 1280x720 around 30 fps which suffices for me. I get the impression that a lot of Windows games that work (through Rosetta or Parallels) on the M1 can play at 30 fps or better, which makes the M1 a good enough gaming machine for me.
 
That could well be true, I simply refered to the fps scores. I have read enough discussion on the GPU of the M1 to know that proper comparison is difficult and requires advanced knowledge of the internal workings of CPU and GPU. I don't know the 1050, there is only a slight difference in fps scores on the complete system.

I wouldn't know whether Rosetta makes a lot of difference in performance or whether it is more the GPU that is the bottleneck.

For my purposes the M1 works fine: I play i 1280x720 around 30 fps which suffices for me. I get the impression that a lot of Windows games that work (through Rosetta or Parallels) on the M1 can play at 30 fps or better, which makes the M1 a good enough gaming machine for me.

There was a YouTube video a while back on Java Minecraft compiled for ARM, and the performance it showed. Alternatively, one could measure the performance in Minecraft bedrock using an M1 iPad, which, importantly has less thermal headroom than a M1 Mac, and compare it to an x86 Windows PC running the same edition of Minecraft.
 
On a Tflop level, I believe the M1 is about the same as an M1050ti. 2.5-ish TFlops.

It will be interesting to see if the rumours of the 32-core M1X are true, because based on that, the next 16" MBP would have theoretical performance levels approaching my Desktop 1080ti, at just 1/6th of the power draw. Crazy stuff. Enough for 1440p Ultra, or 4K medium gaming.

The issue I see is that unless Apple spends effort in getting headline games running native (which it seems the new GPUs have the capability to do) then all people are going to see is the benchmarks, which show Apple losing badly, yet again, to standard US$1K-2K laptops. People won't realise that the games on mac are running via virtualisation of an emulator, or other such thing. And really, it doesn't matter that they are; bad performance is bad performance, even if it is highly technically impressive that the performance in such a situation isn't abysmal. I believe poor graphics performance is a drawback for many potential customers, even if they aren't buying it specifically for gaming. I have a mac and a Windows gaming desktop (best of both worlds), but many other people can't afford or otherwise justify both.

I would suggest that if Apple is interested in going into the gaming market for Macs on any level (they certainly are front and centre in the Mobile gaming market, so I don't think it's a huge leap) then I would suggest that we might see some new, graphically intensive games being showcased in the MBP update event.

I feel it would be a great idea for Apple to consider funding for some leading games to be ported to run native on Apple Silicon. Showcasing having something like Cyberpunk 2077 or Doom Eternal running even on moderate-low settings, at native resolution 60Hz on the top-spec 16" MBP, would certainly rock the "Macs aren't good for gaming" narrative.

If Apple can find a way to show that Macs can be good for gaming, that might help sell more units; not to 'hardcore' gamers of course, but to people who want an everything computer. If that happens, then maybe the balance tips, games studios start seeing profits in making games that can be compiled for native Apple Silicon, and then Mac Gaming finally gets a life of its own.
 
Who cares about the 1050? That is a super old GPU.

For the same money as the M1 MBP (16gb with 1TB SSD), you can by the AMD 14” Razer Blade which has higher multi-core performance and higher GPU performance with the RTX 3060 in a similar thin design.
1050-1060 are actually the most popular PC video cards right now. This means game developers target this level of performance for their games.

I want a gaming-capable Mac but I mainly use my Mac for work. If I bring a Razer Blade into a work environment, my clients aren't going to take me seriously. Straight up.
 
  • Like
Reactions: ader42
I play Starcraft 2. On a 1050ti/1060 level, I can max out all the settings on Windows.

On an M1 Mac, I have to set all the graphics settings to the lowest.

However, Starcraft 2 is running on Rosetta and the game is very old so Blizzard is never going to optimize it for the Mac.
 
  • Like
Reactions: BigMcGuire
I have a strong suspicion that Geekbench compute strongly favors GPUs with dedicated memory. They likely only measure the time it takes to run the kernel, but not the time it takes to transfer the data to and from the GPU memory.
If it measured like that, then the M1 would win, hands down because with unified memory it take zero time to move the data. That is the big advantage of it, they call it "zero copy" as the data never moves.
 
If it measured like that, then the M1 would win, hands down because with unified memory it take zero time to move the data. That is the big advantage of it, they call it "zero copy" as the data never moves.
Not necessarily, AFAIK standard iGPUs dedicate a portion of RAM to be VRAM, and so function somewhat similarly to dGPUs, in having separate VRAM and RAM, therefore requiring a copy.

Geekbench could have this memory copy built into it, which might mean that the M1, even with its unified RAM, is doing a completely superfluous copying of data in its own RAM. Which could be extra-slow, as the RAM hardware itself is then being tasked with reading and writing at the same time, which a dGPU does not need to do.

This would therefore be a flaw in GB.
 
If it measured like that, then the M1 would win, hands down because with unified memory it take zero time to move the data. That is the big advantage of it, they call it "zero copy" as the data never moves.

True zero-copy setup is possible on M1, but requires explicit opt-in via dedicated APIs. The default API is still copying. But copying is much cheaper using UMA since it’s not limited by the PCIe bandwidth or latency.
 
If it measured like that, then the M1 would win, hands down because with unified memory it take zero time to move the data. That is the big advantage of it, they call it "zero copy" as the data never moves.
I kind of think that benchmarks like GB5 Metal typically do not make use big datasets when they are running. The dataset most likely will fit into the M1 or the dGPU's cache. So both tests likely are measuring the core GPU's speed. So in this instance RAM vs VRAM bandwidth probably doesn't play too much role.

I would think the benchmarks measure FPS would be a more accurate real-world-tests to compare the entire setup of the M1 GPU vs dGPU systems.
 
  • Like
Reactions: jdb8167
I kind of think that benchmarks like GB5 Metal typically do not make use big datasets when they are running. The dataset most likely will fit into the M1 or the dGPU's cache. So both tests likely are measuring the core GPU's speed. So in this instance RAM vs VRAM bandwidth probably doesn't play too much role.

I would think the benchmarks measure FPS would be a more accurate real-world-tests to compare the entire setup of the M1 GPU vs dGPU systems.
Yeah GB Metal Compute isn't a rendering test at all, and you can have great compute performance and meh rendering performance (and vice versa). Of course for Apple Users purposes rendering performance has never really been the highlight, it has been the compute performance.
 
Yeah GB Metal Compute isn't a rendering test at all, and you can have great compute performance and meh rendering performance (and vice versa). Of course for Apple Users purposes rendering performance has never really been the highlight, it has been the compute performance.
I would think Apple’s UMA + TBDR would prove beneficial to rendering workflows and doesn’t need as much grunt as dGPUs + fast VRAMs as it does not have the PCIe bottleneck. I think the M1’s potential hasn’t been fully realized at the moment, and it’s already looking good.

It’ll be interesting to see what’s coming next from Apple.
 
I would think Apple’s UMA + TBDR would prove beneficial to rendering workflows and doesn’t need as much grunt as dGPUs + fast VRAMs as it does not have the PCIe bottleneck. I think the M1’s potential hasn’t been fully realized at the moment, and it’s already looking good.

It’ll be interesting to see what’s coming next from Apple.
I agree, I do not think Apple M1 potential has been fully realized.
 
  • Like
Reactions: quarkysg
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.