Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It would help to have benchmarks on current gen integrated graphics cards like the Intel Xe and AMD Vega 6 or 7. I know offhand that a gtx 1650 is much faster than the integrated Xe or Vegas.
Which will occur when it’s released. This user could only use what was on hand.
 
The 1050ti and RX560? Talk about setting a low bar to hurdle over.

The CPU on the other hand, performs nearly as well as the 2017 7820x for multi-core performance, and the single core is nearly on par with the current AMD chips. For a CPU in the budget Macs, that's pretty freaking great.
 
I see it just the opposite. The M1 IS the great effort Apple has been over the last few years.
I think we're on the same side then. The M1 is undoubtedly where they've been focusing their efforts, but the laptops they've released in the meantime have suffered for it.
 
  • Like
Reactions: entropys
Would love to be able to mine some crypto using this. Current Macs suck for crypto mining.
 
The 1050ti and RX560? Talk about setting a low bar to hurdle over.

The CPU on the other hand, performs nearly as well as the 2017 7820x for multi-core performance, and the single core is nearly on par with the current AMD chips. For a CPU in the budget Macs, that's pretty freaking great.

I don't think even the best AMD APU other than a console has something comparable to the 1050ti and RX560, and even then it has slow external shared RAM.
 
Well they are benchmarking against old generation AMD, ATI cards

those cards had been released at 2017

Nowadays both AMD, Nvidia play muuuch higher

amazing chipsets for a mobile phone, no doubt

Xbox Ones Series S = 4 teraflops at 300 euros (and it includes other hardware apart from the graphic card)

Xbox One Series X = 12 teraflops

Custom PC = 20 teraflops (and isn't the highest you can get with a single video card)
True but I have a laptop with a 1050Ti that runs games well at 1080p. Newer ones run at medium quality and get over 30 FPS and ones 2+ years old can easily run high at close to 60 FPS. Add to that the better CPU vs my 8th gen i7 and the M1 computers should be respectable 1080p gaming computers. I doubt the fanless Air will sustain performance well but the Pro and Mini should.
 
I bet a M1 will catch up to PS5 and X Box Series X in 3-4 years and Apple will create a stripped down version of a Mac Mini just for gaming and start entering gaming market. They can potentially come up with a program where people can upgrade the chip annually and change the gaming industry forever.
 
PS5 10.28 teraflops

in short, M1 is far far far away

but amazing for a Phone and/or Tablet
Is it that far away though? I’d expect the midrange chips next year to be next generation cores - let’s say that adds 20% by itself, that’s 3.12 TFLOPS. Then double the cores (which I would totally expect to be the differentiator), now we’re at 6.24 TFLOPS - on chip. That’s without doing any outlandish stuff. Still not PS5-beating, but I’d argue it’s fair to say it’s in the ballpark. And this is what I would expect to be midrange - then they will have a high-end solution for Mac Pro and iMac Pro on top of that.

In other words, even for someone to ask the question about the lowest end cpu, is a huge win. The irony is, if the performance was at the level of the i3 that it is replacing, noone would complain about max 16 GB RAM...
 
I'm not sure why M1 beating 1050ti in a benchmark is that impressive. It's a benchmark, not real world app. And Nvidia MX450 is probably on par with or better than M1 for real world stuff like games.
 
For reference, this is about the GPU performance of the Intel Iris Xe but with a stronger CPU paired to it. It’s not mind-blowing performance unless you ignore the power draw and other features of the package.
 
And M1X and M1Z could potentially be at the level of PS4 Pro at 4.2 Telaflops or they won't be able to justify the price tag of MacBook Pro 16". That's basically a portable PS4 Pro with screen.
 
For someone who doesn’t know much about graphics cards, how far is the M1 behind the PS5? How far are we until we could have next gen graphics in an Apple TV?
Although teraflops (1 teraflop = one million floating point operations per second) are not the best way to rate GPU performance, it can provide a decent baseline of raw compute power. With that said, The M1 chip still only has about 25% of the compute power of a Playstation 5.

Apple M1 = ~2.60 teraflops
PS5 = 10.28 teraflops
PS4 Pro = 4.2 teraflops
XBox X = 12 teraflops
RTX 3080 = 29 teraflops

Intel integrated graphics, up until the new Xe line, have only provided about .4 teraflops of compute power, so this is a monumental leap. Even Intel's discreet GPU provides less power (2.4 teraflops) and it draws about 25 watts. The entire TDP of the M1 is 10 watts, and that includes the CPU/GPU/Memory. This is where the M1 shines: Performance per watt is simply unmatched.

Note that a LOT of other things matter here: Memory bandwidth and shader accelerators, of note, can vastly improve the performance of a card even with similar raw compute power, so these specs are not to be taken as the only assessor of performance.

As for Apple TV, it's not a big leap to say that we could see PS4 performance within two years and PS 5 performance within four. I actually think this is what drove them to release Apple Arcade before any huge third-party titles were available... Knowing that they will have the power to compete with consoles soon could make them a major player in the video game industry.

Can you imagine an Apple TV competing with PS6? It'd be interesting.
 
I'm not sure why M1 beating 1050ti in a benchmark is that impressive. It's a benchmark, not real world app. And Nvidia MX450 is probably on par with or better than M1 for real world stuff like games.
Why?

You can draw and shade x-thousand triangles per second or not. Whether it’s a game or an artificial benchmark doesn’t make a difference.
 
Although teraflops (1 teraflop = one million floating point operations per second) are not the best way to rate GPU performance, it can provide a decent baseline of raw compute power. With that said, The M1 chip still only has about 25% of the compute power of a Playstation 5.

Apple M1 = ~2.60 teraflops
PS5 = 10.28 teraflops
PS4 Pro = 4.2 teraflops
XBox X = 12 teraflops
RTX 3080 = 29 teraflops

Intel integrated graphics, up until the new Xe line, have only provided about .4 teraflops of compute power, so this is a monumental leap. Even Intel's discreet GPU provides less power (2.4 teraflops) and it draws about 25 watts. The entire TDP of the M1 is 10 watts, and that includes the CPU/GPU/Memory. This is where the M1 shines: Performance per watt is simply unmatched.

Note that a LOT of other things matter here: Memory bandwidth and shader accelerators, of note, can vastly improve the performance of a card even with similar raw compute power, so these specs are not to be taken as the only assessor of performance.

As for Apple TV, it's not a big leap to say that we could see PS4 performance within two years and PS 5 performance within four. I actually think this is what drove them to release Apple Arcade before any huge third-party titles were available... Knowing that they will have the power to compete with consoles soon could make them a major player in the video game industry.

Can you imagine an Apple TV competing with PS6? It'd be interesting.
Definitely. I think they will start competing with PS5 and X Box Series X in their mid cycle.
 
The M1 is beating the snot out of the Intel CPUs and now some GPUs. Now don't get me wrong. I recognize that these values may not be sustained and only win in the short term. I don't care because, before now, the Intel equivalents (and I mean equivalent in terms of low power chips) were simply pitiful and didn't even pretend to compete.

This is just the beginning. The M1 is the low end chip. Remember that!

When they integrate 40 lanes of PCIe Gen3/4 and 4 channels of LPDDR4/5 for support of off chip memory, call me.
Until then, this is a glorified iPad processor.
When they start integrating all those PHY's for off chip peripheral and memory support 2 things will happen.
We will see power consumption increase by a significant amount *AND* we get to truly see how the chip stacks up.

Let me know when thy have a MacPro with an ARM.
 
For reference, this is about the GPU performance of the Intel Iris Xe but with a stronger CPU paired to it. It’s not mind-blowing performance unless you ignore the power draw and other features of the package.
If Apple can design an integrated GPU that at 10 watts can outperform a dedicated GPU 35 – 55 watts that is HUGE. With GTX 1650ti you can play all modern games at 1080p and high settings and 45+ frames and for only the most demanding games go for medium settings to get stable 30 frames. That is console level performance (even if not next gen console level). If Apple puts 50 watts to its M-chips with proper cooling as in current Macbook Pro/iMac lineup, we are talking about GTX 2060 super performance in integrated GPU. This is now proper gaming machine with next gen graphics.
 
  • Like
Reactions: neuropsychguy
This is really great news and it shows that Apple is still the leader in tech innovations. It's really sad how some people here feel the need to downplay the M1. They act like it hurts their damn feelings to see any positive news about Apple. SMH.
 
I do feel the Mini version should be superclocked somehow. It's not relying on battery power and has a better cooling setup than the Pro & Air. Then again when has Apple propped up the Mini in any way? /sour_grapes

Running a processor faster isn't just cranking up the clock.
THe processor was manufactured and designed with speed goals in mind.
Exceeding those normally isn't possible due to all kind of issues related to transistor speed, wire on the dies, etc.
That's a simplified explanation.
 
Last edited:
Get ready for it: Yea, but ... But what about ... But I bet they can't ...

And M1 is doing this in a 10w SoC. Think different.
But like I said in a previous post.
There is very little off chip.
Once you start adding off chip interfaces, power goes up significantly.
Intel and AMD could dramatically reduce power if they didn't have PCIe and LPDDR.
 
Good and bad replies to this story.

Bottom line.....

FIRST gen Mac “M” chip and everything says it screams and provides loads of battery life. Only gets better here on out and the 16” MBP may very well have a second generation M chip. Who knows?

With the launch of Big Sur and its bridging some of the gap between mobile devices and full fledged Macs, I’m excited to see what is on the horizon as far as something like a

MacBook Pro Touch.
 
  • Like
Reactions: tonysabbath
Huh?
The GPU in the Macs that the M1 Macs are replacing **WAS** the Intel CPU.
Its Apples to Apples.
Intel CPU with integrated GPU was slower than M1 Macs with integrated GPU.

What ultra-light notebook has a high end discrete graphics chip?
Not an equivalent comparison since the M1 has absolutely no support for off chip memory.
Intel CPUs use off chip memory.
 
Wow, jaw is dropping again. First we see excellent CPU benchmarks and now excellent GPU performance for a laptop: However, don't mistake these "Desktop GPUs" this article compares to the M1 to GOOD modern desktop GPUs - these are extremely budget desktop GPUs. Yes, it's still good news, but this isn't like saying "M1 GPU performs like 2080 Super," or even a 580X for that matter. But that's impressive performance for a low-power-consumption laptop for sure. Also, while this is great for productivity apps that require a decent GPU, and for the handful of excellent games available on Mac, it doesn't have the impact it could as currently as we've lost Bootcamp, and that's where the vast majority of games would have had to have been played. Not a judgement, just a reality. I would love a world in which the MacOS got all the Widows game releases, but that's not the world we live in. But fingers-crossed that advances in Windows on ARM will eventually mean Bootcamp on ARM Macs again at some future date. In terms of what you can expect on (for example) a 560-level GPU; plan on playing most modern games at lower resolutions and with a lot of eye-candy reduced or turned off in order to achieve smooth frame-rates. Older games may run incredibly well however (assuming they are updated for ARM or don't need more CPU performance than they can get out of Rosetta 2). The good news is that Retina displays do a fantastic job of upscaling, so you can sacrifice a lot of resolution and still get a really nice-looking game on these modern UHD displays.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.