Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Its nice to see the M1 chip benchmarking higher than the current gen Intel integrated graphics. Anyone expecting performance better than a 4 year old dedicated GPU is high on something.

I would still hope Apple is working on all CPU chips that can leverage GPUs from MAD or Nvidia.
 
Although teraflops (1 teraflop = one million floating point operations per second) are not the best way to rate GPU performance, it can provide a decent baseline of raw compute power. With that said, The M1 chip still only has about 25% of the compute power of a Playstation 5.

Apple M1 = ~2.60 teraflops
PS5 = 10.28 teraflops
PS4 Pro = 4.2 teraflops
XBox X = 12 teraflops
RTX 3080 = 29 teraflops

Intel integrated graphics, up until the new Xe line, have only provided about .4 teraflops of compute power, so this is a monumental leap. Even Intel's discreet GPU provides less power (2.4 teraflops) and it draws about 25 watts. The entire TDP of the M1 is 10 watts, and that includes the CPU/GPU/Memory. This is where the M1 shines: Performance per watt is simply unmatched.

Note that a LOT of other things matter here: Memory bandwidth and shader accelerators, of note, can vastly improve the performance of a card even with similar raw compute power, so these specs are not to be taken as the only assessor of performance.

As for Apple TV, it's not a big leap to say that we could see PS4 performance within two years and PS 5 performance within four. I actually think this is what drove them to release Apple Arcade before any huge third-party titles were available... Knowing that they will have the power to compete with consoles soon could make them a major player in the video game industry.

Can you imagine an Apple TV competing with PS6? It'd be interesting.

Without AAA titles, Apple cannot compete with the next gen consoles. And to get to these AAA titles, it would first need matching horsepower so that developers could consider developing for ARM. Im ready to be suprised again by the performance of the M2 or M1X or whatever it is called. Next year would be very interesting for the whole industry.
 
I did gfx on my MBP16 with dedicated graphics (Radeon Pro 5500M) and the M1 is performing better than it in most test while in some it lagged a small amount.
 
  • Like
Reactions: torncanvas
Machine learning training is one of Nvidia's biggest and fastest growing revenue segments. It's GPU intensive (unless you're Google or Baidu, and can design your own TPUs or NPUs).

I can't wait to see the MLPerf numbers for these new M1 Mac mini systems, and see how close they are to being cost competitive compared to dedicated GTX 2080 Super ML training boxes from custom integration vendors.
Yep, modern computing is increasingly built on GPUs. And not just games. Shocker!
 
I did gfx on my MBP16 with dedicated graphics (Radeon Pro 5500M) and the M1 is performing better than it in most test while in some it lagged a small amount.
I have a 16 with a 5500m, too, and the M1 is kind of exciting and depressing at the same time.
 
MacBook Pro Touch.

I too would like to see Mac laptops gain touch. It's not a feature I see myself personally using at all, but it would be a huge help in education, as many young students no longer understand how to use anything non-touch based. This causes real problems come testing-time (you would be shocked how many kids have never used a mouse). A touch screen would simply be beneficial at this point to Apple's potential user-base, whether or not you or I personally want to use them. My daughter is also thinking of going to something like a Surface in the future because she's sick of having to have a MacBook and an iPad and shuffle crap back and forth between the two. She does work (for which she needs a full computer, not an iPad), and does pen-based art, and doesn't understand that Apple thinks you should lug-around two devices instead of one.
 
Its nice to see the M1 chip benchmarking higher than the current gen Intel integrated graphics. Anyone expecting performance better than a 4 year old dedicated GPU is high on something.

I would still hope Apple is working on all CPU chips that can leverage GPUs from MAD or Nvidia.
High on something?


GTX 1650ti runs all modern games at high setting and 1080p with +40fps. If a Macbook Air can achieve this kind of performance, it would become (for a first time) a casual gaming machine for many people. This a crazy performance. Intel dedicated graphics cant compete even with 10 years old dedicated desktop GPU, let alone 4 years old one.
 
Tell that to the game developers. No one wants to bring AAA titles to the Apple TV because it's an extremely niche market.
It is niche market. If the market is everyone having Mac with M1 or Apple TV with M1, then the user base is HUGE.
 
I did search for the GeForce GTX 1050 Ti on YouTube, and found this video.
it's a video about games using the GeForce GTX 1050 Ti in 2020, and if the M1 is even faster, then I think it's quite impressive for a chip that sits in an MacBook Air.

 
Also remember that these GPU's they are comparing to are over 3 years old... so I'm not really seeing why we should be so excited. It's a click-bait title with **** analysis. Sure they are decent for mobile stuff - but they in no way compare to full blown dGPU's from AMD or Nvidia.
True. But some of us are comparing this to our now EOL macs that work perfectly well, so seeing that even in Rosetta this budget M1 outperforms my 2012 MBP 15 i7 in all aspects by 100% or more is impressive.

i paid a lot (in my mind) for my refurbished MBP at the time (2014) and have upgraded it over the years with memory and SSDs, but I couldn’t afford the cost of a newer one that was worth owning ($2500+), especially considering the fixed SSD and Memory costs.

This brings hope to those of us who want a new machine worth owning at a lower price.
 
The GTX 1050 series was a budget card in 2016. I don’t remember it being recommended for gaming even at the time, unless budget was a serious constraint.

In all fairness, Apple are nearly always at the root of every spat. They simply don’t know how to meet other companies half way and while that’s worked out, it doesn’t make it right to go blaming other companies for Apples historically poor graphics offerings.

I bought a GTX 1050 Ti two months ago and it's not a gaming card. It's a card for running a lot of high-resolution monitors. I bought it because of thermals (75 Watts max) and because it supports 4x4k monitors. Quad-monitor support is something that the Apple M1 doesn't have. A lot of video cards aren't very good at specifying exactly what is supported on high-resolution monitors and most of them are more aimed at gaming rather than a trading workstation. I can only run at 30 hz - I get intermittent black screens at 60 hz - but 30 hz is fine for trading and the other things that I do.

Intel UHD 630 support 3x4k and I think that Apple's second generation Silicon will need to provide that support - I doubt that they would have any problems doing so given that Intel has been doing it for several years.

The cost of Apple's solution, though, is tiny. My graphics card is $155 - while the cost of Apple's is probably $1 or less.

Correction: my version can support 3x4k + 1xQHD. It can support the resolution of 4x4k though. There may be variants that support 4x4k.
 
  • Like
Reactions: stringParameter
You keep saying the "base" model...but it's still the fastest chip Apple makes, because nothing else exists, so it also the flagship model for now.
It's still positioned as the low end machine, they slotted it in in the low end models of the MBP and Mini, and the Air was already the low end, performance wise, machine. It's an introductory chip that represents the base model macs now, the chips in the machines that replace the higher end machines, the iMac(Pro), higher end MBPs, and higher end mini (let alone whatever they put in the MP replacement) will be substantially faster, that's not an open question. It's not unreasonable to call this the base model, low end AS chip, because we know where it sits in the product lineup and we know faster chips are coming, soon.
 
No, they weren’t impressive at all. But you could plug in an eGPU.

This means the mac mini may not be suitable for certain use cases anymore. It’s curious how differently the mac mini has been positioned in the lineup through its iterations. Maybe we’ll see a more capable configuration once the M2/M1X/whatever they call their higher end chipset.

Also, regarding the article, it’s very impressive what they did with their first computer cpu, no matter how you look at it. Is it enough for all the purposes people use macs for? Not for a longshot. Will apple be able to deliver mac pro caliber performance with their more powerful chips? We’ll see, but this first chip is promising.

As an iMac without a display, the Mac mini was interesting but not so usable for me. I went from a PowerMac G4 to a PowerBook G4 to a MacBook Pro with GeForce 650M. That final machine doesn't seem that much slower than my 2018 Omen by HP laptop computer, even though there were 3 generations of quad core i7 processors between the two.

For many people who think of Apple as an accessory maker for their outfit, the first generation of new processors will be just fine.
 
My desktop is an RX580 and 6700k, which is usually enough power.

I was considering building a 9900k system with vega 56.

The way things are looking I might just save that money and see what Apple comes up with next. 14" MBP or Retina iMac will probably be the target for me.
If you are using a computer for gaming, I would not hold out for a M1 or M2 based Mac for gaming. Upgrade your PC with a Ryzen 5000 series CPU and either a Nvidia 3000 or AMD 6800 series GPU.

I do not think Apple will have GPU hardware to touch those PC parts for a long time, long time. Of course the fact that NONE of the AAA PC games are native on any Mac is the bigger issue. Sure some of them have come to the Intel Mac's, years after the launch and wrapped in some kind of WINE wrapper which only kills performance.
 
  • Like
Reactions: AxiomaticRubric
This reminds me a lot of the time between the Previous tower MacPro and the trashcan Mac.

Apple may or may not bolt the GPU onto the Mac Pro or build an Apple GPU "card" that will cost a fortune and be extremely proprietary. Interesting time.

I'm sure we're all dreaming of Apple having both as an option; where we see a Mac Pro level M1 chip that can do everything and still have the option of installing PCI cards and such.
 
  • Like
Reactions: Manzanito
Instead of benchmarking the M1 on mobile optimized games how about benching the M1 on games that people actually play? Like Doom.
 
very exciting stuff. The higher end M1 chips for iMac, 16" MBP and more should be off the hook from what we have seen so far. By the way, my username I have had since 2013. #winning
 
  • Like
Reactions: torncanvas
I too would like to see Mac laptops gain touch. It's not a feature I see myself personally using at all, but it would be a huge help in education, as many young students no longer understand how to use anything non-touch based. This causes real problems come testing-time (you would be shocked how many kids have never used a mouse). A touch screen would simply be beneficial at this point to Apple's potential user-base, whether or not you or I personally want to use them. My daughter is also thinking of going to something like a Surface in the future because she's sick of having to have a MacBook and an iPad and shuffle crap back and forth between the two. She does work (for which she needs a full computer, not an iPad), and does pen-based art, and doesn't understand that Apple thinks you should lug-around two devices instead of one.
Exactly.

I play a fair share of guitar and love being able to use my iPad Pro for scrolling sheet music. Doing so on my MacBook Pro is a little cumbersome with a mouse or trackpad. Logic also doesn’t work on MBP and files o create on GarageBand, for some reason, won’t translate between my machines.

So a MBP Touch would be a godsend.
 
But like I said in a previous post.
There is very little off chip.
Once you start adding off chip interfaces, power goes up significantly.
Intel and AMD could dramatically reduce power if they didn't have PCIe and LPDDR.
So what you're saying is, Intel and AMD should have made laptop processors that don't need PCIe and LPDDR. Instead, they're beaten by a company who is new to the game.
 
When Apple can beat a GTX 2080 ti or the new 3080, I'll be impressed until then they are still playing catch up... but appreciate the efforts all the same.
How do you propose putting a $1000 graphics card into a $999 laptop and keeping the price at $999?
How do you propose putting a 300W graphics card into a 30W laptop and having it work at all?
How do you propose putting an 11.5" x5.5"X 2" card into a MacBook Air Form factor without interdimensional pockets?
How do you propose cooling the 300W card without fans on liquid in a MacBook air form factor?

Please, enlighten us with your brilliant engineering. No one is saying it is the fastest GFX ever built. They said it was a fine first effort and an ENTRY LEVEL chip. They haven't released a flagship model yet. Pretending it is the top end because it is the only one is willful ignorance.

If you want to run the latest Elder Scrolls game at 8K at 900FPS, you don't buy a laptop to do it. Wrong tool for the wrong job.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.