Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Are you actually telling me that X86 magically switches transistors is such a way that it consumes 200W+ for 10.4TF, but not for ARM64? 🤡

Do even understand the meaning of floating point operations?

It’s like saying 5km/h is different on X86 when compared ARM64 🤣
i think you dont know how this work...and how ARM64 works/ consume...yes the performance of M1 is on par with lets say an i7 for sustain loading but x86 i7 consume twice as power of the M1..facts dont lie
 
Not everyone uses their gpu for content creation tasks. Many just want to game and can’t on macOS at least not like they can on windows.
I really wish they’d made a SKU without the fancy GPUs, priced lower, because I mostly want one of these new machines for the vastly improved screen and keyboard.

As it stands, I’ll be buying way more raw horsepower than I need.
 
Agreed, but the subject of this thread is about comparing the M1-Max to a PS5. So it makes sense that this thread is all about games and not about content creation.
Fair enough, although I would argue the point of this thread is about raw GPU performance, which is the title of the article this thread is attached to. The measurement of relative teraflops applies both to gaming and content-creation tasks.
 
I like dissipates. It (and other laptop components that dissipate heat) speak to the overall thermal considerations and required engineering trades needed to "get rid of" unwanted heat that can cause device, and ultimately, laptop failure.

It's nice that you like it, but it would be as correct as the original article.

Wasting less power is not the same as using less, for comparison purposes. To use the two interchangeably ignores the fact that energy is used in service of its primary function.
 
You actually believe that Apple has more graphics performance than a PS5?
Just for your own knowledge, the PS5 is outperforming a 14TF Readeon V GPU…

Keep in mind that Appels 10.4TF is the max theoretical performance that you won’t come close to at the 60W power usage.
Same as the PS5...it max theoretical 14TF...
I think you are an windows user here to contradict with everyone else =)) ...i care about facts and what the system does for me instead and how it compares with my other x86 systems in real world than someone unknown on web is telling me without even have the device in front of him. I think you miss a whole year of M1...you have been asleep
Im out of this debate..im not here to pollute this topic, have a nice day arguing with all the others :))))
 
  • Haha
Reactions: ohio.emt
Okay, so shove one in a Mac mini, add USB ports on the front, bundle it with With Apple Arcade and boom: instant games console.

Yes, but that is the high end M1 Max chip. So that would be a $1,000 mini at the very least. And with supply constraints for the chip, which I'm sure exist, we won't see it in the Mini this year and maybe never. Obviously there will be a spec bump to the mini coming next year. But it might be an M2.

The real question is, can this mine bitcoin? I shudder to think what happens if Apple makes a Mini that is good at mining bitcoin at low power. For those who don't know the process, mining is done by mathematical computations being run by the GPU, which means cost of electricity and cooling for the computers is the key constraint. If Apple's GPUs work well for mining (they might not be powerful enough) and they use less electricity, then the bitcoin miners will basically buy them all, similar to the way they snap up gaming GPUs now.
 
Same as the PS5...it max theoretical 14TF...
I think you are here to contradict with everyone else...i care about facts and what the system does for me instead someone unknown on web is telling me without even have the device in front of him
Im out of this debate..im not here pollute this topic, have a nice day arguing with all the others :))))

PS5 only has 10.3TF and consumes 215 Watts to reach its max performance. And it outperforms 14TF desktop GPU because of its unique design.

MacBook has 10.4TF ar 60W…
My point is that 60W is far from enough to pull 10.4TF.
 
PS5 only has 10.3TF and consumes 215 Watts to reach its max performance. And it outperforms 14TF desktop GPU because of its unique design.

MacBook has 10.4TF ar 60W…
My point is that 60W is far from enough to pull 10.4TF.
PS5 consumes around 185W because it still is under x86 arhitecture...it cant go lower , it reached the thermal constrained .Unique design of a system on a chip and architecture has Apple since 2020..Yet the M1 it should be far from its 5.2TF at just 10W-15W...but , in reality it isnt...come on...=)))
Have a great day in denial...
 
  • Like
Reactions: D_J
Everyone talking like games are the only thing these bombastic chips can't do.

Well there are many other things they can't do either, like CUDA acceleration (for many python-based ML and DL algorithms), or any kind of workflow relying on OpenGL in general.
Sure there are workarounds, but Apple made it clear they do not intend to push other platforms then theirs by putting hardware acceleration for ProRes in their chips, and showing off apps like Compressor, Logic and Final Cut.

Nothing wrong with that, classic Apple, it just means it's a platform designed for SOME pros, not ALL pros.

I do image analysis and frequently hit some issues due to Apple's lack of support for tools commonly used otherwise, and that's why my main machines remain PCs. Ironically, this forces me to be in the Microsoft ecosystem a lot, and the Macbooks are still very enjoyable for classic desktop work.

Now, that being said, if Apple was really serious about gaming, they would create a separate line dedicated to that. I would see them buying Razer like they did with Beats, and keeping the brand while using Apple chips and a custom OS that supports all the nice libraries, like SteamOS does.
Sadly, they will never do it, either to avoid spreading their line too wide (to remain faithful to the famous Steve Jobs 4 quadrants), or simply because they just don't get it (Apple Arcade is becoming a sad excuse for the mess they brought to this world with their app store policies).
 
You actually believe that Apple has more graphics performance than a PS5?
Just for your own knowledge, the PS5 is outperforming a 14TF Readeon V GPU…

Keep in mind that Appels 10.4TF is the max theoretical performance that you won’t come close to at the 60W power usage.
It is possible to due to integrated memory architecture, traditional hardware design wastes cycles transporting data to and from CPU memory to GPU Memory. Keep in mind the RAM on your machine is slower than GPU memory. In Apples case the high speed memory is the only memory shared between the two.
 
i think you dont know how this work...and how ARM64 works/ consume...yes the performance of M1 is on par with lets say an i7 for sustain loading but x86 i7 consume twice as power of the M1..facts dont lie

You seriously have school yourself…
Now your going from GPU floating point performance to X86 vs ARM64 CPU performance.

CPU is not equal to GPU TF performance.

If a GPU can output 10.4TF it means that’s it’s maximum floating point performance, regardless of X86 vs ARM64 (which is a measure for CPU).

X86 vs ARM64 don’t mean anything when it comes to GPU performance. Again, that’s a measure for CPU architecture 🤡
 
  • Like
Reactions: wilhoitm
Everyone talking like games are the only thing these bombastic chips can't do.

Well there are many other things they can't do either, like CUDA acceleration (for many python-based ML and DL algorithms), or any kind of workflow relying on OpenGL in general.
Sure there are workarounds, but Apple made it clear they do not intend to push other platforms then theirs by putting hardware acceleration for ProRes in their chips, and showing off apps like Compressor, Logic and Final Cut.

Nothing wrong with that, classic Apple, it just means it's a platform designed for SOME pros, not ALL pros.

I do image analysis and frequently hit some issues due to Apple's lack of support for tools commonly used otherwise, and that's why my main machines remain PCs. Ironically, this forces me to be in the Microsoft ecosystem a lot, and the Macbooks are still very enjoyable for classic desktop work.

Now, that being said, if Apple was really serious about gaming, they would create a separate line dedicated to that. I would see them buying Razer like they did with Beats, and keeping the brand while using Apple chips and a custom OS that supports all the nice libraries, like SteamOS does.
Sadly, they will never do it, either to avoid spreading their line too wide (to remain faithful to the famous Steve Jobs 4 quadrants), or simply because they just don't get it (Apple Arcade is becoming a sad excuse for the mess they brought to this world with their app store policies).
i think they are talking because 2 reasons..1 they like to have fun in their free time...and 2 a very complex game can show the power that an gpu can have, while others pro apps cant do (ofc there are pro apps that uses your gpu at maximum)
 
I really wish they’d made a SKU without the fancy GPUs, priced lower, because I mostly want one of these new machines for the vastly improved screen and keyboard.

As it stands, I’ll be buying way more raw horsepower than I need.
Well They do sell a 14 core gpu part.
 
You seriously have school yourself…
Now your going from GPU floating point performance to X86 vs ARM64 CPU performance.

CPU is not equal to GPU TF performance.

If a GPU can output 10.4TF it means that’s it’s maximum floating point performance, regardless of X86 vs ARM64 (which is a measure for CPU).

X86 vs ARM64 don’t mean anything when it comes to GPU performance. Again, that’s a measure for CPU architecture 🤡
who said the cpu IS equal to the GPU TF??? Jesus, can you even read what others are saying?
so not you are new to this but you are trying to start flaming..
please read the rules https://forums.macrumors.com/threads/macrumors-forum-rules.1672419/
 
I know that you know...it seems not everyone was here since the M1..Or, windows users are fragile now with what is happening since last year on the Apple silicon segment
I really dont understand why windows users see this as a problem? All they have to do is wait for Qualcomm, or any of the other ARM CPU makers to start producing competing chips. x86 has run its course, it had a good 50+ yrs run. Time for a change i believe.
 
Not disputing how impressive these chips are but ALRIGHT ALREADY.

How many GD articles do we need about this?

Next article: M1 Max chips allow new MBP to actually make popcorn, will pop out of HDMI slot. All without the use of the new fans!

BTW: said I wouldn't but I did order one of these. Couldn't resist. I know, I'm a sucker.
Agreed. I miss the articles about iPhone colors and Tim Cook's latest opinions on social justice issues. 😜
 
thats why big companies like Blizzard came in just a few weeks with native WOW..and a lot others?Again, you have to be sure they work ok on M1 cpu+gpu thats why a lot heavier gpu are not on mac platform yet...but now, they started to have the hardware
Again, i understand the budget constrained from the small/medium publishers...but not the big one
And, this architecture will translate to the consoles in 8-9 years when they will also go arm
Again, big publisher/developers can translate an AAA BIG game (like WoW, not like CSgo) in a matter of weeks/months
The transition is paid a lot less, so the cost is not the same as it is for building a game from scratch
Again, mac platform is less than 15% of the market..of course but from now on..you have to consider that your new game that you will start and be ready in 3-4 years..have to be arm compatible also since both intel and amd are working hard for arm SoC, and the mobile gaming out-past the desktop one years ago. The x86 is too expensive and too heat constraint in a world where the devices got smaller and smaller in the last decades
And you know what? I hope you are right because like I said, what Apple has pulled off is a technological marvel. I would love to play AAA titles on an M Series mini, with high framerates. But in order for that to happen, Apple is going to have to open its wallet like they did with Apple TV+. Publishers are not going to be willing to take that chance when big budget AAA games are already hitting $200-350 million dollars.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.