Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I do not know what was in the minds of Jobs and Wozniak but I have a feeling gaming was not in their minds when they built the company and that premise is still alive today. ...
One of the first projects that Jobs and Woz worked together on was Atari Breakout. Breakout was one of the first applications Woz wrote for the Apple I.

Gaming is big business, especially Esports ...
Isn't total mobile game app revenue greater than PC Esport revenue?
 
  • Like
Reactions: Unregistered 4U
Look, the M1, M1 Pro, M1 Max are really impressive but the fact of the matter is if you want to play AAA titles it’s not going to be on the Mac. And i’m not trying to throw shade, it’s reality.

Now as far as your price example, here’s one. I purchased an HP Omen 30L directly from Amazon for $1999.99 and it has the following specs. i9-10850K, 32GB of Ram, 1TB Nvme, Wifi 6, RTX 3080. I can play the latest games in 4K pushing 60FPS.

The problem with the Mac is the same Linux has wrt Games. Who is willing to spend time writing the games to run on a Mac? Game Publishers are budget constrained and in order to recoup their enormous investments, they have to target the biggest audience and that’s the PC crowd.

It’s no different with the consoles. Publishers and developers may target a specific console because of the user base. So regarding the Mac, it’s not about specs but rather user base and who wants to invest in making the game run on the M1 series chips. It did not happen when Apple was using Intel chips so it’s highly doubtful things will change with the M chips.

That said, i’m very impressed with what Apple has pulled off.
Did you get it on sale, because the 3080 Omen 30L desktop I find on Amazon is $2600?
 
I do not know what was in the minds of Jobs and Wozniak but I have a feeling gaming was not in their minds when they built the company and that premise is still alive today. I believe Apple's core belief for it's computers is that of learning and education and not gaming because gaming for a machine that has the power to do so so much more, it just seems dirty.
And I think that's what has helped Apple become as successful as they are today - by being very clear about who their intended audience is, catering specifically to them, and this clarity lets them make compromises in areas they know will not matter to their target demographic, so that they can better focus on the other areas that do.

Consider what you would have needed to do to make a viable gaming laptop or PC. Big, bulky, power-hungry graphics cards that require a lot of space for ventilation and cooling. Which goes against what Apple laptops are famous for - being thin and light, having great battery life, and capable of sustained performance while not plugged in to a power source. The competition had a hard time offering this even back when Apple was still using intel chips in their Macs (because they chose to emphasise raw paper specs over the user experience), and I expect this gulf to further widen once the M1 MBPs ship and make their way into the hands of more users.

Not to mention that the people passionate about gaming are also the sort to spend hours online shopping for the cheapest PC parts so as to eke out maximum spec for their money, which is at odds with Apple selling you the entire prebuilt, integration widget at a premium. The money just isn't there.

I don't think Apple has some vendetta against gaming specifically. They just think that it's not something their user base is going to be interested in, they know it's not why people buy Macs, and so they don't bother going out of their way to support it. Not when they already have a very lucrative gaming market in the form of the iOS App Store (even if the majority of it is dominated by freemium games).

And Apple seems to have made the right call here.
 
I don't think Apple has some vendetta against gaming specifically. They just think that it's not something their user base is going to be interested in, they know it's not why people buy Macs, and so they don't bother going out of their way to support it. Not when they already have a very lucrative gaming market in the form of the iOS App Store (even if the majority of it is dominated by freemium games).
I'm of the opinion that Apple is limited by the choices of components they can use to build the Macs that they want to build, i.e. thin and light notebook. With AS, they are no longer constrained. It always good to sell more Macs to gamers if possible, but seeing that this market is niche, Apple chose to service the productivity market instead.
 
I'm surprised that nobody has been talking about this...the issue of extending the life of existing Intel Macs for gaming or graphics programming purposes. We know that macOS still has many years of support life left, so an Intel Mac is still viable IMO.

I've just come across this post by Sonnet of their new eGPU enclosures. I think it shows there's still plenty of life in Intel Macs. The GPU Compute performance alone on a RX6900XT compared to the M1 Pro and M1 Max shown says something else...

View attachment 1873286

Looking at this graph, if it were me, I would probably choose an Intel Macbook Pro and have it fitted with the Sonnet Breakaway box and a Radeon RX6900 instead of a M1 Mac. Either that or have 1 laptop of each platform.

I can only imagine the total cost given the RX6900 is between $2,700 and $3,700 here in Australia. And if you’re going to tether your gfx card to your laptop with an umbilical cord, you may as well buy a desktop and stop pretending that you have a mobile option.
 
I'm surprised that nobody has been talking about this...the issue of extending the life of existing Intel Macs for gaming or graphics programming purposes. We know that macOS still has many years of support life left, so an Intel Mac is still viable IMO.

I've just come across this post by Sonnet of their new eGPU enclosures. I think it shows there's still plenty of life in Intel Macs. The GPU Compute performance alone on a RX6900XT compared to the M1 Pro and M1 Max shown says something else...

View attachment 1873286

Looking at this graph, if it were me, I would probably choose an Intel Macbook Pro and have it fitted with the Sonnet Breakaway box and a Radeon RX6900 instead of a M1 Mac. Either that or have 1 laptop of each platform.
What's the source of that graph? What are the numbers at the bottom representing?
 
== The Death of the Graphics Card ==

It looks to me like two things are going on.

First, Apple is being smart about how it approaches this. Remember when Jobs came back and said that Microsoft was not just the competitor of Apple, "it was the environment?"

He knew that Apple needed to carve out its niche in a world of Windows. Apple couldn't compete head on with Windows. It had to fall back, pare down its offerings and focus like a laser.

The critics cried, "But you'll shrink into oblivion if you don't compete head on!"

Jobs knew better. It was, in fact, the only way for Apple to survive and then eventually thrive. Provide the best product to a niche of the market (graphics/creative) so that they chose Apple first and where Apple could get its high profit margins. "Besides," he said, "No one complains about BMW having just 2% market share."

Apple simply couldn't throw the resources (even now, I would argue) at gaming and hope to get a foothold. The environment for gaming is Windows on x86 with discrete cards.

Having observed this industry for forty years, here is where the second observation comes in, and that is the inexorable march of technology.

It's clear that the days of the discrete graphics card and chip are numbered. We used to spec math coprocessors but the general CPU became good enough and those become unnecessary. Same thing with discrete sound cards. Discrete chips eventually become relegated to specialized niches.

Same thing is happening here. 90% of gamers will do just fine with the capabilities of the new M1 chips. No need for a discrete graphics processor. Over time, Apple will even figure out a way to provide hardware-accelerated ray tracing.

The disappearance of the graphics card market is likely completely foreseen by Nvidia, which is why it wants to diversify and wants to buy ARM.

At the same time, the only hope Apple would have to get into gaming is when a significant technology leap occurred in which it could change the rules of the game—exactly like what's happening right now with the M1 chips and the more-than-capable GPUs built in.

The two forces, the march of technology and the patience for a major technology leap, have come together. The stage is still being set and it won't be ready for another five years—when most macOS users are on AS.

Apple might start making early moves in the next few years but only in five years does it make sense for Apple to seriously go after the high-end gaming market currently owned by Windows. That's when most of the Mac market will be able to run games without any additional hardware because they will be on Apple Silicon.
 
  • Like
Reactions: teh_hunterer
No way. There's a lot more to it and the m1 max simply isn't designed for it. Not saying it'd be bad but there is simply no way apple made a SOC that can hang with the consoles yet.
Hey, they made an SoC that can hang with the Switch, and IT is a console!
 
I’ve always found it puzzling that Apple has no interest in that market, given how important gaming is to their iOS money machine.

But that’s just it, right? On iOS they get a portion of the price of every game sold, so they go out of their way to make iOS an excellent gaming platform.
It’s more so that iOS devices have sold in the BILLIONS (dwarfing macOS sales) meaning that any decent developer can realistically see a potential million+ in sales if they have a suitably engaging game (Flappy Bird for instance). Apple didn’t have to pay the Flappy Bird developer to make it, that developer produced it because they saw value in developing for the platform. Developers DON’T see value in developing for macOS. Perhaps if macOS was selling in the billions, we’d see the same thing happening there.
 
This does sort of clarify that the pricing of the next Mac Pro may not be all that low.
Yeah, Apple’s been fairly clear with their pricing since the most recent iMac Pro’s and Mac Pro’s. The people that NEED those devices for work are the ones that won’t balk at the price no matter how much it is because they realize that having the FASTEST Mac made gives them an edge over their competition. The iMac Pro is going to start around the high end of the MacBook Pro’s and go up. The Mac Pro will overlap the high end of iMac Pro and go up.
 
Making users happy doesn't always mean more $$$. You have to invest a significant amount of money into helping developers and provide dedicated teams to get AAA games launched.
The thing that drives publishers/developers more than ANYTHING else is… how many systems are out there? How many potential customers can I reach? If it’s hundreds of millions of which some small percentage would want to buy my product, they’ll figure out some way to release something. Even if it fails, they might have another shot at it because the potential upside is enormous. Apple could do everything short of helping the publishers to hire developers and then place 5 Apple trained developers on every dev team BUT, if the potential market isn’t there, no AAA game is going to be launched.
 
I do not know what was in the minds of Jobs and Wozniak but I have a feeling gaming was not in their minds when they built the company and that premise is still alive today.
As someone else posted, Wozniak famously wanted the capabilities of the Apple II to be such that you could write Breakout in BASIC.
 
Such an illiterate logic ?.
Apple’s marketing and their fanbase never fails to impress me, especially on MacRumors…

10.4TF doesn’t mean that it can actually use 10.4TF if it’s power limited to +-60W and probably even bandwidth starved, (lack superior L1 and L2 caches, don’t use unified L3 chance, lacks a geometry engine, doesn’t support techniques such as VRS and storage APIs…)

The AMD Radeon V is actually 14.9 TF and the AMD VEGA 64 is 13.4 TF, and both are slower than the PS5 and Series X, significantly slower in fact!

Don’t fall for the fake marketing people. By no mean these MacBooks are slow or anything, but if you actually believe that it’s faster than a PS5 you have to seek help…

For consoles to consistently hit their maximum TFlops performance they actually uses these kind of heatsink.

View attachment 1871105


lets assume the M1 MAX get more than 60W , and put in all the heat sinks they want, does this mean it can be more powerful than ps5 in theory?

Does this mean the M1Max is actually more powerful but limited by its electricity intake? So its actually more powerful than they are advertising, just limited by the power consumption? What happens if you put it in a MacPro and put over it a brick of a heat sink?
 
I'm surprised that nobody has been talking about this...the issue of extending the life of existing Intel Macs for gaming or graphics programming purposes. We know that macOS still has many years of support life left, so an Intel Mac is still viable IMO.

Looking at this graph, if it were me, I would probably choose an Intel Macbook Pro and have it fitted with the Sonnet Breakaway box and a Radeon RX6900 instead of a M1 Mac. Either that or have 1 laptop of each platform.
If it were me, I would choose a PC Tower fitted with a Radeon RX6900. A laptop continuously tethered to an external box is no longer a laptop anyway and, if a Radeon RX69000 was first and foremost essential to what I wanted to do with a computer, I could get better performance in a PC Tower than from a Mac laptop with an external breakaway box. Then, I could still have a Mac laptop for the Mac things I want to do with all the benefits the SoC provides that an external breakaway box never will.
 
If it were me, I would choose a PC Tower fitted with a Radeon RX6900. A laptop continuously tethered to an external box is no longer a laptop anyway and, if a Radeon RX69000 was first and foremost essential to what I wanted to do with a computer, I could get better performance in a PC Tower than from a Mac laptop with an external breakaway box. Then, I could still have a Mac laptop for the Mac things I want to do with all the benefits the SoC provides that an external breakaway box never will.
This.

I still want a M1 Max Mac cause fomo though.
 
What's the source of that graph? What are the numbers at the bottom representing?
It comes from Sonnet Tech, and represents the computational power of the chips (according to them).

If you use their Breakaway EGPU box it's probably the best upgrade you can get for your old Thunderbolt Intel Mac
today, allowing to you futureproof your setup for a few years at least. At least with a RX6900XT, it will work with all the latest macOSes and work great in Windows/Bootcamp.

> https://www.sonnettech.com/product/amd-radeon-rx6900xt-bundles/overview.html
 
  • Like
Reactions: star-affinity
They’re in demand for crypto mining, not gaming.


This 8x RTX 3090 machine makes $128,000 per year:

Well true, but I know a lot of gamers who had to swallow those prices to play games... I ended up buying a PC with a 3080 installed before those were sold out (and probably harvested and resold by miners :) )
 
Let me guess - you don't think the design flaw is that it looks like a MacBook Pro from 2009, or that it's got feet, or a stupid HDMI port. You think the design flaw is the notch don't you.
It’s a MacBook PRO. Not MacBook Air.
Why care if it’s 2009? Or are you the same sucker who bought the 2013 Mac Pro ?
 
You actually believe that Apple has more graphics performance than a PS5?
Just for your own knowledge, the PS5 is outperforming a 14TF Readeon V GPU…

Keep in mind that Appels 10.4TF is the max theoretical performance that you won’t come close to at the 60W power usage.
You have to compare the whole system.
A Radeon V GPU on a HDD and PCI3 lanes will be killed by a lot.

The bottlenecks must be equal to be comparable
 
In consoles such as PS5 or Xbox, there is no system memory. It only uses GDDR6 video memory at 448GB/s and those systems pull around 220W to fully switch all transistors for heavy duty tasks.

For example, an 100W RTX3080 mobile has the same TFlops as the 160W RTX3080, but there is a 30-40% difference in performance due to power limitations. I refuse to believe a 60W MacBook GPU can actually do 10.4TF. I would probably consume 200W+ to reach its theoretical 10.4TF.
There is no need to refuse to believe anything as the facts are well, the truth.

GDDR6 is extremely power hungry and have high latency, but they are fast and cheaper. LPDDR5 is energy efficient low latency but slower and more expensive. Now apple have made their own solution to still get the400GB/s bandwidth.

The RTX30xx series chips are on a separate board wasting energy and resources transporting information between the cpu and GPU.

The M1 Max have shared memory and GPU cores quite literally soldered together. Limiting wasted transfers and clock cycles wasted just waiting for information to arrive back and forth.

The M1 Max likely uses between 70-90W at max power for CPU and GPU usage.
 
  • Like
Reactions: Genkakuzai
My point is that the article is 99% wrong and the MacBooks actual GPU performance is nowhere near the consoles performance.

Maximum theoretical TFlops doesn’t mean you can hit those number if the GPU in the Mac is power and thermal starved.

For the consoles to hit their max 10.3TF performance they actually use massive heatsinks and consume 200W+. Let that sink in for moment, and now imagine Apple’s 60W 10.4TF false marketing…


View attachment 1871101
Can you guess what kind of GPU they use? Yep AMD, worst GPUs on the planet and eats power like chrome eats RAM. Normal 3050ti on laptops likely outperform them.

And apple actucompared it to a 3050ti and 3080 laptop. And actuall laptop you can benchmark any play games one uses 160~w in total.
 
I'm a bit late to this post but I have a few questions:

1)What's the big hoopla that the GPU performance is so much better than the regular M1 or Intel Macbook Pros? I can see some Macbook Pro users wanting to do professional video editing but wouldn't the vast majority of pro video editing Mac users want an iMac with a larger screen?...or are these professionals dumping thousands into multiple external displays attached to their Macbook Pros because the iMacs have some kind of display and/or GPU performance limitation?
You are quite late indeed. Apple have slowly replaced their lowest end to their high end computers with their new CPUs.
The first M1 was just a modified iPad chips, these are upgraded ones with more CPU and GPU cores with added hardware codexes and more ram from 16-64gb if needed.
2)I'm curious why Apple didn't release an M1 Pro/Max that had more CPU cores. There have been 16 mobile CPUs for quite some time now that are obviously aimed at pro users for about $1500 in the Wintel world. Me? My need for crunching power is CPU, not GPU. I'm really not running video intensive apps nor am I pushing to displays above 1920x1080.
Well they did. Went from 4x performance and 4x efficiency cores to 6/8 performance and 2x efficient ones. And 8x GPU cores to 14/16/24/32 GPU cores. And these are equal in power to an i9 with rtx3060 in them. If you aren’t interested in the performance you can always use the MacBook Air/ Babcock pro 13in with 18h battery life and the power of an i7-11700K
3)Is the M1 Pro simply the M1 but with 2 more CPU cores (and various more GPU cores)? Not that Apple would admit it, but I didn't see any clear definition of the M1 vs. M1 Pro in the Youtube video beyond cores and transistors.
A lot of hardware encoders/ decoders, lddr5 ram(much faster) and different cores as they use more powers.
Overall I'm curious why Apple is stressing the GPU performance so much. I'd love to hear some real-world, detailed examples of why/how/when all the extra 16/24/32 GPU cores are going to help.
GPUs are vastly more efficient than the CPU when doing specific tasks. The core count isn’t normally advertised. RTX3080 have 8700 cores for example. They just allow the processing unit do more things in parallel and decide tasks instead of all things landing on one core. This is why CPUs have barely had more than 4 cores as it’s never been used that heavily compared to GPUs.

Small example
 
Can you guess what kind of GPU they use? Yep AMD, worst GPUs on the planet and eats power like chrome eats RAM. Normal 3050ti on laptops likely outperform them.

And apple actucompared it to a 3050ti and 3080 laptop. And actuall laptop you can benchmark any play games one uses 160~w in total.
RDNA2 is about as good as Turing for RT and keeps up with or exceeds Ampere for rasterization performance. I wouldn’t call them the worst GPUs.
 
RDNA2 is about as good as Turing for RT and keeps up with or exceeds Ampere for rasterization performance. I wouldn’t call them the worst GPUs.
Considering we only have Nvidia, then. Yes they are are the worst.
AMD quite literally have no card available that can compete with Nvidia.

Their 6900xt is cost double the 3080 and it would be better to go with two 3080s or a Ti
And the 6800xt is Power consuming beast equivalent to the 3070

But sure intel will be worst when they have their GPU available
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.