Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Faster x86 chips are coming: arrow lake, zen 5, granite rapids Xeons.
And faster is coming from Apple very soon as well, the M3. And my guess is just switching to 3nm alone gets the chip 40% faster than M2.

I think one thing people forget is that these Apple chips are QUIET. No PC tower sitting under our desk with a fan running constantly. Speed is paramount, sure. But having something quiet that uses less energy is also a benefit.

The reality is, the M-series chips are competing head-to-head with AMD and Intel. That’s great! The big difference maker though (with the chips being roughly equal in performance) is that with Apple you get macOS paired with the iOS ecosystem, and with Windows you get… a bunch of problems.

It’s like the difference between iOS/Android. You couldn’t PAY me to switch to Android. And you couldn’t PAY me to switch to Windows.

The performance of the M chips allows me to be confident in sticking by that statement.
 
Last edited:
The i9 wouldn’t have been in a Mac Pro, so it wouldn’t have been an appropriate answer to the question.
There is nothing wrong with using i9 for Mac Pro. Sure Apple used to use Xeons but new M2 is closer to i9 than Xeon: no ECC RAM, no server/workstation level I/O (which is what Xeons give you).
 
And faster is coming from Apple very soon as well, the M3. And my guess is just switching to 3nm alone gets the chip 40% faster than M2.

I think one thing people forget is that these Apple chips are QUIET. No PC tower sitting under our desk with a fan running constantly.

Speed is paramount, sure. But having something quiet that uses less energy is also a benefit.
Yeah, so far Apple has not been very good at delivering new M chips quickly. Intel delivered more upgrades to their chips after M1 was released than Apple did. If the trend continues the gap between the performance of Intel chips and Apple chips will only grow larger.
 
  • Like
Reactions: platinumaqua
There is nothing wrong with using i9 for Mac Pro. Sure Apple used to use Xeons but new M2 is closer to i9 than Xeon: no ECC RAM, no server/workstation level I/O (which is what Xeons give you).

Right. But here’s what the question was:

“what processor would be today's equivalent within intel/amd series , as of today? 12th or 13th gen, but which xeon specifically?”

So I gave the two high-end Xeon W options.
 
What I was pointing out is Apple's priority for Mac gaming.

If it took them 3 years to get GPT via DXVK guy's code then it isn't that high up in their priorities list.

Apple Makes More on Games than PlayStation, Xbox, Nintendo + Activision + Windows COMBINED.

Their incentive to get into Mac gaming is lazy...

This Apple engineer said no DXVK was used: ”We built a dxil to metallib converter and directx11 and directx12 to Metal runtime translator. Non graphics APIs are translated by Wine. But we don’t use any tech from moltenVK or DXVK or spirv-cross etc. The shader converter can be shipped by games and can be used in the game developer asset pipelines.”
 
No, it’s just that it’s incredibly boring to be in the replies to an article about an Apple SoC when you disagree with Apple’s SoC design decisions.
Doesn't make that Apple is right. Clearly, you did not see how Mac Pro 2023 is totally messed up. In fact, Apple Silicon chip is still well known for poor CPU/GPU performance compared to same era Intel/AMD/Nvidia chips from different communities. They only care about power by watt so performance restrictions is inevitable.
 
Why is there no NVIDIA 3080 or 3090 card in the list?
“Turning to a different comparison, the new Apple M2 Ultra's 220,000 Geekbench 6 Compute scores (Metal) sit between the GeForce RTX 4070 Ti (208,340 OpenCL) and RTX 4080 (245,706 OpenCL). For a direct Geekbench 6 OpenCL comparison, the Apple M2 Ultra Open CL scores of about 155,000 are much closer to PC GPUs like the Nvidia RTX A5000 and AMD Radeon RX 6800 XT.”

Hope that helps give you some more insight. No the GPU is not as powerful as the most powerful GPU on the market, but it comes close, which is a huge jump from M1 Ultra that completely had to leave the chat after 200w. M2 Ultra GPU is basically on par with what the average PC enthusiast has in their machine, and it can run comparably demanding tasks. This is impressive for an APU and puts M2 Ultra about 10 years ahead of AMD in GPU performance on an APU.
 
Last edited:
Why would I when PC has way more potential than Apple Silicon which Apple restricted so heavily?
I don't find many restrictions with my work. The only limitations I have are with specific research software. Others might not be able to use Macs for their work but many other people don't have limitations. I'm not sure what restrictions you see Apple doing. Is it because Apple isn't using an x86 processor? They don't allow user-upgradable RAM? What restrictions do you see?

You cant do that? Well, that's Apple Silicon's problem, not PC.

I'm not sure what you mean by this. Are you concerned with Apple's engineering choices? You can contact them and offer suggestions for improvements.
You only care about power by watt which is totally meaningless and pointless especially toward desktop users who need high performances.
I never said that. I said it's a factor to consider. This is in part because Apple engineers make it a part of their processor design. The fact that Apple Silicon provides "high performances" while doing so efficiently is impressive. I have a Ryzen 5900X computer as my main workstation. My M1 MacBook Pro on battery does many things faster than my other system, which will use many more watts under load.
Beside Intel 13th gen are Intel 7 which is more like TSMC 7nm, not 5nm which you didn't even care.
I'm not quite sure what you mean by this. Can you explain? You're also making a lot of assumptions about me that are incorrect.
It is Apple who decided to make power by watt focused chips with a lot of restrictions so I dont see your point. I dont care about power by watt when the performance is too low or poor while working in real life.
Yes, Apple engineers did. If you don't value it, that's fine. Other people do value it. Again, you’re welcome to reach out to their engineering team and explain how they are wrong and what they could do better.
That's how Apple GPU failed as they compared M1 Ultra to RTX 3090 and yet, it never ever close to it. Typical Apple's lie.
What does the GPU have to do with the CPU discussion? You mentioned Geekbench CPU results, which is what I was replying to. GPU is a different matter and bringing it up is a red herring in this lovely discussion.
 
Last edited:
People often don't but if a company relies on driving these machines to the max then a 20% increase in performance may well be enough to pay for itself in time very quickly.

/edited for clarity
The net gains are much lower than 20%. Only when the Studio reaches its peak performance, those 20% gains kick in. Most of the time the performance difference in a single generation upgrade is negligible, also for a business, who typically go for a 3-4 year cycle when their hardware is written off tax-wise.

Ps: You mentioned you are a casual user. I think a casual user with a Mac Studio Ultra is pretty rare by itself :) What do you typically do with this machine?
 
No, it isn't. Your inefficient Intel chip throttles without water cooling. So you're the one without a choice. The M2 Ultra doesn't need water cooling and doesn't throttle either.
Because they are way more powerful and faster. Why do you ignore that?
 
If you feel that way, why are you here?
There is no point in even replying to this person now. He obviously doesn’t know what he is talking about after getting caught posting bogus benchmarks. And then after that, trying to compare off the shelf performance that a regular customer is going to get, to a personally modified suped up machine. He’s just gonna keep replying back with tall tales and stretches to try to save face, rather than just engaging in logical discussion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.