Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Right because that nvidia design is tested and bested the best from AMD and Intel. No one in the consumer market cares about what processors; they care about two things these days: The brand (aka apple logo) and the the price but I guess many go in debt just to buy the brand name.

Software drives everything. There's no way I would give up a PC (x86) based computer over any apple computer any day. There's nothing I'd die for on a MAC but there are software I'd die if I don't have on a PC.

I agree. That's why I'll be running an Intel Hackintosh for a long time to come. It's just too compatible. ARM is a way better processor for portable equipment though but that's not my main requirement right now. Maybe in 10 years time ARM will be the more compatible choice. I wouldn't be surprised.
 
This is terrifying for those of us with loads of TB3 devices. How is this going to pan out? Intel could always say piss off to Apple in regards to Thunderbolt.
https://en.wikipedia.org/wiki/USB4
The USB4 specification is based on the Thunderbolt 3 protocol specification.
The USB4 specification states that a design goal is to "Retain compatibility with existing ecosystem of USB and Thunderbolt™ products ." But compatibility with Thunderbolt 3 is only optional for USB4 hosts and USB4 peripheral devices.
 
Last edited:
I wonder how they will fragment their lineup. Usually the biggest difference was processor speed. Will they have different versions of the Mac A14 version?


They could. Could be completely new. Multiple CPUs, different core counts, clock speeds, etc. Could be more cache, or possibly some difference in GPU model/performance. Really hard to say.

I suspect they don't want to say much yet, as big, impressive specs could really crater sales now. They don't want to Osborne themselves.
 
  • Like
Reactions: anshuvorty
Yeah eGPU kinda lost steam and is basically dead already..

I gotta say, my eGPU works great and I love it. I’m using a 2018 MacBook Pro 15”, Razer Core X, Vega 56. Games run really well booted into Windows 10. I don’t see switching to an ARM MacBook Pro anytime soon, I’d have to throw together a Windows gaming box.
 
Five hundred bucks for the developer mac. Not bad.

1592865874163.png

So what is the "Account holder of an eligible program"? Do I have to get paid developer subscription or do I need something else?
 
They weren’t using a Mac Pro. They were using the developer Mac mini’s that will ship this week with the A12Z chip. I think seeing the Pro Display made you think it was a Mac Pro, but all of Apple’s current hardware can run that monitor. It’s confirmation the developer minis will as well. Remember Craig F. said all the demos for Big Sur were done on that A12Z version of the OS. Considering the first real A-series Macs will come with an A14-derived chip, this makes me believe the new Macs that come out next year will be screamers.
You don’t know what they were using. If they used the developer Mac Mini, they would have showed it Sitting next to the 6K monitors. Not a single time did they show the computer itself. It’s very likely they were demoing Big Sur on ARM, using a much more powerful machine. Most likely the Macs n the demo were using the yet unreleased A14X.
 
  • Like
Reactions: Atlantico
I agree its a Desktop class processor I am with you on that, but what I would like to really see is Apple beating Intel on every account no matter how you do the benchmark. The likes of Linus on Youtube should shout ohh my god for a comparable processor Apple AX chips beats intel socks out of the water.
Apple admitted that PowerPC wasn’t it all those years ago in favour of Intel.

Do you really think they’d be switching to their own chips if they weren’t as good as Intels (at the very least?)

It would be suicide.
 
I would have. We’ll likely never know.

from what i understand everything since pentium has been an x86 decoder around some sort of risc core so if you happened to decompose the x86 instructions into arm instructions you kind of built a modern x86 processor.

question is even if apple never discloses it i'm sure intel will be very interested in figuring out if they did this assuming they believe such a setup would infringe on their patents. is a hardware x86 decoder already a patent infringement?

anyway a big part of the fun of doing a desktop apple arm processor would surely have been in this department so it feels like something apple would definitely have done if their lawyers thought they could get away with it.
 
That's the way to make a presentation look good without opening yourself up to reliable and accurate criticism.

Yes.

But it also a way to not kill all your hardware sales today promising gear that is not ready to ship for quite some time.

Obviously folks will bench the heck out of the first new hardware; Apple can't hide from it or sweep it under the rug. They know that, and we all know that. If there are no substantial improvements to the cost/performance/heat/battery-life matrix...this transition will be a flop. There is nothing to bench yet (that we can buy or even see), so we have to wait.

Honestly...if they gave us benchmarks today, many would not believe them anyway, without production hardware that anybody could test and verify. Pointless.
 
You don’t know what they were using. If they used the developer Mac Mini, they would have showed it Sitting next to the 6K monitors. Not a single time did they show the computer itself. It’s very likely they were demoing Big Sur on ARM, using a much more powerful machine. Most likely the Macs n the demo were using the yet unreleased A14X.
They said in the keynote they was using the Dev kit for all the demos
 
  • Like
Reactions: SBlue1
Doesn’t that just mean it complies to the 8.4a instruction set?

I’m trying to find a recent article I read about how Apple licenses the instruction set, and makes its chip compliant to that set, but the chip itself is not merely a modified reference design.

I’ll see if I can dig it up in my history.

In the meantime, can anyone with a CPU background (like cmaier) chime in on this?
Ah sorry, Im not sure to be honest. I was caught up in someone else asking about if it’s “aarch64”. But the wiki page does specifically say the microarch is armv8.4a. How much that’s just complying to a spec and how much is an arm design is for someone wiser than me to say.
 
First of all, if AMD/NVidia needs to compile their driver for ARM they will just do it. You dont have to rewrite the driver at all, it is the SAME code.
In any case, if Apple wants to lock down the platform they can do it anytime, ARM or not.

Why Apple is moving to ARM? Did you watch the keynote? Johnny explained it, with ARM they have the opportunity to offer much better performance in the same power envelope as competing platform based on x86.

I disagree. Low level drivers are much more performance sensitive (especially in gfx) and much more dependent on system architecture and hardware. It's not the same code at all. Some will be in assembly but more importantly the chips that the driver is talking to are completely different. If Apple doesn't provide Nvidia with the relevant information, then Nvidia can't do much about it in practice.

I hate to break it to you, but corporations lie and mislead for their own gain. Apple of course is selling this transition as a great boon to the customer but of course they would do that. In fact it has serious down sides for many people with a severe loss of compatibility. What is beneficial though is that Apple now fully controls their platform, makes it more compatible with their iPhone platform and adds yet another point of differentiation, and incompatibility, with the competition. This has been Apple's style for some time.

Now as Apple transitions it gives itself the perfect excuse to reject varius third party add-ons due to "compatibility reasons" and increase its profitability. Again, this has been Apple's style for a while now.
 
  • Like
Reactions: dysamoria
from what i understand everything since pentium has been an x86 decoder around some sort of risc core so if you happened to decompose the x86 instructions into arm instructions you kind of built a modern x86 processor.

That’s not quite right. By the way, Exponential tried to use PowerPC as the “core” around which an x86 chip was built. Anyway, there are always little CISC-isms that creep into everything, so there is never a real RISC core at the heart of any x86-64 machine. But it’s certainly not like the real old days where there is a massive amount of spaghetti logic tied to hundreds of control signals and massive pipeline exceptions for different x86 instruction formats.
 
The 1327 is only single-core performance, not multi-core. For comparison, the 2019 A12Z iPad Pro gets ~4600 multi-core, while the 2020 13" MacBook Pro (4 cores) gets ~4500 mult-core. So very close, and in a much more challenging environment (no fans in a much smaller, lighter enclosure).

There's no doubt they are going to design chips that take advantage of each range of device's needs, so a 5nm laptop or desktop A13 (or whatever they call it) running with a lot fewer constraints is going to smoke those numbers whether they increase cores or not.

I was more pointing out that the 8-core A12Z only had 4 high power cores. 4600 multi core results align closely with that. We need more high power cores to take on the 8-core i9 MBP, or the i9 iMac. Which means more die space in an SoC that devotes quite a bit of space to ASIC components, the secure enclave, and the like. So hopefully they move to a chiplet design like AMD as they work on the higher end versions. To make things easier on themselves, but who knows.

PowerPC demonstrated you can’t just increase the power envelope or clock speed to scale your performance up. I agree the A12Z is capable. I also don’t doubt it’s possible to be competitive, but I don’t think Apple has demonstrated it well enough to take it on faith that they will not have teething pains on the high end.

What I’m concerned about are mostly yield/stability issues going to larger dies, more cores, higher clocks, etc. And I think it’s fair to not necessarily take them at their word until they deliver the goods, as they move into this space.
 
  • Like
Reactions: dysamoria
I was more pointing out that the 8-core A12Z only had 4 high power cores. 4600 multi core results align closely with that. We need more high power cores to take on the 8-core i9 MBP, or the i9 iMac. Which means more die space in an SoC that devotes quite a bit of space to ASIC components, the secure enclave, and the like. So hopefully they move to a chiplet design like AMD as they work on the higher end versions. To make things easier on themselves, but who knows.

PowerPC demonstrated you can’t just increase the power envelope or clock speed to scale your performance up. I agree the A12Z is capable. I also don’t doubt it’s possible to be competitive, but I don’t think Apple has demonstrated it well enough to take it on faith that they will not have teething pains on the high end.

What I’m concerned about are mostly yield/stability issues going to larger dies, more cores, higher clocks, etc. And I think it’s fair to not necessarily take them at their word until they deliver the goods, as they move into this space.
They don’t need a chiplet design. Arm cores are tiny compared to x86 cores.
 
  • Like
Reactions: brisalta
It’s the end of gaming on Macs (bye bye bootcamp). And if parallels can’t run Windows, it’s also bye bye Mac in business environments
1. Stop being dramatic

2. Parallels can run windows, you can even see the windows icon on the screenshot of the linux vm and docker running (look at the dock)

3. Porting is nowhere near as big of a deal as it used to be
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.