Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
X Elite has 45 TOPS (likely INT4 goosed metric) of 'AI' processing. So yes has NPU. AV1 , HEVC , etc media. AV1 encoding ... which Apple doesn't have.

But @NetMage claims:
Really? Just one example of how wrong you are:
All three chips in the M3 family also have an advanced media engine, providing hardware acceleration to the most popular video codecs, including H.264, HEVC, ProRes, and ProRes RAW. And for the first time, the media engine supports AV1 decoding, enabling power-efficient playback of streaming services to further extend battery life.

Unfortunately NetMage is reading-challenged. D60 is correct in his claim that AV1 *encoding* is not supported. I think the M4 is likely to support it... we'll see.

It comes into play because macOS is compiled to take advantage of the custom instructions and modes Apple has added to Apple Silicon and will not boot or run on a standard ARM core or understand Qualcomm's custom instructions. This isn't the x64 world.
There's some truth to this, but it's not clear how much. I've seen some handwaving about how conformance to ARM licensing requirements means you're wrong, but (for example) they obviously have custom instructions hidden behind their matrix libraries. Are those required to boot, or run? Not clear.
 
It is a gap.
But a gap that’s quickly drawing thinner and thinner. And it was done pretty fast.

I don't know if it's actually drawing thinner, but hey, competition is good.

(With Meteor Lake, oddly enough, the gap seems to be increasing. I'm guessing this is a result of this being Intel's compute tile generation. Guess we'll see what Arrow Lake brings.)

It comes into play because macOS is compiled to take advantage of the custom instructions

Is it? What binaries wouldn't run on an ARMv8 chip?

and modes Apple has added to Apple Silicon and will not boot or run on a standard ARM core

Yes, but that's largely a function of Apple having a different device tree (mostly like PowerPC OpenFirmware Macs) than Qualcomm. (And ARM itself doesn't really have a standard for this at all.)

or understand Qualcomm's custom instructions.

But it doesn't need to. Those instructions, both with Apple and with Qualcomm, typically exist for further performance optimizations.

If you're right that core binaries are compiled against an instruction set that has them, then indeed booting probably won't get far. (Even then, I imagine you could do a hypervisor that emulates them.)

This isn't the x64 world.

It isn't, but even the x64 world has CMOS and EFI. Software can adapt.
 
  • Like
Reactions: jido
You're basically repeating what I said, but with a bit more certainty than is warranted. The M4 may be out shortly after the Oryon, *if* Apple goes with a 1-year cadence.

Well, releases so far suggest that they're on the same ~15-month cadence as AxX. March '12, November '12, October '14, September '15, June '17, October '18, for an average of 14 months, and so far, November '20, June '22, October '23, for an average of 17 months.


The Oryon will be out before the M4 though unless QC screws up *really* hard.

I think so, yeah.


Also, as I said, your claim (and mine) is valid from an engineering standpoint, but if QC decides to sell this chip at a price point low enough that Windows laptop vendors can sell Oryon laptops at a price competitive with a MacBook Air, then their comparison has some justification. It *still* won't be able to compete on battery life for heavy-duty users, but it might well compete just fine for light work (email, most browsing, typical office apps, etc.).

I think the comparison makes perfect sense. Apple markets the Air as "this is a great laptop, oh, and it runs macOS". They heavily compare against what they call "PC laptops". So why shouldn't QC do the same?

 
I think the comparison makes perfect sense. Apple markets the Air as "this is a great laptop, oh, and it runs macOS". They heavily compare against what they call "PC laptops". So why shouldn't QC do the same?
They should - that's not what I was saying.

I was saying that this is an issue of price points. From an *engineering* perspective, the proper comparison is the M3 Max, not the M3, and the Max beats the pants off the Oryon (at least with QC's own numbers they've released so far).

But.

If QC prices the chip so that a quality Oryon laptop (good construction, light, big enough battery, good screen, etc.) can be sold for the same price an a MacBook Air, then comparing their chip with the base M3 processor is entirely fair, from an end-user's standpoint. My $1200 or $1500 or whatever can buy an MBA, or for the exact same amount I can buy an Oryon laptop of equivalent quality? That's a strong argument. Of course, the big question is, will they do this? We don't know how big the chip is, or even what process it's on (it's "4nm" so my guess is TSMC N4P). That determines what it costs to make them.

Either way their window is pretty small. Intel E cores are growing like mushrooms, the M4 may well be 6P+6E, AMD is maintaining their pace... I want them to get a decent foothold in the market, because good competition is good for us all, but I think they've got even odds, maybe less, to make a dent.
 
  • Like
Reactions: Analog Kid
Oh, also:
Well, releases so far suggest that they're on the same ~15-month cadence as AxX. March '12, November '12, October '14, September '15, June '17, October '18, for an average of 14 months, and so far, November '20, June '22, October '23, for an average of 17 months.
That's pretty silly. Three data points for the Mx = 2 periods, and they were well known to be affected by Covid, supply chain issues, and TSMC process delays. You can't really draw conclusions from that.

Their change in release timing for M3 models vs. M2 models is much more interesting, though it's still like reading tea leaves. We'll know by the end of 2024; before that, it's just speculation.
 
They should - that's not what I was saying.

I was saying that this is an issue of price points. From an *engineering* perspective, the proper comparison is the M3 Max, not the M3, and the Max beats the pants off the Oryon (at least with QC's own numbers they've released so far).

Right.

I think in terms of marketing, they do want this to compete with the M3, maybe secondarily M3 Pro, not the M3 Max. Or in Intel terms, with their U and P CPUs, not their H ones.

Which is why they picked a power draw / performance tradeoff where it's noticeably faster than the M3, as an "it's like the M3 but better" argument.

But.

If QC prices the chip so that a quality Oryon laptop (good construction, light, big enough battery, good screen, etc.) can be sold for the same price an a MacBook Air, then comparing their chip with the base M3 processor is entirely fair, from an end-user's standpoint.

Yep.

My $1200 or $1500 or whatever can buy an MBA, or for the exact same amount I can buy an Oryon laptop of equivalent quality? That's a strong argument. Of course, the big question is, will they do this? We don't know how big the chip is, or even what process it's on (it's "4nm" so my guess is TSMC N4P). That determines what it costs to make them.

I'd go further and expect they'll price it slightly below the Air. They do want to compete, after all.

Either way their window is pretty small. Intel E cores are growing like mushrooms, the M4 may well be 6P+6E,

Hmm, maybe. I'm not sure 6 p-cores on consumer CPUs buys Apple much. I think 4P+6E is more likely, for now. Maybe even 4P+8E for the M5. It costs almost nothing in terms of money, space, or energy, and it frees up some room on the p-cores. Or they do Intel's "what if we had two tiers of e-cores?" thing, even.

AMD is maintaining their pace... I want them to get a decent foothold in the market, because good competition is good for us all, but I think they've got even odds, maybe less, to make a dent.

Yeah.
 
Oh, also:

That's pretty silly. Three data points for the Mx = 2 periods, and they were well known to be affected by Covid, supply chain issues, and TSMC process delays. You can't really draw conclusions from that.

Yeah, but it's all we have to draw on. I see no reason to think they're going with a 12-month cadence. They do for the iPhone because that's what the market expects there (and even there, it's now with an asterisk: the non-Pro iPhones are now 12 months behind). I would argue the M series replaces the AxX series (see: iPad Pro and iPad Air having moved to it), and that, on average, wasn't 12 months.

We'll know by the end of 2024; before that, it's just speculation.

I'm not sure we'll know for several years. Heck, it's always unclear with Apple whether a pattern exists at all.
 
The advances Apple have made with the M series so far have been incredible. Having just received the new M3 Pro 12/18 core model, I'm blown away by its blazingly fast speed compared to my previous Intel base model 2019 16 inch. Real world side by side comparison on a Key Shot 11 jewellery rendering shows it to be about 17 times faster. Owing, perhaps, to the new hardware accelerated ray tracing.

There will surely be Windows users that are anticipating the same kind of revelation in chip design that we Mac users have been experiencing since 2020. (Although I've been running Windows 11 using Parallels, and it's great).
 
Only good things can come out of this (hopefully).
  • If Qualcomm is worse, they'll try to catch up over time, widening support for ARM64 architecture along the way. Thus pushing software vendors.
  • If they actually get better, Apple will be pushed to innovate even more and we'll get better devices (sooner).
  • If they are about the same, then there's more choice hardware-wise. Some of you use or are sometimes forced to use non-macOS machines and this will make them better.
  • They might even end their endeavour if they'll be unsuccessful, but that's just square one.
  • (Unlikely) Or they might get sooo much better over long time that Apple will decide to switch and in that case that's OK.
Might even affect gaming.

All this while supporting arm company. Competiton is a good thing as many have already said.
Apple have pushed some of the main industry standard software developers (like Adobe) to produce ARM software. Would I be correct in thinking that it would be relatively simple to port such software to ARM Windows with relative ease at the point of which there is hardware to run it?
 
  • Like
  • Love
Reactions: DeepIn2U and jido
Apple have pushed some of the main industry standard software developers (like Adobe) to produce ARM software. Would I be correct in thinking that it would be relatively simple to port such software to ARM Windows with relative ease at the point of which there is hardware to run it?

Depends on the layer.

Low-level architecture-specific algorithmic code? Sure, having ported that to ARM64 helps them also do that on Windows. However, this is unlikely to be the bulk of Adobe's code. More and more of it will run on the GPU instead. Some might also run (implicitly) on the Neural Engine, by targeting Metal.

UI code? The bulk of the work there is OS-specific. Porting from macOS to Windows doesn't make sense. And porting from Windows x64 to Windows ARM64 probably isn't much work at all, since such code these days is unlikely to be very architecture-specific. It's mostly just a recompile.

So, TL;DR: yeah, having macOS ARM64 apps helps with Windows ARM64 some, but probably not that much.
 
Depends on the layer.

Low-level architecture-specific algorithmic code? Sure, having ported that to ARM64 helps them also do that on Windows. However, this is unlikely to be the bulk of Adobe's code. More and more of it will run on the GPU instead. Some might also run (implicitly) on the Neural Engine, by targeting Metal.

UI code? The bulk of the work there is OS-specific. Porting from macOS to Windows doesn't make sense. And porting from Windows x64 to Windows ARM64 probably isn't much work at all, since such code these days is unlikely to be very architecture-specific. It's mostly just a recompile.

So, TL;DR: yeah, having macOS ARM64 apps helps with Windows ARM64 some, but probably not that much.
Thank you. I did ask from a complete 'noob' standpoint, and appreciate your breakdown.
 
  • Like
Reactions: chucker23n1
Intel plays simlar games, comparing the MT performance of its 16-core 7165H to that of the 8-core M3, instead of the 12-core M3 Pro or 16-core M3 Max:


View attachment 2326533
There should be a balance between manufacture cost / power efficiency / performance.
Comparing 16 core vs 4 cores is legit if they are similar in the other specifications fields.


Intel's 16-core 7165H vs M3 is fair if they have similar cost and power efficency. I mean, if you can use one or the another in an iPhone without sacrificing anything , price, performance, battery ETC, then it doenst matter if one have 1000 cores and the other one 2.

On the other side,

on the other side, a 16 core CPU /80W/USD200 shouldnt be compared Vs a 16 cores CPU / 250W/USD500 even if both have similar benchmarks . As they are apples and oranges, the only difference here would be the OS you are willing to use.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Intel's 16-core 7165H vs M3 is fair if they have similar cost and power efficency.

But they don't have similar power efficiency.

The 165H has a base power of 28W, and max turbo of 115W. That's really more something between the M3 Pro and M3 Max.

Something closer to the M3 would be the 165U, at 15W/57W power. Even that is a bit above what the M3 draws.

(Cost is another question, sure. But a desktop CPU will always be cheaper, due to… lower power efficiency.)

 
Desktop / console gaming maybe is niche for Apple. But for Tencent, Sony, Nintendo and MS is not niche at all. And consoles are vertical to a some degree. Still, and Xbox or PS5 are not completely designed in the same line as an Apple device, where Apple control the whole process.
You conflated desktop and console gaming, which I made a distinction between. Sony and Nintendo are exclusively involved in console gaming. Microsoft is involved in both, sure, but more people game on the Xbox than game on gaming PCs, I’d reckon, and Tencent is involved in whatever will make them money (Tencent’s involvement runs the gamut from freemium mobile titles to freemium PC titles, I don’t think they actually have any non-freemium properties, though). And I pointed out that the console market is largely vertical.

Push comes to shove, though, gaming is a niche hobby (and I say that as someone who used to be a quite serious gamer). It’s a niche that doesn’t make a lot of sense for Apple to seriously pursue for various reasons. More people play smartphone games, alas, than play even the most popular of consoles/handhelds.
 
You conflated desktop and console gaming, which I made a distinction between. Sony and Nintendo are exclusively involved in console gaming. Microsoft is involved in both, sure, but more people game on the Xbox than game on gaming PCs, I’d reckon, and Tencent is involved in whatever will make them money (Tencent’s involvement runs the gamut from freemium mobile titles to freemium PC titles, I don’t think they actually have any non-freemium properties, though). And I pointed out that the console market is largely vertical.

Push comes to shove, though, gaming is a niche hobby (and I say that as someone who used to be a quite serious gamer). It’s a niche that doesn’t make a lot of sense for Apple to seriously pursue for various reasons. More people play smartphone games, alas, than play even the most popular of consoles/handhelds.
I saw your comments about consoles, so I thought it was part of the conversation.

Going back to Apple vertical integration, yes it's very nice but it doesn't cover every case, and I gave an example of it, gaming, and maybe it could extend to CAD / CAM and 3D apps, since Nvidia and AMD are far ahead of Apple in GPU performance. That was my point.

On gaming, I don't think it's a niche at all, considering the numbers we see from Sony, MS, Tencent and even Apple. And while I agree that most people play on smartphones, most of them are casual gamers. Console / PC gamers are very a different group of gamers, that prefer a high quality gaming experience.
 
Why did Qualcomm feel the need to make this announcement? I thought we already hashed this out almost 2 months ago.

The X Elite is a good first try, but it’s nowhere near as good as people pretend it is.
 
Posts like this could benefit from attached evidence... Data from somewhere, or a link to data somewhere that supports a given definition of "trounce". I'm not saying you're wrong, but it would be nice to not have to take it on faith.

CPU is 21,000 vs 15,000.

GPU, however, is around 195% faster.

Memory bandwidth of 400GB/s trounces the 136GB/s of the X Elite.

The X Elite has 12 cores, all of which can operate as big or little. Their low multicore score exposes a problem. They can’t ramp up all 12 cores to maximum performance. If they could their multicore score should be substantially higher. It could be a thermal issue but I think it’s the memory bandwidth. They can’t feed all cores fast enough to let them run wide open.
 
  • Like
Reactions: Analog Kid
Here we go again. MINE IS BIGGER THAN YOURS! It all seems so childish. Sounds like they’re marketing to 15 year old boys. And Apple is no better with their claims of superiority. I don’t buy Apple products because of benchmarks, I buy Apple products because of macOS, iOS, iPadOS and I’m certain I’m in the vast majority.
Well your not in the vast majority for windows
 
The X Elite has 12 cores, all of which can operate as big or little. Their low multicore score exposes a problem. They can’t ramp up all 12 cores to maximum performance. If they could their multicore score should be substantially higher. It could be a thermal issue but I think it’s the memory bandwidth. They can’t feed all cores fast enough to let them run wide open.

Probably a mix of both. The M1 Max can only saturate about 243 GiB/s even with all cores. And I suspect this chip is smaller, because it's meant to compete with the M3. So, hits thermal issues faster as well.
 
Hmm, maybe. I'm not sure 6 p-cores on consumer CPUs buys Apple much. I think 4P+6E is more likely, for now. Maybe even 4P+8E for the M5. It costs almost nothing in terms of money, space, or energy, and it frees up some room on the p-cores. Or they do Intel's "what if we had two tiers of e-cores?" thing, even.
The intel special-E cores won't happen as long as Apple's not doing chiplets, at least. That's the reason Intel did it that way- so they can entirely power off the CPU chiplet.

More of any core type is modestly helpful, but this is mostly marketing: If Intel and AMD are all selling 12-20 core laptops in the premium segment, that's a bit of pressure on Apple. Not a lot, maybe, but some. And they seem to have gone with 6-way clusters anyway on the M3 Max, so perhaps that indicates future direction?

On the flip side, this would push up Pro and Max core counts (you can't see a 6E6P M4 Pro if the base M$ also has that). Not clear if Apple would be willing to do that- the Pro might have some room to grow, but the Max might be pushing it a bit.
 
Good luck with that 80w. I love my M1 fanless Macbook Air. I recently bought a Windows i9 gaming laptop just to play Call of Duty Modern Warfare 3 and the fans run on that thing like a jet engine nonstop. So loud and annoying.
 
The intel special-E cores won't happen as long as Apple's not doing chiplets, at least. That's the reason Intel did it that way- so they can entirely power off the CPU chiplet.

Qualcomm has three tiers and doesn't use chiplets.


More of any core type is modestly helpful, but this is mostly marketing: If Intel and AMD are all selling 12-20 core laptops in the premium segment, that's a bit of pressure on Apple. Not a lot, maybe, but some. And they seem to have gone with 6-way clusters anyway on the M3 Max, so perhaps that indicates future direction?

Well, the Max has moved higher-end this time.

 
Good luck with that 80w. I love my M1 fanless Macbook Air. I recently bought a Windows i9 gaming laptop just to play Call of Duty Modern Warfare 3 and the fans run on that thing like a jet engine nonstop. So loud and annoying.
This is what killed mobile PC gaming for me. Best options for mobile gaming is likely a Steam Deck or a Switch really. Hated carrying that 17" behemoth that screams at you with fans, just to play something simple.

My MBP can play WoW, Baldur's Gate 3, and Vampire Survivor... for the few times I'll game on it.

When I look at ARM on Windows, I still see many talking about how applications will be the uphill battle... and that's without even getting into gaming on an ARM Windows machine. Doesn't seem feasible for the moment at all so I couldn't really care about Geekbench scores and the like.
 
Qualcomm's disclaimer cracks me up.

They should have just said, "Windows is still gonna suck, but it won't be quite as slow." 😂
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.