Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Whut??
Apple's benchmarked their current processor against 3 years old processor and got better results?? OMG that's great. I bet it beats current lineup of quantum computers too.
Seriously, why not compare them against 3345, 3365 or 3375? Now, that might have been more interesting. Because anyone buying a workstation will not buy new intel chips, right? Wrong...

As for GPUs - they should compare these against rescent GPUs that matter at this price point... this is getting bizarre.
I am still laughing at their joke from previous event, when they've said that m1max was beating 3080.

I’m guessing you didn’t watch the video yesterday? It was also compared against the 12900K Alder Lake and RTX3090.
 
  • Like
Reactions: NetMage
Lower their prices? Many gamers pay thousand of dollars for GPU upgrades and gaming laptops sale pretty well. In fact, Apple is only laptop brand that does not have gaming line of machines. There is a lot of money in the gaming industry. If Apple can equip a 14inch MBP with M1 Pro/Max version designed specifically for gaming and throw some millions to the big developers to release their most popular titles for Apple Silicon, then, the industry, including smaller developers would follow. Mac should not be the top gaming platform for AAA games, but it should be a decent platform, so that people dont decide to go for Windows machine only because the Mac is crappy for any gaming activity. Many many potential buyers are pushed away from the Mac just because you cant play the games that your friends play on their PCs.

You overestimate the amount of people buying high end gaming devices. The most popular GPU on steam is a 2060ti.

By far the biggest market for games right now is mobile and that is projected to increase year over year. Apple already have the greatest market share in mobile gaming.

People keep wondering why Apple isn’t pushing gaming on the Mac….the answer is that is not where the market is going.
 
  • Like
Reactions: GalileoSeven
Ultraaaaaaa Combooooooooo!!!! ??
KJ (Killer Instinct - that game and that shout‘out on those subwoofers was sick!!)

PS: it was ‘30-hit Ultraaaaaaa Combooooooooo!!!!’

Back on topic …

Intel Core i5 12600K on windows/linux
https://browser.geekbench.com/search?utf8=✓&q=Core+i5-12600K

Intel Core 19-12900K on windows/linux
https://browser.geekbench.com/search?utf8=✓&q=Core+i9-12900K

And the new champion
M1 Ultra
https://browser.geekbench.com/search?utf8=✓&q=Apple+M1+Ultra


Apple wasn’t lieing. But its all in the performance of the OS, then the hardware and the apps used. Would’ve loved this presentation to have a cook-off like back in the days of nostalgia.
 
You overestimate the amount of people buying high end gaming devices. The most popular GPU on steam is a 2060ti.

By far the biggest market for games right now is mobile and that is projected to increase year over year. Apple already have the greatest market share in mobile gaming.

People keep wondering why Apple isn’t pushing gaming on the Mac….the answer is that is not where the market is going.
2060ti is pretty powerful. But the gaming industry, in terms of users or even revenue, is mostly not on the PC at all.
 
Not when that comes at the cost of something else. Picking a job based on things that actually matter, such as personal interest in the work, you might not always get to use the best laptop.
Just because something doesn’t “actually matter” to you doesn’t mean that it doesn’t matter to the rest of us.
 
Here's a question to those far more knowledgable about computer/software engineering than myself:

Is there a world in which the Jade 4C Die, for the sake of the hypothetical let's says its 4x M1Max, could it be possible for the single computer to boot multiple OSX iterations. I.e. In a post production environment, when there is a single user, they get the full power of the 4x M1 Max SOC's, but you're also able to boot multiple iterations of OSX from the one machine, so an editor and a colourist could work in tandem, with each person allocated two of the M1Max chips.

This would be fairly amazing if possible, as you could have 1-4 computers (or more) inside the same box. So just wondering if there there are any fatal flaws in the idea that would make it untenable?
 
Last edited:
Here's a question to those far more knowledgable than me about computer/software engineering than myself:

Is there a world in which the Jade 4C Die, for the sake of the hypothetical let's says its 4x M1Max, could it be possible for the single computer to boot multiple OSX iterations. I.e. In a post production environment, when there is a single user, they get the full power of the 4x M1 Max SOC's, but you're also able to boot multiple iterations of OSX from the one machine, so an editor and a colourist could work in tandem, with each person allocated two of the M1Max chips.

This would be fairly amazing if possible, as you could have 1-4 computers (or more) inside the same box. So just wondering if there there are any fatal flaws in the idea that would make it untenable?

You’d have to do it via virtualization, and there are already solutions for that. But you can’t just partition the cores up the way you are proposing -only one OS gets to run at the lowest level, and it is responsible for controlling resources (allocating RAM, access to the I/O bus, etc.)
 
That’s what I said, but everyone keeps insisting that because of that old leak that surely it will just be two ultras tiled together.
It’s not just because of that, it’s because starting a new chip architecture deployment AT the top end is extremely unlikely for simple economics reasons. Apple’s chip design team is talented but there’s a reason every chipmaker for over a decade now has gone from “small to large” with new architectures—Apple included.
 
  • Like
Reactions: turbineseaplane
Just when we thought wait the new M1 max wasn’t fast enough for our every day needs, it’s cheaper buying a 14” MacBook Pro with M1 max and use it in docking mode with a monitor that way you can a laptop as well, the speed of the M1 max MacBook be max studio is identical but one is portable and one is hard wired
Currently waiting for “never arriving” 16” M1Max 10/32 64GB 2TB. Equivalent Studio would cost less ≈ 500€ or conversely for extra 500€ I’ll be getting a truly remarkable display and the added benefit of portability. M1max Studio not appealing.
We don’t actually know that the speed is identical until we get more testing results. In particular, the thermal solution on the max studio appears to be beefier than that on the 14” MBP, so you may see less thermal throttling (though there’s very little with the MBP as it is). In any event, these will sell very well for compute farms, business environments where it isn’t desirable for computers to get up and take a walk, etc.
Headless Mac+Display combo is also a better fit if you plan to upgrade on short term, given the constraints implicit to SoC unified architecture. Swappable internals option would be a great alternative concept, but it’s not going to happen…
I’m so split in what to think about this new machine; I would rather have my MacBook pro in dock mode with a monitor
Unless you need/can go for Ultra or you plan to update/upgrade on short term, I feel MBP is a better package option.
Although my current needs are well served with M1Max, I’m split too as the Ultra performance is so very tempting…so already in self convincing BS talks about future proofing, “modular upgrade combo” virtues, etc.😌
 
I’m guessing you didn’t watch the video yesterday? It was also compared against the 12900K Alder Lake and RTX3090.
Propaganda Apple is powerful. Only facts show that the 48-Core GPU has a weaker performance to the RTX 3060 GPU laptop.

M1UltraOpenCLvsMobileRTX3060.png
 
You’d have to do it via virtualization, and there are already solutions for that. But you can’t just partition the cores up the way you are proposing -only one OS gets to run at the lowest level, and it is responsible for controlling resources (allocating RAM, access to the I/O bus, etc.)
Why can’t the same be applied to core management, i.e: no discrete core architecture; so no CPU/GPU cores just x Neural Engine cores that assigns remaining computing power according to momentum workloads regardless of their typology? Shouldn’t that be the beneficial concept of unified architecture?
 
Yes, people choose Windows over macOS for the games, but also for the lower price. Let me split this gaming target group. Group 1: If gaming is all one cares about, then going for the more affordable system is the reasonable choice. Even if Apple was on par with Windows in terms of developer support, etc. I would go with Windows. Apple doesn't want those cost-sensitive users. Group 2: Then there are Mac users who love the Mac for what it is and also happen to love playing games. Where is the incentive for Apple to invest in game developers bringing AAA to the Mac when this second group has already bought a Mac?
Assuming that everyone that want gaming-capable laptop are price sensitive and therefore would always choose Windows PC is a bit foolish. I guess you haven't seen Alienware laptops or Razer Blade series, which cost as much as 14inch MBP.
 
Yeah because the M1 max just isn’t enough power
It's not. Neither is the ultra. I think in the winter when the Quad Card drops it'll approach matching my 28-core 4x W8000X GPU's...but even then, the 2019 is expandable...whatever the M1 Mac Pro is...it absolutely MUST be expandable to compete with my Mac Pro. Otherwise, it'll just replace my Mac Studio in the music studio "which the delivery date has slipped to April 17th" :/
 
It’s not just because of that, it’s because starting a new chip architecture deployment AT the top end is extremely unlikely for simple economics reasons. Apple’s chip design team is talented but there’s a reason every chipmaker for over a decade now has gone from “small to large” with new architectures—Apple included.

Yes, but it is curious, at least, that Gurman's rumor correctly predicted the M1 Pro (Jade C-Chop), M1 Max (Jade C-Die), and M1 Ultra (Jade 2C-Die) — but incorrectly predicted there would be one more (Jade 4C-Die).

Either Gurman extrapolated something that wasn't there. Or they changed plans at some point. Or Gurman's Jade chips refer to M2*, or both M1 and M2, and M2's Max has two different interconnects to make an M2 Quad possible.
 
  • Like
Reactions: cmaier
Let’s be real, if you’re in that type of environment they are probably using Windows. And if it’s a huge corporation (e.g. government) they probably have a Microsoft contract. I’ve been in government for over 20yrs working with various companies such as Raytheon, Harris, etc and have seen zero instances of MacOS being used.
I'll be real, I run a production company, and we are 100% Mac based. I could name several others the exact same way. In fact, here's a very detailed breakdown another production company did about their Mac only Studio as well.

 
  • Like
Reactions: NetMage
So it's NOT faster than the AMD Ryzen Threadripper 3990X.
It would then be the 2nd best consumer processor in the world.
The entire computer is less expensive than the Threadripper though.

But I don't care, wow... what are people going to do with such a monster of performance... ?
I remember a few people in the cinema industry left Macs because FCPX lacked features when it was released. Are they going to come back ?! Will game developers finally consider the Mac? (the best Mac Studio is 2X as powerful as the PS5)
It was never about the hardware not being powerful enough. FCPX when it first released was replacing Final Cut Pro 7. You have to understand up to that point FCP7 was industry standard here in LA at all of the major studios and even the small ones like my own. When X came out, it was a drastically rewritten code based around an entirely new philosophy of how an edit station could be far more intuitive. However, Editors know only what they know and asking them to learn a brand new way of thinking overnight was a HUGE ask.

Meanwhile, Adobe took that opportunity to strike and released a new version of Premiere Pro that was basically what everyone expected Final Cut Pro 8 to be.

FCPX however, has made an extremely strong comeback in Hollywood. We use it 90% of the time on commercials, trailers, music videos, and even in film. Will Smith's film FOCUS was cut I FCPX, as was Whiskey, Tango, Foxtrot, amongst others.

Don't get me wrong, Avid is the primary editor used for major films now still, but FCPX is definitely coming back and is beyond full featured.

This system is fantastic and definitely enough for filmmakers, music artists, and the like...however...

I ordered one for my music studio, but it will NOT be replacing my 28-core w6800X quad GPU driven Mac Pro. That thing clocks in in rendering at the power of 3 RTX 3090's...and I'm talking about real world rendering...Mac Studio however, is simply nowhere near that kind of power. At most, it's equivalent to 1 RTX 3090, and reports are saying that's not in reference to GPU Rendering...it's not there yet.

That said, I do believe the M1 Ultra based Mac Pro later at the end of the year will be awesome...BUT...and this is a huge but...it MUST BE EXPANDABLE TO REPLACE THE INTEL Mac Pro, If we don't have expansion offered and somehow the ability to upgrade the GPU's in it...then it will not replace the intel Mac Pro for people like me and studios like mine.

Interesting read of another studio that relies on the Mac Pro's as well...

 
  • Like
Reactions: PsykX
It’s not just because of that, it’s because starting a new chip architecture deployment AT the top end is extremely unlikely for simple economics reasons. Apple’s chip design team is talented but there’s a reason every chipmaker for over a decade now has gone from “small to large” with new architectures—Apple included.

So? What does that prove?

We already have A15, we will likely have M2 MacBook airs in a redesigned case. M2 Minis, etc., long before the M2 dual-ultra Mac Pro comes out.
 
Propaganda Apple is powerful. Only facts show that the 48-Core GPU has a weaker performance to the RTX 3060 GPU laptop.
That is an OpenGL benchmark and OpenGL is not optimised at all on M1. If you want an honest comparison the the Apple side should be on Metal. Right this is comparing Apples to Oranges!!
 
That is an OpenGL benchmark and OpenGL is not optimised at all on M1. If you want an honest comparison the the Apple side should be on Metal. Right this is comparing Apples to Oranges!!
No, this is not OpenGL, only OpenCL. As a metal test will be, we will compare metal.
 
That’s what I said, but everyone keeps insisting that because of that old leak that surely it will just be two ultras tiled together.
I’ve been pondering this proposition, and the argument against it (hector martin found only 2 addressable cpus), and wondered if an alternative is possible to meet the rhetoric from apple and hector’s discovery.

As apple says, this is the last of the M1 family (ultra), which is two M1 max attached in a way to make then appear as a single cpu to the os, so hector’s discovery might not directly be speaking to this arrangement, but could perhaps be speaking of an arrangement where you could address 1 of 2 M1 ultras.

I’m betting it’s not this, as there are so many other issues at play, but what we know in terms of what has been revealed and discovered as fact does allow for this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.