Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't think that's correct. We've known about the interconnect and the Jade 2C-Die since early December.


As I stated yes the rumour stated that an interconnect existed = yes. What I’m stating is the hypothesize method of interconnecting was incorrect; sourced quote from that article:

Frederic Orange shows the M1 Max Duo as a multi-die MCM, with the M1 Ultra using chiplet design. Since the interconnect bus only exists on one side of the silicon, the M1 Max Quadra would likely require an I/O die joining two interconnected pairs of SoCs.

No separate I/O chip or die was used by Apple. Srouji, our silocn sugar daddy, lol, clearly stated disadvantage of previously hypothesized connections.
 
Yes but the method of the connection was not understood nor was it ever rumoured in fact the one that was rumoured for the kind of interconnect was totally destroyed by Srouji because they had a better solution that’s the Innovation of Apple
Not sure what you mean by that. It appears to be a very simple point-to-point crossbar. Which is exactly what it looks like if you look at it.
 
  • Like
Reactions: NetMage
As I stated yes the rumour stated that an interconnect existed = yes. What I’m stating is the hypothesize method of interconnecting was incorrect; sourced quote from that article:



No separate I/O chip or die was used by Apple. Srouji, our silocn sugar daddy, lol, clearly stated disadvantage of previously hypothesized connections.

That I/O chip was thought to be in order to connect two Ultras, not to connect two Max’s.
 
  • Like
Reactions: NetMage
But you can actually look at the die photos and you can look at the work the Linux folks have done reverse-engineering how interrupts work and you can tell that there is no way for an M1 Max to connect to more than one other M1 Max.
I fail to see how that invalidates a leak that has given us the entire chip line for the first 2 years of apple silicon to date though.

I'd bet that there is going to be a Jade 4C Die but it may be architected in a way that isn't understood to us today. I make no predictions other than the Mac Pro having a SoC package on the top end that is something other than just the M1 Ultra. It may be two separate socketed SoCs. They may have another technology to interconnect each Ultra. It may be something I cannot imagine.
 
I fail to see how that invalidates a leak that has given us the entire chip line for the first 2 years of apple silicon to date though.

I'd bet that there is going to be a Jade 4C Die but it may be architected in a way that isn't understood to us today. I make no predictions other than the Mac Pro having a SoC package on the top end that is something other than just the M1 Ultra. It may be two separate socketed SoCs. They may have another technology to interconnect each Ultra. It may be something I cannot imagine.

I’ll believe the very clear statement made yesterday that Ultra is the final member of the M1 family rather than some rumor that could easily have referred to plans that are no longer part of the product map.
 
I’ll believe the very clear statement made yesterday that Ultra is the final member of the M1 family rather than some rumor that could easily have referred to plans that are no longer part of the product map.
I don't think it is clear at all. They sent mixed signals.
 
  • Like
Reactions: turbineseaplane
Well, as usual, Apple does not update Intel CPUs to the last generation for a very long time and then compares those already obsolete Intel chips to latest-gen Apple product. And then, what a surprise, the Apple chip is faster. Not to mention that in a proper workstation nowadays you can have 2 sockets with 64-core [128-thread] CPU per socket and up to 4TB of RAM.

Max 128GB of RAM in a workstation in the year 2022.... same amount as my 2010 Mac Pro.... Yawn.
And yet you're effectively comparing obsolete 1333MHz DDR3 ECC to unified memory in a chip from 2022...
 
Let's hope that they will *not* make the Mac Pro smaller (and hence louder and with less thermal dissipation capacity and with less expandability). To be honest I never Understood why on Earth anyone would care to make a workstation or desktop PC tinier / iMac thinner etc. Those are not portable computers and smaller size / smaller thickness brings only disadvantages.
As someone that dragged a Mac Pro from one side of this country to the other, used them in hotel rooms, and in the back of vans, I disagree.
 
  • Like
Reactions: NetMage
I’ll believe the very clear statement made yesterday that Ultra is the final member of the M1 family rather than some rumor that could easily have referred to plans that are no longer part of the product map.

Correction John Ternus actually did say we’re adding one last chip to the M1 family
 
this is one of those chicken and egg type statements.

If there’s no high-end games available you won’t see the man for it last long enough to get it there which is the exact same situation spo going on with Mac OS 10 for a while.

So at the end of the day Apples iOS and iPad or have several games all the scene out of them high-end butter and joy by a lot of users so there is a market of employees or conss that like to game on Mac devices now Apple sod fit to understand the market for gaming and allow iOS apps to play on M1 based CPU is or any Apple Silicon-based CPU that means there’s a market is just finding the target market to play those games on Mac OS X that research needs to be done by the scooters or programming houses in teams but simply saying there’s no money in it, it’s false because if we look at games on iOS and iPad OS uses 10 to pay more money on those games then on android there is a market there is money in it it just needs to happen
Macs have had the hardware to play games for years. And, nothing.

The steam survey figures show how bleak it really is. Big games day 1 sales dwarf the entire macOS platform ownership on steam. MacOS would be a drop in the ocean.

You can try to argue that there will be more gamers if there were more games. But nobody is paying MBP prices to play games. When you can get suitable performance much cheaper.
 
Macs have had the hardware to play games for years. And, nothing.

The steam survey figures show how bleak it really is. Big games day 1 sales dwarf the entire macOS platform ownership on steam. MacOS would be a drop in the ocean.

You can try to argue that there will be more gamers if there were more games. But nobody is paying MBP prices to play games. When you can get suitable performance much cheaper.
Again I agree on the hardware front.

The sales figures I'm not aware of so I'll take your word for it.
- sales figures only shows units, profits, and maybe for which platform - but it doesn't answer the major question: Why haven't ALL top tier games been created for macOS?

I suspect the 'mac user' demanding isn't the issue, although I'm sure that deman died down to extremely low or non-existent researched numbers the last 10yrs and that reason simply is - nothing available, moved onto console (PS/Xbox/Switch) or on smartphones and have basically given up hope.

Regarding MBP prices for games:

Canadian store fronts and pricing: Zephyrus M

Amazon CA: $2052

16GB RAM
512GB Storage (which is slower than the current 14/16" MBP storage on all options).

- sidebar pricing on that page shows

Newegg CA: $2399
>> 1TB of storage.
NOTE the seems to be confusion of what model you're getting
  • 16.0" 2560 x 1600 16:10, 165 Hz, 500 nits, anti-glare
  • 14.00" x 9.60" x 0.80"-0.90" 4.20 lbs.
^ so which model is at this price the 14" or the 16" ?


Side bar on this page shows
2593.44 (Deal Targets)
2749.99 (Memory Express)
2752.99 (OneDealOutlet Canada).

Apple CA: $2499.
14" MBP
8-core CPU, 16-core GPU on SoC
16GB unified Memory,
512GB


So pricing is VERY comparable (outside of the USA) and again depends on models, screensizes and the performance you're getting with what you choose.

the M1 Pro and M1 Max have proven their performance and is very decent.

I suspect the REAL issue is most developers for top tier games (before recent major acquisitions in the last few years leading to consolidation: Sony buying Bungie, Microsoft buying Activision) is possibly they don't have the skills to code for macOS including for metal to get the performance needed?
Maybe parent company restrictions - based on acquisitions?
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Who said anything about needing them? I was talking about picking a profession more suitable to being able to use the best tools.
This is getting silly. Picking a profession based on wanting to use a particular operating system and assuming Apple's products are somehow, by definition, always the "best tools". It's just a computer. Seriously. A means to an end. Nothing more.
 
When you spend all day using a tool, you may as well make it a tool you like.
The tools I use are the codes I run or write, not the OS. The way things are going with MacOS, Linux is far superior for my purposes workstation to supercomputer. Still annoyed at the whole "OpenGL deprecated" thing. We'll pretty much never buy Macs for work.

Makes a nice home computer, though. Typing on one now.
 
Worse than usesless, it's misleading.

Most benchmarks are merely useless because they're generally measuring "turbo speed".

A fair benchmark would "warm up" for 2 minutes then run for 10 minutes and see how many loops (of slightly modified operations to prevent RAM cache cheating seen in FPS game benchmarks) the benchmark can perform.


I don't use Premiere every day but when I use it, I use it all day long. Any time I take a break from rendering, the first couple minutes are always faster than the next couple of hours because the machine heats up. Yet almost every benchmark I've ever seen completes within 5-90 seconds, rendering the whole simulation rather pointless.

Best yet, give me a benchmark that warms up for 30 minutes and then renders from minutes 31-60. That'd be more accurate and useful. The difference between 5 and 10 seconds being "twice as fast" is very little compared to something that takes 2 hours vs 4 hours or worse, 20 hours vs 40 hours.

GeekBench is just a glorified pissing contest.

But given a large number of people aren’t putting their CPU’s under full load for hours, a synthetic benchmark that emulates a race to idle isn’t entirely useless. Using Premiere Pro, Davinci and FCPX will likely be using hardware encoders so tells us nothing about GPU/ CPU/ system performance. There’s a place for all these benchmarks to provide context around every likely scenario.

I personally like the SPEC benchmarks because they give a much greater indication of potential performance…and that’s why I’ll wait for the anandtech reviews.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.