Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You seen this?
Predictions (I made on this site:) that Apple will release their own GPUs.
Also, user expandable RAM.

It is from back in March 2022. At the 3:30 mark he says he is 100% sure that the Mac Pro would be announced at WWDC (2022). That's where I stopped watching ( didn't really want to start watching in first place.).

I've punted on watching this for a while ( it has been cheerleader advertised on macrumors before ), but regretabily took a look.

Then goes onto talk lots about TSMC CoWos. But the Ultra doesn't use CoWoS packaging tech. Info_LSI. The four chip would have to, but Ultra isn't. Number of folks thought it used CoWos but later info showed Apple went another (more affordable ) path. Ultra Fusion is a very large interposer. It is a small interposer ( which is exactly in the diagrams at the beginning of the document so more than odd that he goes off on the CoWoS tangent if actually understood what he was looking at. )

The RAM isn't on the UltraFusion interposer in the Ultra. This Max die + UltraFUsion + substrate is stacked on another non chip "interposer"/substrate for the RAM connections.

The keep cool commentary at 11 mins is only talking about CPU TDP. The vast majority of the chip is a GPU, the CPU is misdirection fro the size and scope of the heat issues to be managed. Cleans it up later at the end of the video but why digging holes in the first place.


The part that talks about a secondary memory controller to 1TB of RAM also states that they secondary RAM source is slower RAM access times than the current Mac Pro. While better than paging to disk (even an SSD/Optane one) that is highly likely going to be quirky at best for general application allocation. Apple doesn't have deem NUMA support and that is definitely going to be "non uniform". If any real time latency constrain GPU data drifts out into that would be an even bigger problematical black hole. If there is a huge hiccup in access time then that secondary RAM as a RAM based SSD might be OK for largish data set ( e.g., static/immutable audio sample libraries , temporal scratch files , etc. ) makes sense. But for apps with multithreaded access with locking synchronization with wide deltas on lock acquisition... dubious Apple has that working well or will even make it work well longer term.

A SSD with DIMM slots would give something for the folks who want to tinker with something inside the box to fiddle with. However, folks who a > 1TB sized HPC computatation that they could throw at a modern Intel/AMD DDR5 ECC pool of completely homogenous RAM and > 64 threads ... why bother?






The notion of making the Max die even larger as a "feature" is also a head scratcher.... adding a second UltraFusion connect is going to displace some memory controller edge space. Doubtful bigger die is going to trade off for the memory edge space consumed. Perhaps the Thunderbolt and some other parts have been shipped off also (through "sub2" layer). And adding two more E cores isn't the core count "colossus" the hype train here is trying to make it out to be. Better than the M1 generation but that isn't going to be a Threadripper 5000 'killer'. Or make Ampere Computer 80 / 120 core solutions run away and hide in fear.



The 'Lifuka' codename fits the Ultra ( 2 Max dies). The GPUs cores from the 2nd die are 'remote' from the GPU cores on the first die. But they both belong to the same Ultra package. Same thing like Lifuka ( die ) is one of the islands of the country (package ) of Tonga. The iMac (in the iMac Pro case ) with an Ultra would have made sense. Apple went Studio with Ultra as a replacement for the iMac. That old leak was about an iMac which only embedded GPUs at best. The iMac 24" and Studio that Apple has delivered on have to separate GPU core only dies on them at all. Those are the iMac replacements. You've seen them ( 'Lifuka' has shipped). So a super head scratcher why that old "iMac is gonna have" rumor still is pushing the removable GPU card meme up the hill. Past the super secret code word decoder stage on the iMac transition products... Apple has crossed iMac off the transition 'to do' list and are on the Apple web pages now.

So not likely not really a "discrete" (as in removable) GPU. Apple has done nothing to move to support a GPU cores that are outside of a M-series package.
 
  • Like
Reactions: Snahbah
The problem with mixing current technology with the new Apple Silicon is that the speed they are getting is the combination of onboard RAM sync'd with the cpu & storage. It is almost one and the same. If you separate it, speed will probably drop dramatically.

The example is with the current Silicon combination SSD storage. Since the RAM, CPU and Storage (SSD) is all inclusive and basically on one architecture or really one chip (so to speak) concept, you get the insane speeds, but if you bring in current modular RAM or SSDs as an add-on (like the top of the market Samsung blades), speeds and power will still drop.

I notice a major difference when using external thunderbolt SSD to house and directly work from the file with my video projects due to storage internal limitations (cannot afford 4-8TB SSD internal Apple pricing). Though the external SSD is fast (getting 28000+ MB/s on bench mark scores) it bottle-necks in contrasts with the speed of "all-in-one" Apple Silicon architecture (getting 54000+ MB/s on bench mark scores). Sometimes my M1 Max works so fast compared to the external RAID SSD (or single external unit) that it glitches or slows down trying to keep up. So mixing current PCie or SSD RAM NVMe add-on blades etc. along with Silicon UNLESS Apple comes up with another miracle technology to integrate, I highly doubt we will see modular or the ability to add-on and get the same results.

Again... we can only see what happens.
The SSD is NOT on SoC with Apple Silicon. It is a separate chip, either soldered onto the motherboard, or, as per the Mac Studio, is plugged into a socket (and protected by proprietary Apple software tomfoolery so that you can't go and self-upgrade at a fraction of the cost Apple taxes for SSDs).
 
That's a little dubious of the M1 can't drive the same set of internals as the upcoming extreme chip, how would it e a complete test mule for the internal chassis , or the chassis overall if not really matching the operating thermals.




The M1 Mini max RAM capacity is so limited that Apple is still ( 2 years after announcing the transition) still selling the Intel model that has better ability. Not matching the Max capacity ram, but didn't stop them for that desktop.

The M1 iMac 24" max RAM capacity is half of the iMac 21.5 ( 16GB vs 32GB ). Apple backslid for this desktop class for the transition.

Similarly, the iMac Pro and iMac 27" are retired . The iMac pro maxed out at 256GB. The last gen iMac 27" did 128GB. The Max powered Studio can't touch either one of those. The Ultra manages to eek out a tie with the late regular iMac but mostly a backslide across more of the SKU. More backsliding.

So trying to match the 1.5TB capacity probably isn't a huge goal for Apple. Apple was quite happy to pile on top of the already high Intel "> 1TB " tax that was slapped on the W 32xxM processors. There was a thousands cheaper option in Intel's line up that Apple could have picked ; and did not. That "extra" money is likely a major contributing driving force. At the Ultra announce dog and show show Apple made a comment that the Studio's 128GB was a large amount. Without ECC, going deeper into the triple digits skates out onto thinner and thinner ice.

The Max and Ultra SoCs are largely GPUs with some CPU and other bits wrapped around them. The characteristics of the RAM are highly likely going to be skewed to keeping the GPUs cores feed with data. ( i.e., how the CPU core complexes throughout the M1 generation are capped on max bandwidth. At Max/Ultra still at same levels as the Pro. )




The Intel memory controllers that Apple uses don't work that way for the Intel Mac Pros. The "plug in " memory iGPUs are no where near as performant as the Apple iGPUs are. High end GPUs don't have "plug in" RAM for several good reasons.


Xeon SP has some features where can plug in heterogenous RAM ( Optane RAM DIMMs and 'regular' DRAM DIMMS). Intel spent years on modifying Linux kernel updates to support that. it is not trivial. There is very little, to no, sign that Apple has been doing that kind of work. Unified Memory where the memory is not highly uniform causes issues. Issues a pretty good chance Apple isn't going to engage for a niche of niche of niche computer.

Plug in memory for a deadicated task like a RAM SSD would be much easier to tack on as the applications and most of the kernel largely would just see it as a device.

Decent chance Apple would take a 256 or 512GB max capacity as a 'win' in a half sized Mac pro and just move on. If that covers the 80+ percentile of the current Mac Pro popution that would still get them a viable set of user base.




LPDDRx shrinks capacity maximums versus DDRx ; not makes the higher. The M1 Max/Pro/Ultra are already on LPDDR5. There is not "new" 4th generation here with M2 .

At the lower half of the current Mac Pro performance coverage , probably yes (whip a 12-16 core with a single W6800 MPX module ) . At the top end, probably not. 16-28 Intel cores and two 6800Duos on embarrassingly parallel code.... the Intel Mac Pro will likely win. The 'problem' there is Apple charges around $27-32K for that. Likely Apple will point to a better Performance/$ ratio. (still as expensive but not quite as high completely maxed out. ). would not be surprising to see Apple bench against Pro Vega II Duos so can post a win.

If Apple+AMD do some driver product updates for W7000 class MPX modules even less likely will dominate if throw tons of money at it. Apple has a GPU driver problem for macOS on M-series. it doesn't scale-out to multiple instances. Two years in Apple has presented zero work that they are working on the issue. None. The biggers low level driver announcement at WWDC 2022 was that the new modern drivers work on iPad Pro also. Thats where Apple's focus has been.
You're throwing around a lot of negativity and worst case scenarios there, my friend.
 
I never liked the idea of calling something "Max" when they knew that something faster would come out in the same family. If they had to have a chip faster than Max, it should be M1 Steve.
 
The SSD is NOT on SoC with Apple Silicon. It is a separate chip, either soldered onto the motherboard, or, as per the Mac Studio, is plugged into a socket (and protected by proprietary Apple software tomfoolery so that you can't go and self-upgrade at a fraction of the cost Apple taxes for SSDs).
Yes, the “SSD” is separate from the SOC, but that you can’t do your own upgrade is not entirely up to tomfoolery.

Apple isn’t putting SSD drives on the motherboard. It puts raw NAND chips there. All of the control circuits are in the motherboard. These are not standard chips and not full SSDs. You are not likely to just get one from Ali Baba. In addition, the data is encrypted and stored distributed across multiple chips. If you were somehow able to solder a new NAND chip of the proper sort there, it wouldn’t work with the other chips as they would all have to be initialized together with that particular SOC. If you were to move the NAND chips to another computer, they would not work with the new SOC as they are cryptographically signed to only work with the one that initialized them.

Normally when someone has physical access to a computer, they can eventually gain full control over its drives, often by remounting them on another motherboard that has been updated to give access. These changes by Apple mostly block that kind of exploit due to how they are encrypted and tied to a particular SOC. The downside of this enhanced security is less flexibility.
 
  • Like
Reactions: loby
Well, I need the fastest computer Apple can provide, so if Mac Pro will be significantly faster than Studio Ultra, I will buy it instead of Studio.

I do not need PCI slots though, so if performance is comparable, I will go with Studio Ultra.
You know, it's interesting...I have a very hard time imagining an AS Mac Pro 8.1 that will get me to move on from my 7.1 I just started a separate thread about this because I'm curious about other people's thoughts on this very thing.

But if you're just doing music I think the Studio Ultra is just fine. My system is incredible for music as well, but primarily it's for GPU intensive 3D work. the Ultra when I tested it was really great for music and my music projects definitely go up into the 100's and most tracks are stacked with plug-ins and effects and it ate them all right up. You can't go wrong either way but the real question is do you want expandability. If you don't care about that option then get the studio. If expandability and future proofing is important to you, get the pro 7.1
 
Pretty sure we already went down this road in 2013.
I've wondered whether Apple had Arm plans way back then, and introduced the 2013 MP as a way of 'conditioning' the market to expect unupgradable systems as time went on. I wonder whether its rejection threw a bit of a spanner in the works!
 
  • Like
Reactions: singhs.apps
Normally when someone has physical access to a computer, they can eventually gain full control over its drives, often by remounting them on another motherboard that has been updated to give access. These changes by Apple mostly block that kind of exploit due to how they are encrypted and tied to a particular SOC. The downside of this enhanced security is less flexibility.
"That is some great security innovation" - someone who has never heard of Bitlocker.
 
"That is some great security innovation" - someone who has never heard of Bitlocker.

::Looks up "Bitlocker". It's some windows thing. ::
Please explain?
Screen Shot 2022-07-27 at 9.52.31 PM.png
 
It makes sense... why jeopardizing the excellent sales of the Studio?
And... it takes quite a lot of time to develop any silicon, even if it just looks like a simple repeat to us.
Just have the Mac Mini only come with the M2 and M2 Pro options. Reserve the Max, Ultra and Extreme chips for the Mac Studio and Pro. They still need to come up with something to replace that remaining Intel Mini. Otherwise, I may as well end up buying a lot more dough on a Mac Studio that might be more than enough of what I need.
 
  • Like
Reactions: xserret
The SSD is NOT on SoC with Apple Silicon. It is a separate chip, either soldered onto the motherboard, or, as per the Mac Studio, is plugged into a socket (and protected by proprietary Apple software tomfoolery so that you can't go and self-upgrade at a fraction of the cost Apple taxes for SSDs).

What Apple calls a "SSD Module" is NOT an SSD. It is solely just NAND chips and a custom , minimal communication channel handling chip ( not even standard PCI-e NVMe protocol comm channel ). There is no SSD controller there at all. All storage and no 'brains'. So it is not an SSD. It is just 'half' of an SSD. How to do wear leveling , logic address block allocation data decisions, and a wide host of 'smart' things a SSD has to do to pragmatically work correctly is totally absent from that module. More accurate term Apple could have used is a "SSD daughtercard". But I guess that somewhere that wouldn't meet political correctness criteria. What Apple is saying is that it is a "module of a SSD".
 
  • Like
Reactions: Tagbert
I've wondered whether Apple had Arm plans way back then, and introduced the 2013 MP as a way of 'conditioning' the market to expect unupgradable systems as time went on. I wonder whether its rejection threw a bit of a spanner in the works!

Probably not.

Intel was on track with their tick/tock model in 2010-2013 time frame. It was Apple that completely missed the boat in 2012. ( every major workstation vendor went to Xeon E5 v1 and Apple served up leftovers from 2011. That wasn't an "ARM plan". the Thunderbolt 2 on the MP 2013 was a bit of a tail wagging the dog and lead to some transition problems.

The other huge miss the boat was they GPU thermal wars from 2012 to present. The upcoming next gen AMD and Nvidia top end single GPUs are consuming more power than an entire entry level Mac Pro from 2010 era ( CPU + GPU + drives + everything). 500W for a single GPU is not something that was on their future roadmap. In 2010-2012 Apple had not taken over GPU reins from Imagination Tech. they were still on the "Perhaps we'll do this ourselves path". [ Apple probably has grown more increasingly intolerant of the of the power consumption trends of GPUs over time as they got more skill at doing themselves internally. ]


OpenCL got partially hijacked and Apple forked onto Metal ( technically more Imagination/Apple GPU related than ARM related. But that part is coupled to the SoC. )

During the prolonged stagnation of the 2013 (out into 2015-6 yes. ). IF Apple saw the Arm transition earlier then the "T2" likely would have shown up earlier also. Apple probably started the T2 somewhere around 2014-2015 or so. The 2013 design was likely largely locked years before that ( 2011-2012 time frame).


If GPU thermals had only modestly grown , Apple had a more focused OpenCL effort (to put more workload onto the Compute GPU) , Apple and Nvidia not gotten into a multisided feud, and AMD/Intel didn't have fab hiccups at 14nm, then the MP 2013 would have had a less bumpy ride (and seen one update before it ran out of gas with the non-hardcore modular part of Mac Pro submarket. )

Part of the MP 2013 was a transition to the iMac Pro 2017.

The thunderbolt progression would have come if Apple ARM had progressed faster or slower or at the speed it did. Apple's desire to have a literal desktop (with minimal footprint) probably would have come regardless of that speed also.

And there there was ( is?) the Apple Industiral Design factor where they don't want to be simply designing 'mere containers' for other peoples designs. If have to rigidly follow lxwxh dimension restraints from a external committee then that puts external demands/constraints up the creative process. That whole "can't innovate my *ss" was a substantive driver too long before Apple got up on stage at WWDC 2013. Too much coverage about how Apple was doomed now that Steve was gone. That was a much bigger role than what ARM SoC was doing.
There is a tie in where they did have more parameter control with their own SoC. when all the super cool projects at Apple are mobile that is going to seep back into the desktop designs if let the designers off the chain to run wild.
The high organization value being placed on folks who can package most tech into the smaller package.


The Apple SoC probably contributed more to the inaction during 2015-2016 where Apple didn't adjust and just did a whole lot of nothing to correct for their missed expectations. But not the original design itself. They stuck with the design longer because there did have other "more interesting" things to do.
 
I wonder if they will provide an upgrade path for the Intel Mac Pro. Galling to have such a beautiful paperweight.
Before, I would have thought that Apple would continue the Intel Mac Pro as it has some advantages still in the market for users…but it will probably depend on sales and Tim’s spread sheet. Apple could put it out to pasture in 2024 (5 years after release) with no guilt as their general policy for support is aprox. around 5-7 years (depending).

Now…not so sure. We will know soon when the next M-class Mac Pro is released. Apple may want to totally free themselves from any trace of intel to focus on their own SoC. Eventually that is the plan of course, but maybe sooner than later….
 
I have a 11-yo, who spends most of his Mac-time on Steam, desperate to upgrade from dad's old 15" MBP to a new M1 Mini. My question: ha other been any scuttlebutt about timing an M2? We ar likely looking at the holiday shopping timeframe. Thanks.

P.S. - Sorry, question #2 (as I'm not up on gaming): is an M1 capable of navigating Steam games as sent to an external monitor? What headaches am I likely missing? And sorry, a Windows-based PC is not an option in this household. 😜
I would bet on seeing a Mac Mini spec bump by December 2022, to the same M2 chip as the Macbook Air / 13” Pro at least, if not ideally offering an option for M2 Pro or better. Seems like the base Mac mini is analogous to the 13” MBP where they can easily continue with the same case design at the entry level price point for that bread and butter best seller.
 
Apple still offers an Intel Mac Mini, so I think they will have a more powerful Mac Mini on the way, or at least I hope so. There is no other need to keep that Intel around other than for their 2yr transition period.
 
Yes, the “SSD” is separate from the SOC, but that you can’t do your own upgrade is not entirely up to tomfoolery.

Apple isn’t putting SSD drives on the motherboard. It puts raw NAND chips there. All of the control circuits are in the motherboard. These are not standard chips and not full SSDs. You are not likely to just get one from Ali Baba. In addition, the data is encrypted and stored distributed across multiple chips. If you were somehow able to solder a new NAND chip of the proper sort there, it wouldn’t work with the other chips as they would all have to be initialized together with that particular SOC. If you were to move the NAND chips to another computer, they would not work with the new SOC as they are cryptographically signed to only work with the one that initialized them.

Normally when someone has physical access to a computer, they can eventually gain full control over its drives, often by remounting them on another motherboard that has been updated to give access. These changes by Apple mostly block that kind of exploit due to how they are encrypted and tied to a particular SOC. The downside of this enhanced security is less flexibility.
What an absolute crock. An encrypted drive is an encrypted drive. My external encrypted drive is not ever going to have someone "gain full control over" it, simply because it is on a single physical standard SSD. The data is encrypted, and the encryption is either crackable or it isn't. Until it's decrypted, the data is just a random jumble of meaningless 1's and 0's.
 
What Apple calls a "SSD Module" is NOT an SSD. It is solely just NAND chips and a custom , minimal communication channel handling chip ( not even standard PCI-e NVMe protocol comm channel ). There is no SSD controller there at all. All storage and no 'brains'. So it is not an SSD. It is just 'half' of an SSD. How to do wear leveling , logic address block allocation data decisions, and a wide host of 'smart' things a SSD has to do to pragmatically work correctly is totally absent from that module. More accurate term Apple could have used is a "SSD daughtercard". But I guess that somewhere that wouldn't meet political correctness criteria. What Apple is saying is that it is a "module of a SSD".
Right, and the SSD modules that sit in the SSD slots in the Mac Studio?
 
Well I'm sure you know better then Apple or actual security experts, so...
Mate, I know enough about encryption to know that it is the encryption algorithm and the quality of the application of it that matters, not some silly extra imaginary hoops you can create.

Do you understand even the slightest thing about what https is, or PGP? You do realise that this is encrypted data that is simply sent over the internet, passing through multiple points at which it can easily and trivially get intercepted and copied? And yet is still completely and utterly useless to the receiver, unless they have the ability to crack the encryption. No silly hardware charades required.

The rubbish you have described is pure control freakery, non-repairability, and profiteering. Nothing more, nothing less.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.