Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, guess why? Because that legacy code would have caused problems during the transition to Apple Silicon that Apple had on their internal roadmaps for years. Case in point, Rosetta. That the current one works as well as it does seems amazing until you think that Apple set it up to make its job as easy as possible. Rosetta today doesn’t have to translate ALLLL of the cruft that’s been created in x86 over the years.


No, it wouldn't. There are a bunch of alternatives Apple could have done.

* Create a separate Rosetta layer, Rosetta-x32.
* Contribute to the Wine project and integrate it into the system.
* Contribute to Box-x86 to allow it to run 32-bit code and integrate it into the system.
* Add hardware acceleration for 32-bit emulation.

All these approaches have their own strengths and weaknesses, but the point is that the 32-bit code could be isolated into a separate project. This IS the approach Microsoft has used. Their 32-bit and 64-bit translation layers are related, but separate projects. They can be enabled and disabled separately, and removing one doesn't necessarily affect the other.

What Apple really wants is not only to streamline their system, but also force people to develop 64-bit code. That has worked, but it has the downside of making MacOS less versatile. If it weren't for emulation / cloud computing, many people wouldn't even consider Apple Silicon, no matter how powerful it is – so much so Tim Cook advertised virtualization when he announced Apple Silicon.
 
I don’t doubt that Apple’s going to go through changes. Someone used to the “Apple II” Apple wouldn’t get how Apple makes more money on services than they make on the Apple II, which isn’t even sold anymore. Actually that’s a good parallel. Apple, if run like most of the rest of the industry, would have JUST removed the final bits of AppleIIGS legacy code from the latest OS in 2021. You could still use a legacy IIGS installer, but it would route you to the appropriate 32-bit installer. :) And the iPod would be entering the terabytes of storage by now.

Steve Jobs died in 2011, but in 2018, we still have Apple making decisions focused on the future (to the detriment of current users) by ending 32-bit support. If this is part of the culture of Apple, it’s the thing that will keep them future facing and relevant primarily TO that future. I would not be surprised if there’s a roadmap to the iPhone’s demise just as there was a roadmap to the iPod’s demise when it was at the top of it’s game. At that time, iPhone people are going to be pissed and wondering if Apple will be relevant in another 5 years…
Yeah...no.

Not sure what point you're trying to make, in all honesty. If you think Apple will find a way because of its "future facing" culture, then definitely no. Apple doesn't know the future better than you and I. And just like you and I, it tries to control as much as it can but to say Apple planned the demise of one of its products or another by 5 years is an exaggeration. It only seems that way because hindsight is 20/20. It's the market forces that Apple is reacting to. They can't plan as much as you think they are capable of. Did they plan the reaction toward the butterfly keyboard too? Did they plan the delay in the transition to Apple Silicon? Did they plan their failure in shipping an Apple Car despite billions in investments for almost a decade? Give me a break, man.

And the example of iPod is just just a really bad take. Apple allows iPod to die because every single function of an iPod is subsumed by an iPhone. It has nothing to do with being "future facing".

And iPhone will only die in the case where another device can fully subsume its functions. Not because Apple has a "roadmap to the iPhone's demise".

Steve Jobs wasn't even sure if the iPhone would have the kind of success that it had when it was first introduced. How is he going to plan the demise of iPod?
 
Last edited:
Interesting Comparison of $599 Mac Mini which beats the $5,999 Mac Pro on some benchmarks:

 
  • Like
Reactions: lysingur
I've only been reading 'Apple is doomed' (to irrelevance or marginalisation if not extinction) stories/opinions for over three decades. Admittedly, the period 1993-1997 got more than a bit ropey and given the state of affairs, no-one would have predicted the near-miraculous transformation that followed. But losing Steve didn't halt growth, on the contrary, Cook accelerated it. Jony Ive was simultaneously held responsible for a string of terrible design decisions and yet leaving Apple without inspiration. And yet, the growth continues. You might not be wrong about future products and decision-making, but Apple now has a rather more robust corporate structure & business model than when it really, really was in trouble.
 
  • Like
Reactions: Unregistered 4U
I've only been reading 'Apple is doomed' (to irrelevance or marginalisation if not extinction) stories/opinions for over three decades. Admittedly, the period 1993-1997 got more than a bit ropey and given the state of affairs, no-one would have predicted the near-miraculous transformation that followed. But losing Steve didn't halt growth, on the contrary, Cook accelerated it. Jony Ive was simultaneously held responsible for a string of terrible design decisions and yet leaving Apple without inspiration. And yet, the growth continues. You might not be wrong about future products and decision-making, but Apple now has a rather more robust corporate structure & business model than when it really, really was in trouble.
Maybe you can enlighten us uninitiated about what happened to Sony then and why what happened to Sony won't happen to Apple.

The doomsaying has been relatively quiet ever since iPhone sales took off. Just look at how many people want in on AAPL.

I'm not saying Apple will disappear. I see it becoming the next Sony (which in all honesty, isn't all that bad) without a blockbuster product before iPhone cash runs out. The gap between Apple Silicon and its Android counterparts will eventually be closed. There is only so much you can stuff into a mobile phone before it ceases to be a selling point. And there are only so many new functions you can add to a mobile OS before people stop caring. You really don't have to look far. Just look at how many people around you are replacing their desktops and how quickly they're replacing them, then you will get a picture of how it will go for the iPhone when its time comes.

Apple probably will have deserved it too.
 
Last edited:
  • Like
Reactions: krell100
Until my current machine, I have bought the minimum I needed and the bought extra RAM/Storage as my needs grew.
I have done this since my first TRS-80 clone back in the 1980's.
Is your current machine a Mac Pro 2019? Do you own one? If not, how do your purchasing patterns matter to Apple’s Mac Pro design?
I have 2 Mac Minis that I run as servers at home, one a 2009 the other a 2011.
So two machines that are so old that Apple does not even support them any more. Got it. Again, how is this relevant to the discussion of the new Mac Pro?
I have about 50TB of storage attached to them, and I have upgraded the RAM in them and replaced the old HD with a much faster bigger SSD.
Great. Still not seeing the connection to the new Mac Pro.
I am looking at replacing these with 2nd hand 2014 models, not new.
So Apple will get no revenue from you and they should care about your needs why? If you are buying 8-9 year old machines, you can buy 8-9 year old machines with specs that meet your needs, which given that they are completely unsupported, do not really matter to Apple.
 
Would be nice if the new mac pro could feature 'M2 cards', expandable modules outside pci standard, that will support 2-4 cpu/gpu/memory combos. Imagine the potential.
 
Agreed with most points. But it was really Cook, not Jobs. https://bit.ly/3DnUFwa

The "St. Steve" disinfromation campaign is rather weak. In 2012.

"...
As the Times article pointed out, that's just what President Obama asked Steve Jobs when they met last February. Jobs answered bluntly: “Those jobs are not coming back.”
..."

All Cook did in 2016 ( 4 years later) is not reverse what Jobs approved and set up. The Mac Pro only got moved back the USA after Jobs died. Jobs was not an early "slave labor monitoring of contractors" adopter either.
 
  • Like
  • Wow
Reactions: gusmula and Mago
Maybe you can enlighten us uninitiated about what happened to Sony then and why what happened to Sony won't happen to Apple.

The doomsaying has been relatively quiet ever since iPhone sales took off. Just look at how many people want in on AAPL.

I'm not saying Apple will disappear. I see it becoming the next Sony (which in all honesty, isn't all that bad) without a blockbuster product before iPhone cash runs out. The gap between Apple Silicon and its Android counterparts will eventually be closed. There is only so much you can stuff into a mobile phone before it ceases to be a selling point. And there are only so many new functions you can add to a mobile OS before people stop caring. You really don't have to look far. Just look at how many people around you are replacing their desktops and how quickly they're replacing them, then you will get a picture of how it will go for the iPhone when its time comes.

Apple probably will have deserved it too.

I guess my question about this type of doomsaying is where do you think it'll lead? Even if we entertain the idea that they are on the path to doom, there's no possible future where desktop Mac Pros are the key to them averting that doom. Doesn't mean they won't ship some great models and we don't all get surprised by what they deliver, but there's no way that it becomes the next major driver of growth or one of the top priorities of the company.
 
  • Like
Reactions: Unregistered 4U
The "St. Steve" disinfromation campaign is rather weak. In 2012.

"...
As the Times article pointed out, that's just what President Obama asked Steve Jobs when they met last February. Jobs answered bluntly: “Those jobs are not coming back.”
..."

All Cook did in 2016 ( 4 years later) is not reverse what Jobs approved and set up. The Mac Pro only got moved back the USA after Jobs died. Jobs was not an early "slave labor monitoring of contractors" adopter either.

The entire article has one quote from Jobs and somehow that warrants your drawing the conclusion that Jobs is the mastermind behind the 2016 secret agreement signed between Apple and the CCP.

How is signing a secret agreement the equivalent of "not reverse"?

Do you even know the extent of Apple's presence and reach in China post-Jobs?

Try to go behind the paywall and read some quality content, my man.

 
Intel is free to dump legacy cruft, they even tried it once with Itanium. However, their overriding goal will ALWAYS be backwards compatibility, so, while Intel MIGHT have the capability to ship a cool and fast 64-bit only Intel processor that ACTUALLY rivals Apple Silicon, they know they don’t have to, because that’s not what their market wants.

First, Itanium really wasn't a x86 replacement. Itanium is aimed at being a HP PRISM , DEC Alpha, Sun Sparc , MIPS (SGI and some others ) , IBM Power & Z-series , Cray Computer, etc. 'killer'. The large server , HPC/Supercomputer market... not Joe Blow's PC that runs email , Word , and web browser. Or some random DOS program. A couple of those targets were 64 bits by the time Itanium got started and all had 64-bit roadmaps. It had extremely little to do with trying to 'cure' the x86-32-bit 'problem' for PCs. Intel wanted to stop the server RISC class processors from getting deeper traction.

It started by killing off HP Prism by getting HP to dump their work to 'join' the Itanium team. Mainly done by accepting the work that HP had already done as the baseline for the instruction set. (i.e., it was NOT designed internally at Intel from scratch to be an x86 replacement. ) One down right out of the gate. That quick 'take down' was a dual edged sword because brought some mixed focus. Alpha got squashed in the subsequent scramble (Compaq, new owners of Alpha, dumped it for Itanium) . MIPS also got sidetracked (SGI, major MIP user, dumped MIPS for Itanium ) and stumbled (firmly setting up SGI transition their HQ into Google HQ.) . PowerPC entangled Power and didn't go wide spread (in PC or computers. Windows got looped into chasing Itanium. ). Sparc did OK but primarily stayed single vendor. What got overall was mostly a contraction of vendors at the higher levels using RISC foundations; no broad , influence building coalitions on something outside of Intel's influence.
.

Yes, Itanium has some x86-32 sidecar and/or emulation baggage attached to it along the way. That actually did more damage to Itanium than helped. Adding a modicum on internal, dynamic scheduling to what they had would have helped tons more than that. Marketing themselves as the 800lb gorilla it made sense to use the x86 compatiblity mode as the boogeyman is going to get you factor. Not sure why folks like Compaq and SGI 'bought' that; from a technical perspective is was junk.

In terms of killing off , or at least muzzling, the RISC server family from gaining deeper traction, Itanium did a decent job of carving a path of destruction. There were internal factions at Intel who wanted to move forward on 64 bit extension but those defered. Pragmatically that was good because it allowed AMD to throw a design out there. Intel-AMD cross licensed and AMD got to continue. If Intel had been first AMD would likely be toast right now and things would likely be in a lot worse shape. (Intel likely would have arrogantly stumbled as they did , but there would have been fewer competitors around to put pressure on them to clean up their act.)



More than anything else, they want to be able to “run all of yesteryear’s code” not “last all day on battery” or “run cool”. They’ll always spin a good yarn, they’ll always miss their goals and it won’t matter.

Microsoft dumping 32-bit kernels altogether by 2024 is going to change "all of yesterday's code" metric.
( MS is going to stop selling Windows 10 to folks by the end of this month. Windows 11 only has 64-bit kernels. Yesterday's code set at 2007 by 2027 is twenty years of stuff. It is much longer than Apple's window. ;-) )


All of last century's code at maximum speed really isn't all the necessary for overwhelming vast majority of users.

Intel's 800lb gorilla standing was not solely "all of yesterday's code" ... it was the overwhelming number of PCs. Wintel was the "king of the Monsters" and they even managed to suck macOS off PowerPC and onto the x86-64 juggernaught. It was the variation of the "Nobody got fired for buying IBM" mindset. The herd was all heading to x86 for better line up with the herd or get run over. That's one reason why they were sprinkling x86 pee on the Itanium to use that sales pitch hammer.

The 'attack of the killer PCs' became the attack of the killer handheld. The inertia juggernaut now is an even smaller computer. Intel the inevitable juggernaught is gone. Intel's biggest problem is that they drank way too much of their own kool-aid.



Absolutely no question, no doubt and businesses have GOT to LOVE that fact. And, the x86 ecosystem, as a result, will still be less efficient than the stuff Apple’s producing. Garbage in, garbage out.

Only dubious business buy into drinking too much of that kool-aid. If running the exact same accounting system from 33 years ago you are likely missing out on maximum efficiency. If your business services software stack is 20+ years old same thing. The interconnections between business and the market dynamics have substantively changed in the last 20 years.

Old school Internet Explorer is gone.

Folks who keep dogmatically running the "same old , same old" ( Bed Bath and Beyond. Sears , Soutwest crew assignment system, etc. ) tend not to do well over the long run. Something folks use the "if it ain't broke , don't fix it" as dogma to ignore change. They don't look for 'broke' in a perspective to adaptation to the contemporary business challenges. Setting old criteria for evaluations of old systems in a circular logic set up of "still meets the specs" looking at issues in a bubble.


Sometimes there are corner case pieces of software that can't move because what they are attached to can't move (adapt to change). Wrapping those into a protective blanket ( virtual machine to run OS from jurrasic era or embedding the system behind a protective firewall ) are options. But broad based systems wise to go 20 years and no substantive software upgrades... what are those software vendors doing? Software compilers from 20+ years ago compared to the current ones? It isn't really a contest. What you have is code that is probably not written as well as it could be to run on modern hardware.
 
How many mid level to high end GPUs have socketed RAM? AMD's , Nvidia's , Intel's.

Same baseline techincal reasons. What you are ignoring in your examples is the soldered VRAM on the GPU cards being cast aside. The hyper modularity folks don't have hissy fits over that. The technical needs are for

1. relatively very wide memory bus width. The bus width on a high end GPU is likely going to wider than most mainstream CPU only packages.

The bus with on Intel and AMD iGPUs is the same as the mainstream. But are any of them mid-high end GPUs ? Nope. If you cannot get the data to the arithmatic units fast enough then they starve. Hence the wider than 'normal' buses. Don't have the space for the bulkiness of DIMMs busses. ( and that new connector for laptops is an even bigger space waster. )


2. caches are not going to get you 98% hit rates all the time. Caches are effective over a percentage of the larger capacity they cover. 10MB cache on a 1,000,000MB storage drive isn't going to be as comprehensively effective as a 1,000MB cache over the same amount. The lower the percentage cover the more likley the hit rate will be lower. (even more so as throwing multiple concurrent workloads grabbing at different ranges of the primary source being cached. Get more mapping conflicts and data being evicted prematurely )

Joe Blow putting in random RAM capacities means can't have a predictable hit rate over a range of configurations.
(don't tend to see high performance GPUs with extremely large variance in configurable RAM. It is bad for optimized turning. )

3. power. The farther away the memory is the more power inefficient it is to get to.


All of those lead to why have dense package GDDR VRAM packed around the perimeter of a GPU package on a discrete card. And also why in the even higher performance zone will see stuff like HBM memory where stacking RAM dies even taller and with wider busses.


The other issue progressing over last 5-6 years is better packaging technology. Which is stacking things higher and closer at lower power consumption. Pointing back at the technology limits of the 1980's isn't going to give much insight into what can do now.






Apple isn't mostly doing a CPU. Intel was Apple's largest GPU vendor ( not AMD or Nvidia. ). That's is a primary 'dump' target here. Lots of folks have spun this change as Apple dumping Intel ment that Apple was obssesed with just dumping Intel CPU cores. It was both.

THe Mx Max die is a GPU with some CPU/NPU elements sprikled around it. Apple is primarily building a mid-large size GPU so it really shouldn't that it looks like the mid-large size GPUs that other folks also build.


This is usually get 'twisted' and claim that if Apple is doing a GPU then they 'have to' come out with some 4090 killer ( whatever the largest , most expensive GPU currently out there... have to 'kill' that or it is a complete bust. ). That is nonsense. They don't have to do everything for everybody to be competive in a targeted area. AMD often got into trouble in the first half of this century trying to 'monkey see , monkey' do everything that Intel was doing.
AMD narrowed their focus and they got better ( reusable chiplets for both mainstream dekstop and severs. for a while did lower half of GPU range and then did upper half ). Apple also when Jobs came back and tossed have the model variations out the window.

Doing everything for everybody isn't necessary. Even in Mac Pro space.







It is far, far, far, from "some odd reason". Nvidia did an 'embrace, extend , extinguish' attempt on OpenCL. They put CUDA in front of Metal once Apple moved onto Metal as the alternative. when there was a huge problem with Nvidia iGPUs they ran anyway and left Apple holding the financial bag. Finally toward the end Nvidia had some code that would "Halt and catch fire' for every new macOS kernel update. Throwing drivers out in a completely uncoordinated approach to Apple's plans. And Nvidia relatively publicly blamed all the quirkiness all on Apple. (we have drivers but we have zero idea why apple won't sign them. Hoping to get fanbase to put more heat on Apple to get a reprieve. ). That just put the icing on the cake.

It isn't 'odd'. Nvidia and Apple got into a 'war' as to who was more powerful and Nvidia basically got dropped as a bad partner to Apple. They were bad. Apple contirubted to the decline also. They are just as dogmatic about Metal first as Nvidia is about CUDA first. But it is Apple's systems and their operating system. And Apple's Metal first strategy was deeply intertwined with the iPhone business. Apple had zero incentiveto let Nvidia do an "embrace, extend , extinguish" move on Metal. There is no 'new' Mac Pro business they were going to drive that was going to offset the iPhone business.

Technically , Nvidia is just a 'subcontractor'. If you are a subcontractor who is always causing drama for the prime contractor ... eventually you get dropped. After a while Apple just stopped signing their drivers. That's it.
Some folks in the general PC parts world start inside out. First pick an Nvidia card and then wrap the rest of PC parts around that. In that alternative 'world' Nvidia is the prime contractor calling the shots. Apple doesn't work that way at all.


Both sides got much bigger over time on business areas that had almost zero overlap. Apple didn't 'need' Nvidia financially or technically and vice-versa. It was very easy for the 'war' to escalate until both sides 'blew up' their side of the bridge between them.


Nvidia is a pain to work with. Almost no potential 'business partners' wanted them to buy Arm. Similar dust ups with other vendors. Nobody trusts Nvidia except their end users who buy into the moat that Nvidia digs around their products.





There was a trend toward the end of the Intel Mac era where Apple's boot firmware got closer and closer to UEFI. Early on there were special "Mac boot ROM" requirements for video cards because Apple was mainly interested in EFI (not backward looking BIOS). That UEFI support crept in mainly because the Intel CPU packages needed it.

Given a free hand as to what the boot firmware was going to be Apple tossed UEFI out the window. Macs officially boot macOS and that's it. Can do some hackery with Linux but it has no official technical support coverage.

From the first WWDC 2020 when the Apple Silicon was announced Apple said the GPU driver coverage was only Apple GPU. That didn't change at all over the next two WWDC session ( 21 and 22 ). Maybe it would be 'low priority' so it would slide to '22, but nothing. The major driver coverage expansion announced at WWDC 22 was that DriverKit drivers should work on iPadOS on M1/M2 iPads. Crickets chirping on any 3rd party GPU drivers.

There are over 50+ cards that work in a Thunderbolt PCI-e card expansion box. Just not anything that deals with the early boot set up that isn't a generic driver like USB or NVMe or SATA. THere is no DriverKit coverage for display GPU in the object hierarchy for the framework ( unlike the preceeding IOKit framework). Thunderbolt works with Apple Silicon. That isn't the issue.

The big push for the Blackmagic eGPUs was for the laptops which had limited GPUs. The new Mn Pro and Max are no where near as limited. The Mini ... again now no where near as limited (especially if loop in a Studio Ultra).
The iMac Pro was less limited. And the MP 2019 not particularly limited at all in an "augment through eGPU" wise. Those last two were not where eGPUs were primarily targeting.




I suspect that is part of Apple's point of excluding 3rd party GPUs. They want developers extremely focused on optimizing for Apple GPUs. If there are not other options there are no distractions. Apple is trying to get rid of the notion that 'iGPU == slow' . That doesn't have to be if don't cripple the memory bus to kowtow to DIMM options.






That far enough in the delusional zone that appears to come from an alternative universe. Apple does billions in gaming on Apple GPUs. (the high end PC game market isn't the whole gaming market). No AI. chuckle you app exposure is limited. Apple is far more focused on AR than VR and have laid lots of foundation.

Apple's approach is closer to a console gaming than perhaps more hard core Windows PC gamers are comfortable with. It is somewhat of a hybird approach. Apple is looking to match highly optimized graphics code to a finite set of good hardware. A small enough set so that can optimize very well for all of them, but not so large where have constantly mutating drivers trying to plug every quirky hole that corner case chasing apps dive into.

When Apple does VR it will deep links to be mobile (on battery) VR. Ai on battery . Gaming on battery . 75+% of Mac sold are mobile capable. That is where the inertia is. The Mac Pro is gigantic chasm away from what is driving the vast bulk of Mac sales ( it is likely down in the 1% (or less) range). Highly focused , well optimized code when run on a plugged in and larger SoC will often run faster. (brute force code will proably require more expensive brute force to run faster.) They are not likely to completely miss out on all of the mid-high end range of performance coverage. Especially if have a high quality of optimized code on their GPUs than the competition.




Again no. Apple is more about creating a system than in a subcomponent. If looking to Apple just to make incomplete subcomponents you are kind of shopping for a pork sandwich at a conservative kosher deli.


If the unified memory wasn't buying them largely differentiated performance you'd have a point. It is a trade-off that apple is making. Don't like the trade-off then don't have to buy from them. Apple isn't trying to make everything for everybody.


Apple does need to add the ability to provide more general PCI-e bandwidth to the Ultra (and up) SoC that will never land in a laptop. Some acceleration that adds more performance in 'scale out' (in the box) workloads is likely necessary because there is only so much 'horsepower' you can put into a single package for a fixed fab process technology level. These days a supercomputer is a bunch of little computers cluster in a very expensive, custom network.
Gotta say, this is an awesome post with extensive useful helpful information. Thank you.
 
They should just put the SoC on a "compute card", so people can at least upgrade that way. Want more RAM, more GPU cores, or an upgrade to M3? Just buy a new compute card and swap them. Would be the most elegant way of combining upgradability with the unified architecture of Apple Silicon.
In addition to that, they could still do something with normal memory slots. They could treat the unified memory as a cache, or a faster portion of RAM, but then add standard RAM that may be a bit slower to cover the need for more RAM.
 
Since the first M1 variant, it was obvious that despite all its awesomeness (no sarcasm, it is awesome for almost all uses), the Mac Pro as known was over.

That could only change if Apple decided on some M Mac Pro frankenchip to insure the usual traits. That never seemed quite possible (at least to me).

What they should have done is using POWER10 instead…
 
  • Haha
Reactions: Nermal and avkills
No, it wouldn't. There are a bunch of alternatives Apple could have done.
Could, yes. It’s a wonderful word. Almost up there with might. :) But, let’s look at the world we live in. What they did, just so happens to perfectly align with paving the way on macOS for utilizing a processor that had no native ability to process 32-bit applications. There were those, like myself that predicted at the time that they wouldn’t do that unless they were preparing to use their 64-bit only iOS processors as the base of upcoming Macs. What happened was… they used their 64-bit iOS processors as the base of upcoming Macs.

I mean, sure, I can make up alternatives. Apple could have engineered perfect 1-1 execution of all current x86 applications such that all instructions ran three times faster than the fastest Intel chip while sipping 2 watts of power. Pretty much anything could be done. What WAS done, though, prepared all macOS developers for a future without native 32-bit instructions. Quite nicely at that.

What Apple really wants is not only to streamline their system, but also force people to develop 64-bit code. That has worked, but it has the downside of making MacOS less versatile.
That’s only a downside for folks that don’t need macOS. And… folks that don’t need macOS… aren’t going to be buying macOS systems… so, there’s no point in trying to make folks that require versatility happy.

If it weren't for emulation / cloud computing, many people wouldn't even consider Apple Silicon, no matter how powerful it is – so much so Tim Cook advertised virtualization when he announced Apple Silicon.
Every year, there are literally billions that don’t consider Apple Silicon. That’s a fact that Apple’s been aware of for a LONG time, which is why they just focus on a profitable sliver of a few million folks a year that find enough value in their products to pay for them (and that have the money to buy them… that’s really important!)
 
In addition to that, they could still do something with normal memory slots. They could treat the unified memory as a cache, or a faster portion of RAM, but then add standard RAM that may be a bit slower to cover the need for more RAM.
Not really. What they could do was allow PCIe connected RAM disks, but they would still be a lot slower although would have the advantage that they would not need specialized hardware to support them and specialized software, but would just use the existing VM system.
 
  • Like
Reactions: DavidSchaub
Not sure what point you're trying to make, in all honesty. If you think Apple will find a way because of its "future facing" culture, then definitely no. Apple doesn't know the future better than you and I. And just like you and I, it tries to control as much as it can but to say Apple planned the demise of one of its products or another by 5 years is an exaggeration.
I mean, it’s not a stretch to think that, while they were working on the iPhone, that they had an inkling that the iPod market would be dealt a killing blow. I mean, yes, it’s possible that, a year after the iPhone was released, some lower level staffer burst into Steve’s office, sweating profusely, stating “iPod sales appear to be falling and we don’t know why!” But I doubt it. Apple looked at the mountain of cash coming in from the iPod… a business that ANY company would have made a deal with forces unknown to have… and killed it one fell swoop. Nokia/Sony, most of tech will prop up whatever they were good at years ago to try to keep that golden goose producing. Apple’s killed a few golden gooses in their day. No, I don’t know the future, but if someone asked “will Apple keep on killing golden gooses I’d say, “probably”. And, in 10 years when Apple’s still relevant, no one will even know these words existed.:) Considering how many companies have gone the way of the dodo, there’s something different about Apple. “Future facing culture” was just my attempt to try to put a name to why they can systematically compete against their own products over years and continue to win.

Did they plan the reaction toward the butterfly keyboard too? Did they plan the delay in the transition to Apple Silicon? Did they plan their failure in shipping an Apple Car despite billions in investments for almost a decade?
Pretty much. I mean, if anyone’s worked in a mid to large sized multinational organization like Apple, there are people always planning, performing risk assessments, weighing profit and loss statements (some of us here have been on teams where we still can’t talk about what we worked on). Just because something is planned doesn’t mean it goes that way. But, anyone that thinks that Apple’s just ‘winging it’ hasn’t worked in a mid to large sized corporation. :)
 
  • Like
Reactions: Detnator
I've only been reading 'Apple is doomed' (to irrelevance or marginalisation if not extinction) stories/opinions for over three decades. Admittedly, the period 1993-1997 got more than a bit ropey and given the state of affairs, no-one would have predicted the near-miraculous transformation that followed. But losing Steve didn't halt growth, on the contrary, Cook accelerated it. Jony Ive was simultaneously held responsible for a string of terrible design decisions and yet leaving Apple without inspiration. And yet, the growth continues. You might not be wrong about future products and decision-making, but Apple now has a rather more robust corporate structure & business model than when it really, really was in trouble.
Because of the way Apple works (focusing on attracting newer customers, sometimes at the detriment of their long time customers), there’s always going to be a large number of people that, when Apple switches direction (on printers, wireless access points, web design software, iPods, Mac Pro’s what have you), those folks will, from that day forth, be calling Apple’s doom. It will always be just around the corner and folks will say that “The start of the end was when they stopped <making the thing I like in the way I liked it for a price I wanted to pay>”. As it’s an emotional response, there’s no about of data that can inform them otherwise. So, the naysaying will continue!
 
Nintendo 64...! ;^p
Just watched a video recently about the tech in the Nintendo64… impressive stuff, that. And, those with the skill and know how are continuing to do amazing things today understanding the best way to get the most out of it (well, and having the benefit of not having to stick to a “gotta release this” development schedule! :))
 
Because of the way Apple works (focusing on attracting newer customers, sometimes at the detriment of their long time customers), there’s always going to be a large number of people that, when Apple switches direction (on printers, wireless access points, web design software, iPods, Mac Pro’s what have you), those folks will, from that day forth, be calling Apple’s doom.
Some of it may even be true/right. :)
It will always be just around the corner and folks will say that “The start of the end was when they stopped <making the thing I like in the way I liked it for a price I wanted to pay>”. As it’s an emotional response, there’s no about of data that can inform them otherwise. So, the naysaying will continue!
I remember a very rich investor friend of my dad’s saying that the secret to his success was buying too late and selling too early. :) I have talked to more than a few Apple employees about why they killed printers, Airports, etc. and the arguments were always the same: they could not maintain their margin and still sell a product at the quality they needed to maintain their brand.

In some of those cases it might have been a good choice to cut their margin to keep the ecosystem healthy, but clearly not all of them, and on the whole, they have been right way more than they were wrong.
 
Last edited:
Could, yes. It’s a wonderful word. Almost up there with might. :) But, let’s look at the world we live in. What they did, just so happens to perfectly align with paving the way on macOS for utilizing a processor that had no native ability to process 32-bit applications. There were those, like myself that predicted at the time that they wouldn’t do that unless they were preparing to use their 64-bit only iOS processors as the base of upcoming Macs. What happened was… they used their 64-bit iOS processors as the base of upcoming Macs.
Apple has done this many times and each time they have done it, everyone said it was the end. They got rid of the floppy, they got rid of the optical disk, they got rid of the 68K, then the PowerPC and finally x86. They dumped the ADB in favor of USB, and a whole lot more things. They do these each time to push the market forward.

Burning the boats on the shore.
Every year, there are literally billions that don’t consider Apple Silicon. That’s a fact that Apple’s been aware of for a LONG time, which is why they just focus on a profitable sliver of a few million folks a year that find enough value in their products to pay for them (and that have the money to buy them… that’s really important!)
Instead of a long transition where they kept releasing many new Intel based systems and allowing people and software developers to hang in the past, they made it clear that Apple Silicon was the only future. Everything they have done pushed things in that direction.
 
Interesting Comparison of $599 Mac Mini which beats the $5,999 Mac Pro on some benchmarks:

Sorry to get exercised about this, but the comparison they're referencing ( https://www.digitaltrends.com/computing/apple-m2-mac-mini-beats-mac-pro/ ) is disingenous clickbait. Look at what the author (Monica White) is saying:

1675049800291.png

First, she's trying to create drama based on the single-core speeds. This is ridiculous. Even when the Mac Pro was released in 2019, its single-core speeds were slower than those of a 2015 i7 iMac (SC GB score = 1048). That's not what the Mac Pro was ever about. She's trying to take something that's always been known and dishonestly portray it as some new thing she's reporting:
1675049928789.png


Second, for multi-core, she disingenously uses the slowest Mac Pro sold, without noting that it's the base (8-core) processor. Given how expensive the Mac Pro is, it's a waste to buy it if all you need is an 8-core processor, so I can't imagine many would do that. A more honest comparision would have been to give the range which, according to GB's own scores, are 8,038–20,031:

1675050623449.png

1675050625836.png


There's no question the Mac Pro is now a horrible value. But that's so clear that Ms. White could have demonstrated it without the game-playing.
 
Last edited:
  • Like
Reactions: Detnator
If you are going to pay a hefty price for a Mac Apple should let you consider customizing it the way you want. It’s simple as that.

Why else do we pay premium prices for Apple products?

You pay premium prices for Apple products just because they have the Apple logo on them. Nothing else. Just like versace or giorgio armani make people pay premium prices for garments with their brand on them, Apple does exactly the same with its products.

As for upgradeability, expect as little as possible of it. Apple wants as many people as possible to upgrade as often as possible, so a little or non-upgradeable product is obviously part of the plan. Profits are all they care about. Not the quality of the product, not the end user, just profits.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.