Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Good point, I should have been more specific. I thus updated my post to say:

"I.e., UMA gives Apple a technological excuse to extend, to their desktops (whose Intel versions all had upgradeable RAM), a restriction they've previously only had in place only for laptops, and thus profit accordingly."
I have posted in another thread, but to summarise here, Unified Memory Architecture (UMA) does not dictate whether you use LPDDR or DDR or HBM or GDDR6 or whatever memory technology.

Apple can choose to use SODIMMs for their notebook and DIMMs modules for the Mac Studio, but they will take a performance and battery life (for notebooks) hit and it will massively increase their manufacturing cost because they have to lay 1024 data traces or more for the Mac Studio just for the memory alone, and they would need 16 DDR5 DIMM sockets (maybe more) just to get to 1024-bits of data. It would be a nightmare just to layout the data traces and not get crosstalks.

So no, the reason they are using soldered RAM is more from the battery life and performance angle, and likely cost as well.

I believe Apple's engineers knows what they are doing. Contrary to popular believes, I don't think Apple's marketing calls the shot when it comes to engineering Macs.
 
I believe Apple's engineers knows what they are doing. Contrary to popular believes, I don't think Apple's marketing calls the shot when it comes to engineering Macs.
Whether Apple's engineers know what they're doing has no bearing on whether Apple's marketing department calls the shot. The reality is most likely a mix of both. It does seem to lean heavily toward marketing these days though. How quickly do people forget about the butterfly keyboard and the touch bar, just to name a few engineering fails in recent memory?

The fact that Apple missed their own transition deadline by full two months should give you pause. If Apple's engineers really are the ones calling the shot, then it logically follows that they've run into technical problems. Perhaps it's because Apple Silicon just isn't as competitive against Intel chips when power efficiency is less of a concern, e.g., in a Mac Pro-type machine, as a lot of people here would like to believe.

If Apple's engineers know what they're doing and Apple Silicon is so superior to whatever the state-of-the-art Intel/AMD chips are, why the delay? They're already going to keep the original design so we can rule out a new exterior being the bottleneck. If your premise holds true that Apple's engineers are the ones calling the shot, the only logical conclusion is Apple Silicon does not scale just fine. You can't have both, quarkysg.
 
  • Like
Reactions: gusmula
I guess that the Mac Pro will be released with M3. M2 already have got so much negative responses that it wont make sense to make the Mac Pro based on that.
 
So no, the reason they are using soldered RAM is more from the battery life and performance angle, and likely cost as well.

I believe Apple's engineers knows what they are doing. Contrary to popular believes, I don't think Apple's marketing calls the shot when it comes to engineering Macs.
You misunderstand. I wasn't saying Apple chose soldered RAM for economic rather than performance reasons. Rather, I was saying that they're not exactly unhappy that this choice, in addition to providing better performance (at least for mobile devices) also results in increased profit by forcing consumers to preconfigure RAM at the time of purchase:
[I'm not saying UMA doesn't offer performance benefits. Rather I'm just saying Apple isn't exactly unhappy that this also forces its desktop customers to preconfigure their purchases with Apple RAM.]

I have posted in another thread, but to summarise here, Unified Memory Architecture (UMA) does not dictate whether you use LPDDR or DDR or HBM or GDDR6 or whatever memory technology.

Apple can choose to use SODIMMs for their notebook and DIMMs modules for the Mac Studio, but they will take a performance and battery life (for notebooks) hit and it will massively increase their manufacturing cost because they have to lay 1024 data traces or more for the Mac Studio just for the memory alone, and they would need 16 DDR5 DIMM sockets (maybe more) just to get to 1024-bits of data. It would be a nightmare just to layout the data traces and not get crosstalks.
You're of course quite right, UMA doesn't preclude slotted RAM. I just realized that myself independently yesterday :).

But let's consider this from a big-picture results-oriented perspective for desktop applications. [So this isn't just about the Mac Pro -- it's also about the Mini, Studio, and iMac.]:

What would be the actual performance (not efficiency) penalty in going from soldered LPDDR to slotted DDR (for the same frequency)? E.g., maybe it's only, say, 1%, which would be hardly noticeable, and outweighed by the two key benefits of slotted DDR in desktops (see last paragraph). Is Apple's architecture more sensitive to such a change than x86? The i9-13900K is higher-performance than any current AS chip (GB5 SC = 2240), and is able to reach that with slotted DDR5 (granted, that's not to say it would't be even faster with LPDDR5).*

To save design costs, Apple takes a modular approach to its chips. E.g., I assume the desktop M1 Ultra incorporates the same memory controllers as the mobile M1 Pro, it just has four times as many of them, allowing four times the RAM capacity and bandwidth.

But AMD's new 7940HS, for instance, accepts DDR5, LPDDR5, and LPDDR5x RAM. I don't know if this means they have a "universal" memory controller that accepts both DDR and LPDDR, or if they have two different variants. But either way, couldn't Apple do the same thing, such that they could offer soldered LPDDR on their mobile devices and slotted DDR on their desktops?

If so, that would provide their desktops the RAM modularity they currently lack. And that might also give them a larger max memory capacity.

Specifically, you could get an Ultra Mac Pro with 800 GB/s bandwidth using 16 slotted DDR5 RAM sticks:

one 6400 MHz DDR5 stick: (6400 x 10^6)/s x 64 bits x 1 byte/8 bits = 51.2 GB/s

So you'd need 16 conventional slotted sticks. Might not be too bad in an Ultra Mac Pro, since that would give a starting RAM of 128 GB (at 8 GB/stick), and would go up to 16 sticks x 32 GB/stick = 512 GB. If 64 GB sticks become available, that would give 1 TB RAM. This seems like a simpler solution than tiered RAM. The question is whether you need tiered RAM to avoid a meaningful performance hit from not having some RAM super-close to the CPU/GPU.

*OTOH, with LPDDR5x becoming available now, and DDR6 not expected until 2025+, for the next couple of years you'll need to go with LPDDR to get the fastest performance, for generational reasons (LPDDR5x > DDR5).


1673641494225.png
 
Last edited:
  • Like
Reactions: atonaldenim
The fact that Apple missed their own transition deadline by full two months should give you pause. If Apple's engineers really are the ones calling the shot, then it logically follows that they've run into technical problems. Perhaps it's because Apple Silicon just isn't as competitive against Intel chips when power efficiency is less of a concern, e.g., in a Mac Pro-type machine, as a lot of people here would like to believe.

If Apple's engineers know what they're doing and Apple Silicon is so superior to whatever the state-of-the-art Intel/AMD chips are, why the delay? They're already going to keep the original design so we can rule out a new exterior being the bottleneck. If your premise holds true that Apple's engineers are the ones calling the shot, the only logical conclusion is Apple Silicon does not scale just fine. You can't have both, quarkysg.
Did some say NO to that CPU card slot/socket and the cost of haveing 100's OF MB's for each CPU / RAM config was to much?

For an PRO system have an BASE MB and have the CPU / RAM on it's own card / socket.
 
VMware dropped experimental IOMMU passthrough on desktop apps a while ago and Parallels has never supported it, and I don't believe MacOS allows for that abstraction at all right now, so while I agree that would be awesome I don't really expect it

What VMWare and Parallels offered is largely muted on the macOS on Apple Silicon context since both are required to use Apple's hypervisor. They only offer value adds on top of what Apple's hypervisor (and virtualization ) framework offer. So Apple is the real "show stopper" blocker , or not, of whether the feature gets delivered or not.

Does Apple's hypervisor framework off pass through IOMMU mapping now? No. Is Apple deprecating (kicking out) kernel extensions in favor of a IOMMU based scheme in DriverKit? Yes. IOMMU mapping and control are base foundational elements of how Apple is doing security in the Apple silicon kernel. So is "pass through" a far fetched feature? No. At least not any more far fetched as Apple doing all the 3rd party GPU work entirely by themselves.

Apple's hypervisor has been missing multiple layered virtual machines also. That is another missing piece that eventually should get filled in.

If Apple is not going to do 3rd party GPU drivers then IOMMU mapping is a very straighforward way of just dropping that 'issue' onto another OS than might want to deal with the issue. If macOS doesn't want to deal with those card then map them completely out of the OS (level above the hypervisor) and drop it someplace else.

A Mac Pro with minimally an Ultra has a 'floor' RAM capacity level of 64GB . A floor of 64GB to run multiple VMs isn't all the horrible. A fairly high percentage of folks who want to run substantive VM workloads are going to buy an Ultra ( plenty of CPU cores to spread around, RAM to spread around , more GPU cores to run a virtual GPU on, etc. ). One of the major drivers of way folks want to run Linux/Windows on raw metal is that they want 'raw metal' access to the GPU card ( heavier 3D and graphics workloads that do not fit as well on virtual GPUs. ). If Apple solves that problem then the number of advocates for "gotta have raw iron Windows or bust" will greatly diminish.
It is one of few corner cases were virtualization does not work as well. A straightforward solution that problem would be to make virtualization "faster". IOMMU is faster virtualization in a general sense.


P.S. if Apple kills off kernel extensions in 3-4 more iterations they are likely going to need IOMMU pass thru mappings anyway because folks with macOS VM image that keep kexts to work with older cards that don't get any DriverKit updates are going to need them also. Yet another "angry mob" with pitchforks and torches circling Apple because they are mad because can't keep things working the "old way".
 
  • Like
Reactions: gusmula
I have posted in another thread, but to summarise here, Unified Memory Architecture (UMA) does not dictate whether you use LPDDR or DDR or HBM or GDDR6 or whatever memory technology.

That really isn't true technically. Apple might call it "Unified Memory Architecture", but pramagtically it is really "Uniform Unified High Performance Memory Architecture". What Apple has implemented is not highly non-uniform access tolerant. Nor is it very amenable to heterogeneous RAM subsystems.




Apple can choose to use SODIMMs for their notebook and DIMMs modules for the Mac Studio,

That would not work. The M1 Max and M1 Ultra are largely just 'regular' GPU structure with some CPU and SoC elements sprinkled around it. How many high performance GPUs can you name with DIMMs slots on them? They are no widespread examples of those for very sound technical reasons.

Apple has put together a "poor man's HBM". It has better economic affordability than HBM. But it has alot of the same properties. regular DDR DIMMs are very dubious foundation to try to build a more affordable HBM-like solution on.


You can point to Intel iGPUs and to AMD iGPUs that are provisoned off of regular DDR and soDIMMs, but are you pointing to a relatively high performance GPU? Not really. The current ones are faster than those from 2-5 years ago but are they competing in the desktop mid range space? No.


but they will take a performance and battery life (for notebooks) hit and it will massively increase their manufacturing cost because they have to lay 1024 data traces or more for the Mac Studio just for the memory alone, and they would need 16 DDR5 DIMM sockets (maybe more) just to get to 1024-bits of data. It would be a nightmare just to layout the data traces and not get crosstalks.

So no, the reason they are using soldered RAM is more from the battery life and performance angle, and likely cost as well.

The battery is a bit secondary to Apple focus on the GPU subsystem being the dominate element of the SoC. One of Apple's primary goals is to kill off discrete GPUs in laptops. Folding the dGPU and VRAM into the SoC is a 'vaule add' to support their pricing their SoC at higher costs. And also bringing high performance to lower "single chip" solution packages. The M1 Max running at 80W really isn't saving "maximum" battery. Usage for hours at that rate is going to kill the battery.


I believe Apple's engineers knows what they are doing. Contrary to popular believes, I don't think Apple's marketing calls the shot when it comes to engineering Macs.

I don't think it is completely one sided either way. Marketing probably does have a hand in thinning out the enclosures. The iPad-on-a-stick 24" iMac is probably not a purely engineering driven design. Someone in marketing didn't rein in the thinness politburo on that design.

Apple's engineering probably cannot use 'infinity' die area to contruct a solution. SoC has die size caps. So everything and the kitchen sink can't be throw into a package.

Apple's engineering also probably can't completely throw all their work out he window and do a sidebar die with completely different tech and philosophy. There are also some design reuse parameters on their designs to (i.e., a non limitless budget. Not a tiny budget, but not a "ask for as much money as you want" one either. ). The SoC package has to make money also.
 
That really isn't true technically. Apple might call it "Unified Memory Architecture", but pramagtically it is really "Uniform Unified High Performance Memory Architecture". What Apple has implemented is not highly non-uniform access tolerant. Nor is it very amenable to heterogeneous RAM subsystems.
To me it’s just terminology. The fact is that how Apple implemented the AS memory from a programming model POV does not dictate how it is implemented physically. The main reason it is what it is now stems from overall performance and power consumption. There’s no technical reason why it cannot be done.

That would not work. The M1 Max and M1 Ultra are largely just 'regular' GPU structure with some CPU and SoC elements sprinkled around it. How many high performance GPUs can you name with DIMMs slots on them? They are no widespread examples of those for very sound technical reasons.

Apple has put together a "poor man's HBM". It has better economic affordability than HBM. But it has alot of the same properties. regular DDR DIMMs are very dubious foundation to try to build a more affordable HBM-like solution on.


You can point to Intel iGPUs and to AMD iGPUs that are provisoned off of regular DDR and soDIMMs, but are you pointing to a relatively high performance GPU? Not really. The current ones are faster than those from 2-5 years ago but are they competing in the desktop mid range space? No.
Why would it not work? It’ll just have terrible performance. That’s why I said it’ll have a performance hit. Apple probably weigh the pros and cons and decided performance and lower power consumption outweigh the RAM modularity. And I think the industry agrees and is also moving in that direction.

The battery is a bit secondary to Apple focus on the GPU subsystem being the dominate element of the SoC. One of Apple's primary goals is to kill off discrete GPUs in laptops. Folding the dGPU and VRAM into the SoC is a 'vaule add' to support their pricing their SoC at higher costs. And also bringing high performance to lower "single chip" solution packages. The M1 Max running at 80W really isn't saving "maximum" battery. Usage for hours at that rate is going to kill the battery.
I disagree here. I don’t think Apple want to kill off discrete GPUs. They are looking for a low power high performance GPU and no one is offering what they need. So Apple build their own. What is the alternative GPUs that offers what the M1 Max offer?

Edit: Using SODIMMs necessitate high voltages and thus requires more power/energy, so my point still stands.

I don't think it is completely one sided either way. Marketing probably does have a hand in thinning out the enclosures. The iPad-on-a-stick 24" iMac is probably not a purely engineering driven design. Someone in marketing didn't rein in the thinness politburo on that design.

Apple's engineering probably cannot use 'infinity' die area to contruct a solution. SoC has die size caps. So everything and the kitchen sink can't be throw into a package.

Apple's engineering also probably can't completely throw all their work out he window and do a sidebar die with completely different tech and philosophy. There are also some design reuse parameters on their designs to (i.e., a non limitless budget. Not a tiny budget, but not a "ask for as much money as you want" one either. ). The SoC package has to make money also.
Of course no one department have the final say. But many posters in this forums keeps repeating the mantra that Apple is managed purely by marketing. That is definitely false IMHO. I find it hard to believe that VP of marketing will override VP of hardware engineering.

Whether Apple’s engineering uses a certain set of technologies depends on many factors, such as patents, costs, efficiency, etc. Don’t you think engineering would have deliberated in depth and decide on a final design before production starts? Just because Apple didn’t use one tech or another does not mean engineering is not doing their job. Otherwise Apple wouldn’t be where they are today.

IMHO Apple always look at the whole package and not solely at one metric. So AS is just one part of the puzzle. Unfortunately Apple always gets compared to various components makers and the conclusion is then Apple is doomed.

The fact that after more than a year and no notebook comes close to Apple’s MacBooks speaks volume. That IMHO is more important than any single metric such as top Geekbench scores.
 
  • Love
Reactions: Detnator
I worked in IT and we had all machines configured the same. We never upgrade a single machine to have more RAM than others because that makes deployment a headache when it comes to figuring out if XYZ software is compatible with all of the workstations.

When the time comes to upgrade, we don't upgrade all machines to increase ram, we just buy all new machines because that's when the new budget comes in.
I’ve worked in places that do this and I always thought it was weird. All those perfectly useable computers going out the door to… where…??

At the time, I hadn’t realized so much was going into landfills, or land-MOUNTAINS, in other countries whose governments have contracted to use their poor communities as waste zones.

Now I know. It’s deeply disturbing and offensive.

Not sure it’s getting any better today, though there is a tiny bit of infrastructure for materials recovery; it just doesn’t seem to be much actual recovery or reuse.

What a waste of materials and energy. Humanity is insane.
 
What VMWare and Parallels offered is largely muted on the macOS on Apple Silicon context since both are required to use Apple's hypervisor. They only offer value adds on top of what Apple's hypervisor (and virtualization ) framework offer. So Apple is the real "show stopper" blocker , or not, of whether the feature gets delivered or not.
VMWare dropped IOMMU passthrough on *all* of their desktop versions, they really really really want you to use esxi or vsphere for that kind of use case.
 
100% agree. The sooner he retires, the better (in my book)

I'm sure many who are just worried about juicing the stock price in the near term will disagree.
So be it.
The better? Who will replace him that has any vision? I’m not remotely optimistic here. The corporate world is full of this crap. His replacement could be even worse.
 
I’ve worked in places that do this and I always thought it was weird. All those perfectly useable computers going out the door to… where…??

At the time, I hadn’t realized so much was going into landfills, or land-MOUNTAINS, in other countries whose governments have contracted to use their poor communities as waste zones.

Now I know. It’s deeply disturbing and offensive.

Not sure it’s getting any better today, though there is a tiny bit of infrastructure for materials recovery; it just doesn’t seem to be much actual recovery or reuse.

What a waste of materials and energy. Humanity is insane.

We kept a few in storage. We also setup a test space for figuring out deployment plans. The rest went to other IT departments within the company (it was a Fortune 500 company). Most of the junk accessories we e-wasted/recycled.

Apple will be happy to take all of your computers for free to ewaste/recycle, including PCs.
 
  • Like
Reactions: Unregistered 4U
We kept a few in storage. We also setup a test space for figuring out deployment plans. The rest went to other IT departments within the company (it was a Fortune 500 company). Most of the junk accessories we e-wasted/recycled.

Apple will be happy to take all of your computers for free to ewaste/recycle, including PCs.
In my experience you see upgrades on machines mid cycle in really small environments (small businesses with like 20 people or less) and in really huge environments (large companies that do standard 3-5 year cycles and someone needs an update to their machine mid-cycle for a pressing reason and has managerial approval)
 
That is quite easily the most comical comment on this thread. The sarcasm is pretty funny too. I’ve been a video and design pro for 37 years using macs and I always upgrade stuff and when allowed, get inside the damn machine. THAT is what pros do. So skills, experience, abilities, etc don’t make a pro…but not opening a computer up does. Ok then.

Sure you did. Maybe you didn't intend to be so absolutist here, but this is what you wrote, and thus what I replied to:

You subsequently significantly softened your position to this:

...but haven't acknowledged you've softened it.


I'm afraid we're not aligned. Many pros don't, but many pros do. It's of interest to many in both groups. I disagree with the characterization that upgradeable RAM is more of an enthusiast thing than a pro thing.
I consider myself a professional at what I do. And I prefer to spend my time making money doing that than spend it playing with the internals of my tools.

I believe the philosophy Apple subscribes to is that there are different kinds of pros but the pros Apple is interested in are the ones whose work makes more money Tham money saved tinkering. In Apple’s book if a pro is any good at what they do then their time doing what they do is worth a lot more than the money they lose trying to save money tinkering with their machines. For those pros who make more money doing what they do then when they need a better tool, it is more cost effective to trade in the old tool, buy a new tool, and spend their time making the more money than spend their time trying to save money tinkering with the existing tool instead.

I’m not saying that’s the definition of a pro. I’m simply saying there are different kinds of pros but those described above are generally the only pros Apple is really interested in catering to. And there are enough of those to keep Apple in business that they have no need or desire to cater to the pros who consider their time is better spent trying to save money tinkering with their tools instead of just getting on and using their tools to make the money.

We can each decide for ourselves whether or not we think that is reasonable but from Apple’s perspective they have a business to run and that’s what they believe is the most successful route. Right or wrong, It makes sense. It’s logical. It’s internally consistent. It’s disappointing for some but that’s a different matter. For those who like to tinker, the PC world caters tremendously well to that. For those who don’t, Apple is a great choice that caters to them. That’s pretty much been Apple’s philosophy in some form or another since the beginning.

All that said, that doesn’t mean they won’t cater to the tinkering crowd if they can, and the 2019 Mac Pro is testament to that. So is the Mac II series in the 80’s. And others. But if catering to the tinkerers requires compromising any of their deeper goals then they’ll sacrifice catering to the tinkerers first.

That’s why when genuine benefits come from bundling the RAM in the SoC (vastly improved performance, etc.) Apple won’t hesitate to sacrifice tinkering ability for those benefits.

How that will play out in the ASi Mac Pro is yet to be seen. But if they can’t reasonably figure out how to make RAM on a SoC upgradeable or make upgradeable RAM some other way without compromising the performance that comes with the SoC approach they’ve now adopted, then the Mac Pro’s RAM won’t be upgradeable. And that will be fine by the pros in Apple’s target market. Those pros will look ahead, buy a machine with enough RAM for the future, CapEx it, and then replace the whole machine when it’s not enough any more. And they’ll be happy doing that and not tinkering. And they’ll be happy that they have a machine with all the benefits that macs come with even at the expense of “tinkerability”. I for one am, with my current M1 Max MBP, and will be when I get an ASi Mac Pro for a new need I will soon have, if they ever release the darn thing.
 
Last edited:
In recent years (since all parts are soldered in) I buy the fully loaded top model as it will still be viable for many years and OS versions. My 2019 Intel 16" MacBook Pro (64GB ram and 8TB SSD) is still on MacOS 12.6.2 to have compatibility with vintage equipment and software.

I acquired a 2020 MBAir M1 (16GB. ram and 1TB SSD) to learn about the new M series processors. It is now my beta test machine now. I have a 2021 14" MacBook Pro Max (64GB ram, 32 core GPU and 4TB SSD) as my travel machine. My 2013 MacPro retired into a MacStudio Ultra with 128GB Ram, 64 core GPU and 8TB SSD.

Hope the Studio can last 7 or 8 years. But at age 77, it may outlast me..

One advantage to the soldered units is that there is no finger pointing when it craps out. There are no other non Apple vendor parts miss installed by the user. The **** hits Apple's fan and they have to fix it. No way to duck their responsibility.
 
VMWare dropped IOMMU passthrough on *all* of their desktop versions, they really really really want you to use esxi or vsphere for that kind of use case.

VMWare wanting a larger portion of their user base to use a feature has very little to do with whether or not IOMMU pass thru is some significant tenicial and/or expensive barrier to implement for Apple. Apple's primary 'day job' is not to cater to VMWare customer base. They have a different customer base.

Apple has no, second , 'high end' hypervisor. They only have one. And it is installed on every single macOS on Apple Silicon instance sold. If Apple charges $1 per system for the feature and sells 25M per year that is a $25M/yr to get this done. How does Apple charge more money to their customer base is a easy answer. Just tack $X.YZ onto every system sell. They have to 'herd' their customers into another product by taking a feature away. The 'herding' aspect is just buying the Mac product at all.
 
Apple will do it and their consumers will throw money at them for some ridiculous reason. You have to wonder about their IQ when they pay money for such gimped products. Makes zero sense.
I wonder about the IQ of people who can’t imagine the possibility that others have different needs, circumstances, and values.
 
  • Like
Reactions: dysamoria
The cause for my MacStudio Ultra with two VESA mount Studio Displays purchase was the increasing erratic behavior of my 2013 Mac Pro and the demise of one of the two 27" Thunderbolt screens. The is a giant leap in performance for me.
 
that's is an extremely ridiculous statement. Just because you don't need to upgrade doesn't mean others don't. Typically apple over charged for ram, many pro users would buy base ram configs and buy ram somewhere else, or later when it was cheaper. Needs for ram change over time. you pay for a pro system you can afford at the time and save to add ram later. not everyone is rich like you. graphics cards change almost yearly. adding internal expansion cards used to be a thing as well. Adding, networking, ports, audio, further internal drive connectors. It was all a thing at one time. I have G4 quicksilvers running as file servers still, I repurposed them by adding more ram and storage and Sata cards.. if you think it is cool to be locked in to SoC configuration that are designed to make you buy $10,000-20,000 machines every 4 years, you have way too much money to burn. It is embarrassing to have to decide at purchase what exactly all your needs will be for the next 7-8 years. If you cant afford to buy it now, you are too poor? This company is just greed driven now. glue together and force you to buy a new one every other year. All that e-waste. I don't fall for that anymore. 35 years of using Macs and I have never seen it this bad. jobs left a money hungry bozo in charge. There is no reason why machines can't last 10 serviceable years and then be repurposed.

There is so much wrong with your argument here I don't even know where to start.

Credibility out the door first as soon as you mention e-waste. I'm so tired of this argument. There is absolutely no reason any Mac (or PC for that matter) should wind up in landfill. Apple will take an Apple product, of any age, and many other companies' products, and recycle it for you, for free. For recent enough gear you can trade it in for money. And they'll repurpose or recycle the old one. E-waste is a straw-clutching argument.

Next you're trying to make comparisons between today's Mac Pro and 20 years old PowerMac G4's??! Ok, you're still using them for file servers? Serving what kinds of files to where? The newest MacOS a G4 can run is leopard. The security holes in that OS today, alone, are a major risk. The maintenance overhead on anything that old (hardware and software) to keep them (a) running, and (b) compatible with anything else your network needs must be problematic. What do you do when an ethernet port or logic board fails? ie. What does that cost in hardware repairs? And what does it cost in downtime also?

All power to you if that's what works for you but one of two things go from there: whatever time you're spending on maintaining those older machines is either...
(a) saving you more money than you make doing whatever you do for a living with these machines, in which case a Mac Pro isn't going to make you any more money than a Mac mini would and it's not financially sensible for you to justify being in the market for a Mac Pro anyway. Or...
(b) you'd make more money with the time spent maintaining that stuff if you spent it doing the work that makes your living with these tools, and you'd be better off making that money with your work and spending it on better hardware so you didn't have to maintain the old hardware (and software).

A brand new $800 Mac mini would easily replace multiple Power Mac G4s (you can't put more than 2GB in a PMG4, and it won't run anything better than leopard). Let's say you make $100 an hour. You can't tell me you're spending less than 8 hours per year maintaining those machines, or that a logic board or ethernet port or anything else hasn't blown in all the time you've had them.

What you're doing is perfectly ok if that's what you enjoy, but comparing that description to the work of anyone even remotely in the market for a Mac Pro is the part that's ludicrous.

A Mac Pro is supposed to be a high end business tool for people who generally make a lot of money using it. Artists, film makers, sound engineers, developers, maybe lawyers, & accountants, and any number of other professions where time is money time spent waiting for a slow computer means money lost. When a company or self employed individual could make $300 per hour from someone using a tool and the tool slows them down so that they can only deliver $200 an hour's worth (and therefore only charge that much), then it's very affordable to buy a better machine. It's got nothing to do with being rich or poor. If the expensive machine doesn't save you enough time such that you're not losing money for that lost time then what do you need the expensive machine for? It's not worth it and you should buy a cheaper computer to make the expense worth it.

And if your needs only warrant a cheaper computer then replacing it with another cheaper computer when its usefulness runs out is also affordable. It's simple economics.

I've got nothing against the idea that if a computer has upgradeable RAM then great, we can make use of that. But get your head out of your a... Tim Cook isn't soldering RAM to the inside of the SoC because he's a greedy bean counter. He's doing it because it makes the product better. Why should Apple cater to tinkerers at the cost of performance?

That $10K-$20K you mention is NOT $10K-$20K Painting that picture is a deception. Upgrading a modern Mac is easy. They hold their value better than any other PCs. You sell the old one, and buy a new one. That exercise doesn't cost $10K. For a $10K computer it might cost $2-3K. Whatever it is, let's call it $X.
In comparison, perhaps upgrading RAM in an existing computer costs $Y. And the kind of RAM that goes in a $10K computer is not cheap.
The cost of non-upgradeable RAM is not $X. It's the difference: $X - $Y = $Z. And $Z is usually not very much, especially when for $Z you get a new processor, storage, and all the other upgrades that also help improve the performance of the machine, which will (a) make you more money because you get your work done faster (if you're a professional who relies on the tool to get your job done, and if you're not then you don't need a $10K computer in the first place), and (b) also make it worth more to sell when it comes time to replace that. Not to mention a new warranty that means the cost to repair anything that goes wrong is zero, and a newer computer is less likely to break down meaning the cost of loss of usage is also less. The cost of downtime is not zero.

It's all well and good to want to upgrade the RAM in your Mac, but you people continually pushing this idea that Apple is greedy and insane because they don't let you do it any more (and trying to use arguments like the above to push that) ... that's what's ludicrous here.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.