Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
No they’re not hobbyists, but just how big a pool do you think the people that do that for their work AND are apparently their IT professional at the same time?

ie. They dont matter to me, so it's ok they dont matter to apple. It's very important that since I dont care bout it, I want to make sure those that do care about it, dont get what they want or need.

Another classic from oldies station trashcan.
 
  • Like
Reactions: DrEGPU

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
It isn't meant to offend but to point out that the complaint of how expensive Mac Pros have become is normal for those who use it to make bank. It is a business expense and not something you casually buy like a pair of airpods.

Ie, since I didnt mean to offend you, you cannot be offended. It's not allowed... And that also shouldn't be offensive to you... And if it is, I can be offended at you taking offense, but not vice versa. 🙄
 

IconDRT

macrumors member
Aug 18, 2022
84
170
Seattle, WA
Apple should have just come out and said, "Hey, we are going in a new direction. AS won't scale like we'd hoped and the solution we did come up with is cost-prohibitive to design/produce, we're sorry. As a result, there will be no more Mac Pro. But check out this Mac Studio Pro/Mac Content Pro with slots baby!" Just be honest, Apple. Stop giving us the tree-climbing fish of workstations, dammit! ;)
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
Apple should have just come out and said, "Hey, we are going in a new direction. AS won't scale like we'd hoped and the solution we did come up with is cost-prohibitive to design/produce, we're sorry. As a result, there will be no more Mac Pro. But check out this Mac Studio Pro/Mac Content Pro with slots baby!" Just be honest, Apple. Stop giving us the tree-climbing fish of workstations, dammit! ;)

They should have skipped the M2 for the Mac Pro completely and done it right, even if we had to wait for the M3. At best they have confused their messaging. More likely the message is 'we no longer give a ...darn... about you pros... go pound sand'. Instead, they basically are ghosting the pros.
 
  • Like
Reactions: IconDRT

IconDRT

macrumors member
Aug 18, 2022
84
170
Seattle, WA
They should have skipped the M2 for the Mac Pro completely and done it right, even if we had to wait for the M3. At best they have confused their messaging. More likely the message is 'we no longer give a ...darn... about you pros... go pound sand'. Instead, they basically are ghosting the pros.
That last sentence is the crux of the matter. Like you said elsewhere, maybe they come back with an M3/+384GB ECC RAM/Raytracing/etc in 2024-2025, but I don't have high hopes. And by then, how much smaller will the segment be as folks finally give up on Apple and buy PCs.
 
  • Like
Reactions: ZombiePhysicist

impulse462

macrumors 68020
Jun 3, 2009
2,097
2,878
I’ve worked at universities and ESPN. I’ve never seen an end user of a machine that was purchased for them be allowed to be upgraded by that person.

You’re saying that’s common in the data scientist field? They both run and are personally allowed to do hardware modifications?

Hopefully nothing that matters, given the massive security risk posture that allows….
Depends on your department.

I can speak for computational Imaging, AI research, medical imaging, computer architecture, robotics, and computational biology labs at my institution where I know people (mostly grad students, sometimes postdocs) who personally maintain lab workstations and upgrade them modularly, especially with GPUs now.

I even know people in a tissue engineering/materials science lab, not a traditional computational field which has an HP workstation which is upgraded with GPUs. Hilariously enough they had a 6,1 and then ditched it because their COMSOL simulations started to become super slow on them after languishing for 8 years w/o an update. Their 5,1s are still in use with upgraded storage.
 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
ie. They dont matter to me, so it's ok they dont matter to apple. It's very important that since I dont care bout it, I want to make sure those that do care about it, dont get what they want or need.

Another classic from oldies station trashcan.
No I’m genuinely asking, what kind of business in the data sciences is running like a startup where the person who’s expertise is advanced computation…is also the guy sticking RAM and storage into their work machine?

I’ve honestly never worked or seen anywhere that allows employees outside IT to open up devices.
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
iFixit lied that you couldn't do DIY SSD upgrades, you can it's just SUPER expensive.

It's also weird that people focus on apple's wacko proprietary not-really-an-ssd storage solution and make statements as if that is the only way to get storage on a slotted Mac. Who cares. I wont buy their crud storage, I got the 256GB minimum storage and never use it.

Many 3rd party storage you get with a PCI card will DECIMATE their lame as hell storage slots. Then again, on the M2, if you do, youve just saturated all your slots, but still, it is upgradable that way, and probably better upgraded that way if you need true capacity and speed.
 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
Depends on your department.

I can speak for computational Imaging, AI research, medical imaging, computer architecture, robotics, and computational biology labs at my institution where I know people (mostly grad students, sometimes postdocs) who personally maintain lab workstations and upgrade them modularly, especially with GPUs now.

I even know people in a tissue engineering/materials science lab, not a traditional computational field which has an HP workstation which is upgraded with GPUs. Hilariously enough they had a 6,1 and then ditched it because their COMSOL simulations started to become super slow on them after languishing for 8 years w/o an update. Their 5,1s are still in use with upgraded storage.
Wow that’s really interesting. I kind of get it from a university/college kind of setup, but I can’t imagine it works that way in an established company that has made it out of startup operations.

From an IT security/compliance perspective that sounds like an automatic disqualification from many security certifications.
 
  • Like
Reactions: Longplays

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
Wow that’s really interesting. I kind of get it from a university/college kind of setup, but I can’t imagine it works that way in an established company that has made it out of startup operations.

Expand your imagination. This perverse presentation by apple that the only 'real' work places are Pixar, where white gloved IT serfs tend to the 'gods' there is not reality of most of the big buyers.

Even at governments sites, you'll find a dude with a red stapler keeping thing going with duct-tape and deftly applying minimal funds to upgrade what is needed most (and amazingly, when funding comes through for something that really needs it, it gets applied in a minimal way--ie if a new graphics card or 2 will get you there, they wont buy a new system).
 

impulse462

macrumors 68020
Jun 3, 2009
2,097
2,878
Wow that’s really interesting. I kind of get it from a university/college kind of setup, but I can’t imagine it works that way in an established company that has made it out of startup operations.

From an IT security/compliance perspective that sounds like an automatic disqualification from many security certifications.
I can say at least majority of the impetus of these modular systems comes from the grad students who want/need computational power and have specific needs in mind. We also have access to department wide clusters (linux ofc) which require submitting jobs which, as a student annoys me when I'm doing basic experiments tbh.

Also it could be a function of my institution and department (in the college of engineering). We tend to like running our own things for our own purposes. I wonder what the math/physics/chemistry solution to this is.

You make a fair argument that for security companies at least, modularly upgrading probably isn't allowed but maybe not because of lack of desire but more for liability purposes as you mentioned. As for startups? I think we could meet 50/50 on that, I know a few startup where people do modularly upgrade, however I can't imagine they are the majority especially dealing with time sensitive/critical workloads which affect money (rather than publishing timelines heh)
 
  • Like
Reactions: NT1440

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
Expand your imagination. This perverse presentation by apple that the only 'real' work places are Pixar, where white gloved IT serfs tend to the 'gods' there is not reality of most of the big buyers.

Even at governments sites, you'll find a dude with a red stapler keeping thing going with duct-tape and deftly applying minimal funds to upgrade what is needed most (and amazingly, when funding comes through for something that really needs it, it gets applied in a minimal way--ie if a new graphics card or 2 will get you there, they wont buy a new system).
Okay you’re just ranting at this point.

Imagination has nothing to do with the technical certifications you need to comply with as a business to do things like process credit cards. If you’re dealing with health information (a major computational field) you have to meet a whole different set of criteria.

Circling back to the point here, what kind of actually established businesses has their data engineers (the guys actually using the hardware) actually servicing the hardware? That very practice alone would disqualify the business from a whole range of certifications to even receive/handle large data sets.

Where are these businesses that have their own regular workers acting as their IT departments?
 

impulse462

macrumors 68020
Jun 3, 2009
2,097
2,878
It isn't meant to offend but to point out that the complaint of how expensive Mac Pros have become is normal for those who use it to make bank. It is a business expense and not something you casually buy like a pair of airpods.

The Intel and AMD chips that could service scientists/researches/etc are now absent.

They are not Apple's target market any longer. If they were then Mac chips would be designed for their use case. Scientists/researchers/etc benefited from Apple using x86 that shared the R&D cost of all PC users.

With Mac chips it uses the R&D of iPhone chip use cases ~90% and the ~10% is the ~80% typical Mac use case.

Apple has a very good competitive analysis and performance/use case dev teams. So they keep making designs that seem to hit most sweet spots.

When iPhone and Android ate into Canon/Nikon/Sony's digital camera market they looked to Youtubers and other vloggers as their new growth market.

It did not cover all their lost market share but it help soften it.

Same with that those editors likely buy Macs far more often than any budget-limited scientist/researcher.

I can easily see editors buying more than 75,000 Macs annually for YouTube alone.

The scientists/researchers/etc are better served with Threadripper + Nvidia dGPU workstations. They can easily do piece meal upgrades when budget permits.
I think you're underestimating the budget some research labs have; although I could be blinded by the fact that my institution and individual labs are swimming in money - getting a workstation is a drop in the bucket for the overall lab expenses.

Agreed about the threadripper+dGPU workstations but only because those hardware components perform better and it was apple who decided to stop supporting them.

I mean I agree with you that Apple has refused to support these cases and just cater to the A/V crowd, my real issue with this is that these aren't consumer-grade products, I don't think ROI for their R&D cost should be as big of a concern here especially when they are minting money with the other parts of their business. Taking functionality away is just stupid for a workstation class product. But maybe thats how company has changed with the BUSINESS GENIUS that is tim cook.
 
  • Like
Reactions: ZombiePhysicist

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
Okay you’re just ranting at this point.

Imagination has nothing to do with the technical certifications you need to comply with as a business to do things like process credit cards. If you’re dealing with health information (a major computational field) you have to meet a whole different set of criteria.

Circling back to the point here, what kind of actually established businesses has their data engineers (the guys actually using the hardware) actually servicing the hardware? That very practice alone would disqualify the business from a whole range of certifications to even receive/handle large data sets.

Where are these businesses that have their own regular workers acting as their IT departments?

Dude, honestly, you manage to be so wrong, yet so arrogant. You are in some 'check the box world' that, thankfully, is not the majority of the world. Startups do not work that way. Companies where people need to work fast and keep on the bleeding edge that way. Jeez, even high end ad agencies do not work that way. There are people that need to get things done, and they get them done, and not by committee.

Also, at this point, you are dismissed because you honestly are incapable of seeing how insulting you are.
 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
Dude, honestly, you manage to be so wrong, yet so arrogant. You are in some 'check the box world' that, thankfully, is not the majority of the world. Startups do not work that way. Companies where people need to work fast and keep on the bleeding edge that way. Jeez, even high end ad agencies do not work that way. There are people that need to get things done, and they get them done, and not by committee.

Also, at this point, you are dismissed because you honestly are incapable of seeing how insulting you are.
I stated several times that only works in a startup environment, and depending on what they’re doing they could be exposing themselves to some pretty serious security/legal risks.

I’m not sure why you’re angry about how established businesses and government offices run. Compliance is a thing in the world of IT, that shouldn’t make you angry 🤷‍♂️
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
I stated several times that only works in a startup environment, and depending on what they’re doing they could be exposing themselves to some pretty serious security/legal risks.

I’m not sure why you’re angry about how established businesses and government offices run. Compliance is a thing in the world of IT, that shouldn’t make you angry 🤷‍♂️
First, telling me I'm 'ranting' is a sure fire way to make friends and influence people. Again, you just do not see how you insult people.


Also, ad industry is not "only" a startup.

Furthermore, because youre telling me stuff I have experience with does not exist. You know better than what I've experienced at government installations, ad agencies, even legal organizations.

It's funny, when youre a tip of spear somewhere, it just works different. Why does that sound so familiar...
 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
Honestly I have no idea how a conversation about upgradability use cases in enterprise turned into all that but whatever.

Anywho, I think in the long term we’re on a cost curve that’s going to make the SoC approach with on package RAM the *leader* in performance. We’re not there yet, and in the meantime those who’s current workflows *cannot* be resolved without massive pools of RAM are in a bad spot.

5ish years down the road, however, when RAM costs are low enough to allow for 2+TB on-die, the performance of slotted ram will look quaint in comparison.
 

JouniS

macrumors 6502a
Nov 22, 2020
638
399
5ish years down the road, however, when RAM costs are low enough to allow for 2+TB on-die, the performance of slotted ram will look quaint in comparison.
I don't see that happening due to space constraints. There are about as many transistors in 16 GB of RAM as in the M2 Ultra, though the actual die area should be 2x to 3x smaller. Small processors dealing with (physically) large amounts of memory is a fundamental feature of the computer architecture that has been dominant since the 1940s.

Also, Apple seems to be prioritizing cache performance over RAM performance. Once the size of the working set exceeds ~100 MB, memory latency is higher on my 2023 MBP than on my 2020 iMac. It would be easy to construct a semi-plausible workload where the i9-10910 is faster than the M2 Max.
 

DrEGPU

macrumors regular
Apr 17, 2020
192
82
Apple's very behind when it comes to ML. Any companies doing ML ain't gonna buy Mac Pros for that, they're gonna use Nvidia workstations since the CUDA cores process faster.

That doesn't mean the pro Macs can't do ML. My Macbook Pro does. But to the level of enterprise, it's not enough.



That's exactly why the M2 Extreme chip got cancelled. Why burn the R&D on a chip they're hardly gonna sell?

The downside is the Mac Pro is now left feeling unfinished, but tbh nowadays the Mac Studio is moreso the new Mac Pro, and the "Mac Pro" is only there for those who need PCIE (though one can argue is PCIE really worth the extra $3000 over the Studio?)
It’s funny you mention that, because Apple buys (or has bought in the past) nvidia machines for their AI/ML/DL work. Somewhat ironic…
 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
I don't see that happening due to space constraints. There are about as many transistors in 16 GB of RAM as in the M2 Ultra, though the actual die area should be 2x to 3x smaller. Small processors dealing with (physically) large amounts of memory is a fundamental feature of the computer architecture that has been dominant since the 1940s.

Also, Apple seems to be prioritizing cache performance over RAM performance. Once the size of the working set exceeds ~100 MB, memory latency is higher on my 2023 MBP than on my 2020 iMac. It would be easy to construct a semi-plausible workload where the i9-10910 is faster than the M2 Max.
I think I’m a little more optimistic about the trajectory of memory density, that could just be my take on it though. I just think the things that are bleeding edge today will be much easier in 5 given we’re just really starting to move into the 3D RAM die era.

 

NT1440

macrumors Pentium
May 18, 2008
15,092
22,158
It’s funny you mention that, because Apple buys (or has bought in the past) nvidia machines for their AI/ML/DL work. Somewhat ironic…
I’m curious about this. Can you provide a source? Is this for in house work or are they ordering a ton of the Hopper (I think that’s what it’s called) cards for data center work?
 

DrEGPU

macrumors regular
Apr 17, 2020
192
82
I’m curious about this. Can you provide a source? Is this for in house work or are they ordering a ton of the Hopper (I think that’s what it’s called) cards for data center work?
This was back in the DGX1 days. I had a private meeting with nvidia rep and his boss about buying some DGX’s and casually asked about the Apple vs nvidia nonsense. He passed it off as just business and mention Apple buying tons of nvidia hardware. If anyone has the manpower and talent to make ROCm work for them, it’s Apple, so maybe they don’t use nvidia anymore? I don’t work for nvidia, AMD, or Apple btw lol
 

ZombiePhysicist

Suspended
May 22, 2014
2,884
2,794
:rolleyes:
Honestly I have no idea how a conversation about upgradability use cases in enterprise turned into all that but whatever.

Anywho, I think in the long term we’re on a cost curve that’s going to make the SoC approach with on package RAM the *leader* in performance. We’re not there yet, and in the meantime those who’s current workflows *cannot* be resolved without massive pools of RAM are in a bad spot.

5ish years down the road, however, when RAM costs are low enough to allow for 2+TB on-die, the performance of slotted ram will look quaint in comparison.

There's an old saying about "in the long run". Yea, in the mean time, who cares what other people need.
 

impulse462

macrumors 68020
Jun 3, 2009
2,097
2,878
I’m curious about this. Can you provide a source? Is this for in house work or are they ordering a ton of the Hopper (I think that’s what it’s called) cards for data center work?
If you’re not the field it’s definitely seems weird and surprising but as an AI researcher (still grad student technically) myself using Linux+nvidia workstations it’s truly the only way to get published and deployable results. They also used to use lambda labs workstations.

If you’re familiar with the the developer tools Apple releases they have something called CoreML. Now you could implement deep learning models using it. But they (Apple) knows no one will do it since PyTorch/tensorflow is the standard and those are almost exclusively used on Linux+nvidia workstations. CoreML allows loading ONNX models which is an open source platform that allows conversion of pytorch/tensorflow models to onnx which then can be loaded onto CoreML. That’s how I’m the majority of DL models are run on Apple devices from what I’ve read of course I don’t have any internal stats to back this up obviously.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.