Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This would be true if you bought $1-2K machine, not a $25K machine. I do not imagine its financially sound to replace 5 Mac Pros costing $25K each in an institution like a school every 2-3 years. Those are treated more like industrial machines not an iPhone.
I don't think schools are the target market for a maxed out MacPro. Universities? Sure. But life is different there and work on a per-research-project level. The average research project is about 3 years (sometimes more, sometimes less). And that's where the money is coming from and people are hired on that level as well (at least junior staff and PhD students). I just bought a bunch of larger Dell Precision workstations and Razer Laptops (because they offer up to RTX5000 GPUs) for a new project in my research group specifically for this project. At some point (after 3 years) these machines will end up somewhere, probably in some basement or a lab where students can work on them and we'll buy new machines with the funding for the next project. It happens all the time.

Also, from an industry point of few. Musicians/photographers/videographers are not buying MacPros because they have two jobs per months and need to get things done faster. They want to get things done faster to take more jobs. And that's paying for itself. So let's say you make $1k per job and this allows you to take 5 additional jobs per month... do the math. $25k and even $50k isn't expensive for a workstation that is making you money. It is expensive though for someone who is not making money with it at which point I'd question whether that person actually needs such a computer.
Given they no longer have to use Intel CPUs why on Earth are they not using AMD Threadripper / Epyc!?
Depends on what you do. AMD is still missing instructions and libraries. Take the Intel Math Kernel library and try to get it run on AMD... good luck. Plenty of more examples exist, but in the end it depends on what you do with it and therefore if AMD is a suitable choice.

Also, there's a massive problem with availability from AMD. They just can not supply large numbers in reasonable timeframes, which might matter for initial orders but also replacement in case of failure. This does of course not affect the average home user, but when I have to order 10000+ CPUs that might become a massive problem, given many companies/institutions/datacenters around the world do the same. For for mission critical systems, I'd stay away from AMD as far as possible.

Also for Apple x86 is a dead end. As soon as they have their own AS SoCs, the x86 systems will be history. It makes little sense for them to add full AMD support to macOS. And no, despite the AMD hackintoshes, not everything works on AMD machines.
 
What are you buying a $25K machine for if not to do $200K+ worth of work? Mac Pros make terrible day-to-day machines from both a cost and feature standpoint. Almost every university I work with auctions off workstations after a few years. They have to keep that grant money flowing (use it or lose it) and they buy the latest and greatest to meet their evolving needs. K-12 is woking on iMacs, MacBook Airs and iPads. They may have one Mac Pro in a media lab or something like that and it is probably under-utilized.

Honestly I do not know, I can imagine a place like Pixar have no issue replacing all their computers every 2 years, but I imagine many people keep those machines around as an asset. I keep hearing a lot of businesses and corporates run old hardware for a long time so much so I heard they still use floppy disks in some governments.

That $25k machine will rapidly depreciate in value, though.

This makes it even more of a reason that you rather not sell it after 2-3 years and keep it for the longer run.

On paper my iMac is supposed to be faster than M1 though, it has a Radeon Pro 580 which has more raw computing power than M1, again on paper, however when I do photo editing on my iPad Pro (A12Z, not M1), iPad Pro does the same task more than 3 times as fast as my iMac.

On paper iPad Pro with A12Z is much less powerful than my iMac, in reality, it does photo editing faster (Same app, same effects, same workflow). There's something inherently fast about A series processors that the intel ones simply don't match. At least in photo editing tasks. So if A12Z (2 years old) is already faster than my iMac (4 years old), I cannot imagine how fast M1 is doing the same tasks. It'll be faster than any intel laptop even with i9.

as Chucker explained, I too learned that not all computing processors are the same. Different hardware will work better for different type of use, for example if you do number crunching like say predicting the weather you will need a different CPU/GPU than if you want to play games with 4K graphcis and each will perform badly/slower if used for a different purpose that it was intended for.
 
I don't think schools are the target market for a maxed out MacPro. Universities? Sure. But life is different there and work on a per-research-project level. The average research project is about 3 years (sometimes more, sometimes less). And that's where the money is coming from and people are hired on that level as well (at least junior staff and PhD students). I just bought a bunch of larger Dell Precision workstations and Razer Laptops (because they offer up to RTX5000 GPUs) for a new project in my research group specifically for this project. At some point (after 3 years) these machines will end up somewhere, probably in some basement or a lab where students can work on them and we'll buy new machines with the funding for the next project. It happens all the time.

Also, from an industry point of few. Musicians/photographers/videographers are not buying MacPros because they have two jobs per months and need to get things done faster. They want to get things done faster to take more jobs. And that's paying for itself. So let's say you make $1k per job and this allows you to take 5 additional jobs per month... do the math. $25k and even $50k isn't expensive for a workstation that is making you money. It is expensive though for someone who is not making money with it at which point I'd question whether that person actually needs such a computer.

I see what you mean. I just find the "buy it then lose it" mantra of university research is such a wasteful method.
 
I see what you mean. I just find the "buy it then lose it" mantra of university research is such a wasteful method.
really 2-3 years for pro work is the max I will hold a computer. Normally it is 1 year.

As mentioned by others we make more money by having faster and better machines, and the cost for a machine is paid in 1 job out of many. We take on more clients etc by having better systems and processes to be super productive.
 
  • Like
Reactions: visualseed
This makes it even more of a reason that you rather not sell it after 2-3 years and keep it for the longer run.

I don't know about that. If you configure the Mac Pro to $25k, you do so because you need a high level of performance ASAP, so odds are you'll want another high-end device soon after.
 
  • Like
Reactions: richinaus
I don't know about that. If you configure the Mac Pro to $25k, you do so because you need a high level of performance ASAP, so odds are you'll want another high-end device soon after.
Some people dont get it.

However I have worked in multiple professional offices where they dont get it either. They think they Are saving money not investing in the hardwar, because it works. However due to the slowness [which wouldnt cost much to fix] they let staff waste hours of time, due to delay, lack of right software etc.

My partner works at such an office. I have seen her waste countless hours due to tech issues, and she is billed out at $350 an hour ! A few hours here and there and its a computer….. totally insane.
 
really 2-3 years for pro work is the max I will hold a computer. Normally it is 1 year.

As mentioned by others we make more money by having faster and better machines, and the cost for a machine is paid in 1 job out of many. We take on more clients etc by having better systems and processes to be super productive.
I think 1 year is completely insane and wasteful. It's how we destroy our planet by overconsuming. I manage the equipment of a team of 25-30 people doing pro video work and none of the machines ever gets replaced in 1-2 years. 3 years only if there's a good reason. Desktops, we upgrade (RAM, SSD, GPU, possibly CPU). Laptops, we pass on to newbies so they get another 2-3 years.
 
I think 1 year is completely insane and wasteful. It's how we destroy our planet by overconsuming. I manage the equipment of a team of 25-30 people doing pro video work and none of the machines ever gets replaced in 1-2 years. 3 years only if there's a good reason. Desktops, we upgrade (RAM, SSD, GPU, possibly CPU). Laptops, we pass on to newbies so they get another 2-3 years.
Eh? We don’t exactly throw them in the bin, or leave them on the shelf, so I am unsure how it is wasteful.
 
  • Like
Reactions: Zdigital2015
I think this is a good thing, intel macs will get more years of support. And what i dont understand why every 2 years or so a new version of MacOS? Why does Apple not choose a version of MacOS and stick to it for atleast 6 years or so..and use that manpower to extend the life for the intel version of the os and at the same time for the new apple silicon. Just my thoughts...
 
I see what you mean. I just find the "buy it then lose it" mantra of university research is such a wasteful method.
I understand what you're saying. It is an organizational problem of funding, which can be odd. For example, when I quickly need 4 weeks of processing power, I might not be allowed to use a cloud service with the funding I've received. Instead I'm required to buy an additional system for that 4 week task. In other cases, there's no problem with 4 weeks on AWS. 🤷‍♂️

In the past few years, additional problems showed up. While a 1 or 2 year old PC was usually completely outdated when it comes to CPU, RAM, GPU and storage, that's not the case anymore. What's usually outdated is the GPU, while the rest will work for a couple of more years. And that is the next problem, we usually buy this machine from Dell or Lenovo with a service contract that guarantees same day repair in case something breaks down. This is needed for mission critical hardware. While I could physically replace the outdated GPU, this doesn't work with Dell/Lenovo and the service contracts. So I really have no other choice but to buy a new machine in case Nvidia brings some new features to the next GPU generation. And no, sitting it out and not use these features is not an option. This is research and further funding depends on who brings something new to a conference that no-one else did before the own research group.

That might also be a contract issue, but with the amount we're buying, we're getting massive discounts. Easily over 60% off with Dell, but it depends on the hardware. Not too long ago I bought a system from Dell which they normally sell for just over $150k and ended up paying about $70k. So if being able to swap a GPU after 2 years means paying $150k instead of $70k, then no one is going to do that.

It's sort of like iOS development. You could use an iPhone X for development, but you have to use the latest model to be able to utilize all features Apple is offering to consumers. So having new hardware is usually a must-have, but in case of the MacPro and similar machines, it's more expensive than an iPhone.
 
I think this is a good thing, intel macs will get more years of support. And what i dont understand why every 2 years or so a new version of MacOS? Why does Apple not choose a version of MacOS and stick to it for atleast 6 years or so..and use that manpower to extend the life for the intel version of the os and at the same time for the new apple silicon. Just my thoughts...
Make no mistake, you're not going to get years of support for Intel MacPro. As soon as they have a suitable replacement, they're going to drop Intel support like a hot potato. 1 year max.

Well, technically speaking they did that with macOS 10. Instead they went from minor version to minor version and that's probably what they will do with macOS 11 as well. They still have to provide updates and of course they're using it to provide new features in their ever moving eco-system. As far as Intel support goes, why support a dead system? They already know where to go and what to do and Intel is no part of that future. From a business point of view, I'd do exactly the same.
 
It depends on the kinds of tasks.

The M1 is strong at:

  • single-threaded tasks, which are extremely common especially in UIs.
  • machine learning tasks, which may play a role in photo editing.

Your iMac may be stronger at:

  • heavily multi-threaded tasks, which however aren't that common
  • GPU-accelerated tasks (which are also heavily parallelized, but compared multi-threaded CPU tasks tend to have lower precision)

It's much easier for Apple to scale those up (assuming the thermal room is there, which on a Mac Pro it certainly would be) than for other vendors to increase single-threaded speeds, so it stands to reason a high-end M1 or M2 would improve on them.
I would assume that there are a lot of GPU accelerated tasks in photo editing, since the apps I use support metal GPU acceleration. And even then, A12Z beats 580 Pro. It's most probably about the design of the SoC itself, rather than the raw computing power.
 
as Chucker explained, I too learned that not all computing processors are the same. Different hardware will work better for different type of use, for example if you do number crunching like say predicting the weather you will need a different CPU/GPU than if you want to play games with 4K graphcis and each will perform badly/slower if used for a different purpose that it was intended for.
This is precisely it. And photo/video editing seems to be one of these SoC's strong suits.
 
This is the same conclusion I came to. It was self defeating to stay using apple hardware when there were so many better pro options available in PC's. I really do love using a mac for work, however it was restricting my output.
Suffice to say my PC workstation has totally delivered. Its a pure tool to earn $$$
care to share the specs you threw together?
 
really 2-3 years for pro work is the max I will hold a computer. Normally it is 1 year.

As mentioned by others we make more money by having faster and better machines, and the cost for a machine is paid in 1 job out of many. We take on more clients etc by having better systems and processes to be super productive.

I meant, I kind of understand 3 years but how much does technology advance in 1 year that makes it necessary to replace your whole setup for the faster hardware and that improvement is enough to justify the price of the repurchase?

I don't know about that. If you configure the Mac Pro to $25k, you do so because you need a high level of performance ASAP, so odds are you'll want another high-end device soon after.

I kind of had a different idea, you buy the top of the line to be future proof. You cheap out if you have plans to replace it soon.

I understand what you're saying. It is an organizational problem of funding, which can be odd. For example, when I quickly need 4 weeks of processing power, I might not be allowed to use a cloud service with the funding I've received. Instead I'm required to buy an additional system for that 4 week task. In other cases, there's no problem with 4 weeks on AWS. 🤷‍♂️

In the past few years, additional problems showed up. While a 1 or 2 year old PC was usually completely outdated when it comes to CPU, RAM, GPU and storage, that's not the case anymore. What's usually outdated is the GPU, while the rest will work for a couple of more years. And that is the next problem, we usually buy this machine from Dell or Lenovo with a service contract that guarantees same day repair in case something breaks down. This is needed for mission critical hardware. While I could physically replace the outdated GPU, this doesn't work with Dell/Lenovo and the service contracts. So I really have no other choice but to buy a new machine in case Nvidia brings some new features to the next GPU generation. And no, sitting it out and not use these features is not an option. This is research and further funding depends on who brings something new to a conference that no-one else did before the own research group.

That might also be a contract issue, but with the amount we're buying, we're getting massive discounts. Easily over 60% off with Dell, but it depends on the hardware. Not too long ago I bought a system from Dell which they normally sell for just over $150k and ended up paying about $70k. So if being able to swap a GPU after 2 years means paying $150k instead of $70k, then no one is going to do that.

It's sort of like iOS development. You could use an iPhone X for development, but you have to use the latest model to be able to utilize all features Apple is offering to consumers. So having new hardware is usually a must-have, but in case of the MacPro and similar machines, it's more expensive than an iPhone.

May I ask who is paying for the funding of these research? I never heard of a source that is willing to replace your whole hardware yearly, most of the time corporates are stingy on bringing new hardware. Even schools are opting from Chromebooks because that is cheaper than a cheap Windows machine.
 
Eh? We don’t exactly throw them in the bin, or leave them on the shelf, so I am unsure how it is wasteful.
Your "I normally hold a computer 1 year" was kinda misleading then. Not holding = kinda equal to getting rid of. Even if you move them down the ranks, either your company is growing fast and there's always new people to catch the computers coming down, or they're getting dumped at a high rate at the bottom anyway. 5 years is very normal for even video equipment. We amortise all A/V equipment on 5 years, including video editing computers. Laptops, 3, because you can't upgrade them.
 
Surely Apple has competent hardware designers & strategists to wait till at *least* Sapphire Rapids before upgrading the Intel Mac Pro?

Why in the world would they release a new one without DDR5, HBM2e caching, & PCIe5?
 
For starters, because a Sapphire Rapids-based Xeon W won’t ship before 2023, if even that.
 
I meant, I kind of understand 3 years but how much does technology advance in 1 year that makes it necessary to replace your whole setup for the faster hardware and that improvement is enough to justify the price of the repurchase?
Technology advances with hardware generations. So if Nvidia or anyone else brings new features to the game, then it might be worth a replacement. Probably not for everyone, but for some.
I kind of had a different idea, you buy the top of the line to be future proof. You cheap out if you have plans to replace it soon.
Maybe. Or you buy the top of the line to have the most processing power available to you. And when something new comes along that is faster, you replace it because you need the fastest. I'd buy that Nvidia GPU which will come out in 2030 right now, because I could utilize it today. The problem is, I can't because it's not out yet. So the fastest hardware out today is a compromise. I've seen cases where a few hours make all the difference. When research groups compete it's a race to the goal line. Those who are first get everything, get to present a paper at a conference, the citation count, new funding, etc. Those that come in 3 hours late due to lack of processing power get nothing.
May I ask who is paying for the funding of these research? I never heard of a source that is willing to replace your whole hardware yearly, most of the time corporates are stingy on bringing new hardware. Even schools are opting from Chromebooks because that is cheaper than a cheap Windows machine.
Depends, could be industry funded or government or other organisations that manage research money from various sources. The money for my current projects is coming from a much larger research fund of about $4B per year. This is spread across 50 research institutes, many universities and organisations across the world. Other funding might come out of local (district level) funds. Those managing the funds usually make packages available for which you can apply, which are grouped depending on the task (say real time rendering for water for graphics, radar systems for monitoring climate change, AI for self driving cars, etc.). How much money is in such a package depends on how many people are involved and on what level (university only, multiple universities, industry partners, same country, all around the world, etc.). When applying, you have to provide a full project plan, what you're going to do, what the expected results are and so on. Then might you get the money. At that point it's unclear if your research group already has the required resources or not, that's why they usually provide the money. Any money that isn't used, goes back. And let's be honest, who would not use "free money" when available?
 
This rumor flies in direct contravention to Tim saying Apple would be completely transitioned to AS by 2022.

Of course the statement leaves a lot open to interpretation, but it is definitely an interesting development, given that no other Intel-based hardware has been released so far.

Next year is going to be really really interesting for Apple, product-wise.

How is this in contravention? This rumor is that Apple will offer an updated Mac Pro with a newer XEON Intel chip in 2022. It doesn’t say that Apple won’t also release the first Mac Pro featuring Apple Silicon in 2022. They can do both a spec bump and a new machine at the same time.
 
Last edited:
  • Like
Reactions: Zdigital2015
They said it was a two year transition. Two years should mean WWDC of 2022. This contradicts that.
The 2 year clock MAY have started in June at WWDC or it may have started in November when the first M1s were released. Only Apple knows for sure. In either case, how does this rumor contradict that? Does this say that Apple isn’t also working on a new Apple Silicon Mac Pro while they also do a spec bump on the older system?
 
Technology advances with hardware generations. So if Nvidia or anyone else brings new features to the game, then it might be worth a replacement. Probably not for everyone, but for some.

Maybe. Or you buy the top of the line to have the most processing power available to you. And when something new comes along that is faster, you replace it because you need the fastest. I'd buy that Nvidia GPU which will come out in 2030 right now, because I could utilize it today. The problem is, I can't because it's not out yet. So the fastest hardware out today is a compromise. I've seen cases where a few hours make all the difference. When research groups compete it's a race to the goal line. Those who are first get everything, get to present a paper at a conference, the citation count, new funding, etc. Those that come in 3 hours late due to lack of processing power get nothing.

Depends, could be industry funded or government or other organisations that manage research money from various sources. The money for my current projects is coming from a much larger research fund of about $4B per year. This is spread across 50 research institutes, many universities and organisations across the world. Other funding might come out of local (district level) funds. Those managing the funds usually make packages available for which you can apply, which are grouped depending on the task (say real time rendering for water for graphics, radar systems for monitoring climate change, AI for self driving cars, etc.). How much money is in such a package depends on how many people are involved and on what level (university only, multiple universities, industry partners, same country, all around the world, etc.). When applying, you have to provide a full project plan, what you're going to do, what the expected results are and so on. Then might you get the money. At that point it's unclear if your research group already has the required resources or not, that's why they usually provide the money. Any money that isn't used, goes back. And let's be honest, who would not use "free money" when available?

Sorry for OOT but genuinely curious,
Do corporates and governments pay universities to do their research for them? I always imagined the research is done in house? I didn't imagine Apple will pay Brown University to come up with something new?!

Who is doing the research in universities? The students? Who would trust that?

Do universities use the funding to do research for prestige or they actually try to profit from it to make money back to the university?
 
Who is doing the research in universities? The students? Who would trust that?
I know I would, they're on the cutting edge with professors that are also on the cutting edge of what can be done. The students participating aren't your typical education, maybe even fine arts, people.

Do universities use the funding to do research for prestige or they actually try to profit from it to make money back to the university?
Both, and they receive both.

Do corporates and governments pay universities to do their research for them? I always imagined the research is done in house? I didn't imagine Apple will pay Brown University to come up with something new?!
For commercial purposes, I doubt it, but the there's a lot of organizations that do grants for certain types of research!
 
  • Like
Reactions: MacBH928
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.