Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As a Mac fan its ALWAYS been my dream to own a real brand new Mac Pro in the generation it was released in...

However I could never justify nor have the privilege of a business's IT account to order one..

Can Apple make this dream for me and I'm sure a number of other Mac fans a reality by finally cost controlling with its own Silicon ? we will see...
 
I was wondering if anyone would bring up Transmeta. :)
I interviewed there once. I walked out after the first interview, which was with a guy who had previously worked at microunity. He had done bipolar design there, and got into an argument with me where he insisted it was better to do level shifting at the input of gates instead of the outputs. I tried explaining to him that in real circuits you have to drive interconnect impedence, and so the emitter followers for level shifting could be made into nice little amplifiers, so it was more efficient to do it there. And I pointed out that might be one reason that Exponential and RPI actually taped out working CML cpus, and MicroUnity hadn’t gotten anything much working by then. He was being an ass about it and Yelling at me, so I figured that must be what the culture was like and I should play along.

I told the next person on the schedule that I had heard enough and it wasn’t going to work out. He marched me into meet Linus, I guess the theory being that I would be overcome with his star power or evil charisma or something.

I ended up working at Sun instead, on UltraSparc V.
 
The scores are what matters. It's 5.9% faster, not a big difference. It costs $4K instead of $3K for the Xeon. That makes the AMD chip super not worth. Usually AMD's offerings are budget-friendly because they're also considered the second-tier brand, so that's weird. Their mid-range chips are a better value than Intel's IIRC.

And more cores is a bad thing, not a good thing. First off, a single fast core can do the job of two half-speed cores at least just as well, except for minor context switching overhead. More importantly, these benchmarks run one independent thread per lcore, but in many real applications, there's communication across threads that comes at a cost. To put it one way, if your application is really so parallelizable, you're better off sending it to a cluster of small machines that'll beat both of these in performance/$ easily.

You realize Single performance is now AMD, right? AMD EPYCs coming out followed by TR, never mind the Ryzen 9s are all faster and more scalable than anything on the market. And on the x570 platform Infinity Fabric has unified with the RX 6000 GPGPUs the memory between CPU/GPU. Sorry, but Apple should have moved to AMD two years ago and two years from now entered the market with ARM, if they still saw it worth while.

ARM with Nvidia is now in a precarious legal battle. Sit back and enjoy the next 24 months. Zen 3/RDNA 2.0/CDNA 1.0 on advanced 7nm node Nov 2020, Zen 4/RDNA 3.0/CDNA 2.0 on advanced 5nm node Nov 2021. Zen 4 or Zen 5 will introduce designs with FPGAs/DSPs and more from the Xilinx merger that give AMD a huge leg up.

Apple didn't forcast AMD merging with Xilinx. As an Apple alum I'll wait to buy into ARM in 3 years.
 
  • Like
Reactions: neomorpheus
Ah, they can finally bring back the trash can mac pro since they will control both CPU and GPU...
The sad reality is, that design is not bad itself, but apple draconian control bs, ended affecting their own customers, since the gpu and ssd have those proprietary connectors and apple never offered any replacements.
They could use the same design, but with apple silicon, updated gpus and upgradable parts, reasonable price (yeah i know, wont happen) call it simply Mac and you got a stew going. 😁
 
Last edited:
The current Mac Pro, aside from the price, is perfect. Why mess with it? It delivers exactly what the pros wants.



Lets put it this way. When Apple was a heavy computer company and released the Mac Pro in 2006 they had a market cap of $54 billion. Fast forward 14 years they are now hovering around 2 trillion depending on the month and it has nothing to do with their pro workstations. It's such peanuts for the company that they treat it as more of a halo product at this point. It's good for brand image at this point such as the headlines the 2019 Mac Pro made. Think about it....they spend 25 mins demoing a computer workstation to an audience back in 2019 where 99% of the people had no use for it. It created headlines on Forbes "check out Apple's new $50,000 computer!"

Apple wants to do another demo for the audience sometime in the future to create headlines using the Mac Pro and just swapping in new chips might not be what they want even though that what Pros may want.
 
  • Like
Reactions: George Dawes
Yeah! Like a 3 year lifespan is a joke.
As I showed with the PowerPC to Intel transition 5 (five) years is more likely
Why different interfaces? Options.
If the options are comparative that would make sense but I don't see it
SATA SSD are cheap and fast enough for a heap of uses
SATA hard drives are huge
Uh, exactly what part of "USB 3.1 Type-C is faster then SATA III" did you not understand? Unless something has changed since Dec 2018 there isn't a SATA IV. So you want the the option of using something slower?! How does that make a lick of sense?
PCIe x4 can do various PCIe cards like network adapters, wifi cards fiber channel, etc.
Why would you in this day and age need a network adapter? And what in the world is a "wifi cards fiber channel" or did you forget a comma?

Wifi is built into the mac (I'm using it right now on this old 2013 mac) and as I pointed out a long time ago if you aren't close enough to the digital switch-box you aren't going to be be able to get fiber to your house and stuck with old style cable internet. (This was my problem in Las Cruces, NM and back then you payed even for cable $100/month on average once you got past the teaser period)
A PCIe x16 slot - do I even need to explain upgradable video?
What part of "Well there is the eGPU option" did you not understand? Yes, PCIe 4.0 has 16 GT/s but what insanely sized at what gonzoly high resolution of a monitor are you using that needs that?! And if you have the money go for something that is clearly going to be insanely expensive why not get a Mac Pro to begin with?
Not all Mac users are video editors and some want higher spec options (than a Mac mini without a display) not geared entirely around Hollywood
USB-C 3.1 isn't "geared entirely around Hollywood" and I have no idea where that is even coming from?
 
No, three years is the outside for replacement in most studios where I have consulted. Sometimes systems are replaced every other year, but never less often than once every three years. After three years there is enough performance improvement to justify new systems. The cost of the professional’s time exceeds the cost of the hardware, and being able to increase their performance easily makes it worth it.
Never mind, as I have shown vi the PowerPC to Intel transition, Apple supports the old no longer made hardwire in OSes/Software Updates for FIVE years. As I said before Five years (ie half a decade) is an eternity in the computer world epically in the professional market.

On the second part that may not be true for much longer. The original version of Moore's Law (2x every year) is dead and the revised version (2x every 2 year) is showing signs it may be dying as well. Somewhere, perhaps in the next decade, maybe two we will have gotten as far as is possible and have to be looking at long term use.
 
  • Like
Reactions: DaveN
You realize Single performance is now AMD, right? AMD EPYCs coming out followed by TR, never mind the Ryzen 9s are all faster and more scalable than anything on the market. And on the x570 platform Infinity Fabric has unified with the RX 6000 GPGPUs the memory between CPU/GPU. Sorry, but Apple should have moved to AMD two years ago and two years from now entered the market with ARM, if they still saw it worth while.

ARM with Nvidia is now in a precarious legal battle. Sit back and enjoy the next 24 months. Zen 3/RDNA 2.0/CDNA 1.0 on advanced 7nm node Nov 2020, Zen 4/RDNA 3.0/CDNA 2.0 on advanced 5nm node Nov 2021. Zen 4 or Zen 5 will introduce designs with FPGAs/DSPs and more from the Xilinx merger that give AMD a huge leg up.

Apple didn't forcast AMD merging with Xilinx. As an Apple alum I'll wait to buy into ARM in 3 years.
The thread was about server/workstation chips in Mac Pros today. Future gains in AMD chips, I can believe. You know more than I do about what AMD has up its sleeves, but do you know how that compares to what Intel is doing in-house with FPGAs?
 
Never mind, as I have shown vi the PowerPC to Intel transition, Apple supports the old no longer made hardwire in OSes/Software Updates for FIVE years. As I said before Five years (ie half a decade) is an eternity in the computer world epically in the professional market.

On the second part that may not be true for much longer. The original version of Moore's Law (2x every year) is dead and the revised version (2x every 2 year) is showing signs it may be dying as well. Somewhere, perhaps in the next decade, maybe two we will have gotten as far as is possible and have to be looking at long term use.
Since Moore’s law only requires that the number of transistors on a chip doubles every X years, I don’t see it dying anytime soon. (By the way, X has been 2 since 1975, so it’s not like it ”died.”)
 
Apple didn't forcast AMD merging with Xilinx. As an Apple alum I'll wait to buy into ARM in 3 years.
The majority of Apple’s business is mobile. From what I’m reading, this is a part of AMD’s play for data center dominance. So, it wouldn’t have been a good long term fit. Plus, just like Intel, AMD has a business need for their low power mobile processors to perform more poorly than the ones they make for desktops. What Apple is doing isn’t impossible for Intel or AMD to accomplish, it’s just financially better for them that chips like the i3 lack features the i9 has.

The only way Apple is ensured that there’s a focus on the mobile business that means the most to them is with their own chips. So, while they couldn’t have foreseen Xilinx, their “Pro” solution choice wouldn’t have been altered even if they did know.
 
Last edited:
  • Like
Reactions: Maximara
Uh, exactly what part of "USB 3.1 Type-C is faster then SATA III" did you not understand? Unless something has changed since Dec 2018 there isn't a SATA IV. So you want the the option of using something slower?! How does that make a lick of sense?
I'm not saying get rid of USB type C.

USB type C is not an internal interface. I don't want drives hanging off the machine. I want them inside.

Yes, I do want the option of using something slower IN ADDITION to the external USB type Cs. Like I said elsewhere, wake me up when you can get affordable 20-40+ TB SSDs; until that point spinning disk has its place as secondary storage.

Also - the chipset has SATA lanes built in (whether you wire them up or not) and a limited number of m.2 lanes.

Doesn't matter that M.2 is faster so much if you can only add 2 of them - you have a whole heap of 600MB/sec SATA lanes sitting un-used in the chip-set that you could fill with additional SSDs or hard drives as you see fit.

600 MB/sec is plenty fast enough for a lot of uses and MANY uses do not see any performance difference between m.2 and SATA. But hey, 1-2 TB SATA SSDs are cheap and can act as tier 2 storage between m.2 and hard disk.

In short:
Your entire post smacks of "I don't need it, and I have no ability to see beyond that, so therefore no one could ever require it".

Newsflash: different users have different needs. Just because you don't do advanced network device/VM simulation work on your Mac (for example) does not mean nobody does. If you don't need or want those things - congratulations.

This isn't about YOU.

As its is, I have a desktop PC (in my sig) with those sort of things in it (running Linux) because Apple do not offer a similar product. I'm not alone. It was around half the price of a base model Mac Pro and offers superior performance. Sure it can't be expanded as far, but I don't need to go that far.


edit:
and yes, I typed that post on my phone, and there should have been commas between wifi, network, fibre channel, etc. There are plenty of other PCIe x4 cards that maybe someone who has a hobby may want to use. Additional M.2 SSD controllers, additional USB/Thunderbolt controllers, etc. The point is you have OPTIONS that do not exist on a Mac mini without throwing stupid money at things you don't need on a Mac Pro.

Anyone arguing this stuff with any credibility should know enough to figure that out, even if it had a typo.
 
Last edited:
  • Like
Reactions: AAPLGeek
I’ll be pretty interested in this. Still using a 5.1 Mac Pro in my music studio. I need Pcie so I’m hoping this new machine will accommodate that.
 
Since Moore’s law only requires that the number of transistors on a chip doubles every X years, I don’t see it dying anytime soon. (By the way, X has been 2 since 1975, so it’s not like it ”died.”)
From Moore's Law (investopedia):
Understanding Moore's Law
In 1965, Gordon E. Moore—co-founder of Intel (NASDAQ: INTC)—postulated that the number of transistors that can be packed into a given unit of space will double about every two years

Moore's Law's Impending End
Experts agree that computers should reach the physical limits of Moore's Law at some point in the 2020s. The high temperatures of transistors eventually would make it impossible to create smaller circuits. This is because cooling down the transistors takes more energy than the amount of energy that already passes through the transistors. In a 2007 interview, Moore himself admitted that "...the fact that materials are made of atoms is the fundamental limitation and it's not that far away...We're pushing up against some fairly fundamental limits so one of these days we're going to have to stop making things smaller."
==
The definition of Moore's Law provided in the above shows it is not just doubling of transistors but that doubling happens within a given unit of space ie the density of transistors doubles every two years.

The following definition agrees with that:
"More precisely, the law is an empirical observation that the density of semiconductor integrated circuits one can most economically manufacture doubles about every 2 years."

Wikipedia's article has cited references for the following:
"Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time."
"Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor" at minimum cost."

And here is a line straight from Moore's own paper: "Such a density of components can be achieved by present optical techniques and does not require the more exotic techniques, such as electron beam operations, which are being studied to make even smaller structures."

Moore's Law is about the density of components doubling very year originally with the current version every two years.
 
I'm not saying get rid of USB type C.
Non sequitur. I honestly have no idea where this is coming from.
USB type C is not an internal interface. I don't want drives hanging off the machine. I want them inside.
Basically a form factor or a very highly portable situation, got it.
Your entire post smacks of "I don't need it, and I have no ability to see beyond that, so therefore no one could ever require it".
Try "I don't understand why someone would effectively cripple their throughput unless they had to". Yes you can use an USB 2 mini hub on a USB 3.1 chain but as in my case (don't trust the battery life of Magic keyboard and mouse) I had it laying around from 2.0 days and it make no sense to tie up two USB 3.1 ports if I didn't need to (the cheap PC keyboard has no USB through put - water resistant though).
Newsflash: different users have different needs. Just because you don't do advanced network device/VM simulation, work on your Mac (for example) does not mean nobody does. If you don't need or want those things - congratulations.
You know what is funny? I have seen this exact same argument regarding Microsoft dropping 16-bit code support or Apple dropping 32-bit support. :)
This isn't about YOU.
Never said it was.
 
We have Mac Mini, Mac Pro and now this...could they call it just, the Mac?
 
Mac Pro Cube...

mac pro shorty.jpg
 
  • Like
Reactions: boyorion
We have Mac Mini, Mac Pro and now this...could they call it just, the Mac?
We need the different names so to clarify what type of mac. I still remember the "what where they think?" Performa line -seperate names of the different configuration?! Just who on earth taught that was a good idea?!
 
Try "I don't understand why someone would effectively cripple their throughput unless they had to". Yes you can use an USB 2 mini hub on a USB 3.1 chain but as in my case (don't trust the battery life of Magic keyboard and mouse) I had it laying around from 2.0 days and it make no sense to tie up two USB 3.1 ports if I didn't need to (the cheap PC keyboard has no USB through put - water resistant though).

Huh?

How is hooking up SATA ports to the channels that are in-built to the chipset already (yes, even on your mac that has no sata ports), IN ADDITION to the damn USB-C ports crippling your system?

All not hooking then up does is make the stuff already in your chipset unusable.

And you think we'd be crippling a machine by making use of the stuff included in it?
 
From Moore's Law (investopedia):
Understanding Moore's Law
In 1965, Gordon E. Moore—co-founder of Intel (NASDAQ: INTC)—postulated that the number of transistors that can be packed into a given unit of space will double about every two years

Moore's Law's Impending End
Experts agree that computers should reach the physical limits of Moore's Law at some point in the 2020s. The high temperatures of transistors eventually would make it impossible to create smaller circuits. This is because cooling down the transistors takes more energy than the amount of energy that already passes through the transistors. In a 2007 interview, Moore himself admitted that "...the fact that materials are made of atoms is the fundamental limitation and it's not that far away...We're pushing up against some fairly fundamental limits so one of these days we're going to have to stop making things smaller."
==
The definition of Moore's Law provided in the above shows it is not just doubling of transistors but that doubling happens within a given unit of space ie the density of transistors doubles every two years.

The following definition agrees with that:
"More precisely, the law is an empirical observation that the density of semiconductor integrated circuits one can most economically manufacture doubles about every 2 years."

Wikipedia's article has cited references for the following:
"Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time."
"Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor" at minimum cost."

And here is a line straight from Moore's own paper: "Such a density of components can be achieved by present optical techniques and does not require the more exotic techniques, such as electron beam operations, which are being studied to make even smaller structures."

Moore's Law is about the density of components doubling very year originally with the current version every two years.
From Wikipedia:

Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.

The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and CEO and co-founder of Intel, who in 1965 posited a doubling every year in the number of components per integrated circuit,[a] and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate (CAGR) of 40%. While Moore did not use empirical evidence in forecasting that the historical trend would continue, his prediction held since 1975 and has since become known as a "law."

WTF is “investopedia” anyway?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.