Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Putting 128 ARM cores in a Mac Pro will let it run some kinds of software very fast but not everything can be split into 128 parts that work in parallel.

I suspect that all of the Apple software, like Final Cut will be re-written to have 128+ threads but third-party software would not be re-written for the very small Mac Pro user base.

What Apple needs if they are going to continue selling high-end computers is an ARM core that is faster, Simply adding cares will not work in all cases.

On the other hand, Apple might just abandon the professional market and stall with only consumers. Why would they bother with a low-volume product? Would they abandon an entire segment? Yes. they abandoned Aperture and gave away the entire pro photography market to Adobe. They might do the same with the pro film editing market. remember Apple was "all in" and promoting Aperture until that last second, then switched. Apple will take about the future of Final Cut until one second they don't.

Don't they already have that? A faster than anybody core?

What they need is more of them, it's the only thing intel has got to hold itself.

And that is surely coming along with the extra versatility (GPU, TB, etc...) the M1 currently lacks.

Software that is not parallellizable won't be any slower on a chip with lots of cores. On the contrary, it's possible such a chip will have 8 or 16 high speed cores for single-threaded processes, and then 64 or 128 for applications that benefit from having many threads.

"Professional" workloads are precisely the ones you can just throw more cores at. Graphics, audio, and compilation are all highly parallelizable, and if you're supporting 8 cores, you're already supporting 128 cores. (It's glibly said that in computer science, there are only three numbers that matter—0, 1, and Infinity.) It's simple, user-facing, interface-related tasks that are hard to parallelize and which benefit most from having individually faster cores.

All this going on about 128 cores...

You all realize the rumors do not say a single thing about a 128 core CPU (APU), but the 64 core & 128 core references were towards Apple GPUs (for a future Apple Silicon-based Mac Pro)...!?!

And for anyone who "called it", I have been going on about Apple making another Cube for a couple of decades now, so...

28 Performance cores (just to thumb their nose at the 28 core Xeon in the 2019 Mac Pro)
4 Efficiency cores
32 GPU cores
16 Neural Engine cores
32GB RAM (128GB RAM maximum)
1TB NAND SSD (4TB maximum)
Four USB4/TB4 ports
Two USB-A ports
HDMI 2.1 port
Two Gigabit Ethernet ports (10GB Ethernet option)
420W Platinum-rated PSU
US$2,499 (base model 32GB RAM / 1TB SSD / Gigabit Ethernet)
US$4,999 (max model 128GB RAM / 4TB SSD / 10Gb Ethernet)

Apple Low-Profile Mechanical Keyboard & Apple 3D Mouse
US$249 for the pair

Apple 40" 5K2K Ultrawide TB4 Monitor
US$1,749

And to close, a blast from the past:

Qk2bnWyclKa689OEwfOdGLBvvsWUtA6twg4gy9T5d38.jpg


G4 Cubes running the displays on the set for the NX-01 Enterprise...!
 
Last edited:
It's funny when you go back in time.
Everyone is excited about these new machines yes?
We don't know the future, but right now it's all amazing excitement... Yes :)

And yet, let's jump back to 2006 and look at the new machine then:

OMG how amazing that was, and how amazing the future will be. ;)

And yet, here we are, just 15 years later, and what do we think of all that now?

It should be a lesson to everyone that the future is very hard to predict. Another 15 years, Intel may be king, Samsung may be King, Apple may be a distaster.
Right now, that seems impossible to imagine, but 10 or 20 years from now?
 
This time it will work because of Apple Silicon. The Cube and the Trashcan were hampered by Intel and Motorola and heat.

And the drive to make the boxes completely closed. The trash can was innovative, but not in a way that really addressed any current issues. To find out it overheats, and people trying to upgrade them are facing issues is not at all unexpected. It was a clever, and ruthless stab in the heart of professional computing. That market tends to require more flexibility than the trash can could ever supply. It should have been bigger, it should have been engineered with a future. Being a closed box wasn't a good idea, and since they pretty much dropped it after it was birthed, proves they realized they screwed up. Still, if anyone has one, and wants to get rid of it cheap, let me know. I'll put it next to my older Mac Pro. They can keep each other company...

(They could have added a 'nuclear reactor style' cooling tower to the top of it. THAT would have been epic!)
 
It's funny when you go back in time.
Everyone is excited about these new machines yes?
We don't know the future, but right now it's all amazing excitement... Yes :)

And yet, let's jump back to 2006 and look at the new machine then:

OMG how amazing that was, and how amazing the future will be. ;)

And yet, here we are, just 15 years later, and what do we think of all that now?

It should be a lesson to everyone that the future is very hard to predict. Another 15 years, Intel may be king, Samsung may be King, Apple may be a distaster.
Right now, that seems impossible to imagine, but 10 or 20 years from now?

I remember the talk from several years (decades?) ago, that Microsoft was going to buy Apple. Apple was going to buy Sun. Apple was going to buy IBM, or the other side, IBM buying Apple. Lenovo buying Apple (I loved that one. Sure, yeah) I remember a client having conversations with IBM to enter into a contract to manufacture a cooling plate chassis and the IBM folks telling them that Apple was all but dead, and was going to declare bankruptcy and be broken up and sold for scrap. So much for that prediction, but it was interesting if even half of that was true. There's talking smack about competitors, and then just spreading crap thick and wide. *shrug*
 
  • Like
Reactions: Piggie
I remember the talk from several years (decades?) ago, that Microsoft was going to buy Apple. Apple was going to buy Sun. Apple was going to buy IBM, or the other side, IBM buying Apple. Lenovo buying Apple (I loved that one. Sure, yeah) I remember a client having conversations with IBM to enter into a contract to manufacture a cooling plate chassis and the IBM folks telling them that Apple was all but dead, and was going to declare bankruptcy and be broken up and sold for scrap. So much for that prediction, but it was interesting if even half of that was true. There's talking smack about competitors, and then just spreading crap thick and wide. *shrug*
I have a feeling, and I may be wrong, that Apple are going to get to be more and more of a juganaught over the next decade or two and will become more and more controlling over what users can do, and what they want users to pay for.

They are, one might say, pushing the limits even now, but still people are just about ok with it.
Arrogance and feeling of invulnerability is a very easy error for giant companies to get as years go by.

Microsoft I'd say rose up, got too large/controlling, got knocked back, regrouped and have come out better.
I'm feeling that Apple is perhaps on the same path, and will at some point get knocked down, either due to their controlling practices simply getting too much, or customers becoming interested in someone else. Perhaps someone we don't yet know of.
 
  • Like
Reactions: dysamoria
... or customers becoming interested in someone else. Perhaps someone we don't yet know of.

I got an Nvidia AGX Xavier dev kit. It cannot (yet?) boot from NVMe... but if it could it would make for an excellent computer.
I like you can set the power state (full, 30W, 15W, ...), running Ubuntu. I even managed to run my GAE project on it without difficulties (using Intellij)
Compiling a custom AI (Qt/C++ based) little more than pulling form git.

Or, the makers of the beagleboard announced BeagleV, a SOC based on RISC V.

Interesting times
 
Why would they keep intel over mac silicon?

edit: I was honestly asking. Thank you to those who answered without the snark. To those angry folks, maybe calm down a bit. lol.
There are two reasons: One, for people who need a _powerful_ Intel computer. An 8+4 core M1 would be able to run a Windows emulator for a low-end PC, so this only applies to people needing something powerful.

Two, Apple is building more powerful versions of the M1 chip one after the other. As long as an Intel chip is more powerful, Apple will sell it.
 
The Cube was an abysmal failure from day one. Hopefully they've learned their lesson
No. The Cube was released just at the start of an unforeseen financial crisis. It was slightly expensive, but everyone buying it was quite happy. Then suddenly money got tight. Any other time it would have been a reasonable success.
 
Apple could make a Mac mini Pro. Like the Pro, 2 handles on the top and 4 small feet. This 'raised' mini could house beefed-up specs with bumps on the underside.
 
Last edited:
This time it will work because of Apple Silicon. The Cube and the Trashcan were hampered by Intel and Motorola and heat.
... and the tiny cylinder case design that Jony Ive forced it all into. That was a big part of the problem too.
 
The Cube actually had a pretty cool industrial design. It was one of the first Mac designs to be fanless and cooled by convection. Unfortunately, it had other problems and didn't sell well. A revised version should be a whole different story.



I can imagine someone wanting to buy one last Intel powerhouse if they have a significant software purchase they want to keep running for as long as possible.
Other problems? The thermals WERE the problems. Just like with all of their compact computers. Beautifully designed for light, casual, or average usage, but terrible for hot & heavy power-user workloads. They tend to die after a year or two from thermal issues, if put to serious CPU & GPU workloads.
 
Last edited:
No. The Cube was released just at the start of an unforeseen financial crisis. It was slightly expensive, but everyone buying it was quite happy. Then suddenly money got tight. Any other time it would have been a reasonable success.
Also, it cracked over time. That didn't help sales either.
 
  • Like
Reactions: dysamoria
What lesson were they supposed to learn?
Thermals. They didn’t learn. Every MacBook Pro since, plus the 2013 Mac Pro, tell us this. Maybe they’ve finally seen the error of their ways and that’s why Jony Ive “got bored”... And the current Mac Pro is well made for thermals, but costs too damned much for non-corporate studios and plutocrats. I’d like it if they learned their lesson about thermals AND pricing.
 
Thermals. They didn’t learn. Every MacBook Pro since, plus the 2013 Mac Pro, tell us this. Maybe they’ve finally seen the error of their ways and that’s why Jony Ive “got bored”... And the current Mac Pro is well made for thermals, but costs too damned much for non-corporate studios and plutocrats. I’d like it if they learned their lesson about thermals AND pricing.
The m1 air seems to have solved the thermals issue. I know it's not pro'y enough for many but it's not like they are going to include real GPU's for much longer.
 
A Mac Pro that had both an Intel chip and Apple Silicon chip for full compatibility would be pretty interesting.

And expensive.
I’d love this, especially with option to run an NVidia GPU. Then I could buy ONE new computer, not two.

It’ll never happen. There’d be way too much engineering involved for what Apple would consider a temporary product, since their plan is to transition, not hybridize. They want to exit from Intel reliance, not perpetuate it.
 
Putting 128 ARM cores in a Mac Pro will let it run some kinds of software very fast but not everything can be split into 128 parts that work in parallel.

I suspect that all of the Apple software, like Final Cut will be re-written to have 128+ threads but third-party software would not be re-written for the very small Mac Pro user base.

What Apple needs if they are going to continue selling high-end computers is an ARM core that is faster, Simply adding cares will not work in all cases.

On the other hand, Apple might just abandon the professional market and stall with only consumers. Why would they bother with a low-volume product? Would they abandon an entire segment? Yes. they abandoned Aperture and gave away the entire pro photography market to Adobe. They might do the same with the pro film editing market. remember Apple was "all in" and promoting Aperture until that last second, then switched. Apple will take about the future of Final Cut until one second they don't.
Yup. Linear processing tends to not make use of multiple CPUs/cores. Logic is a great example. A single stream of audio cannot be split into threads.

Rendering pipelines can make good use of many cores, but Mac OS isn’t exactly a priority for 3D developers. Every cross-platform 3D modeling & rendering package I’ve used has been much less stable on Mac OS. The Lightwave 9.6.1 physics engine crashed endlessly on me on Mac OS, as did trials of Modo.

From what I read in forums, NewTek only had ONE programmer working on the Mac port of Lightwave. This was years ago, but it wouldn’t surprise me if it were still true.

Though, with all the crap Apple have added to Mac OS to accommodate their cloud services & iOS devices, we need more cores just for the OS itself... We miss you, Snow Leopard.
 
Sounds great, but it's gonna make it even more difficult now to sell my 2019 Mac Pro. Been trying since October. No one seems to want them, so I'm stuck with it. I need the cash, not a machine I can't use anymore (the pandemic destroyed my business).
Sorry to hear you’re in this situation. It’s hard to find a buyer for such an expensive machine. I assume you’ve offered it on forums where smaller FX studio workers hang out?
 
"Professional" workloads are precisely the ones you can just throw more cores at. Graphics, audio, and compilation are all highly parallelizable, and if you're supporting 8 cores, you're already supporting 128 cores. (It's glibly said that in computer science, there are only three numbers that matter—0, 1, and Infinity.) It's simple, user-facing, interface-related tasks that are hard to parallelize and which benefit most from having individually faster cores.
Audio isn’t very parallelizable. From what I last read (and may be out of date), all the plugins on one track are forced to use the same thread. You can have many more tracks with more cores/CPUs, but there’s still an upward limit to how many tracks are manageable, or even sensible, in a project.
 
Oh? Are you doing GPU-accelerated 3D rendering farm work on M1 machines?
Was that a serious question? M1 macs don't have GPUs. So... no. I do all my 3D work on a machine that respects power. I use my mac to read and write.
 
Rosetta 2 exists for a reason. It assists with the transition period. Moreover, Apple will provide legacy support for years on Intel machines, as will software developers, but there comes a time when you must make the change. I did business to business sales for all the major studios, post production houses, photography studios in LA and time and time again I saw them fall behind because they refused to adapt to new technology. They would end up spending 2-3x as much when they finally upgraded than had they made smaller incremental updates as they became available. Also, Adobe and other major developers have already translated their software to work with and take advantage of Apple Silicon.
You’re not really this naive, right? As a person who lived through the transitions to Intel, and then 64-bit-only, there’s a bunch of software that I need to hope my older machines keep living on for indefinitely...

Rosetta II will be around probably for as long as Rosetta I was (ie: one major version), and there are developers who never ported to the new architectures in prior changes, for various reasons, and the same will happen this time too.
 
  • Like
Reactions: xnu
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.