Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Again Mike you keep saying "down from" but you're comparing them to more powerful models that are still able to buy!
But here’s the rub - how long will Apple write Mac OS that will support Intel? Will they drop it in a year? Two? I’ve never spec’d lower than an upper mid tier build, and as a result usually ended up with builds that exceed $2k. I’m not willing to spend that much money on a box that won’t be upgradable after a couple years. My builds typically last 5-7 years before obsolescence. If I bought the Mac Mini that I feel like meets my current needs, it would cost a little over $2k with an egpu. Now I have no idea how long it would be supported.
 
They do seem to be supported as noted on Blackmagic eGPU link under compatibility so there's hope 🤞

<snip>
1605105753442.png

I just went to your link, and the "M1" reference must have been removed.
Don't get me wrong: I have an eGPU and hope they would support it!
I don't try to be controversial, but the written evidence out there and the comment from Apple makes it sound very unlikely. Don't you think so?
 
I’m well aware of that and you have completely missed my point. It’s called marketing: educate yourself.
Yes, obviously. If you haven't gotten over the fact that Apple labels thing "Pro" which aren't in 11+ years, that's on you.
 
Last edited:
But here’s the rub - how long will Apple write Mac OS that will support Intel? Will they drop it in a year? Two? I’ve never spec’d lower than an upper mid tier build, and as a result usually ended up with builds that exceed $2k. I’m not willing to spend that much money on a box that won’t be upgradable after a couple years. My builds typically last 5-7 years before obsolescence. If I bought the Mac Mini that I feel like meets my current needs, it would cost a little over $2k with an egpu. Now I have no idea how long it would be supported.

Apple should offer a definitive support lifecycle so we can know for certain.

I am convinced they will be supporting Intel for 5-10 years, at least. To keep from losing professionals like crazy...

For instance, they showed off how fast their ML processing is, but that is compared to Intel integrated graphics when anyone really using ML is using AMD or Nvidia for this. How long or if AMD or Nvidia will support ARM Macs is unknown. This is just one example...
 
  • Like
Reactions: minivini
Intel UHD Graphics 630 in Mac Mini 2018: 3.150 GPixel/s, 25.20 GTexel/s, FP32 403.2 GFLOPS

M1 Mac Mini 2020: 41 GPixel/s, 82 GTexel/s, FP32 2.6 TFLOPS

A12Z FP32 1.1 TFLOPS
Radeon Pro 560X 2.056 TFLOPS
Radeon Pro 5300M 3.2 TFLOPS
Radeon Pro 5300 4.2 TFLOPS
Radeon Pro 580X 5.530 TFLOPS
Yep. The UHD630 was so weak that even an in graphics 6x faster M1 can't match an average graphics card.
 
The new System on a chip should removes the need for parts. So when the memory fails we get a whole new laptop. It kind of made me angry that they removed the 2 TB ports from the mini. Given the new GPU's from both team red and team green are getting crazy good maybe they should have left egpu support in the OS.
 
Re-establishing a relationship with Nvidia isn't going to happen...until conditions change and such a relationship would be an asset to Apple.

Nvidia wants to guide users to use, need their CUDA cores. Apple wants control. Apple doesn't need NVidia, although enthusiasts and a small slice of gamers wish they would.
I take it you didn't know Nvidia bought ARM a few months ago.
 
  • Like
Reactions: torncanvas
I would assume it might be more related to drivers not being ready, though I may be wrong.

Anyway, Apple being Apple (aka our ecosystem or die), I wouldn't be surprised if they envision a cluster of Minis in a rendering farm instead of a eGPU.
It makes sense in a certain way for video work (a base Mini is about the same price as a eGPU enclosure + AMD GPU), still a bit sad for gamers.

Anyone knows if FCPX / Compressor will support clusters with the new M1 computers?
I own a render farm (I do 3d art).

A mac Mini isn't cost effective as a render node. For the cost of 1 mac mini, I have 3 maxed out z210 workstations. 8c/8t (ARM) vs 12c/24t Intel.

That is before running rendering engines in emulation - hard pass.
 
  • Like
Reactions: Frank Philips
To be perfectly honest the 13” “Pro” is an insult to Pro machines, and I say that with a lot of respect for what Apple has impressively done with their own silicon. “Pro” used to be synonymous with “dedicated GPU” in terms of Macs but now they just throw the term about wherever they like. The 13” MacBook Pro is not a MacBook Pro in my eyes, and I’ve felt like this since they released the first 13” “Pro”.

People need to stop with the Pro terminology. Pro in product names just means enhanced or better. Is the MacBook Pro 13" better than the Macbook 13"? Yes. Okay that is why its Pro.

Like PS4 Pro, Visual Studio Professional, Vimeo Pro, and many more products.

And Pro does not mean the same thing to everybody. I cannot get 128GB of RAM on a Microsoft Surface Pro, yet I need that much RAM for some of my professional work. That does not mean Microsoft should drop the Pro off the product.
 
I don’t really understand the logic. The Mac Mini is a very capable alternative to the iMac if you don’t need (or want) the aio form factor. Now they’ve seemingly hamstrung it; and for what purpose? I was seriously planning to move to a Mac Mini for my first Apple computer. Now that I can’t have more than 16Gb RAM, no Windows platform, and no egpu, I think I’ll just stick with a Windows desktop to handle my photo and video needs. Disappointing.
Let's just wait and see. Since they are still selling the Intel Mac mini, I believe this M1 just replaces the entry level Mac mini. Next year with M2 will handle the higher performance of the Mac mini with 32GB or more of RAM support.
 
  • Like
Reactions: minivini
I think all this discussion is premature.
1/These are 1st gen model, they will lack features and not be at full parity.
2/Target should not be people with 2018/2019/2020 intel machines. Upgrade cycles (exclude high end pros) are usually at least 3 years, and these are entry models.
3/I am disappointed that eGPUs are not supported and I wish we knew if this was permanent (this basic arch means eGPU latency would just not work for anything but offloading compute) or if its a SW thing
4/We are jumping to conclusions on how the Pro Machines will look (RAM support, etc). I do think this is indicative of how the arch will look (integrated memory, gpu) but not on perf or raw capability. Its very possible they will release a 32GB RAM Mini option when the next set of Mx CPUs is released as a step up. Or maybe they won't - we will see.
5/I am holding up on a purchase til gen 2. The only reason I would buy now is if I was at an upgrade point or needed an Apple Silicon platform to build Mac Apps.
6/I agree, the 2 TB3 model MBP (Intel and M1) should not be a Pro model. Its marketing.

-Shaown
 
And it shouldn't... eGPU was a big failures. The enclosure costs like $250 and the GPU itself costs like another $250 totaling $500 and in the end it does not perform as will as built-in GPU. If you are willing to pay $500 more just to get better graphics, maybe you should invest in a complete Windows system or buy a Mac with dedicated GPU.

Its even worse that the eGPU cable length is short so its not exactly a mobile setup. Maybe Apple has plans to build their own GPUs and make them eGPUs for the future.
 
  • Like
Reactions: svenning
Isn't it that GPUs have an option ROM to get it initialized, which is compiled for x86 on most GPUs and therefore won't run on ARM.
 
Isn't it that GPUs have an option ROM to get it initialized, which is compiled for x86 on most GPUs and therefore won't run on ARM.
It’s because the way those AMD/NVIDIA GPU’s work is they’re sitting there waiting for the system to push a load of data over to render. In an Apple Silicon system, the GPU is right next to the CPU accessing the same RAM, there’s no “load of data” to reroute from an internal discrete card to an external discrete card.
 
Limitation I guess, but do many people actually use these? Honest question... nobody I know of does.
I agree with the fact not many people use this functionality, I feel obligated to at least let you know that I do.

Not for Mac OS use though, it is entirely for gaming & bootcamp. My 2018 i7 Mac mini equipped with Razor Core X egpu does quite well as a gaming machine. I paid a bit more than an equivalent gaming rig (especially if comparing to custom built vs store bought) but I also have luxury of switching back to Mac OS whenever I want along with a few other nice things I wouldnt have otherwise such as 4 thunderbolt 3, and nice small desk footprint (the egpu is out of sight).

Like I said this set up works fantastic. Windows 10 is now very compatible.
 
  • Like
Reactions: Return Zero
It’s because the way those AMD/NVIDIA GPU’s work is they’re sitting there waiting for the system to push a load of data over to render. In an Apple Silicon system, the GPU is right next to the CPU accessing the same RAM, there’s no “load of data” to reroute from an internal discrete card to an external discrete card.

Not true, any data would be transferred directly through the PCIe bus on the M1. There is no hardware limitation that I can see. M1 supports PCIE4 so there is no hardware reason (on the M1 itself, it could be missing support in the thunderbolt controller) why it cant run an eGPU.

My Intel MPB has integrated GPU near the die that shares RAM, and works identically to the M1 in terms of data flow, it also works with a discrete GPU/eGPU.
 
Not true, any data would be transferred directly through the PCIe bus on the M1. There is no hardware limitation that I can see. M1 supports PCIE4 so there is no hardware reason (on the M1 itself, it could be missing support in the thunderbolt controller) why it cant run an eGPU.

My Intel MPB has integrated GPU near the die that shares RAM, and works identically to the M1 in terms of data flow, it also works with a discrete GPU/eGPU.
That’s not how the Unified Memory and Apples TBDR system works. Intel may work that way, but the Apple CPU does not ‘move’ data to the GPU, they access the same memory. So, unless an eGPU solution can directly access the same memory the CPU accesses, there’s no eGPU solution possible.
 
That’s not how the Unified Memory and Apples TBDR system works. Intel may work that way, but the Apple CPU does not ‘move’ data to the GPU, they access the same memory. So, unless an eGPU solution can directly access the same memory the CPU accesses, there’s no eGPU solution possible.

The PCI spec allows PCI devices to map system memory, however it is not going to be near as fast as the on SoC GPU being able to access that memory. I am also wondering if not enough PCIe lanes were not available (Thunderbolt needs 4 dedicated PCIe lanes).
 
oh cmon aapl
you need to step up your gpu situation
i have to separate gaming race cars with win 10/intel ovrtclocked and nvidia rtx gpu
 
Let's just wait and see. Since they are still selling the Intel Mac mini, I believe this M1 just replaces the entry level Mac mini. Next year with M2 will handle the higher performance of the Mac mini with 32GB or more of RAM support.
That’s valid. I’m still contemplating a new intel Mac Mini. I just wish I had some basic idea when Apple will cease support. I’m also wondering if they will revert to their old habits and solder in the RAM, etc. I rechecked OWC and a single 32Gb module is just over $150. The full 32Gb kit when spec’ing at Apple is $600. Gonna give it a week or two and try to decide then.
 
Yeah, sure, the Mac was already quite literally “infinitely” unappealing to gamers. That’s why every MacRumors story dealing with Mac GPUs is filled with questions from gamers asking about GPU performance, along with the inevitable Pavlovian spasms about how no one cares about the questions so many readers are discussing.

(Doubtless all these alleged “Mac gamers” don’t really exist, and all their fake questions are being made up by a vast multi-state criminal conspiracy determined to change the outcome of our MacRumors forum discussions. ;))
Small enough number that Epic didn't mind stopping development of Fortnite for Mac just to prove a point, a game that supports like all the other platforms, if that adds perspective. And that's a casual game; "gamer" = someone who takes this as a hobby.
 
I own a render farm (I do 3d art).

A mac Mini isn't cost effective as a render node. For the cost of 1 mac mini, I have 3 maxed out z210 workstations. 8c/8t (ARM) vs 12c/24t Intel.

That is before running rendering engines in emulation - hard pass.

Yes, I easily imagine that.
But don't forget we're talking Apple here. It's another world where the laws of physics do not apply, and other systems do not exist.

A friend of mines bought a computer for 3D graphics in 2018. He ended up buying some entry-level Lenovo workstation. While I don't recall all the specs (was it an I5 or I7?), I do recall it came with 16GB of ECC Ram and a Quadro GPU. It also came with a 3 years warranty with on site intervention.

That machine was roughly the same price as the I7 Mini with 16GB or Ram, a bumped up SSD, and only a 1 year bring your computer to the Apple store warranty.
 
Small enough number that Epic didn't mind stopping development of Fortnite for Mac just to prove a point, a game that supports like all the other platforms, if that adds perspective. And that's a casual game; "gamer" = someone who takes this as a hobby.
Yet still large enough for developers like Feral Interactive to make a profit by continuing to develop and publish native Mac versions of games that we nonexistent Mac gamers could simply play in Boot Camp.

Companies don’t stay in business for two decades selling products that are “infinitely unappealing.”
 
  • Like
Reactions: Homy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.