Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As Steve Jobs would say, "IT'S A FEATURE!"


Maybe I just know Macs are better and just happen to like some cool games?
Do you tell your other friends what to do also? LOL
You don't really need an eGPU to play those games unless you care about the high graphics settings.
As for telling people what to do, Apple is in that position, and they've spoken.
 
  • Disagree
Reactions: RogerWade
I see eGPU mentionned here about gaming. I dont think gamers are the main target for that ( they already use PCs , considering many games aren’t even available on Mac ).

The target are mostly 3D artists , as they’ve been massively migrating towards PCs for many years. I don’t think there’s any high-level 3D artist/animator that still uses a Mac, as almost all 3D software is heavily optimized for GPU ( Mac’s weakness) rendering and specifically NVIDIA . The push to eGPUs is an attempt by Apple to regain some of that market.
This is exactly correct. Creative software used to "just work" on Mac, now it just doesn't, unless you're all in with Apple's creative software.
 
I would assume it might be more related to drivers not being ready, though I may be wrong.

Anyway, Apple being Apple (aka our ecosystem or die), I wouldn't be surprised if they envision a cluster of Minis in a rendering farm instead of a eGPU.
It makes sense in a certain way for video work (a base Mini is about the same price as a eGPU enclosure + AMD GPU), still a bit sad for gamers.

Anyone knows if FCPX / Compressor will support clusters with the new M1 computers?
 
Anyone who expected these to still work isn't paying attention. These are ARM based, (i'm guessing) closer to architecture to the iPad/iPhone than intel-based Macs.

Doesn't mean they can't get them to work EVENTUALLY, but no chance they're working now. You also notice that none of the macs released even support discrete graphics?

Yeah, that's not a very good defense either since Macbook Pro users are used to BETTER graphics with discreet graphics, not the INTEL experience.

This is a bad start for a launch in my opinion. The Pro laptop should not have been updated at all. It signals at BAD LAUNCH!

For the Mac Mini and Macbook Air, all sounds good, but you cannot pull this crap with pro users and they should have known better.

And why no tech specs, no GHz specs ON the processors, nothing? I call BS on all of this unless they put up or shut up.

I'm sorry, but MYSTERY will not sell. THE REALITY DISTORTION FIELD IS GONE TIMMY.
 
would be surprising if they did support it
Indeed... all that custom unified SoC with CPU, GPU, Memory, bunch of coprocessors, etc must have something to to do with it.
Pulling hypotheses out of my arse: maybe the architecture is so vastly different that it either doesn’t make sense or it would be quite the detour to get out whatever it needs to be gotten out from from memory from that whole cluster, sent to GPU memory (and external at that) and then catch back the results.
 
  • Like
Reactions: Si Vis Pacem
Yeah, that's not a very good defense either since Macbook Pro users are used to BETTER graphics with discreet graphics, not the INTEL experience.

This is a bad start for a launch in my opinion. The Pro laptop should not have been updated at all. It signals at BAD LAUNCH!

For the Mac Mini and Macbook Air, all sounds good, but you cannot pull this crap with pro users and they should have known better.

And why no tech specs, no GHz specs ON the processors, nothing? I call BS on all of this unless they put up or shut up.

I'm sorry, but MYSTERY will not sell. THE REALITY DISTORTION FIELD IS GONE TIMMY.
What do you mean? Didn’t the 3x CPU, 5x (or 6x GPU) and 11x (or 15x tensor cores) faster numbers to the previous generation equivalent means anything? Or that equivalent best selling ultra books are both more expensive and considerably less performant? Provided that it’s true, how is a several times better performant system for the same price (or less) signals a bad launch or not have been updated? I honestly don’t get it.

Why no tech specs: because there’s this myth that more RAM and more GHz = a better overall system. Check at Android... they can’t throw enough GB of RAM to save it while an iPhone with a measly 3GB of RAM runs always extremely snappy.

But ok, fair is fair, how about this, let’s wait for some benchmarks and jump to conclusions then?
For what is worth an iPad Pro from 2018 already could edit 6K+ RAW footage without frame drops (it’s on a Max Tech channel episode where the guy is trying to run a special type of coded and no machine could... except an iPad Pro). The same iPad Pro 2018 already had better single core, multi core and metal scores than TODAY’S 2020 high-end intel chipsets, and that’s without active cooling that now the M1 Mac Mini and MacBook Pro 13” have.
 
So I have a 2019 MacBook Pro i9 with a Razer core x with a nvidia 3080 and a Samsung odyssey g7 and it kicks butt on gaming. But the big new today is the launch of the Xbox x and for $500 it a dream to game on so IOS games are simple games and Apple has really lost the gaming market. They have even lost with VR the oculus quest 2 is a great game system.
 
Like it matters. Apple's integrated graphics are like 2x faster than intel's and 20x faster than any eGPU made 5 years ago for 98% of PCs. They're even faster than iPhone graphics!

Edit: Why so many dislikes? I mean those graphics are like a million times faster than a TI-86. Who would ever need extra? It even has over 640K Ram! Also, I heard that if you really wanted them, you could get airtag based eGPUs

Edit 2: So. Many. Dislikes. Keep them coming, but I have to warn you, if Apple takes note of this we may never see airtag eGpus...or even airtags for that matter.

LONG LIVE THE AIRTAG BASED EGPU!
Yes, it does matter. You are assuming that an eGPU is used only to drive displays. NVIDIA CUDA cores are useful for everything from data analysis to audio dsp processing. It is a real pity that Apple has essentially removed support for external CUDA cores. You'd have to use the Mac as a thin client and put the CUDA cores in a server. For those of you too young to remember using WordPerfect or Lotus 123 for DOS, early PCs were also used as mainframe terminals. Everything old is new again. [sigh]
 
The killer app would be ATV+, which they ended up allowing to run on all the competing hardware along with AirPlay, and third party games, which didn't really happen. I mean the TV area is all about ecosystem lock-in, not real innovation. Apple also kept changing the UI around as if they don't really know what they want.

I meant “killer app” in a broad sense, not literally an app. Sorry if that wasn’t clear. There’s not much missing that Apple can add hardware-wise. Like I said, HDMI 2.1 is about it.
 
  • Like
Reactions: KeithBN
Yes, it does matter. You are assuming that an eGPU is used only to drive displays. NVIDIA CUDA cores are useful for everything from data analysis to audio dsp processing. It is a real pity that Apple has essentially removed support for external CUDA cores. You'd have to use the Mac as a thin client and put the CUDA cores in a server. For those of you too young to remember using WordPerfect or Lotus 123 for DOS, early PCs were also used as mainframe terminals. Everything old is new again. [sigh]
Lol. I know it matters. That's why I make funny post. Apparently, most people don't seem to find it funny though ?‍♂️
 
An integrated graphics card is still an integrated graphics card no matter the marketing. Crappy graphics will continue to be the bottleneck of Apple products. I expect this to only get worse as they migrate to their own silicon to make direct comparisons to "the leading PC laptop", become harder as they move away from AMD and Nvidia graphics.

There is a reason they hid everything behind "leading selling PC" and which could be a netbook for all we know. Because if they compared their GPU to even a mobile version of a Nvidia 3000 series GPU, the Apple silicon would have been left in the dust.
 
They still sell the Intel versions. But if you are a gamer, why consider a Mac?
I am a scientist. Intel macs were like a swiss army knife. I could use Mac, Windows and Linux software. Some of the software is graphically intensive, which runs better on windows. Period. But after ~2007 apple have been putting absolute garbage GPUs in their computers. You have to buy a top of the line MBP to get yesterdays mid-range graphics card. That wasn't always the case.

Sadly Apple's usefulness for 1) people who need cross-platform support, and 2) Graphics intensive tasks. Is rapidly coming to an end. If not already arrived at the station of no return. Especially at their price points. If they are going to run integrated graphics solutions, give it an integrated graphics price. Don't make out its some graphics powerhouse, if you are comparing it to the last generation IRIS integrated intel garbage. Michelangelo painting the Sistine Chapel ceiling would look fast compared IRIS graphics.
 
Last edited:
keep in mind that these new Macs also can most likely be bricked or disabled remotely very much like an iPad or an iPhone being disabled or bricked by apple or law enforcement.

Intel Macs can never be bricked. you can always reinstall the OS

You cant do this with these ARM Macs. They have security chips, hardware where they can be permanently disabled remotely by apple or law enforcement.

They can do this with the iPhone and iPad. Same exact processor and hardware.

And ARM Macs. still no Touch screen

What a rip off of a machine. not to mention all the digital rights management built in the hardware and BIG SIR

BIG SIR. no thank you. Catalina.
 
Like it matters. Apple's integrated graphics are like 2x faster than intel's and 20x faster than any eGPU made 5 years ago for 98% of PCs. They're even faster than iPhone graphics!

Edit: Why so many dislikes? I mean those graphics are like a million times faster than a TI-86. Who would ever need extra? It even has over 640K Ram! Also, I heard that if you really wanted them, you could get airtag based eGPUs

Edit 2: So. Many. Dislikes. Keep them coming, but I have to warn you, if Apple takes note of this we may never see airtag eGpus...or even airtags for that matter.

LONG LIVE THE AIRTAG BASED EGPU!

Apple's Unified Memory is HBM. The only reason they have decent performance is the 1024 bit interface that interconnects the Neural Engine, GPU cores, CPU Cores all using HBM memory.

It's the same design that the PS5/XBox X is using with custom Zen 2 SoCs and a unified GDDR based RAM instead of HBM. If they had gone with HBM on the PS5 and XBox it would have added > $100 to the price tag. What it should show is the R&D developed in making those two SoCs for Sony and Microsoft, all of it will be incorporated in the Zen 4 CPU/APUs and RDNA 3.0 GPUs. They'll incorporate their own AI FPGAs as well ala Neural Engine.

There's a catch with Apple's use of HBM memory. It has a peak limit of 128 GB potential with 8 16GB stacks. The problem is 32GB stacks have been used on their AMD GPUs for the Mac Pro designed by AMD. The die size becomes quite large, considerably larger and the power draw goes up considerably.

I don't foresee the Mac Pro M based processor relying on HBM to replace the current 1.5TB of DDR4 capable. Pro lines will have DDR5/LPDDR5 RAM and a lot more power consumption.

Bragging about an integrated GPU besting a discrete GPU from 5 years ago (try seven and say let's take the Radeon R9 270x)


FP32:2.688 TFLOPS

That's with a 2GB GDDR5 VRAM on a 256 bit bus.

Integrated GPUs are light years behind Discrete GPUs.
 
  • Like
Reactions: torncanvas
The funny thing is that Apple could fix this if they just decided to call the entry-level 13” MacBook Pro (2 TB ports) the MacBook and reserved the Pro branding for the higher-end (4 TB ports) systems.

The Intel Macs these replaced had 1.4 GHz 15W CPUs and were limited to 16GB of DDR3(!) RAM. They were Pro in name only already. I’d say few people were buying these to use with a high-end eGPU.

That said, I’m sure the eGPU limitation is temporary while they figure out drivers. Luckily, AMD is pretty open about their drivers and I’m sure already working with Apple on a solution.

Also, anyone complaining about 2.6TFlops from on package graphics in a machine targeting 15w TDP needs to reevaluate the possibilities this brings... Comfortably editing 4K video un-tethered, without your lap on fire, and netting battery life you can measure in hours is an amazing feat.
 
ok, then

if no virtualization extensions
if no egpu support

Why they still using "Pro" on macbook name?
 
I still wonder if my usb-c to display port adopter will work or my cal digit thunderbolt 3 dock
yes, same question to caldigit support. they said their TS3 plus dock works with all M1 Macs and can support ONE external monitor. If you want more than one external monitor from M1 Macs, you need to get M1 mac mini or wait for M2 :)
 
Black Magic eGPU on the apple store page shows support for mac mini M1https://www.apple.com/us_epp_55499/...23c64fe07140be99a602ff5d75b6c8426632a18d30161
That page has changed and no longer shows compatibility with M1 macs. Neither does the store app

Blackmagic eGPU - Apple.png
 
Intel Macs can never be bricked. you can always reinstall the OS

You cant do this with these ARM Macs. They have security chips, hardware where they can be permanently disabled remotely by apple or law enforcement.

They can do this with the iPhone and iPad. Same exact processor and hardware.

Who is using an iPhone today worried it's going to be bricked? What are you doing on your phone that it's a concern?

The concern should be the data mining and background stuff that goes on with a phone being extended to your home computing, but I'm pretty sure they were already doing that through the OS.
 
Fixed it. Sorry, but it looks like I will be using Windows for professional work and maybe Macs for personal use.
Everything is relative of course but the vast majority of “professionals” using a Mac won’t be doing it on the lowest tier machines - certainly not those who care about an egpu.

Those people almost certainly have a working machine now - so “wait a year or two” is a very sensible approach - just like if they updated new intel based Macs with minimal improvements over the previous modes - wait to see what comes out next is generally a pretty safe option.

A professional switching their entire working environment because a consumer/low end product that they likely wouldn’t use anyway is released and doesn’t support the features they need, is just bizarre.

If they released a new intel mini with otherwise same specs (ie 2 tb/USB-c, max 16gb ram) that wouldn’t suddenly make me give up my 2018 mini with 4 ports and 64Gb and use windows, would it?

the number of people who feel the need to proclaim loudly that they will/have “switched to windows” is baffling. Either do it or don’t do it, nobody here cares if you do.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.