Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Ahheck01

macrumors 6502
Original poster
Aug 7, 2006
491
45
I currently have my 2019 16" MBP maxed out and using an eGPU with an RX 580. On paper, the M1 Pro has the same GPU power as the RX580, and would be a huge bump in CPU speed.

I do a little bit of everything:
  • Full Stack + Mobile Development
  • Media Production (Final Cut Pro, Lightroom, Photoshop, After Effects)
  • General Productivity (MS Office, Teams, Zoom)
  • Local Generative AI (stable diffusion, llama.cpp, kohya_ss)
I know the ecosystem has evolved rapidly, but what are the edge cases that may bite me in the butt if I make the switch to a 16" M1 Pro?
 
  • Like
Reactions: Miicat_47
By definition, "edge" cases are those that only come up sporadically and under very specific conditions. That alone makes it difficult to identify them as one would have to experience it firsthand to know it's an issue in the first place. Also, why are you looking at the M1 Pro and not the M2 Pro?
 
[...] what are the edge cases that may bite me in the butt if I make the switch to a 16" M1 Pro?
Not much really. Mostly, you will lose eGPU support but for your use cases the M1P is more than enough for those tasks. I'd say go for it. I upgraded from a 13" 2015 base model MBP and the difference is like night and day.
 
  • Like
Reactions: Miicat_47
Local Generative AI (stable diffusion, llama.cpp, kohya_ss)
If you're seriously considering this I'd go for an M1 Max minimum. Preferably with 64GB of memory. llama.cpp works fine with MPS acceleration now but you need a ton of memory for the 13B, 33B, 65B model sizes

Automatic1111 sets up fine with MPS acceleration too but it's unbearably slow even with an M2 Ultra.
 
If you're seriously considering this I'd go for an M1 Max minimum. Preferably with 64GB of memory. llama.cpp works fine with MPS acceleration now but you need a ton of memory for the 13B, 33B, 65B model sizes

Automatic1111 sets up fine with MPS acceleration too but it's unbearably slow even with an M2 Ultra.
I should mention that I'm currently doing most of the generative AI stuff on a PC build with an RTX 3070. Would be nice to do on the Mac, and the unified memory would allow 100% GPU inference whereas my 8GB 3070 is pretty crippled. But not a deal breaker - not like my intel mac can do any of it.
 
By definition, "edge" cases are those that only come up sporadically and under very specific conditions. That alone makes it difficult to identify them as one would have to experience it firsthand to know it's an issue in the first place. Also, why are you looking at the M1 Pro and not the M2 Pro?
We've got twins due in November, so looking for the performance per dollar sweet spot.
 
I should mention that I'm currently doing most of the generative AI stuff on a PC build with an RTX 3070. Would be nice to do on the Mac, and the unified memory would allow 100% GPU inference whereas my 8GB 3070 is pretty crippled. But not a deal breaker - not like my intel mac can do any of it.
You will regret not getting the extra GPU cores in the Max IMO.
 
I currently have my 2019 16" MBP maxed out and using an eGPU with an RX 580. On paper, the M1 Pro has the same GPU power as the RX580, and would be a huge bump in CPU speed.

I do a little bit of everything:
  • Full Stack + Mobile Development
  • Media Production (Final Cut Pro, Lightroom, Photoshop, After Effects)
  • General Productivity (MS Office, Teams, Zoom)
  • Local Generative AI (stable diffusion, llama.cpp, kohya_ss)
I know the ecosystem has evolved rapidly, but what are the edge cases that may bite me in the butt if I make the switch to a 16" M1 Pro?
You can do all the things and with very good performance, but performance with SD will be very poor. EG:

A M2 Ultra with the CoreML port can do about 7 iter/second where as a RTX4090 can do about 28 iter/second at 512x512

My pc with a RTX 2060 6GB can do aroun 7-7.5 iter/second.
 
We've got twins due in November, so looking for the performance per dollar sweet spot.

If you intend to do ML interference, M2 family could be worth it. The expanded AMX and NPU (with bfloat support!) provide a nice boost.
 
I don't know what exact kind of full stack development you do, but as a DevOps/SRE guy, the only real limitation for me right now is a complete lack of nested virtualization on Apple Silicon, which may be a deal breaker for you (it is for me in specific scenarios). There are rumors that this is actually a software limitation that's going to be dealt with in future versions of macOS, but that remains to be seen.

Everything else is amazing - the speed, battery life, Neural Engine, totally worth the money. Just max out a Pro machine with what you need and you're golden.
 
  • Like
Reactions: Basic75
My 2 cents: as a Typescript + Go developer, there's nothing that ARM Macs can't do that other computers can.
 
M-series Macs can't do any of the following:

- act as a coffee warmer
- run Intel space heaters - oops, I mean CPUs
- natively run 68k or PPC code
- run external GPUs
- fold over into a tablet-like form factor with a touchscreen
- develop an independently functioning conscience, thereby becoming sentient beings
 
  • Haha
  • Like
Reactions: diggy33 and ifrit05
You can't run (Intel) Windows of course, and I still find the occasional library in both the web-space and AI-space that still require Intel binaries, but overall most things work now.

While I used to run Parallels with Windows, I'm finding it better overall to have a desktop Windows PC that I VNC into -- this lets me run GPU-itensive AI models without taxing my laptop, as well as do website testing, and run the occasional Windows-only app.
 
  • Like
Reactions: AAPLGeek
Couple of things, check if all of your current applications are available with native versions of the binary (available with a universal binary), and do you need to run VMs? If your applications are available, and you don't run VMs, then you will be very happy with a new MacBook Pro. You can run some VMs on current versions of macOS, but compared to what you can do on an Intel Mac, it is much more limited. Check out Parallels Desktop for Mac, to see it supports any of the guest OS's you need on the Apple Silicon version, you might be fine if you don't do anything too exotic in a VM. I switched from a 2019 16" MacBook Pro to the 2021 16" M1 Max MacBook Pro last year (before M2s were available), and haven't looked back. Good luck!
 
  • Like
Reactions: throAU
With an ARM Mac you can't browse Steam and know which games that sport the Apple logo will run natively on your machine, which will run acceptably well in Rosetta 2, and which won't run at all because they are too weird or 32bit only.
 
I know the ecosystem has evolved rapidly, but what are the edge cases that may bite me in the butt if I make the switch to a 16" M1 Pro?

I haven't really found any with my M1 MacBook Pro other than trying to run games. Which works much better than expected... can run some stuff inside ARM Win11 in Parallels and it is getting better, but overall very happy.
 
I can't do that with an intel MacBook either, lol. Either due to compatibility or performance.

I wouldn't get too hung up on whether apps are native or not. Rosetta 2 performance is great.
At least on my Intel i9 16" MacBook I can bootcamp into Windows and play 100% of my steam game library whenever I want to.
 
  • Like
Reactions: Basic75
I can't do that with an intel MacBook either, lol. Either due to compatibility or performance.

I wouldn't get too hung up on whether apps are native or not. Rosetta 2 performance is great.
Rosetta 2 works wonderfully. However, it is not going to be around forever, and is not a long-term solution. If the application vendor does not plan on updating to Universal binary format, you might want to start looking for a replacement. I would hope that by now, most applications would be updated for Apple Silicon, and if they haven't been, they may never be :-(
 
  • Like
Reactions: Hastings101
That's quite a big limitation. I hope they fix it soon, if not in software for existing machines then at least in hardware for future ones.
As I understand, it's already supported in hardware on the M2 (not M1 unfortunately) but the software support isn't there in macOS yet so you can't do it.
 
Rosetta 2 works wonderfully. However, it is not going to be around forever, and is not a long-term solution. If the application vendor does not plan on updating to Universal binary format, you might want to start looking for a replacement. I would hope that by now, most applications would be updated for Apple Silicon, and if they haven't been, they may never be :-(

That’s a possibility certainly, however there is some hardware support baked into the m1 that apple may decide to support moving forward.

By that time i would expect to have moved onto either replacement apps or native updates, but we’re a long way from that point just yet.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.