Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
and the bastardization of the "Pro" name continues.

This is sad.

On a positive note, the Air looks like a fantastic purchase for this generation of notebooks.

It has never been anything but three letters used by a marketing department to advertise products.
 
The FAN in the Macbook Pro is a big deal.

The fan allows the MBP to run with no need to throttle the M1 processor. It can run full-speed. I think the main use of the larger battery in the MP is to allow the computer to run at high speed for a longer

The "Air" might be used to read emails and web surfing and it might run using only the slow-speed cores most of the time and get the claimed battery life. But if you push the high-speed cores you might cause the fan to come on and the battery to be drained in 2 hours.

The fan actually allows the faster battery drain.

If you want to avoid thermal throttling buy the MBP. Yes, the fan costs $400 but if you need the fan spend the money on it.
 
Last edited:
Need to pay $200 to get the 8th core not disabled. :confused:
plus all the other differences, you know, better screen, Touch Bar, speakers, mics...... you can easily determine everything on the tech specs page
 
Looks like both MBA and MBP have same display, SOC (except the GPU), RAM, USB/TB ports, and storage. MBP has better battery life and the touch bar, but IMO that's not worth an extra $300. Disappointed in the MBP upgrade, especially the max 16gb RAM, but the MBA seems like a good value for money.
You ignored the thermal package difference
 
If the Developer ARM Mac mini was anything close to what they announced, RAM works the same as on intel machines. 16GB max won't cut it for my PRO work. My Mac Pro pages out even with 96GB. Not matter how to try to defend Apple, lower RAM is not acceptable for working with large projects and files.

I think todays announcements were rushed to get the ASMac's out. By the time most of the software is up and running as expected, these will be obsoleted by newer versions.
I think the M1 Mini will probably be a good choice for developers, but maybe as an additional machine for MacOS/iOS development. 16GB is fine for most local development if you are not running local VMs or running heavyweight app servers or DBs on the local machine. Bear in mind that a lot of development pushes builds to cloud infrastructure these days; I haven't run a local VM for several years.

What are you running that uses up your 96GB RAM?
 
I find very strange how two people can disagree with someone else’s experience. What are you split personalities inhabiting the same brain or something? Weird.

He’s giving his true experience of using the Touch Bar and saying it’s a positive one. If you have a different experience then say so, but disagreeing is illogical. I mean, I hate toffee. Are you going to hit the disagree button because you love toffee and think I’m wrong to hate it? Agree that I hate toffee, then go and post how much you love the stuff! 😐
I think recent reactions to the US elections show that a very large number of people take great exception to anyone holding a different view to their own, and will take every opportunity to explain to all and sundry just how wrong other people are. Weird...and very, very, sad....
 
  • Like
Reactions: Tozovac and ian87w
I think the M1 Mini will probably be a good choice for developers, but maybe as an additional machine for MacOS/iOS development. 16GB is fine for most local development if you are not running local VMs or running heavyweight app servers or DBs on the local machine. Bear in mind that a lot of development pushes builds to cloud infrastructure these days; I haven't run a local VM for several years.

What are you running that uses up your 96GB RAM?
with the limitation of only up to two external attached displays ( first one needs to be HDMI from what I am reading ) that limitation alone makes it unpopular with developers. chopping out support for eGPU's and losing 2 TB ports also seems to be an issue.
 
I guess we don‘t know what the answer is - is it a technical limitation or a sales/product lineup decision?

I expect one of the Apple podcasts will get the answer.
If we assume that the same M1 die and package size is used in all these machines, then the limitation will be a combination of the available package real-estate available for the DRAM (it's on-package but not part of the SoC die) and the density of the DRAM chips. Presumably, they could only fit a maximum of 16GB on the package.

Future versions of Apple Silicon will probably use larger packages (with more CPU & GPU cores), and have more room for DRAM, or use the next generation RAM with higher densities.

It's certainly possible - the Fujistu A64FX SoC used in the Fugaku super-computer has 32GB of HBM2 RAM on the package similar to the M1 (https://www.fujitsu.com/global/about/resources/news/press-releases/2018/0822-02.html), but the Fujitsu is a much larger chip.
 
  • Like
Reactions: ascender
Wait is that real? That's absurd and a major difference, but seems impossible. It's built into the chip. That would be like finding out the m1 has standard 32GB RAM that only accessed 8GB.
You also get twice the SSD storage, so not so bad. The extra GPU core still costs $50 though.
 
The Neural Engine is still a little ambiguous to me. How does this help the CPU and GPU? What improvements will I see in my day to day?

Anything using machine learning. Examples are: categorizing photos in iPhoto, image enhancement in photo editors, raytracing denoise acceleration (including realtime raytracing). This is a hot area for new features since it wasn’t practical a few years ago.
 
with the limitation of only up to two external attached displays ( first one needs to be HDMI from what I am reading ) that limitation alone makes it unpopular with developers. chopping out support for eGPU's and losing 2 TB ports also seems to be an issue.
I work with developers and do DevOps, and I have almost never seen people use more than 2 external displays and I've worked with a lot of different teams. The vast majority are running 1 or 2 externals with the laptop screen as an extra for e-mail, Slack etc.

Similarly, most folks are using some kind of dock, so 2 TB ports is probably OK. I have four on my MBP16 and only use one (for the dock with pass-through power), and sometimes stick in an SD card adapter for photos/video. If you don't have a dock or some kind or port replicator, then I agree that 2 ports is not enough (you need power, 2 x display, external drive/device)

Sounds like eGPUs are not yet supported, but again, I have never seen a developer use an eGPU - it's more for video editing & gamers.

The Mac Mini is clearly not a powerhouse, but it will do the job for a lot of devs. I'm seriously considering one for playing around with, but I would need to get a better idea of tool support for MacOS-on-ARM before buying one. It could end up as an expensive toy with limited use other than XCode.
 
Last edited:
I work with developers and do DevOps, and I have almost never seen people use more than 2 external displays and I've worked with a lot of different teams. The vast majority are running 1 or 2 externals with the laptop screen as an extra for e-mail, Slack etc.

Similarly, most folks are using some kind of dock, so 2 TB ports is probably OK. I have four on my MBP16 and only use one (for the dock with pass-through power), and sometimes stick an SD card adapter for photos/video.

Sounds like eGPUs are not yet supported, but again, I have never seen a developer use an eGPU - it's more for context creators & gamers.

The Mac Mini is clearly not a powerhouse, but it will do the job for a lot of devs. I'm seriously considering one for playing around with, but I would need to get a better idea of tool support for MacOS-on-ARM before buying one. It could end up as an expensive toy with limited use other than XCode.

all three of our software developers use more than 2. ( one uses 3 one uses 4 ) and the creators/develpers for graphics applications and animation and video playback/streaming all use eGPU's for rendering and multi platform testing. Its actually fun watching them swap their externals around some days.

my husband was looking at maybe getting one for his desk at home, shrugged and said nope. he even uses 3 screens all the time when not working. one for his web browser, one split for emails and whatever he uses the other side for and the third for real time day trading .... or running cartoons to keep our youngest happy.

admittedly everyone's needs are different and I have to go by our workplaces and what we deal with on a daily basis.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.