Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For the Mac Pro announcement, I call it M1 Ultimate which is 4x Max chips. I CALL IT!

If you want more than 8, I call it M1 Unlimited or M1 Ungodly.
 
It’s true that the new chips are great for rendering YouTube videos, mixing tracks and Photoshop. Youtubers and DJs say that is very professional and important work. Who am I to say otherwise?

On the other hand, you need an Intel processor to run a stable full Excel, Autocad, Solidworks, Siemens NX, Bloomberg Terminal and other overwhelming number of professional programs that only work on x86.

You know, those things that people who create and run finance, companies, the government, planes, rockets, ships, cars, buildings, iphones, m1 processors and other similar unimportant things use. So as far as I’m concerned, professionals use Intel.
I have a 16GB M1 Mini, it's OK and is in my office as a back up box. I am one that uses CAD and other "creative" applications. My choice is a HP Z2 desktop workstation with a 6 core Xeon under the hood, 32GB of RAM and an Nvidia T600 card for 2d work. As far as power consumption goes, the M1 is using 15 watts streaming Apple Music. The HP is using around 20-25 watts as the Xeon is running at 2% load and 50-60% frequency most of the time.
 
Last edited:
You're right that some niche software continues to be Windows- and x86-only. The market trend is certainly shifting away from that, though. Even if it weren't, your conclusion is silly. Tons of "professionals" don't rely on the software you listed. A lot of software development these days happens on macOS and Linux. A lot of graphic design has always happened on macOS. Etc.

But even if we take your examples:

  • you say "full Excel". That's technically true, but the Mac version isn't really that limited, and I would wager a lot of people who live in Excel full-time use it just fine. (It's also a historical footnote at this point, but fun fact: Excel originated on the Mac.)
  • Autodesk does have multiple products for the Mac.
  • I'm not sure why you bring up Solidworks as another CAD vendor.
  • …or Siemens NX as a third CAD app. I guess you like CAD? You realize some careers™ with professionals™ don't involve CAD, right?
  • Bloomberg Terminal is often just used remotely these days.
As a CAD user myself the Mac versions of AutoCAD, Ares Commander, BricsCAD are all pretty gimped for the Mac. Don't have any experience with the rest however.
 
Interesting, both me and my business partner who run our solar business use macs exclusively. He handles all the finances, which includes using excel, among other things. While I handle all the engineering and design work. As far as I’m concerned, you could not be more wrong.

Then either your company is small or your partner is not a pro Excel user, because Excel for Mac lacks devs tools: ActiveX panel and so on.
 
You're right that some niche software continues to be Windows- and x86-only. The market trend is certainly shifting away from that, though. Even if it weren't, your conclusion is silly. Tons of "professionals" don't rely on the software you listed. A lot of software development these days happens on macOS and Linux. A lot of graphic design has always happened on macOS. Etc.

But even if we take your examples:

  • you say "full Excel". That's technically true, but the Mac version isn't really that limited, and I would wager a lot of people who live in Excel full-time use it just fine. (It's also a historical footnote at this point, but fun fact: Excel originated on the Mac.)
  • Autodesk does have multiple products for the Mac.
  • I'm not sure why you bring up Solidworks as another CAD vendor.
  • …or Siemens NX as a third CAD app. I guess you like CAD? You realize some careers™ with professionals™ don't involve CAD, right?
  • Bloomberg Terminal is often just used remotely these days.

What’s special about Solidworks and Siemens NX is CAM. I bring all of them because they’re the foundations of every secondary sector company. The fact that you don’t know the difference demonstrates little more than your ignorance regarding the matter. Virtually every physical object you can see was created by using those.

Bloomberg Terminal is not used remotely, that’s Bloomberg anywhere and it’s a different service.
 
What’s special about Solidworks and Siemens NX is CAM. I bring all of them because every industrial company uses them.

And yet every single of the products you've named is advertised as CAD.

"Every industrial company" also uses accounting software, almost all of which is no longer Windows-specific. "Every industrial company" also uses groupware. Or word processing. Or CRUD apps. Or tons and tons of other apps.

Glad you enjoy being a CAM Expert™, though.
 
And yet every single of the products you've named is advertised as CAD.

"Every industrial company" also uses accounting software, almost all of which is no longer Windows-specific. "Every industrial company" also uses groupware. Or word processing. Or CRUD apps. Or tons and tons of other apps.

Glad you enjoy being a CAM Expert™, though.

But it is, because full Excel is only available on Win x86. Even full Outlook is only available on Win x86. You can run a virtual machine based on a dev version of win for arm, but that’s not stable.

I’m a deal with big companies expert though, not a CAM expert.

Sorry the professional world doesn’t adjust to your views. Yes, CAD and CAM are incredibly important, everything is done that way. What can I say?

Well, if you wish, only roads, bridges, tunnels, houses, buildings, warehouses, appliances, furniture, bicycles, motorbikes, cars, buses, trains, ships, planes, rockets, satellites, drones, computers, smartphones, the robots that build those and so on and so forth.

Look, my point is, until now, you could use professional editing tools, such as Photoshop, FCP, Logic Pro, you could code for Apple plus virtually every other pro software for different sectors with only one computer. Not anymore. Apple’s silicon is powerful, but it’s very, very limited.
 
Last edited:
And yet every single of the products you've named is advertised as CAD.

"Every industrial company" also uses accounting software, almost all of which is no longer Windows-specific. "Every industrial company" also uses groupware. Or word processing. Or CRUD apps. Or tons and tons of other apps.

Glad you enjoy being a CAM Expert™, though.
You remember that even Apple’s suppliers use Windows in their environments, right?
 
  • Like
Reactions: p8blr
It’s true that the new chips are great for rendering YouTube videos, mixing tracks and Photoshop. Youtubers and DJs say that is very professional and important work. Who am I to say otherwise?

On the other hand, you need an Intel processor to run a stable full Excel, Autocad, Solidworks, Siemens NX, Bloomberg Terminal and other overwhelming number of professional programs that only work on x86.

You know, those things that people who create and run finance, companies, the government, planes, rockets, ships, cars, buildings, iphones, m1 processors and other similar unimportant things use. So as far as I’m concerned, professionals use Intel.
Those Window and Linux PCs that Apple hardware engineers have to use to design and simulate their iPhones, Macs, chips, etc. with. Intel should make an ad “Designed on Intel” that mimics Apple’s “Designed in California” and shows how Apple needs to use PCs to do their electrical engineering design and signal integrity analysis when making their own products.
 
  • Like
  • Sad
Reactions: p8blr and brucemr
But it is, because full Excel is only available on Win x86. Even full Outlook is only available on Win x86. You can run a virtual machine based on a dev version of win for arm, but that’s not stable.

But I suspect few people need "full Outlook". There's some things I miss in the Mac version, but nowhere near enough that it would make me go "boy, a Mac is completely unworkable". And of the few people who do need it, I suspect most wouldn't need to run it locally; RDP is enough.

With Excel… maybe?

But again, I'm not denying that "applications that are far more practical on a Windows/x64 setup" is a thing. I just think that's a very small slice of work these days.

Sorry the professional world doesn’t adjust to your views. Yes, CAD and CAM are incredibly important, everything is done that way. What can I say?

Nobody is denying the importance of CAD.


Look, my point is, until now, you could have professional editing tools, such as Photoshop, FCP, Logic Pro plus virtually every other pro software for different sectors with only one computer.

Sure, but who needs that? How many people are both an audio editing expert who works in Logic or Cubase all day and also someone who does CAD?

And how many actually every wanted the hassle of dual-booting during their work day? How do managers feel about that? At that point, why not just use Windows full-time? (Yes, virtualization exists. But you can't have it both ways. You're either arguing that virtualization/emulation is good enough, in which case an ARM Mac will probably do as well, or that performance matters for those applications, in which case you really wanted Windows to run natively anyway.)


Not anymore. Apple’s silicon is powerful, but it’s very, very limited.

I don't think it's anywhere near as limited as you think.

But yes, there are absolutely fields where the Mac has become less practical, due to the arch switch. No doubt.
 
Those Window and Linux PCs that Apple hardware engineers have to use to design and simulate their iPhones, Macs, chips, etc. with. Intel should make an ad “Designed on Intel” that mimics Apple’s “Designed in California” and shows how Apple needs to use PCs to do their electrical engineering design and signal integrity analysis when making their own products.

…why?

As a juvenile gotcha?
 
But I suspect few people need "full Outlook". There's some things I miss in the Mac version, but nowhere near enough that it would make me go "boy, a Mac is completely unworkable". And of the few people who do need it, I suspect most wouldn't need to run it locally; RDP is enough.

With Excel… maybe?

But again, I'm not denying that "applications that are far more practical on a Windows/x64 setup" is a thing. I just think that's a very small slice of work these days.



Nobody is denying the importance of CAD.




Sure, but who needs that? How many people are both an audio editing expert who works in Logic or Cubase all day and also someone who does CAD?





I don't think it's anywhere near as limited as you think.

But yes, there are absolutely fields where the Mac has become less practical, due to the arch switch. No doubt.

In actuality it is pretty common that someone has to do a presentation and uses Photoshop or FCP for that, but at the same time his everyday position is more technical.
 
As a CAD user myself the Mac versions of AutoCAD, Ares Commander, BricsCAD are all pretty gimped for the Mac. Don't have any experience with the rest however.

Macs are mostly sold for graphics, video, and software development professionals. I don’t know a lot about the CAD market but suspect the cost and performance of high end Mac Pros and iMac Pros never made for great CAD machines, so why would CAD devs do the extra work to keep their software up to date on them?

Now you can get a 5k monitor with a M1 Max (either MacBook Pro or Mac Studio) that blows away the iMac Pro at two thirds the price, or the Mac Studio Ultra that destroys a Mac Pro at less than half the price. It will be interesting to see if Apples vastly cheaper performance Macs convinces CAD devs to refresh their Mac products to catch up with their other platforms.
 
  • Like
Reactions: Zorori and EdwardC
Now you can get a 5k monitor with a M1 Max (either MacBook Pro or Mac Studio) that blows away the iMac Pro at two thirds the price, or the Mac Studio Ultra that destroys a Mac Pro at less than half the price.
In some engineering disciplines, it doesn't meet the mark.
 
Now you can get a 5k monitor with a M1 Max (either MacBook Pro or Mac Studio) that blows away the iMac Pro at two thirds the price, or the Mac Studio Ultra that destroys a Mac Pro at less than half the price. It will be interesting to see if Apples vastly cheaper performance Macs convinces CAD devs to refresh their Mac products to catch up with their other platforms.

"Vastly cheaper" than previous Macs, maybe; vastly cheaper than the competition, ehhh.
 
But it is, because full Excel is only available on Win x86. Even full Outlook is only available on Win x86. You can run a virtual machine based on a dev version of win for arm, but that’s not stable.

I’m a deal with big companies expert though, not a CAM expert.

Sorry the professional world doesn’t adjust to your views. Yes, CAD and CAM are incredibly important, everything is done that way. What can I say?

Well, if you wish, only roads, bridges, tunnels, houses, buildings, warehouses, appliances, furniture, bicycles, motorbikes, cars, buses, trains, ships, planes, rockets, satellites, drones, computers, smartphones, the robots that build those and so on and so forth.

Look, my point is, until now, you could use professional editing tools, such as Photoshop, FCP, Logic Pro, you could code for Apple plus virtually every other pro software for different sectors with only one computer. Not anymore. Apple’s silicon is powerful, but it’s very, very limited.

If you want web sites, illustrations, photos, videos, or software applications, a large number are built or processed on Macs. I’ve never worked anywhere where the web devs weren’t using Macs. Even our Android developers use Macs.

Two days ago I asked the Android lead why he didn’t just use a Windows laptop for Android Studio instead of a Mac Book Pro and he laughed at me and said it was terrible on windows and that windows laptops mostly sucked.

The difference between me and you is I realize “professional” computer uses run across a gamut of industries and applications, not just my own narrow specialty. Professional personal computer users are anyone who creates enough value using their computer to justify spending thousands of dollars to get the one that is the best most productive tool for their tasks.
 
In some engineering disciplines, it doesn't meet the mark.

Of course not. There will never be a computer or a platform that is the best solution for every person in every profession. I would never spend extra to buy Macs for low wage customer support techs when a $400 Windows PC works nearly as well. And I would never try to skimp on a software developers Mac setup when their cost is over $100/hour and any 1% productivity increase is worth investing thousands of dollars.

The GPUs in the M1 Max and Ultra are monsters when it comes to graphics and video production. Not so much when it comes to rendering apparently. So depending upon a graphics professionals workflow needs they could be either amazingly cheap or just too damn expensive.
 
If you want web sites, illustrations, photos, videos, or software applications, a large number are built or processed on Macs. I’ve never worked anywhere where the web devs weren’t using Macs. Even our Android developers use Macs.

Two days ago I asked the Android lead why he didn’t just use a Windows laptop for Android Studio instead of a Mac Book Pro and he laughed at me and said it was terrible on windows and that windows laptops mostly sucked.

The difference between me and you is I realize “professional” computer uses run across a gamut of industries and applications, not just my own narrow specialty. Professional personal computer users are anyone who creates enough value using their computer to justify spending thousands of dollars to get the one that is the best most productive tool for their tasks.

But, until now, you could do all of it with a Mac. That’s the reason you could see Macs outside design studios.

That’s the whole point. A Mac’s utility for the overall professional market went downhill with this transition, so Apple could increase their margins.

I get that, it’s their job, I complain because it affects me negatively.
 
Of course not. There will never be a computer or a platform that is the best solution for every person in every profession. I would never spend extra to buy Macs for low wage customer support techs when a $400 Windows PC works nearly as well. And I would never try to skimp on a software developers Mac setup when their cost is over $100/hour and any 1% productivity increase is worth investing thousands of dollars.

The GPUs in the M1 Max and Ultra are monsters when it comes to graphics and video production. Not so much when it comes to rendering apparently. So depending upon a graphics professionals workflow needs they could be either amazingly cheap or just too damn expensive.
Indeed - and I don't think the next Mac Pro will necessarily be sufficient for some of those needs - especially if you're doing computational memory and processor intensive stuff (the word is not coming to mind)
 
And yet every single of the products you've named is advertised as CAD.

"Every industrial company" also uses accounting software, almost all of which is no longer Windows-specific. "Every industrial company" also uses groupware. Or word processing. Or CRUD apps. Or tons and tons of other apps.

Glad you enjoy being a CAM Expert™, though.

The weird part is I’ve never seen any company invest much in PCs for accounting personnel, except for CFO and their key lieutenant.

I once tried to explain to a purchasing director at a Fortune 1000 company the need to get newer faster Macs for our Devs who carried a $150K total annual cost. Every hour of work saved pushed off the day we had to add another $150K annual cost. It went over his head and instead he went off on a rant about how he got by with his cheap 4 year old Windows PC was 4 that took “20 minutes to boot”.
 
Are there any benchmarks on neural engine performance?

A lot of the benefits of the M1 architecture are the heterogenous computing modules (neural engines, media engines,etc), but all the benchmarking I see is plain vanilla CPU/GPU testing.
eclecticlight.co has been trying to look into this. It's not easy to find a use case that can be timed, and things appear to be very much in flux, but one case that can be tested is Visual Look Up (have your Mac recognize a piece of art work, a type of flower, or whatever). He verified that this, on a "single" M1 takes about half the time it takes on a high end Intel Mac (I think he use an iMac Pro).

This gives us one datapoint, but leaves unspecified whether the task would be another 2x faster on an Ultra.

To be fair NPU is somewhat like where GPGPU was about 15 years ago (when CUDA just came out). There's an expectation that great things are possible, but also that everything is in flux at both the HW and SW levels.
Apple uses this stuff right now for photo classification, image lookup, live text, and some parts of Siri (I think both the voice analysis and the voice synthesis, but mostly not the actual "answer/task generation). What's unclear, for example, is whether they even use the NPU yet for language tasks (like translation); so much of that stuff is old code that runs on pre-NPU devices, and there's always a tension to keep that running (and backwards compatible) vs throw it away, start from scratch, and just say "Language Processing 2.0 only runs on A14/M1 and later".

A similar question could arise regarding encoding. If I have two encoder engines available in an M1 Ultra, can I encode to h.265 at higher quality? Can I even do the simpler task pf performing two such hardware encodes at once?

In one sense you can say "Of course you should be able to, anything else is dumb"; in another sense, building up ANY serious API/driver infrastructure takes time, it really does, and often the way this plays out is by the time the nicely functioning versions of all these APIs ship, it's three years after the first hardware shipped.
I mean, even something like the Live Text and Visual Lookup UI's are (let's be honest) pretty awful; they do the job but are so damn clumsy! It takes a year to get the basic tech into people's hands, then at least another year to see how people use it and figure out a better UI packaging.
 
The weird part is I’ve never seen any company invest much in PCs for accounting personnel, except for CFO and their key lieutenant.

There's kind of no point. Even a Chromebook is plenty good.

There's little heavy computation involved, and for the special cases where there is, you preferably run that part on a server anyway, as it needs to be multi-user. So on the client side, it's really just a CRUD app. Forms, lists, charts, fairly basic user interaction, almost no computation at all.

I once tried to explain to a purchasing director at a Fortune 1000 company the need to get newer faster Macs for our Devs who carried a $150K total annual cost. Every hour of work saved pushed off the day we had to add another $150K annual cost. It went over his head and instead he went off on a rant about how he got by with his cheap 4 year old Windows PC was 4 that took “20 minutes to boot”.

Sure, that's the good ol' "workers should be in an open-plan office, and they'll be more productive that way! Except of course for me, the boss, who oddly finds all reasons why that's terrible unproductive and somehow doesn't have the self-reflection to acknowledge that maybe it's terrible for them as well" rules for thee, not for me thing.

Although really, in tech, I haven't encountered it much. Very easy to convince a manager "you could give the entire team raises, or you could simply give them newer computers… which one is cheaper?"
 
  • Like
Reactions: SlaveToSwift
A similar question could arise regarding encoding. If I have two encoder engines available in an M1 Ultra, can I encode to h.265 at higher quality? Can I even do the simpler task pf performing two such hardware encodes at once?

In one sense you can say "Of course you should be able to, anything else is dumb"; in another sense, building up ANY serious API/driver infrastructure takes time, it really does, and often the way this plays out is by the time the nicely functioning versions of all these APIs ship, it's three years after the first hardware shipped.

I don't know about the decoder engine in particular, but I know for a fact that the M1 Ultra appears as one Metal device, so it's fully abstracted away on the GPU front.

 
I don't know about the decoder engine in particular, but I know for a fact that the M1 Ultra appears as one Metal device, so it's fully abstracted away on the GPU front.
Of course! That was basically essential before the device could ship, since it was going to be a selling feature.
And obviously this would be a goal for the NPU (and media systems) going forward. The question is whether it has been achieved today.
 
  • Like
Reactions: Analog Kid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.