Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Feel free to provide your own definition, per my post.
[automerge]1594764047[/automerge]

That’s not possible at Intel’s rated specs. MacOS or Windows isn’t relevant, it’s an Intel issue.
Is that not what Intels Thermal Velocity boost does?
Intel® Thermal Velocity Boost Frequency
Intel® Thermal Velocity Boost (Intel® TVB) is a feature that opportunistically and automatically increases clock frequency above single-core and multi-core Intel® Turbo Boost Technology frequencies based on how much the processor is operating below its maximum temperature and whether turbo power budget is available. The frequency gain and duration is dependent on the workload, capabilities of the processor and the processor cooling solution
 
There is a hole, but it’s not massive. 80% of Apple’s sales are notebooks, and another 10-15 of sales are iMac. That leaves 5-10% of sales to be split between the mini, iMac Pro and Mac Pro.

According to Apple, a lot of creatives moved to iMac when it became fast enough that they didn’t need to buy a Mac Pro anymore. (Surely that trend was accelerated by the amazing 5K Retina screen on the 27” iMac.)

It’s impossible to give everyone what they want; some are always going to want something different. There are those that want a larger mini with a PCIe slot or two. Those who want nVidia GPUs. Those who won’t buy until they get upgradeable SSDs, USB-A and/or MagSafe on a MBP. Those who want AMD CPUs, etc.

What’s Apple to do, try to satisfy every niche?

But, how do we know having an xMac wouldn't dramatically change those numbers, at least in the smaller segment of the pie? We don't.

And, pretty much everything on your list except for the MagSafe could be put on ONE model of Mac that would keep nearly that whole crowd happy, right?

I bought a 2018 mini, which I'm pretty happy with. But, had an xMac been available, I certainly would have considered that instead. I'm guessing a lot of iMac people (especially when the mini was missing), might have gone that route too.

Apple creates a lot of these self-fulfilling situations. For example, when my iPhone SE breaks or gets too old, and I buy a phone w/o a 3.5mm jack, can they conclude I've finally stopped wanting one?

While I'm probably fine with my mini, is having a thermally capable, even slightly expandable, non-$6k machine that crazy a request? You folks make it sound like we're asking for some outlandish thing.... which is common within the computer industry as a whole.

While I'll be the first to jump all over someone saying everything Apple makes is junk, I'll equally jump all over the the people trying to tell everyone the obvious isn't obvious.

The other interesting point, if your stats are correct, is that what.... like 97% of Mac use isn't pro-use anymore? I hope that isn't the case, though I suppose it is possible. Or, are they been just ***getting by*** with what Apple has been offering?

How else are those same people going to complain that Apple isn't focussed enough, and has too many product offerings.

I'm not sure that is always the same crowd. But, why can't we have a Mac for advanced hobbyists, prosumers, etc. But, we can have a dozen different iPhone models or laptops with little actual differentiation?

I think the argument would be to go back more towards Steve's 'grid' maybe just one level bigger. What we have now is more like sub-dividing 2 of Steve's 4 boxes into a dozen, and then tearing one of the boxes off the chart and tossing it mile up the road.

I was a Mac consultant back in the Performa, Centris, Power Computing, etc. days, and I don't want to go back there. At the same time, I'd like a machine I could safely run hard that doesn't start at $6k too (or sound like a jet taking off).

I'm sure some people would love a $2k, $1k, $whateverK Mac Pro Lite, xMac, whatever. What they're missing is that they market has moved on. Offices aren't filled with towers any more; they're filled with laptops. For the edge cases where you need high-end GPU power, an eGPU will almost invariably do. The Mac Pro is a high-end niche because it can be.

Now, Apple does choose not to serve every niche. And for those who want macOS but don't quite want any of the Mac hardware Apple is offering, that sucks. But it's not a lot of people.

But, as above, what about people who 'crunch' something outside of a few minutes here and there? Are there none of us left? Everyone just uses computers for email and browsing these days?

I just don't believe there isn't a market between email and Hollywood video editing / 3D animation anymore. Now, maybe Apple has just decided they don't care about that market anymore. But, I don't believe it doesn't exist.

As far as Apple chasing market segments, I think they've got the "small, throttled, non-expandable, thermally-challenged" market covered.

Exactly! And, I guess we're to believe 99% of people don't want anything more than that?
 
But, how do we know having an xMac wouldn't dramatically change those numbers, at least in the smaller segment of the pie? We don't.

And, pretty much everything on your list except for the MagSafe could be put on ONE model of Mac that would keep nearly that whole crowd happy, right?

I bought a 2018 mini, which I'm pretty happy with. But, had an xMac been available, I certainly would have considered that instead. I'm guessing a lot of iMac people (especially when the mini was missing), might have gone that route too.

Apple creates a lot of these self-fulfilling situations. For example, when my iPhone SE breaks or gets too old, and I buy a phone w/o a 3.5mm jack, can they conclude I've finally stopped wanting one?

While I'm probably fine with my mini, is having a thermally capable, even slightly expandable, non-$6k machine that crazy a request? You folks make it sound like we're asking for some outlandish thing.... which is common within the computer industry as a whole.

While I'll be the first to jump all over someone saying everything Apple makes is junk, I'll equally jump all over the the people trying to tell everyone the obvious isn't obvious.

The other interesting point, if your stats are correct, is that what.... like 97% of Mac use isn't pro-use anymore? I hope that isn't the case, though I suppose it is possible. Or, are they been just ***getting by*** with what Apple has been offering?



I'm not sure that is always the same crowd. But, why can't we have a Mac for advanced hobbyists, prosumers, etc. But, we can have a dozen different iPhone models or laptops with little actual differentiation?

I think the argument would be to go back more towards Steve's 'grid' maybe just one level bigger. What we have now is more like sub-dividing 2 of Steve's 4 boxes into a dozen, and then tearing one of the boxes off the chart and tossing it mile up the road.

I was a Mac consultant back in the Performa, Centris, Power Computing, etc. days, and I don't want to go back there. At the same time, I'd like a machine I could safely run hard that doesn't start at $6k too (or sound like a jet taking off).



But, as above, what about people who 'crunch' something outside of a few minutes here and there? Are there none of us left? Everyone just uses computers for email and browsing these days?

I just don't believe there isn't a market between email and Hollywood video editing / 3D animation anymore. Now, maybe Apple has just decided they don't care about that market anymore. But, I don't believe it doesn't exist.



Exactly! And, I guess we're to believe 99% of people don't want anything more than that?
Not only is the 97% figure correct for Mac users not using any kind of pro apps (give or take a %), it’s also true with Windows as well. Intel primarily sells midrange laptop cpus and a bulk of their money comes from server cpus. High end desktop cpu sales are a rounding error to intel. 65% of computers sold don’t even have discrete graphics from AMD or Nvidia.
 
I'm not a developer at all (OK, I've written a few tiny python apps and modified a bunch of stuff while working in IT), but my understanding is that it is quite dependent on the app. Maybe a calculator or word-processor might be able to just recompile, but that isn't going to work with an app like SolidWorks or something like Autodesk 3ds Max.
Yes, the more complex the app, the more likely a developer will rely on other developers, i.e. third-party frameworks, libraries, etc. However, that doesn't absolutely equate to a lower probability of porting, maybe just more patience necessary.
I'm not sure how big of a market we are, but I think a lot of us are worrying about losing access to those kind of higher end apps and utilities that are highly unlikely to port, but which we can currently run under Windows (on our Macs) when needed.
And the demand, i.e. market, is the typical driving force. If it's profitable for a company, most won't be swayed by (added) development costs. The same logic goes for generally maintaining and improving apps, whether they're headed for a major transition or not. In other words, if customers and business models provide, the willingness of companies to overcome obstacles should be of little concern. Otherwise, discontinuation is probably the next stop no matter what people claim is ahead.
It seems to me that Cloud hosting of applications ultimately diminishes the issue of platform dependencies.
Indeed, developers can provide a lot of information and functionality within a Web browser, which is fairly consistent cross-platform, significantly reducing development costs. Therefore, Web apps are strongly climbing in popularity, including development resources from the basics (HTML, JS, CSS, etc) to extended/enhanced frameworks, libraries, etc (JQuery, React, Bootstrap, Blazor, etc).

Admittedly, Web apps aren't viable for everything, however, there are a lot/plenty of feasible use cases.
 
Last edited:
But, as above, what about people who 'crunch' something outside of a few minutes here and there? Are there none of us left? Everyone just uses computers for email and browsing these days?

I just don't believe there isn't a market between email and Hollywood video editing / 3D animation anymore. Now, maybe Apple has just decided they don't care about that market anymore. But, I don't believe it doesn't exist.
Not really. For example, there's a videographer with an iMac. The preview of the film changes probably updates in a matter of seconds, however, the final rendering requires minutes, maybe even a couple of hours. During you're able and will move to another task, whether on your Mac or not. Ultimately, acceptable (and expected). Is it possible to perform UHD AV renders in seconds? Absolutely, but the financial cost is probably 10x. Is that a willing trade-off? (Let's assume) You say no because it's probably true the results don't exceed or equal the costs. Well, that's what it's going to take. For other entities, such as movie production companies, that extra system(s) expense is reasonable as it outweighs effective costs. Basically, there is demand but because the reality of bringing about such a scenario is nonsensical or even unattainable the market is forgettably tiny.
 
Well, that aged like milk.

The Threadripper Pro series of CPUs exists and the Lenovo P620 is coming in September. (12 - 64 cores/128 PCIe 4.0 lanes/2Tb max memory). It will start at $4,600 - 12 cores. I'll bet that the 16 core version will be $6,000.


People haven't upgraded their systems because Intel has been on 4 cores/8 threads for a decade. Hard to justify replacing a computer when you are only seeing 3 - 5% performance increase per generation.

That isn't the case anymore. The upcoming low end will be 8 core/16 thread systems.
 
Well, that aged like milk.

The Threadripper Pro series of CPUs exists and the Lenovo P620 is coming in September. (12 - 64 cores/128 PCIe 4.0 lanes/2Tb max memory). It will start at $4,600 - 12 cores. I'll bet that the 16 core version will be $6,000.


People haven't upgraded their systems because Intel has been on 4 cores/8 threads for a decade. Hard to justify replacing a computer when you are only seeing 3 - 5% performance increase per generation.

That isn't the case anymore. The upcoming low end will be 8 core/16 thread systems.
Yeah the 64 Core Threadripper Pro flagship is like 2 generations ahead of the 28 core Intel Xeon found in the Mac Pro.
 
Not that this detracts much from the impressive specs, but the maximum supported memory in the P620 is 1TB not 2TB. I mention this just in the interest of accuracy. (ref)

The chip can address 2Tb of memory according to AMD.

Kinda like how the low end 7,1 can "only" (I can't believe I just wrote that) address 768Gb - the chip itself can address 1Tb.
 
Not only is the 97% figure correct for Mac users not using any kind of pro apps (give or take a %), it’s also true with Windows as well. Intel primarily sells midrange laptop cpus and a bulk of their money comes from server cpus. High end desktop cpu sales are a rounding error to intel. 65% of computers sold don’t even have discrete graphics from AMD or Nvidia.

Oh, for sure if we're talking about market in total. I mean, how many Wintel boxes are point of sales terminals or just filling rows of cubicles.

I guess my point is that there was a time when Macs had a relatively high percentage use by creative pros, in multiple fields (not just video editing). So, in comparison to the Wintel market, a higher percentage were pro-use in Macs, which kind of leveled the playing field more in terms of which platform apps might target.

For example, when I started in 3D work, you pretty much used a Mac if you did anything high-end. There were some PC apps, but mostly in architectural visualization and stuff like that, and the output wasn't that great. Hollywood, was almost all Mac. If you were making a new 3D app, you likely targeted the Mac first.

Similar, for professional CAD that wasn't AutoCAD or super-high end. Some of the first 3D solids modeling CAD that didn't cost a small fortune was on the Mac. Form*Z, Ashlar's Vellum Solids and Cobalt, etc. (PC had stuff like SolidWorks and Catia, but that wasn't touchable unless you were pretty big.)

I could tell a bit of a story there, actually... as I was working with an ID firm and needing real 3D solids CAD. I called Ashlar, as they had the closest to what I was looking for, but it wasn't quite there. The guy I was talking to said... hey, do you want to talk to the dev of something we're working on? I was like... SURE! So, I end up on the phone with Tim Olson (currently ViaCAD, SharkCAD) and he's telling me about something they are just moving from alpha to beta on and wondering if I'd want to start testing it. I'm like... Yeah!

Tim had come out of Lockheed (SR-71 kind of stuff, aerospace CAD) and was putting a new kind of interface on their CAD software on a new engine (Spatial SAT). The kind of drawing assist tools and stuff you more commonly see in better CAD stuff largely came from his work. I and people from Scaled Composites (Burt Rutan, Voyager fame), and some yacht racing designers and stuff were in those early betas and forums. Fun stuff.

Same in 3D rendering. There were people in the forums like Alex Lindsay and even a couple times, John Knoll (ILM Rebel Unit)... people working on the Matrix, etc.

The Mac was THE place to be for this stuff. The PC AutoCAD/3DS Max people were jealous of the stuff we were doing. My how the tables have turned.

Yes, the more complex the app, the more likely a developer will rely on other developers, i.e. third-party frameworks, libraries, etc. However, that doesn't absolutely equate to a lower probability of porting, maybe just more patience necessary.

Yeah, the patience part is what has me concerned. :) But, yes, I suppose if there is a big enough market, the devs will come. However, some of these bigger companies also have a level of bias, just like a lot of IT departments. There is more than just market and money involved, unfortunately.

Admittedly, Web apps aren't viable for everything, however, there are a lot/plenty of feasible use cases.

Yeah, though aside from the brute need for compatibility, I don't like most of them very much. I guess it is better than nothing, but many of the web-apps in a wrapper are always a bit flaky and often have UI issues (I use several, like Slack, Todoist, etc.)

Not really. For example, there's a videographer with an iMac. The preview of the film changes probably updates in a matter of seconds, however, the final rendering requires minutes, maybe even a couple of hours. During you're able and will move to another task, whether on your Mac or not.

Oh, absolutely, it is doable these days, often even on lower end machines. Then you spend $ to get it done more quickly (to make more $).

The problem is more thermals and damage to equipment. I get that if a rendering takes 24 hours, I can spend $ to make that into 10 hours, etc. But, if the 24 hours damages my system, then I almost have to try and go for that higher cost system, even if doesn't balance out with income side of things.

In the PC market, I can build a machine closer to the cost of a base Mac that can handle the thermals, AND spend more money for higher quality, or higher performance systems. The range is available. Apple has nothing under the cost of an iMac Pro.

I make do with my mini, and have my fingers crossed that by moving the GPU out, maybe my luck will be better (than it had been with MBPs). We'll see. But even then, it would be *nice* to have less noise when I push it. If it were a bit bigger, or an xMac, that probably wouldn't be an issue.

Huh? There are at most 5 but realistically 3 new models of iPhone, and three models of Mac laptops.

Yes, they have narrowed down the lines recently. It was more not long ago (especially if you included the year's old models they were selling for more entry-level models).
 
I guess my point is that there was a time when Macs had a relatively high percentage use by creative pros, in multiple fields (not just video editing). So, in comparison to the Wintel market, a higher percentage were pro-use in Macs, which kind of leveled the playing field more in terms of which platform apps might target.

For example, when I started in 3D work, you pretty much used a Mac if you did anything high-end. There were some PC apps, but mostly in architectural visualization and stuff like that, and the output wasn't that great. Hollywood, was almost all Mac. If you were making a new 3D app, you likely targeted the Mac first.

Similar, for professional CAD that wasn't AutoCAD or super-high end. Some of the first 3D solids modeling CAD that didn't cost a small fortune was on the Mac. Form*Z, Ashlar's Vellum Solids and Cobalt, etc. (PC had stuff like SolidWorks and Catia, but that wasn't touchable unless you were pretty big.)

I could tell a bit of a story there, actually... as I was working with an ID firm and needing real 3D solids CAD. I called Ashlar, as they had the closest to what I was looking for, but it wasn't quite there. The guy I was talking to said... hey, do you want to talk to the dev of something we're working on? I was like... SURE! So, I end up on the phone with Tim Olson (currently ViaCAD, SharkCAD) and he's telling me about something they are just moving from alpha to beta on and wondering if I'd want to start testing it. I'm like... Yeah!

Tim had come out of Lockheed (SR-71 kind of stuff, aerospace CAD) and was putting a new kind of interface on their CAD software on a new engine (Spatial SAT). The kind of drawing assist tools and stuff you more commonly see in better CAD stuff largely came from his work. I and people from Scaled Composites (Burt Rutan, Voyager fame), and some yacht racing designers and stuff were in those early betas and forums. Fun stuff.

Same in 3D rendering. There were people in the forums like Alex Lindsay and even a couple times, John Knoll (ILM Rebel Unit)... people working on the Matrix, etc.

The Mac was THE place to be for this stuff. The PC AutoCAD/3DS Max people were jealous of the stuff we were doing. My how the tables have turned.
Not that I don't have plenty of things to slam Apple about lately, I think, the position of the Mac has changed due enough to the overall evolution of computers. Personal computers, client-server relationships all have shifted, some resembling their origins much more closely than ever, some becoming obsolete, and so on.
Yeah, the patience part is what has me concerned. :) But, yes, I suppose if there is a big enough market, the devs will come. However, some of these bigger companies also have a level of bias, just like a lot of IT departments. There is more than just market and money involved, unfortunately.
Valid point, though from what I've seen the attitude is on the whole. So, if a developer originally released their software on macOS and has done so for years without any sign of a version for any other OS, you can be confident they'll continue to support macOS and probably only macOS whether Apple moves from Motorola/IBM to Intel, OpenGL to Metal, etc. In other words, a bias dev that created an app for macOS is probably diehard to macOS. And because you're using said software, you're probably dedicated to macOS (and their software). Basically, even if there's a bias, logically, it should be in line with yours.
Yeah, though aside from the brute need for compatibility, I don't like most of them very much. I guess it is better than nothing, but many of the web-apps in a wrapper are always a bit flaky and often have UI issues (I use several, like Slack, Todoist, etc.)
Web languages do seem to have more unreliable (i.e. unexpected) behavior compared to platform-specific. Although, I've observed plenty of UI glitches on many platforms, including frustrations with my own coding.
Oh, absolutely, it is doable these days, often even on lower end machines. Then you spend $ to get it done more quickly (to make more $).

The problem is more thermals and damage to equipment. I get that if a rendering takes 24 hours, I can spend $ to make that into 10 hours, etc. But, if the 24 hours damages my system, then I almost have to try and go for that higher cost system, even if doesn't balance out with income side of things.

In the PC market, I can build a machine closer to the cost of a base Mac that can handle the thermals, AND spend more money for higher quality, or higher performance systems. The range is available. Apple has nothing under the cost of an iMac Pro.

I make do with my mini, and have my fingers crossed that by moving the GPU out, maybe my luck will be better (than it had been with MBPs). We'll see. But even then, it would be *nice* to have less noise when I push it. If it were a bit bigger, or an xMac, that probably wouldn't be an issue.
I may not have been clear on this. My comment of "unattainable" was primarily in reference to (sustainable) financials. And your point about theremals is playing off what I meant to say, which is that if you do frequently have highly demanding tasks 'crunching' for hours or days, it's logical to have a dedicated system for those specific tasks. I vaguely recall seeing a YouTube video that featured a system with dual power supplies, four high-end video cards, 64 or 128GB of RAM, and (I think) a Xeon CPU that was used as a video rendering server. It was a beast of a computer tower but I don't doubt it performed it's assigned task well. In contrast, could you render a 4K, 3-D animated film with Doubly sound on a Mac mini, the same computer you use for email, Web browsing, YouTube, etc? Let's assume yes but, of course, it's not going to be ideal. Other tasks will probably lag, the fan/blower will be loud, spinning at peak speed working to keep the system just within thermal limits, the casing will be hot, and so on. And yes, it's because the Mac mini, iMac, and other personal computers or even workstations aren't designed with that situation in mind. Just as if you'd use a server configuration like that utilized by Amazon or Microsoft's cloud services to surf the Web or check your email. Would it work? Yes, though you'll probably be wasting a lot of energy and cost to provide that energy. In conclusion, it's not always possible to have a one size fits all. The (sensible) reality might be, a MBA and MP combination could be the proper choice if an iMac (Pro) isn't powerful enough. I hope that cleared it up. :)
 
Probably a bonehead move by me to get involved in the AMD/Intel argument, however, it doesn't appear anything has changed over the decades.

Foremost, people pick a brand, typically based on a few instances, and stick by it no matter what -- which isn't entirely bad, but not great. I say this as a former huge fan of AMD/ATI.

Second, which I'm not certain about and invite anyone to prove otherwise, AMD CPUs (still) have better numbers on their white sheets but can't seem to strongly back up the perception.

Because I think Cinebench, GeekBench, etc only gauge the very minimum for comparison, I did a quick search for some of the more so "real world" tests:

Lightroom Classic CC 2019 CPU Roundup: Intel vs AMD

Premiere Pro CC 2019 CPU Roundup: Intel vs AMD vs Mac

I even tried to keep the comparisons more on the level by looking at the similar clocked and number of cores equipped models and Intel still typically won.

Ultimately, I'm not intending to suggest AMD is bad and Intel is good rather just like in the early 2000s, gloating about AMD by strictly showing off specs doesn't seem like a smart idea. Again, feel free to post proof otherwise because I'm genuinely wondering if I'm overlooking an aspect.

DISCLAIMER: I know RAM, motherboard, software, automation method, OS, etc are factors as well though these kind of tests still provide an okay judgment.
 
  • Like
Reactions: SteveW928
Did you just equate your porting of 7zip to ARM with porting something like Maya to ARM with the implication being that it would likely be equivalent for them?

My example was related to the question if you need to include all AVX performance optimization for a viable port or just trivially go with the C-reference code.
In this sense Maya is just a significantly larger project in terms of source code lines and dependencies but not inherently more difficult to compile.

Maybe another example. For some projects i had to port dependent 3rd party frameworks to ARM first, like for instance the QT framework. QT framework sometimes have orders of magnitude more complexity than the application using it and then QT itself is dependent on other 3rd party frameworks like SDL. So i started walking up the dependence tree: SDL, others..., QT, others..., final application.

So yes, more complex projects take more effort, but the inherent "difficulty" was the same in particular related to the question of re-programing all the AVX and SSE optimizations i stumbled over. The real heavy-lifting when going x86->ARM was done by the compiler anyway.
 
Last edited:
  • Like
Reactions: pasamio
Not that I don't have plenty of things to slam Apple about lately, I think, the position of the Mac has changed due enough to the overall evolution of computers. Personal computers, client-server relationships all have shifted, some resembling their origins much more closely than ever, some becoming obsolete, and so on.

A lot has changed, but if I had to guess, there are now even more people doing higher-end, complex creative work. So, it shouldn't be a smaller market than it was back then. Apple is just more chasing the average consumer than the pro (maybe almost opposite of the old days).

That isn't necessarily a bad thing, especially from a revenue perspective. But, we've been mostly OK with the Mac on Intel, as we could run the stuff from companies that couldn't be convinced to care about the Mac. Now we're going to have to convince them or change platforms.

Valid point, though from what I've seen the attitude is on the whole. So, if a developer originally released their software on macOS and has done so for years without any sign of a version for any other OS, you can be confident they'll continue to support macOS and probably only macOS whether Apple moves from Motorola/IBM to Intel, OpenGL to Metal, etc. In other words, a bias dev that created an app for macOS is probably diehard to macOS. And because you're using said software, you're probably dedicated to macOS (and their software). Basically, even if there's a bias, logically, it should be in line with yours.

Yes, that will probably be the case with at least the bigger companies that make apps. I am worried about some of the smaller one-dev to say, team of five type apps that are maybe having a hard time keeping up with all the changes.

The real problem is an app like Revit, where they don't make a Mac version and probably have no plans to.

I may not have been clear on this. My comment of "unattainable" was primarily in reference to (sustainable) financials. And your point about theremals is playing off what I meant to say, which is that if you do frequently have highly demanding tasks 'crunching' for hours or days, it's logical to have a dedicated system for those specific tasks. I vaguely recall seeing a YouTube video that featured a system with dual power supplies, four high-end video cards, 64 or 128GB of RAM, and (I think) a Xeon CPU that was used as a video rendering server. It was a beast of a computer tower but I don't doubt it performed it's assigned task well. In contrast, could you render a 4K, 3-D animated film with Doubly sound on a Mac mini, the same computer you use for email, Web browsing, YouTube, etc? Let's assume yes but, of course, it's not going to be ideal. Other tasks will probably lag, the fan/blower will be loud, spinning at peak speed working to keep the system just within thermal limits, the casing will be hot, and so on. And yes, it's because the Mac mini, iMac, and other personal computers or even workstations aren't designed with that situation in mind. Just as if you'd use a server configuration like that utilized by Amazon or Microsoft's cloud services to surf the Web or check your email. Would it work? Yes, though you'll probably be wasting a lot of energy and cost to provide that energy. In conclusion, it's not always possible to have a one size fits all. The (sensible) reality might be, a MBA and MP combination could be the proper choice if an iMac (Pro) isn't powerful enough. I hope that cleared it up. :)

Exactly, except that this use to not be the case (hence the fuss). You could buy a Power Mac that could handle it starting at a much lower price (within prosumer or smaller-pro reach). Or, the often mentioned xMac could do something like this too. Or, there are lots of options in the PC market. You don't have to buy the $10k Dell workstation if your budget PC isn't cutting it.

Mac users either have to pony up for the Mac Pro (which wasn't even an option for quite some time), or *make do* with a system that isn't really designed for the task. There isn't a middle ground.

My Mac mini was my compromise middle-ground. We'll see how it goes. Aside from some noise when I need to crunch, so far, so good.

I even tried to keep the comparisons more on the level by looking at the similar clocked and number of cores equipped models and Intel still typically won.

Ultimately, I'm not intending to suggest AMD is bad and Intel is good rather just like in the early 2000s, gloating about AMD by strictly showing off specs doesn't seem like a smart idea. Again, feel free to post proof otherwise because I'm genuinely wondering if I'm overlooking an aspect.

Thanks, I kind of wondered about that. It also probably depends on what you're doing to some extent, kind of like GPUs (though to a lesser extent). I had a bad experience with an AMD in the past, but would be open to one again. But, I'd want to see some real world or pricing benefit for my particular use (assuming I were to build a PC box to switch or supplement my Mac).

In this sense Maya is just a significantly larger project in terms of source code lines and dependencies but not inherently more difficult to compile.
...
So yes, more complex projects take more effort, but the inherent "difficulty" was the same in particular related to the question of re-programing all the AVX and SSE optimizations i stumbled over. The real heavy-lifting when going x86->ARM was done by the compiler anyway.

I understand, I think. For my questions, I think I was thinking of complexity more as dependencies rather than hard-to-code. In other words, a high-end 3D app often depends on a lot of external libraries that are out of their control. So, even if they want to port, they'd potentially have to wait on a bunch of other companies/development teams, which might never get onboard.
 
A lot has changed, but if I had to guess, there are now even more people doing higher-end, complex creative work. So, it shouldn't be a smaller market than it was back then. Apple is just more chasing the average consumer than the pro (maybe almost opposite of the old days).

That isn't necessarily a bad thing, especially from a revenue perspective. But, we've been mostly OK with the Mac on Intel, as we could run the stuff from companies that couldn't be convinced to care about the Mac. Now we're going to have to convince them or change platforms.
Exactly, except that this use to not be the case (hence the fuss). You could buy a Power Mac that could handle it starting at a much lower price (within prosumer or smaller-pro reach). Or, the often mentioned xMac could do something like this too. Or, there are lots of options in the PC market. You don't have to buy the $10k Dell workstation if your budget PC isn't cutting it.

Mac users either have to pony up for the Mac Pro (which wasn't even an option for quite some time), or *make do* with a system that isn't really designed for the task. There isn't a middle ground.

My Mac mini was my compromise middle-ground. We'll see how it goes. Aside from some noise when I need to crunch, so far, so good.
So, you want a high-end iMac config, such as:
  • 3.6GHz 8-core 9th-generation Intel Core i9 processor, Turbo Boost up to 5.0GHz
  • 16GB 2666MHz DDR4 memory
  • Radeon Pro Vega 48 with 8GB of HBM2 memory
  • 512GB SSD storage
  • Magic Mouse 2
  • Magic Keyboard
...in a Mac Pro form factor?

For me, a new Mac mini will be a big boost, though I'm hoping the Apple Silicon Mac mini has higher-end Intel iMac performance. Essentially, adding more value without needing to stretch a direction I'm not happy with and interested in.

The real problem is an app like Revit, where they don't make a Mac version and probably have no plans to.

Y-e-a-h... I think that's just a sad/frustrating situation when you need to search for a viable alternative and begrudgingly apply the resources to integrate it. If you don't find a feasible alternative, that resource redirect will be to adding a system for that app (and, hopefully, try to utilize it for anything else it would be valid for).

Or attempt to persuade the dev to create what you need... Or code your own. 😊
 
Last edited:
  • Like
Reactions: SteveW928
So, you want a high-end iMac config, such as:
  • 3.6GHz 8-core 9th-generation Intel Core i9 processor, Turbo Boost up to 5.0GHz
  • 16GB 2666MHz DDR4 memory
  • Radeon Pro Vega 48 with 8GB of HBM2 memory
  • 512GB SSD storage
  • Magic Mouse 2
  • Magic Keyboard
...in a Mac Pro form factor?

For me, a new Mac mini will be a big boost, though I'm hoping the Apple Silicon Mac mini has higher-end Intel iMac performance. Essentially, adding more value without needing to stretch a direction I'm not happy with and interested in.

Yes, something like that more or less, but in a case designed to cool it properly. It doesn't have to be Mac Pro form-factor necessarily. Others might want a giant tower, I suppose.

Heck, I'd be incredibly happy if I could double the size of my current mini (6-core i7) and make the rest a proper cooling system. Easy access to the RAM would be nice as well.

I think a lot of the 'xMac' people want card slots, though. I'd probably have gone that direction if it existed, but I'm pretty happy with the eGPU route, too. (If you stick a stock GPU in a box, then you have to figure out how to cool it and silence those crappy fans. At least for me.)

Y-e-a-h... I think that's just a sad/frustrating situation when you need to search for a viable alternative and begrudgingly apply the resources to integrate it. If you don't find a feasible alternative, that resource redirect will be to adding a system for that app (and, hopefully, try to utilize it for anything else it would be valid for).

Or attempt to persuade the dev to create what you need... Or code your own. 😊

It is an industry problem. Revit is what most of the firms use, so that's where the jobs are. I think if you had your own firm, and didn't work to heavily on multi-firm projects, Vectorworks is competitive (though I don't know how they compare on a feature to feature basis), and is Mac based.

But, for me it will probably mean adding a PC at some point, I suppose.
 
<snip>
The other interesting point, if your stats are correct, is that what.... like 97% of Mac use isn't pro-use anymore? I hope that isn't the case, though I suppose it is possible. Or, are they been just ***getting by*** with what Apple has been offering?
No, it’s not that 97% of Mac use isn’t pro use; it’s that 97% of users don’t need to buy iMac Pro or Mac Pro to get the performance and features they want/need. They buy MBP or iMac. (The 27” 5K iMac was a game changer, and iMac Pro was a response to those who outgrew the regular iMac.)

Apple disclosed some limited pro use data in the April 2017 Mac Pro round table, where Phil Schiller mentions about 15% are pros—but they use MBP, then iMac, then Mac Pro, in that order (there was no iMac Pro at that time).
 
No, it’s not that 97% of Mac use isn’t pro use; it’s that 97% of users don’t need to buy iMac Pro or Mac Pro to get the performance and features they want/need. They buy MBP or iMac. (The 27” 5K iMac was a game changer, and iMac Pro was a response to those who outgrew the regular iMac.)

Apple disclosed some limited pro use data in the April 2017 Mac Pro round table, where Phil Schiller mentions about 15% are pros—but they use MBP, then iMac, then Mac Pro, in that order (there was no iMac Pro at that time).

Sorry, yeah, I didn't mean that to be some kind of exact figure. I was also talking more about the older/original use of Pro, not the new one. Pro used to refer to a heaver kind of use, that often required specialized hardware built for that kind of use. A lawyer could certainly be using a Chromebook for a multi-million $ case, and that would be professional use (new definition).

MacBook Pros and regular iMacs just aren't made for the kind of use I'm talking about (well, neither is my mini, but I'm making do as best as I can). It can do the work, just not as quickly, quietly, and trouble-free.

MacBook Pros, the higher end iMacs and my mini are more-pro oriented than some of the other models, to be sure. They can certainly get some heavy or high-end work done, but anymore, so can all the non-pro models. It's more a matter of speed and reliability.
 
I'm still in the return window for my 3k (including applecare/tax) 2019 16 inch mbp. What to do, what to do... I absolutely love this computer and have no other mac to fall back on....
Kick back, relax and enjoy the ride. I'm doing that with my 6 week old MBP 10gen unit. Remember, OUR computers will be serviceable for a VERY LONG time. Can do Windows. Can do Lynix. Lets see what those ARM units can do when their released.
 
Kick back, relax and enjoy the ride. I'm doing that with my 6 week old MBP 10gen unit. Remember, OUR computers will be serviceable for a VERY LONG time. Can do Windows. Can do Lynix. Lets see what those ARM units can do when their released.

For sure. I think the only issue will be if Apple's new systems start to really eclipse the Intel stuff rather rapidly. What if in, say two years, Apple's systems are 3x faster, with some special purpose processing being like 20x faster? That doesn't mean our current Intel-based Macs won't be good, but the incentive to upgrade will be pretty huge.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.