Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Your comment just, to me, proves that yes the problem IS the base price based on what you get.

I was merely trying to change the conversation from affordability to value. Too many kneejerk "it's expensive cuz it's professional and you can easily pay it off after x years" B.S. responses.
 
  • Like
Reactions: 09872738
Apologies for being away for ~2 months. Had been on this thread and wanted to add an observation:

I was merely trying to change the conversation from affordability to value. Too many kneejerk "it's expensive cuz it's professional and you can easily pay it off after x years" B.S. responses.

The general observation of "only +XX more per month" is a fallacy because it is invariably a slippery slope: while its true that a few bucks here or there doesn't affect the big picture, they still do add up in terms of each incremental increase in expense means lower profits at the end of the balance sheet.

The calculus is if the more expensive hardware really will result in a net gain to the financial bottom line. Usually, its improved productivity = faster deliveries = more business & revenue = investment pays for itself.

For the balking at this $6K 'starting at' MSRP with underwhelming specs, it compares quite poorly to the past: the 'starting at' MSRP of the 2013 'Trash Can' and of the 2012 cheesgrater were IIRC $3K and $2.5K respectively, and their hardware minimums for their time weren't as technologically gutted as this 2019 MP appear to be. Granted, there is some defense to be made by pointing out that there's a lot to pay upfront at the base end for having the capability to take the machine way upscale, but given that the motherboard is PCIe 3.x, that argument is technologically problematic.

What it comes down to is (a) pessimism on Apple's part for how many^H^H few mMP's they really expect to sell over the next five years, which results in a small denominator for amortizing the fixed development costs, and of course, (b) Apple's incessant financial focus for everything to have a high profit margin tacked on.


-hh
 
  • Like
Reactions: 4487549
The other thought is: are the higher prices related to Apple wanting to produce the Pro in the United States to avoid possible tariff action? (I know, I know, this veers somewhat closer to political discussion and I wanted to avoid that but I figured it needed to be said).
 
Wow. This conversation still has traction?

Few points to make this far on from the announcement.

Price tag. I see so many people comparing this to hand built AMD systems. A double fallacy. First apple is a boutique builder so you need to compare this to other boutique builders with the standard 35-55% markup for the customisations and labour.

Second yes AMD is less expensive. And yes the current line of AMD is running over Intel competitors. I’m a huge AMD fan. But it’s a false comparison. Apple isn’t competing against self built AMD based systems, they are competing against Falcon and AVA Direct.

Nvidia vs AMD. There are a small number of places where CUDA is needed. Everywhere else AMD is simply a better choice on paper and on the benchmarks.

The power supply. I see some say OMG. It’s less than the one my PC had. Built 6 years ago.

The case? Gorgeous. An update to the original. Scifi, and practical. Nice. Love it

The monitor? The price is on par with equivalent monitors/TVs. Matching to high quality monitors, they’re within $500-$1000 of the competition.

About the only legitimate complaint I can come up with is a $1000 stand. But... that’s Apple. Time to start saving. Maybe I can get this one when the 2029 version knocks the prices down a bit. ;)
 
I never hear anyone say this outside of this forum.
It really depends on what you’re doing. CUDA is very specialised AND very closed. Sure it’s superior at what it does and I don’t deny that. For games and graphics/video work. Nvidia has some benefits.
But like the AMD processors in general vs the Intel counterparts... AMD has long been the better choice on raw crunching.
Assuming the same “pro” target target of all previous desktop pro line macs... science, engineering, and development... AMD is the better choice.
If you need CUDA there’s nothing stopping you from popping a card in the available sockets and disabling the ATIs.
Do I think there should be a choice to use Nvidia if you so desire? Sure. Unfortunately that’s not the way it works.
But again there’s nothing stopping you from adding your own.
Ultimately, re not seeing it before, there’s no real hard decision. Like the current cpu war, AMD and Nvidia trade close benchmarks on honest Apple to Apple orange to orange product comparisons. When it’s that close it’s up to special incentives on a decision. If you need Nvidia buy one a pop it in. If you don’t actually need it there’s an equally capable card in the Pro. One that’s a step above the current top market product from AMD.
 
If you need CUDA there’s nothing stopping you from popping a card in the available sockets and disabling the ATIs.

Nothing stopping us other than Apple, you mean? ;-). From what I've read, Apple is keeping nVidia support from the Mac even though nVidia wants it.

Assuming the same “pro” target target of all previous desktop pro line macs... science, engineering, and development... AMD is the better choice.

I don't know what info is out there to back up this statement. Better drivers in their pro line, perhaps? I've heard that was a bragging point of AMD for a while, but I don't work at a huge company where someone might only work in a part of a pipeline (like CAD modeling full time).

Cuda was everywhere at this year's Siggraph. Again.

It really depends on what you’re doing. For games and graphics/video work. Nvidia has some benefits.

Yeah, the benefit being they have the best performance! :-b

But like the AMD processors in general vs the Intel counterparts... AMD has long been the better choice on raw crunching.

I don't see that claim validated anywhere and it certainly hasn't been my experience, but I haven't been looking for that info. Perhaps I'm wrong, but it's been my understanding that the best performing consumer GPU one could buy has been nVidia for a while now.

If you don’t actually need it there’s an equally capable card in the Pro. One that’s a step above the current top market product from AMD.

The base 2019 Mac Pro will come with a 580 in it. I hope it outperforms the Radeon Pro 580 in my 2017 iMac for the price Apple is asking. :eek:

Like the current cpu war, AMD and Nvidia trade close benchmarks on honest Apple to Apple orange to orange product comparisons.

Nah, AMD's latest cards (not available on the mac) trade close benchmarks with nVidia in mid range graphics price points but not much else. Does a $500 AMD card perform as well as a $500 nVidia card? Sure. Apples to apples.

But if I drop $1200 on a 2080ti, AMD has no product for comparison. It's apples to nothing.

CUDA is very specialised AND very closed. Sure it’s superior at what it does and I don’t deny that.

You forgot one thing: it's everywhere. So it's closed, specialized, superior and ubiquitous.
 
Nothing stopping us other than Apple, you mean? ;-). From what I've read, Apple is keeping nVidia support from the Mac even though nVidia wants it.
MacOS isn’t as closed off as so many people make it out to be. They keep some things out of the App Store but there’s nothing stopping Nvidia from releasing packages on their own website or including software on USB with their cards. From software to drivers to kernel extensions. Both my AV, Sophos, and my network firewall, Little Snitch, use kernel extensions, custom drivers, and direct system access. Neither came from the App Store. So I don’t buy that they cannot make it work. People hook up Titan cards to iMacs. There many discussions on this at Reddit and Steam. I myself have a Vega 64 internal installed externally via a dock adapter over thunderbolt 3. Again no help from Apple.
I repeat I’m all for choice. And I repeat there is nothing prohibiting Nvidia from supporting the Mac Pro if they so choose.
Apple made the choices they did. That’s generally the end of it. There’s plenty I dislike about the new system. But when you buy an apple you get what they offer. Or add it yourself. I’ve greatly expanded my iMac since I bought it. Almost entirely via internal components connected externally. Video, audio, storage, to name the most obvious.
It’s not the easiest route but it IS doable and workable today. I see nothing to prohibit anything moving forward.
 
I've also got a Vega 64 in an eGPU, thanks to native support in OSX.

nVidia swears the problem is Apple, which frustrates me. You know they would sell a lot to the cg market. I'd rather have egpus (for compute) be outside of the case for good cooling.

I think we want the same thing. But I'm frustrated that I can't throw a 2080ti in my eGPU in Mojave.

I originally got my imac because of the promise of nVidia in egpus, which was stopped cold by apple after 10.13. Now I'm left waiting for certain engines to port to metal, if at all.
 
Last edited:
there’s nothing stopping Nvidia from releasing packages on their own website or including software on USB with their cards. From software to drivers to kernel extensions.

Negative Ghostrider, the pattern is full. I don't quite know what you mean by "nothing stopping Nvidia from releasing packages on their own website", but there are only a few cards compatible with Mojave, and they are old school:

https://www.mactrast.com/2018/09/ap...ac-users-of-mojave-boot-camp-incompatibility/

Compatibility stopped at High Sierra for the GTX series (not to mention the RTX...). The cards are useless with Mojave and I doubt it has anything to do with Nvidia. I still receive driver updates for High Sierra, so if they're still making drivers for that, don't you think they would for Mojave if they had a reason? Apple shut them down. Regardless of "this card being better than that card", people should be able to have a choice. I need CUDA, and with my 1080Ti, I'm stuck on High Sierra because of it.
 
Btw
Perhaps I'm wrong, but it's been my understanding that the best performing consumer GPU one could buy has been nVidia for a while now.
Possibly. I wasn’t actually looking at consumer level cards. And I’m not sure the Titan or Vega cards should be lumped into that category, even on the high end. I was looking at the Pro Vega vs Tesla. Where the numbers by product are actually very close.

I understand now what you are both looking at from the links. I’ll make two comments.
First both linked articles cover built in support. So any company saying it doesn’t work because it’s not part of the package is crying wolf.
That said; Catalina, not Mojave, does enforce driver signing as an absolute. You can (or could prior to July) disable the requirement in Mojave. There’s now three levels of software with Catalina. Apple authorised which is basically a way to put a software package including drivers (like Logitech does) in the App Store.
Then there’s signed packages. Where a certificate is issued by Apple for non-app-Store developers.
Blocked apps which are from developers who are not certified. And “can not be installed”. At the moment those apps can still be installed as of Dev4 via a convoluted series of terminal instructions; but I believe it will be locked down for the final release.
So theoretically Apple could have locked out Nvidia that way. I’m not sure I believe it but it’s possible.
I went through this when I upgraded to Catalina Dev1 with my brother scanner. It’s how I found out about all the switch flipping you can do to make unsigned software work. It reminds me of Vista and driver signing.

I don’t follow Nvidia much anymore Beyond the benchmarks that show up in my email from time to time. I wasn’t aware of the legal situation. If Apple cut off their certificates then yes, it looks like an issue. Ignoring that any signed developer could write their own drivers Nvidia does still have one option. Something that has bothered the open source community (personally I could care less myself) for a long time. Nvidia /COULD/ just open source their software. They anyone with the free Xcode kit could compile their own drivers from source and even distribute them via, eg, gethub or sourceforge. I’m no fan of restrictive licensing, including many self righteous copy-left licenses. But ultimately if Apple does honestly lock out Nvidia completely, they still have that choice. And there’s no way Apple could cut that off in the real world.
 
Last edited:
The other thought is: are the higher prices related to Apple wanting to produce the Pro in the United States to avoid possible tariff action? (I know, I know, this veers somewhat closer to political discussion and I wanted to avoid that but I figured it needed to be said).

I think its possible that they baked a little of that in ...

... but I personally doubt it. Apple got burned with the tcMP, so they're not likely to repeat that mistake.
They'll of course ask for tariff waivers and claim that they want to make it in the USA, but that's smoke.

Overall, I suspect that Apple's market analysis concluded that the target customer isn't that price sensitive (price inelasticity), so they're charging because they know that they can.
 
Btw
Possibly. I wasn’t actually looking at consumer level cards. And I’m not sure the Titan or Vega cards should be lumped into that category, even on the high end. I was looking at the Pro Vega vs Tesla. Where the numbers by product are actually very close.

I understand now what you are both looking at from the links. I’ll make two comments.
First both linked articles cover built in support. So any company saying it doesn’t work because it’s not part of the package is crying wolf.
That said; Catalina, not Mojave, does enforce driver signing as an absolute. You can (or could prior to July) disable the requirement in Mojave. There’s now three levels of software with Catalina. Apple authorised which is basically a way to put a software package including drivers (like Logitech does) in the App Store.
Then there’s signed packages. Where a certificate is issued by Apple for non-app-Store developers.
Blocked apps which are from developers who are not certified. And “can not be installed”. At the moment those apps can still be installed as of Dev4 via a convoluted series of terminal instructions; but I believe it will be locked down for the final release.
So theoretically Apple could have locked out Nvidia that way. I’m not sure I believe it but it’s possible.
I went through this when I upgraded to Catalina Dev1 with my brother scanner. It’s how I found out about all the switch flipping you can do to make unsigned software work. It reminds me of Vista and driver signing.

I don’t follow Nvidia much anymore Beyond the benchmarks that show up in my email from time to time. I wasn’t aware of the legal situation. If Apple cut off their certificates then yes, it looks like an issue. Ignoring that any signed developer could write their own drivers Nvidia does still have one option. Something that has bothered the open source community (personally I could care less myself) for a long time. Nvidia /COULD/ just open source their software. They anyone with the free Xcode kit could compile their own drivers from source and even distribute them via, eg, gethub or sourceforge. I’m no fan of restrictive licensing, including many self righteous copy-left licenses. But ultimately if Apple does honestly lock out Nvidia completely, they still have that choice. And there’s no way Apple could cut that off in the real world.

Looks like I’m back to waiting for Pro Vega 2 in an eGPU in my Mac, or looking at buying a pc to get access to RTX. Dunno how long I can wait before going nuts.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.