Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I put a lot of effort into comparing apples to apples. Every component listed is at least workstation grade, if not server grade. They will all work together perfectly. I did make one mistake and assigned the W-3225 CPU when it should be the W-3223 - So $2000 in components, not $2500. Please tell me - What am I missing? What am I getting for the extra $4000? A different case? Less expandability? The inability to use over half the graphics cards on the market (without support)? Or just a fruit logo and Tim Cook's blessing to use his company's (admittedly good) software?
Charging $4k for the case is courage, in ”apple style, it’s in our DNA”.
 
I guess the wait is because macs, especially this mac, is not the product they make the profits of the company. So they take all the time to find the cheapest long term deals for parts and find the timewindows of assembly lines when they would othewise be idle.
Maybe they even have some internal guideline, that every product has to make 100% profits with design cost spread out for the years they will sell this model.

I dunno. "Pro" and "Workstation" class hardware seems to have massive profits in general. There's a reason as you near the top of current performance prices start to spike exponentially. I cannot imagine that was the issue. The Pro market is important to have a hold on from a strategic standpoint because so much of what gives a platform value is in what is created by professional content creators.

I'm hoping this trickles back to the MacBook Pros too. Right now, they're glorified MacBook Airs—and I say this as someone who is really happy with everything that is not the keyboard on my 2016 13" which replaced a 2012 MacBook Air.
 
Some developers ARE invited to Apple’s special events. But, by the nature of them being SPECIAL events, they invite SPECIAL developers.


And no, logically it doesn’t mean that and if you’re not aware of what kind of logical arguments those are that you made, well, that’s on you. :) Logically, it means that this release is so important for developers that certain important developers already have their hands on pre-release systems. AND, since you have all the developers coming together anyway, why not let them all know about it?

I’ve not seen any developers (not developer representatives or those showing off new features) reporting being at special events.

So, what’s a special developer?

Did they invite developers to the event where they admitted they screwed up?

There was need for the personal attack. Your points, however salient they are to your opinion do not need to be accompanied with such nuances.
 
Yes, but if and only if those apple parts are really standard. If you have to buy expansion and upgrade options only from apple, it could be a financial nightmare. Technology is becoming cheaper, not more expensive. Apple seems to be wanting to go against current on that fact. I see no excuse for a 6000 computer with 256 SDD with basically Intel Architecture and an AMD video card. The premium is just too high.
The question here really is what Apple defines as "standard".
An anecdote: I bought mini2018 & LG's 4k2k display. The display is "standard" tb3-display and it works with every pc & mac on a planet with tb3, but not with mini2018. Hours of support calls and all Apple can say is, that "it doesn't work with mini 2018, it might start working at some point". Which is not nice, since I'm not upgrading to Catalina, since I'm running a lot of 32-bit apps. Apple might update mini2018's display drivers in Mojave or not. I'll just have to wait and pray.

Another thing about the price: the most expensive part of these gigantic hardware companies is support. And Apple has already made loss with my mini2018. Because the hours of support. I guess this is why Apple wants as many mocos users as possible, to use AIOs. My problem was categorised as niche called "external display problem" which pretty much tells the story. There was no knowledge in the support and they had to gather snapshot of the whole system and look into it.

Maybe this is the reason why they jumper out of wifi-products, since there are too many problems with thousands of different 3rd party devices and it is simply too expensive to figure out the problems.
It just might be the reason for the cost of new MP, because as soon as users start to put in "standard" pci-cards, problems will start to rise.
 
That video is just plain wrong.
Most big budget movies are shot with Arri Alexas.
https://www.arri.com/resource/blob/...ri-formatsandresolutionsoverview-2-9-data.pdf
Can you or that youtubber define what is 4k camera?
Does it need 35Mpx sensor (before debayer) or less?

Did you even watch the video? If you did, go back and rewatch it. He specifically and repeatedly states movies are generally shot on Arri Alexa cameras. He also states that the cameras in use are 2k cameras, or at the very best 3k cameras and it's likely to be that way for a long time, so my point still stands.
 
Did you even watch the video? If you did, go back and rewatch it. He specifically and repeatedly states movies are generally shot on Arri Alexa cameras. He also states that the cameras in use are 2k cameras, or at the very best 3k cameras and it's likely to be that way for a long time, so my point still stands.
Do you understand the difference of resolution before and after de-bayer? How many 2k cameras did you spot from that Arri’s list?
How many movies with over $150M budget in 2018-19 was not finalized in 4k?
 
Do you understand the difference of resolution before and after de-bayer? How many 2k cameras did you spot from that Arri’s list?
How many movies with over $150M budget in 2018-19 was not finalized in 4k?
The video explains that the intermediate processing is mostly done at a 2k resolution. Any 4k resolution is the result of upscaling. It does go on that the 4k upscaling post production is much better than upscaling in a TV or BluRay player.
 
  • Like
Reactions: venom600
Don't be ridiculous. HP, Dell and probably countless other PC vendors offer duals socket workstations that are twice as powerfull (two Xeon processors, up to 4 GPUs, ap to 10 HD/SSD drives, up to 3TB RAM).
Saying things like “twice as powerful” about a computer that’s not even on the market yet calls your objectivity and credibility into question.
 
What I am curious about, in light of the story that Apple is expanding in Seattle to presumably grow their AI effort will they use this machine? Perhaps they could contribute to AMDs ROCm library. Or will their own engineers have to use other hardware?
 
After seeing the processor pricing that's floating around, I still think the best "value" will be the 16 core machine - the jump to the higher core count "M" processors is significant.
 
  • Like
Reactions: PickUrPoison
Saying things like “twice as powerful” about a computer that’s not even on the market yet calls your objectivity and credibility into question.

You can cherry pick features to fit some predetermined narrative you have about the Mac Pro all you want, but it doesn’t change the fact that this machine is a monster, there’s nothing like it on the market, and well-configured models will deliver blazing, best in class speeds for music and film production pros that have never been seen before.

I'm confused now.

And yes, this machine is a monster, but in a completely different way. Base model that's not even powerful enough for the high end production (on the storage/GPU side), and at the same time complete overkill for the 2D design and stuff.
 
I'm confused now.

And yes, this machine is a monster, but in a completely different way. Base model that's not even powerful enough for the high end production (on the storage/GPU side), and at the same time complete overkill for the 2D design and stuff.

The idea is that you will upgrade the specs as appropriate for your needs.

For example, I imagine a musician would opt for more cores and ram, but would likely be fine sticking with the base GPU, based on the discussions I am seeing and from my understanding of how the software operates.

A 3d modeller might elect to max out the graphics, but keep the base 8-core CPU.

A video editor would upgrade the CPU, GPU and ram in lockstep.

None of them use the base Mac Pro, but they all use different upgrades. The base configuration is not meant to actually sell. What Apple has done is to provide minimum specs for workloads that each stress a different part of the computer.
 
I'm confused now.

And yes, this machine is a monster, but in a completely different way. Base model that's not even powerful enough for the high end production (on the storage/GPU side), and at the same time complete overkill for the 2D design and stuff.
There’s one base model no matter what the customer’s use case may be. Some users will upgrade (or even max out) one or more of the the CPU/GPU/RAM/SSD options. Some users won’t upgrade one, more or any of those, at least at the time of purchase. Maxing our all four will be very pricey, but few users need 28 cores, 56 Teraflops of GPU, 1.5 Terabytes of RAM and a 4TB internal SSD. However, whether now or five to ten years from now, the expansion capabilities are valuable to many pros; this machine definitely puts the 2013 cylinder to shame.

btw I think you’d get quite a bit of pushback that the base model is “complete overkill” for any pro usage, unless you mean the slots/power supply/cooling capacity. To me, 8 cores, a 5 Teraflop GPU, 32GB RAM and 256GB SSD seem to be fine minimum base specs.
 
Last edited:
I am amazed at the LOW price of the offer. As close as I can get with a 24 core processor build out from AVAD, which is the only one I know for such server platforms; I’m starting at $18250. And that’s missing the custom video and custom bus features!
People looking at the MacPro in the jaded eyes of the Mac Trashcan are deluded! I couldn’t build the new MP for anywhere NEAR $6000. I think the price is a STEAL!
I can’t begin to afford it; or even justify it! But Apple is offering an entry level workstation at half the price!
Anyone who is looking at business, not game, systems will instantly recognise the value here!
 
I am amazed at the LOW price of the offer. As close as I can get with a 24 core processor build out from AVAD, which is the only one I know for such server platforms; I’m starting at $18250. And that’s missing the custom video and custom bus features!
People looking at the MacPro in the jaded eyes of the Mac Trashcan are deluded! I couldn’t build the new MP for anywhere NEAR $6000. I think the price is a STEAL!
I can’t begin to afford it; or even justify it! But Apple is offering an entry level workstation at half the price!
Anyone who is looking at business, not game, systems will instantly recognise the value here!

What are you talking about? Numerous people in this thread have shown that other OEMs like HP offer similar machines at similar prices to the MP. It isn’t half the price of the competition
 
What are you talking about? Numerous people in this thread have shown that other OEMs like HP offer similar machines at similar prices to the MP. It isn’t half the price of the competition
Everything posted so far is similar but nothing matches. Very Close starts at $8000. Ignoring proprietary options, the closest I can get is $18250 or $1120 with two cpu sockets or less memory sockets, respectively.
 
I'm confused now.

And yes, this machine is a monster, but in a completely different way. Base model that's not even powerful enough for the high end production (on the storage/GPU side), and at the same time complete overkill for the 2D design and stuff.
iMac Pro is for 2D design.
Don’t buy the base model Mac Pro. Is just marketing so they can advertise a lower starting price. Most serious creators will create custom configs outta the gate.
 
I wonder why Apple so reluctantly refuses to build such a machine, it could be the no 1 selling Mac.
My theory still is, that Apple wants to avoid support for "not pro" users, who are having problems with 3rd party monitors, storage, ram & pci-e-cards. This support might bite the bigger part of $2.5k MP's profits.
[doublepost=1560901224][/doublepost]
The video explains that the intermediate processing is mostly done at a 2k resolution. Any 4k resolution is the result of upscaling. It does go on that the 4k upscaling post production is much better than upscaling in a TV or BluRay player.
This might not be the optimal place for talk about this, but there is so much more about quality of picture than this number-geeks arguments about 2k vs. 4k.
Let's start with camera sensor: what kind of sensor is needed for "real 4k"?
What resolution?
Should we define that eg. "ability to resolve 1.5k verical line pairs"?
Can you tell what sensors can do that?

Some real world info about the issue:
https://library.creativecow.net/kaufman_debra/The-Girl-with-the-Dragon-Tattoo/

Can you explain why the most talented movie makers use Alexas, even if they are not considered to be "real 4k"?
 
Last edited:
Jony Ive doesn't care about design anymore.
He cared enough to hire his mate Marc Newson, who almost certainly designed this. To my eye it has his signature all over it, and the timing between his hire and this release seems about right (based on development timelines). Beauty being in the eye of the beholder, taste will vary. For me...I find it to be a nice departure from the very established design language of the past ten years, as beautiful as it has been.
 
  • Like
Reactions: Martyimac
He cared enough to hire his mate Marc Newson, who almost certainly designed this. To my eye it has his signature all over it, and the timing between his hire and this release seems about right (based on development timelines). Beauty being in the eye of the beholder, taste will vary. For me...I find it to be a nice departure from the very established design language of the past ten years, as beautiful as it has been.

I don't know about you, but when I buy a tower I sling it under the desk and never look at it again until I want to upgrade something.
 
  • Like
Reactions: Stephen.R
My theory still is, that Apple wants to avoid support for "not pro" users, who are having problems with 3rd party monitors, storage, ram & pci-e-cards. This support might bite the bigger part of $2.5k MP's profits.
But does Apple truly provide "pro" level support like other vendors? I really don't know, but if the answer is "bring it to an Apple store", the answer is no.

This might not be the optimal place for talk about this, but there is so much more about quality of picture than this number-geeks arguments about 2k vs. 4k.
Let's start with camera sensor: what kind of sensor is needed for "real 4k"?
What resolution?
Should we define that eg. "ability to resolve 1.5k verical line pairs"?
Can you tell what sensors can do that?

Some real world info about the issue:
https://library.creativecow.net/kaufman_debra/The-Girl-with-the-Dragon-Tattoo/

Can you explain why the most talented movie makers use Alexas, even if they are not considered to be "real 4k"?
I was going by what the video provided. Just because you can provide an example of a movie which was processed at 4K doesn't disprove that 4K is still not overwhelmingly embraced by Hollywood. Even your asking about the cameras seems to support the argument that the production chain hasn't changed to 4K yet. It will, or it will move to a higher resolution eventually.
 
I was going by what the video provided. Just because you can provide an example of a movie which was processed at 4K doesn't disprove that 4K is still not overwhelmingly embraced by Hollywood. Even your asking about the cameras seems to support the argument that the production chain hasn't changed to 4K yet. It will, or it will move to a higher resolution eventually.
Is this list unreliable:
https://referencehometheater.com/ultrahd-blu-ray-title-info/
?
Because there is listed about 200 uhd titles that had 4k DI.

I posted that link to that article about Light Iron, because it tells a bit more detailed story than 2k vs. 4k. Even if the article is 8 years old.
Because these aren't the only options.
And it's really sad thing that something that is posted in 3.9K is listed as 2k.
Working with frames that are 4096 pixels wide tells nothing about how much angular resolution or spatial resolution the frame really have.
There was like a decade when so few movie theaters had 4k projectors, that using about 4k in the post to have something like 10% more real resolution was just not worth it.

But what people really see when watching uhd-disc?
Pixel dimension of the frame is 3840 px wide.
Then almost all people watch it on a display that overscans about 5-10% of that away.
You do know how much real resolution is wasted when you scale those remaining 3456 px to screens 3840 wide picture with consumer grade televisions realtime conversion?
Why this happens? Because almost nobody knows what 1:1 setting in deep down od the display settings of the tv means. And if they find it and try it they see all kind of garbage when watching broadcasted signal. So they don't use, even if they could find it.

Lets say that movie is shot with 5k sensor, which is debayered to 4k picture and has 1500 cycles (with decent MTF) of resolution after LPF.
Then you make the post in 4k and scale that to UHD, which drops real resolution to 1200 cycles and then overscanning in tv drops it to about 1000 cycles.
Another movie is shot in 3k after debayer and has 1100 cycles. Scaling to UHD drops it to 900 cycles and overscanning to 800 cycles.
The difference is not huge.
Then if the 5k camera material is a bit soft, because of hundred of reasons, including that director, dp or editor likes it soft and 3k camera material is really sharp, the end picture, which uhd watcher is watching can be pretty much the same.

People making these movies are not stupid. They are not making decision about some numbers just because of the numbers. If they gain very little by bigger number, they use their money where it counts. If that some 100 cycles of more resolution in uhd watcher's tv costs $50M, they could use it somewhere else. Maybe shoot 20 days more. Or double the budget of set design. Or vfx. Or just marketing.

For short: that picture pair on wikipedia's page tells about everything:
1.2 Spatial resolution
 
Last edited:
You know very well that‘s just your apparently biased opinion. And you also know that the opposite is much more likely

You would think that the segment not being served by Apple (ie: those who want a headless Mac) would be the ones who have incentive to inflate their numbers and make themselves seem more numerous or more influential than they really are.

I have shown my numbers. Desktop Macs are a small fraction of Macs, the Mac Pro is a fraction of a fraction, and the niche caught in the middle isn’t much bigger either.

I don’t work for Apple, and I have nothing to gain or lose regardless of whether Apple releases a headless Mac or not. I am simply trying to look at matters as objectively as I can, and so I do mean it when I feel that in the greater scheme of things, the number of customers who do legitimately need such a product is probably too small (and unprofitable) for Apple to address at the moment. Especially when it is not performance which is their chief concern, but price.
 
  • Like
Reactions: Stephen.R
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.