Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Zellio

macrumors 65816
Original poster
Feb 7, 2012
1,165
474
Sure I'd say it's important to have in servers, but considering how much marketshare Apple has in the server field, I'd say that a product like this is going to two types of people. A: Business professionals who probably make things in the media production department of any sort, and B: Rich spoiled hipsters. Neither exactly need ecc ram. Sure it would be nice to have but it's also like saying you need an $80,000 car.

Apple needs an HEDT option. Rather it's the Intel 9th gen or future gen, or Threadripper, an HEDT option would be perfectly fine for most who use this and not be $6000 at an entry point. And of course, that why it would never happen as well, because today's Apple who sells $1000 monitor stands basically gets a guaranteed $6000 from anyone wanting upgradability.
 
Why stop at ECC RAM? The overrated/overpriced argument has been had since the Xeons - they were essentially the i7s with the exception of being 4 times the price and supporting dual socket, whereas the i7 would support single socket. Other than that, they were the same. Same socket, same core count, same cache, same frequency.

Then you've got the consumer vs. Pro graphic card arguments: "they're essentially the same" when you're looking at the raw numbers of the FirePros vs the consumer R9s following them, or the Titans vs the following GTXs. Why pay for the Pro one if you'll get the same performance in a consumer card the following year?

But when you put all three elements together to save on things that aren't absolutely essential or have a performance difference worth the price, you've essentially built a gaming rig which isn't designed with any overhead in mind and will ultimately be a lot less stable when running round the clock.

You can't really have one without the other if you follow me; it's all part of the same ecosystem. And let's not forget that the consumer grade machines definitely have a lot of grunt, so it's not exactly like you have to buy the iMac/Mac Pro in order to get any work done. :)
 
Let's flip it around and say data corruption countermeasures should be a fundamental human right!

It's just historical and artificial market segmentation by Intel that has influenced opinions on this subject. ECC RAM is not some kind of space-age military-grade technology. It's totally boring and in a sane world would be in everything at this point, just as a matter of course.
 
Let's flip it around and say data corruption countermeasures should be a fundamental human right!

It's just historical and artificial market segmentation by Intel that has influenced opinions on this subject. ECC RAM is not some kind of space-age military-grade technology. It's totally boring and in a sane world would be in everything at this point, just as a matter of course.

I agree, the problem here is with Intel are than anyone else. ECC requirements and the feature set that Intel ties them to are just not the same. There are plenty of use cases where a flipped bit or two in a huge project, costing millions, is just not that important. Who cares if a pixel in one frame of a 4k blockbuster movie is a slightly different shade of cyan?

But what if a bit flips in your massively overpowered for Quicken i5 machine? Or your undergrad research project, with a budget of less than the optional wheels on the new Mac Pro, creates a difficult to spot error that suddenly makes salami look like a cure for leprosy? At the very least you would look like a complete idiot, but still wouldn't have needed more than a dorito's worth of compute power.
 
  • Like
Reactions: ssgbryan and frou
The larger your memory configuration, the larger the probability of error. The more data you move the through memory, the larger the probability of error.

These machines are professional workstation designed for very heavy duty use. Yes, if a bit flips in an image in a video file it's no big deal. But you cannot predetermine, where and how things fail.

If a machine has a problem in the middle of shoot you lose a LOT of money. If a rendering chain goes down for some reason and has to be debugged and restarted, you lose a lot of money.
 
I have almost 3TB total of non-ECC RAM running my current server cluster. Zero soft errors in almost 3 years of uptime on them.

I had 2 16GB sticks go bad in first 6 months though. ECC wouldn’t help a hard lockup failure.

Bit flipping is far more rare than the studies (funded by RAM makers) indicate. If their numbers were consistently true, your phone would crash once a month.
 
Last edited:
I could write paragraphs on why you are wrong, link you to studies, drag up all the old posts that have been written on here refuting this same argument over and over. Also you get ECC as much for correcting errors as much as it is on Registered DIMMs which is a much more important thing for running loads of memory. Forget all that though, let's put it in financial perspective:

Xeon W-3223 - 8-cores at 3.5/4GHz - $748
i9-9900K - 8-cores at 3.8/4.4GHz - $599

Cheap 16GB DDR4-2666 DIMMS are ~$4.25 a GB, ECC is ~$6.

So with 64GB we are talking $870 vs $1130.

Xeons and ECC are not why the Mac Pro is expensive and never were.

Yes consumer processors are great these days, but they cap out before Xeons so no point on workstation lines like this from the top vendors.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.