Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Nothing to do with "bit errors" (Mass storage devices/filesystems/drive controllers have been using checksums/parity etc. since long before journaling filesystems) and everything to do with mass storage being non-volatile and - unlike RAM - expected to keep its data structures intact in the face of software crashes, disconnections, power-failures etc. that might happen in the middle of a complex update.
Since few home computers are attached to a UPS, the ability to recover from a power failure is a necessity.

I strongly suspect that reduction in DRAM cell size will make memory bit flips even more common as there isn't much that can be done about muon background. As such, some form of ECC will be required.
 
...because nobody is actually coming up with any solid evidence to show that this is a problem that exists. If you want to claim that all consumer devices need ECC to solve this "risk" then it's your burden of proof to show that the risk exists.
The funny thing is, this should be something that is fairly easy to prove. If ECC provides the "early warning" that Basic75 is saying it does, it would have to exist as a log entry either in the Windows Event Log (or Mac or Linux equivalent) or in the EFI firmware or BIOS in the system. It should be easy for a "home" workstation-class system user to provide actual numbers showing the frequency of ECC-detectable data corruption events.
 
Again, even they dont worth it just like Apple. What just because Apple is the only one developing and using their own chips?

Actually HP, Lenovo, Dell you mentioned ALSO make servers along with workstations. Do you see the point now? Both workstations and servers benefits from each other from development after all.
YES it is just because they make there own SoC.

HP Lenovo etc don’t make their own CPU instead they buy in Intel or AMD, where the R&D for developing that CPU is spread across a massively greater number of CPU. Hence why is possible for them to do the updates on the workstations but not worth to Apple. The cost to update to next Xeon is negligible compared to Apple developing a SoC that used for a niche product of Apple customers.

So much of that Xeon development also covered on the Server side which Apple don’t sell eiether.

Apple on the other hand have to spread the R&D for the new SoC across a much smaller number of SoC, so updating the Studio or Pro with a new Extreme SoC is way more expensive and just not effective to do every year.

You actually HAD to ask that!
 
  • Haha
Reactions: 1d1otic
We are talking about Apple, not OpenAI. Apple is using their own closed-ecosystem with their own hardware and software. That's a huge difference. Yeah, they could use other services but eventually, they would develop and use their own. Besides, tell that to Apple Intelligence.
Apple also predominantly use Linux Servers on non Apple Hardware.
They may sell an ecosystem to the customer base but that doesn’t mean everything they use internally is Apple badged.

They also use Azure and AWS despite having Apple Cloud Services and they aren’t going to completely eliminate the use of other cloud services.

Apple have quite often dropped products, they used to do there own

wireless boxes
servsers
SAN
time capsule
Laser printers
Injet printers.

They even stopped doing monitors for a while.

Apple clearly do not feel the need to do everything internally so why do they need to do the AI learning on their own hardware and specifically why would they need a Mac Pro for that.

Plenty of other people are developing AI without developing and releasing a workstation. Wether it by simply buying up Nvidia Hardware or like Google and developing own hardware. Google’s hardware nothing like a Mac Pro because as I stated several posts ago you would not start with a Mac Pro as your base for AI hardware, so the fact that have skills in house for a workstation like machine does not mean that have skills for AI hardware.

Why is it that Apple are Unable to do any of this and have to develop a workstation to be able to do AI.

Your opening post was specifically that Apple need the Mac Pro to do this but so far nothing posted showing why Apple need the Mac Pro for this.

Yes have been apparently putting M2 Ultra Studio’s into Racks for some AI work well if can use the Ultra SoC then can simply use the Studio can you not. At which point don’t need the Pro do they.
 
  • Haha
Reactions: 1d1otic
Apple also predominantly use Linux Servers on non Apple Hardware.
They may sell an ecosystem to the customer base but that doesn’t mean everything they use internally is Apple badged.

They also use Azure and AWS despite having Apple Cloud Services and they aren’t going to completely eliminate the use of other cloud services.

Apple have quite often dropped products, they used to do there own

wireless boxes
servsers
SAN
time capsule
Laser printers
Injet printers.

They even stopped doing monitors for a while.

Apple clearly do not feel the need to do everything internally so why do they need to do the AI learning on their own hardware and specifically why would they need a Mac Pro for that.

Plenty of other people are developing AI without developing and releasing a workstation. Wether it by simply buying up Nvidia Hardware or like Google and developing own hardware. Google’s hardware nothing like a Mac Pro because as I stated several posts ago you would not start with a Mac Pro as your base for AI hardware, so the fact that have skills in house for a workstation like machine does not mean that have skills for AI hardware.

Why is it that Apple are Unable to do any of this and have to develop a workstation to be able to do AI.

Your opening post was specifically that Apple need the Mac Pro to do this but so far nothing posted showing why Apple need the Mac Pro for this.

Yes have been apparently putting M2 Ultra Studio’s into Racks for some AI work well if can use the Ultra SoC then can simply use the Studio can you not. At which point don’t need the Pro do they.
M2 Ultra is too slow to use cause Apple has no choice but to use their own chips other than Nvidia. They are not even close to TPU servers. And I already told you, workstations are related to servers due to how they work in terms of part. Why do you keep justifying poor hardware for? If you think Mac Studio or Ultra chips are enough, then you are short sighted and will only kill Apple's future.

Clearly, Apple is too limited with hardware resources.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.