Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
so skylake cpu with 1h more battery usage, improved igpu (maybe the top model will get the same amd from now??),and lets not forget the thunderbolt 3 that will make a more importance in usage than thunderbolt 2 did
 
So it seems ppl here are waiting for the performance gains of Sjylake, I myself is just in it for the battery :)


I could not wait, had to pull trigger on new rMBP, been very happy with my 15' Mid 2012 rMBP, 2.3 GHz i7, 256 GB, 8GB RAM, still a great machine; will use as back up and leave at home as desktop replacement for my purposes.

The new one to arrive today:

2.5GHz Processor
512 GB Storage

  • 2.5GHz quad-core Intel Core i7
  • Turbo Boost up to 3.7GHz
  • 16GB 1600MHz memory
  • 512GB PCIe-based flash storage1
  • Intel Iris Pro Graphics
  • AMD Radeon R9 M370X with 2GB GDDR5 memory
  • Built-in battery (9 hours)2
  • Force Touch trackpad
 
  • Like
Reactions: frosse
So here’s a big reason why I hope Apple begins using the mobile Xeons. If you’re someone like me who edits video almost every day, and works with a lot of video footage (or any large amount of data for that matter), you begin to see the video app slow down over time, due to memory leaks - memory that isn’t correctly being released or returned back to the system. As such, the computer begins to chug, and things become inefficient. You may have to close the app (which will free up all the memory it was using, both correctly and inefficiently) and then reopen the app to get some of the lost performance back. I do all my video editing on a laptop now, and I frequently need to close the app in order to regain some performance.

Now consider one of the big benefits to using ECC memory in Xeon configs - Error Correcting Code memory can be used to detect both memory leaks and corruption, which significantly reduces the amount of memory that leaks over time. That’s huge… and really important in terms of performance to anyone working with a lot of data. When you realize that more and more work that was previously being done on desktop computers is now moving towards laptops, it just makes sense to start using ECC memory in those machines in order to keep apps and data intensive tasks running efficiently.
 
So here’s a big reason why I hope Apple begins using the mobile Xeons. If you’re someone like me who edits video almost every day, and works with a lot of video footage (or any large amount of data for that matter), you begin to see the video app slow down over time, due to memory leaks - memory that isn’t correctly being released or returned back to the system. As such, the computer begins to chug, and things become inefficient. You may have to close the app (which will free up all the memory it was using, both correctly and inefficiently) and then reopen the app to get some of the lost performance back. I do all my video editing on a laptop now, and I frequently need to close the app in order to regain some performance.

Now consider one of the big benefits to using ECC memory in Xeon configs - Error Correcting Code memory can be used to detect both memory leaks and corruption, which significantly reduces the amount of memory that leaks over time. That’s huge… and really important in terms of performance to anyone working with a lot of data. When you realize that more and more work that was previously being done on desktop computers is now moving towards laptops, it just makes sense to start using ECC memory in those machines in order to keep apps and data intensive tasks running efficiently.
Good to know! Some ppl here say that Xeon ain't happening in the rMBP, why not?
 
So here’s a big reason why I hope Apple begins using the mobile Xeons. If you’re someone like me who edits video almost every day, and works with a lot of video footage (or any large amount of data for that matter), you begin to see the video app slow down over time, due to memory leaks - memory that isn’t correctly being released or returned back to the system. As such, the computer begins to chug, and things become inefficient. You may have to close the app (which will free up all the memory it was using, both correctly and inefficiently) and then reopen the app to get some of the lost performance back. I do all my video editing on a laptop now, and I frequently need to close the app in order to regain some performance.

Now consider one of the big benefits to using ECC memory in Xeon configs - Error Correcting Code memory can be used to detect both memory leaks and corruption, which significantly reduces the amount of memory that leaks over time. That’s huge… and really important in terms of performance to anyone working with a lot of data. When you realize that more and more work that was previously being done on desktop computers is now moving towards laptops, it just makes sense to start using ECC memory in those machines in order to keep apps and data intensive tasks running efficiently.

ECC does nothing for memory leaks, memory leaks is a coding error. There is one tool that exploits ECC to detect memory leaks, but thats is something that has to be compiled into the app at a performance loss. Memory corruption is likely not going to affect you. According to google 92% of their DRAM modules never experience an error (Google's study, DRAM Errors in the Wild: A Large-Scale Field Study: "0.22% of DIMMs suffer an ECC-correctable error every year."). Unless you care about dataloss, ECC does nothing for you. In cases of multi bit error interrupt from the memory controller, your mac is going to "blue screen" anyway.
 
Last edited:
According to google 92% of their DRAM modules never experience an error. Unless you care about dataloss, ECC does nothing for you.

Flipping a random bit in system memory can potentially lead to an almost endless list of symptoms beyond dataloss. As system memory grows in size the need for ECC should be greater no?
 
Flipping a random bit in system memory can potentially lead to an almost endless list of symptoms beyond dataloss. As system memory grows in size the need for ECC should be greater no?

If the chance for one bit error is 0.22% for a server running 24/7 per year, that properly translates into your laptop having the chance of it happing once in its lifetime, that change of that bit being in a place where it hurts...
 
If the chance for one bit error is 0.22% for a server running 24/7 per year, that properly translates into your laptop having the chance of it happing once in its lifetime, that change of that bit being in a place where it hurts...

You moved the goal post from: it can only lead to data loss, to it's very rare. The 0.22% is also questionable, as there are other studies that claims that it's more common. In any case, more system memory would increase the risk no?

BTW, the study you cite mentions 0.22% per DIMM and 1.3% per machine anually. From the conclusion:

We found the incidence of memory errors and the range of error rates across different DIMMs to be much higher than previously reported. About a third of machines and over 8% of DIMMs in our fleet saw at least one correctable error per year. Our per-DIMM rates of correctable errors translate to an average of 25,000–75,000 FIT (failures in time per billion hours of operation) per Mbit and a median FIT range of 778 – 25,000 per Mbit (median for DIMMs with errors), while previous studies report 200-5,000 FIT per Mbit. The number of correctable errors per DIMM is highly variable, with some DIMMs experiencing a huge number of errors, compared to others. The annual incidence of uncorrectable errors was 1.3% per machine and 0.22% per DIMM. The conclusion we draw is that error correcting codes are crucial for reducing the large number of memory errors to a manageable number of uncorrectable error
 
Last edited:
  • Like
Reactions: The Leafs Fan
You moved the goal post from: it can only lead to data loss, to it's very rare. The 0.22% is also questionable, as there are other studies that claims that it's more common. In any case, more system memory would increase the risk no?

How did I move the goal posts? Is it only data corruption and it is very rare.

My numbers are from http://static.googleusercontent.com/media/research.google.com/en//pubs/archive/35162.pdf , show me yours.

Sure more memory increases the risk if it is the same type, if it is newer technology it could have a lower or higher risk. But if it is a once in a life time event for a laptop, so what?
 
How did I move the goal posts? Is it only data corruption and it is very rare.

You said that unless you worry about data loss ECC is useless. I then replied that a flipped bit can potentially lead to an endless list of symptoms beyond data loss and that more memory should afaik, increase the likelyhood of an error. You ignored this and just stated that it's very rare. But nevermind, I tired of this.
 
Is it good
So here’s a big reason why I hope Apple begins using the mobile Xeons. If you’re someone like me who edits video almost every day, and works with a lot of video footage (or any large amount of data for that matter), you begin to see the video app slow down over time, due to memory leaks - memory that isn’t correctly being released or returned back to the system. As such, the computer begins to chug, and things become inefficient. You may have to close the app (which will free up all the memory it was using, both correctly and inefficiently) and then reopen the app to get some of the lost performance back. I do all my video editing on a laptop now, and I frequently need to close the app in order to regain some performance.

Now consider one of the big benefits to using ECC memory in Xeon configs - Error Correcting Code memory can be used to detect both memory leaks and corruption, which significantly reduces the amount of memory that leaks over time. That’s huge… and really important in terms of performance to anyone working with a lot of data. When you realize that more and more work that was previously being done on desktop computers is now moving towards laptops, it just makes sense to start using ECC memory in those machines in order to keep apps and data intensive tasks running efficiently.
Is it good for web design?
 
ECC does nothing for memory leaks, memory leaks is a coding error. There is one tool that exploits ECC to detect memory leaks, but thats is something that has to be compiled into the app at a performance loss. Memory corruption is likely not going to affect you. According to google 92% of their DRAM modules never experience an error (Google's study, DRAM Errors in the Wild: A Large-Scale Field Study: "0.22% of DIMMs suffer an ECC-correctable error every year."). Unless you care about dataloss, ECC does nothing for you. In cases of multi bit error interrupt from the memory controller, your mac is going to "blue screen" anyway.

Three things Ries…

Firstly, memory corruption isn’t the day-to-day culprit that’s slowing down pro apps, which often do need to be closed and then reopened in order to regain performance… memory leaks are, so talking about the data corruption side of things is a moot point in this regard.

Secondly, coding errors are what cause memory leaks. Humans, as hard as they try, aren’t perfect at coding, and if you used a pro video editing app on a daily basis, you’d recognize for yourself that leaks are occurring often because the app becomes progressively slower as you use it, using far more system resources than it should, or by creating orphaned memory that goes off the rails when there's conflicting code. There are days when I close the app several times, just so I regain lost performance.

And thirdly, ECC memory has the potential to do a lot to address memory leaks - it’s far more accurate and can help developers ensure more wasted memory is returned to the user.

Professionals won't mind paying an extra $50, $100, $200 or more if it saves them a lot of time and money in the end. I would gladly pay extra for a Xeon / ECC config as a BTO option.
 
Good to know! Some ppl here say that Xeon ain't happening in the rMBP, why not?


The MBP is mainly for casual users. Probably only 5% of user base is dedicated Pro users that might actually benefit from Xeon.

Xeon is also a lot more expensive (ECC RAM as well). It also requires a custom motherboard which adds costs for manufacturing.

So neither offering Xeon for all MBP, or making a special Xeon version, really makes sense imo.
 
Three things Ries…

Professionals won't mind paying an extra $50, $100, $200 or more if it saves them a lot of time and money in the end. I would gladly pay extra for a Xeon / ECC config as a BTO option.

Problem is this is Apple not Dell, those option are likely to add considerably more than that to the price tag.

This is a company that charges $200 for a 0.3 bump in processor speed and $500 for an upgrade from 512GB to 1TB SSD.
 
Problem is this is Apple not Dell, those option are likely to add considerably more than that to the price tag.

This is a company that charges $200 for a 0.3 bump in processor speed and $500 for an upgrade from 512GB to 1TB SSD.

Yeah, but Lenovo's new P50 & P70 ( http://www.anandtech.com/show/9503/...ile-workstations-with-first-mobile-xeon-chips ) are offering those Xeons and huge amounts of ECC memory at pretty respectable prices, so Apple can't price themselves too far off in left field.

These chips are yet to be released, so no one knows for certain what they'll be priced at yet. Though if history is any indication, technology that was once at the very pinnacle of performance, like in racing for example, often trickles down to consumers over time.
 
Yeah, but Lenovo's new P50 & P70 ( http://www.anandtech.com/show/9503/...ile-workstations-with-first-mobile-xeon-chips ) are offering those Xeons and huge amounts of ECC memory at pretty respectable prices, so Apple can't price themselves too far off in left field.

These chips are yet to be released, so no one knows for certain what they'll be priced at yet. Though if history is any indication, technology that was once at the very pinnacle of performance, like in racing for example, often trickles down to consumers over time.
It's not impossible but it would be very unlike Apple to chase any other competitor on price.

If they believe the market volume is there they'll do it, but it won't be cheap.
 
It's not impossible but it would be very unlike Apple to chase any other competitor on price.

If they believe the market volume is there they'll do it, but it won't be cheap.

Even if Apple charged $500-$1000 more than Lenovo for those same specs, they wouldn't be that far off from what they're currently charging for a top-spec BTO rMBP... so I wouldn't consider holding steady on price as "chasing anyone". However, consumers should rightly expect more performance with each release if the price remains constant over time. All I'm saying is, I don't think these mobile Xeons are going to be as astronomically priced as everyone thinks they will be.
 
And thirdly, ECC memory has the potential to do a lot to address memory leaks - it’s far more accurate and can help developers ensure more wasted memory is returned to the user.

A memory leak happens when allocated memory is not free'd, ECC deals with memory corruption, i.e when a value at a specific memory address changes.

Professionals won't mind paying an extra $50, $100, $200 or more if it saves them a lot of time and money in the end. I would gladly pay extra for a Xeon / ECC config as a BTO option.

IMO, it's Intel that should add ECC as an option for all their CPUs. With Skylake it's now possible to have 32GB in a laptop, so ECC would be a nice option.
 
A memory leak happens when allocated memory is not free'd, ECC deals with memory corruption, i.e when a value at a specific memory address changes.



IMO, it's Intel that should add ECC as an option for all their CPUs. With Skylake it's now possible to have 32GB in a laptop, so ECC would be a nice option.

Google "Exploiting ECC-Memory for Detecting Memory Leaks and Memory Corruption" or "Dynamically Validating Static Memory Leak Warnings". The potential is most certainly there... and El Capitan aims to make things more efficient.
 
Google "Exploiting ECC-Memory for Detecting Memory Leaks and Memory Corruption" or "Dynamically Validating Static Memory Leak Warnings". The potential is most certainly there... and El Capitan aims to make things more efficient.

A potential may be there for a new novel use of ECC memory proposed in a paper, but that doesn't mean that's how it's acutally used in the real world. Even though the overhead seems to be lower than software tracing of memory leaks, it still mentions a figure of up to 14% overhead in the abstract.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.