Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyone know anything about US patents? - why would this come out now?
  • is there a certain amount of time before patents become public knowledge?
  • has it always been public since 2016 and someone just happened to discover it now?
  • was it applied for in 2016 but only granted now?
  • is it one of those controlled leaks to let people know there will be kb changes coming?

EDIT: some googling around implies that it just takes a while for patents to be published.
I feel like we will see this in the 2018 Macbooks as a "third generation" butterfly mechanism.
 
  • Like
Reactions: RandomDSdevel
Anyone know anything about US patents? - why would this come out now?
  • is there a certain amount of time before patents become public knowledge?
  • has it always been public since 2016 and someone just happened to discover it now?
  • was it applied for in 2016 but only granted now?
  • is it one of those controlled leaks to let people know there will be kb changes coming?
EDIT: some googling around implies that it just takes a while for patents to be published.

Supply chain leaks are really the best source such as Ming or sometimes Mark Gurman gets a scoop. Ming seems to get his leaks from the supply chain wheras Mark Gurman seems to get stuff from Apple as a controlled leak.

The patents however are relatively meaningless. It just means they are playing around with that possibility out of probably dozens and one or none are likely to be selected any time soon

I'd expect a marginal improvenet on the keyboard -- or perhaps the rumored keyless keyboard some time next year or the year after
 
  • Like
Reactions: RandomDSdevel
I'd expect a marginal improvenet on the keyboard -- or perhaps the rumored keyless keyboard some time next year or the year after

If/when the keyless keyboard happens I think people will be nostalgic for the good old 2016 butterfly keyboard days.
 
In my opinion this is my ratings of the keyboards:

2009 MacBook Pro = 2010 iMac > 2017 iMac > 2017 MacBook Pro > 2017 MacBook > 2016 MacBook

It's sad the keyboard feel is gradually getting worse over the years, for the most part. I'm currently typing on a 2008 MacBook and the keyboard is noticeably better than my 2017 MacBook keyboard. However, the 2017 MacBook keyboard is at least OK. The 2015/2016 keyboard is significantly worse.

BTW, same goes for the trackpads:

2009 MacBook Pro > 2017 MacBook Pro > 2017 MacBook = 2016 MacBook
 
  • Like
Reactions: RandomDSdevel
Based off the magnitude of leaks and obvious hinting by certain people on Twittersphere I believe we will see a Intel announcement of the new H series processors sooner rather then later.

EDIT: GDC 2018
 
Last edited:
  • Like
Reactions: RandomDSdevel
In my opinion this is my ratings of the keyboards:

2009 MacBook Pro = 2010 iMac > 2017 iMac > 2017 MacBook Pro > 2017 MacBook > 2016 MacBook

It's sad the keyboard feel is gradually getting worse over the years, for the most part. I'm currently typing on a 2008 MacBook and the keyboard is noticeably better than my 2017 MacBook keyboard. However, the 2017 MacBook keyboard is at least OK. The 2015/2016 keyboard is significantly worse.

BTW, same goes for the trackpads:

2009 MacBook Pro > 2017 MacBook Pro > 2017 MacBook = 2016 MacBook


I'd actually throw in the PowerBook G4, Aluminum MBP keyboard. The silver one. Initally I hated it coming from a chiclet key MacBook. But boy... was I wrong. I honestly think this was the best keyboard Apple ever had!
 
  • Like
Reactions: RandomDSdevel
This announcement is already a little bit older, but still relevant I think – Intel Optane can be used as RAM, and it is supposed to launch in 2018. This could potentially sidestep the LPDDR3 16GB RAM limit. And since 3DxPoint memory is non-volatile, it should reduce the standby battery usage essentially to zero. Also, with current Optane pricing, you can get 118 GB for €200, which would allow for a far greater amount of RAM without making the machines more expensive.

It remains to be seen if Optane is fast enough to completely replace DRAM (it probably isn't), but it could improve performance when used in tandem with traditional RAM.

http://www.tomshardware.com/news/intel-optane-dimms-timing-2018,35928.html
 
This announcement is already a little bit older, but still relevant I think – Intel Optane can be used as RAM, and it is supposed to launch in 2018. This could potentially sidestep the LPDDR3 16GB RAM limit. And since 3DxPoint memory is non-volatile, it should reduce the standby battery usage essentially to zero. Also, with current Optane pricing, you can get 118 GB for €200, which would allow for a far greater amount of RAM without making the machines more expensive.

It remains to be seen if Optane is fast enough to completely replace DRAM (it probably isn't), but it could improve performance when used in tandem with traditional RAM.

http://www.tomshardware.com/news/intel-optane-dimms-timing-2018,35928.html

It's still several orders af magnitude worse than RAM in latency and speed when reading/writing "random" locations. It's worthless as replacement for RAM. The dimms are for databases and stuff like that, to store some ""index/cache" data in low latency storage that has higher storage capacity than RAM for speeding up operations.
 
It's still several orders af magnitude worse than RAM in latency and speed when reading/writing "random" locations. It's worthless as replacement for RAM. The dimms are for databases and stuff like that, to store some ""index/cache" data in low latency storage that has higher storage capacity than RAM for speeding up operations.

I certainly wouldn't call it "worthless". If you do the math, the latency is just above 1 frame on a 240 Hz display. Given that Macs use 60 Hz displays, this means there's essentially no measurable latency on the end user side. There are obviously scenarios which would benefit from even lower latency, but as a way to increase the size of the RAM, maybe with some clever software on the back end, Optane sure is fast enough.
 
I certainly wouldn't call it "worthless". If you do the math, the latency is just above 1 frame on a 240 Hz display. Given that Macs use 60 Hz displays, this means there's essentially no measurable latency on the end user side. There are obviously scenarios which would benefit from even lower latency, but as a way to increase the size of the RAM, maybe with some clever software on the back end, Optane sure is fast enough.
You’re math is right but your comprehension of computer architecture and machine code execution is obviously wrong. You metric of refresh rate of a display means essentially nothing. The size of the cpu instructions could easily exceed the cpu cache a many times over and thus need to be pulled from ram, do this enough with slow ram and you will notice.
 
I certainly wouldn't call it "worthless". If you do the math, the latency is just above 1 frame on a 240 Hz display. Given that Macs use 60 Hz displays, this means there's essentially no measurable latency on the end user side. There are obviously scenarios which would benefit from even lower latency, but as a way to increase the size of the RAM, maybe with some clever software on the back end, Optane sure is fast enough.

1/240HZ = 4ms
DDR4 has a latency of ~14ns or 0.000000014ms

It's not even close.
 
It's still several orders af magnitude worse than RAM in latency and speed when reading/writing "random" locations. It's worthless as replacement for RAM. The dimms are for databases and stuff like that, to store some ""index/cache" data in low latency storage that has higher storage capacity than RAM for speeding up operations.
I still would love to buy a quad MBP 13" 2018 with 16GB LPDDR3-RAM + extra 16GB slower RAM for SWAP, and don't use disk-based SWAP at all.
It means that I can pass the 16GB by a lot (till 100%), without seeing tasks that took one second without swapping, taking 10 seconds because of the swapping (as happening currently with disk-based SWAP).
And if the extra RAM can be used not through the SWAP but through a smarter memory management that Apple will add to OSX, which will put the busy things in the faster RAM and the rest in the slower RAM, it will be the best.

But more than anything else (Apple, do you read?), I would love to see a quad 13" as soon as possible, even with the current limit of 16GB; The current situation, that competing 13" laptops with 4 cores are sold for a half price of 2 core MBP, is crazy, and hurts Apple sales, and it may become a tsunami in April-May (no time to wait till WWDC...).
 
Last edited:
I still would love to buy a quad MBP 13" 2018 with 16GB LPDDR3-RAM + extra 16GB slower RAM for SWAP, and don't use disk-based SWAP at all.
It means that I can pass the 16GB by a lot (till 100%), without seeing tasks that took one second without swapping, taking 10 seconds (as happening currently with disk-based SWAP).
And if the extra RAM can be used not through the SWAP but through a smarter memory management that Apple will add to OSX, which will put the busy things in the faster RAM and the rest in the slower RAM, it will be the best.

Running, as in excuting or using data for calculations, anything from "Xpoint" RAM would be like running something on a 286 CPU. The CPU would just be stalled into oblivion. The CPU would be doing this:

waiting-meme-22.jpg


It would basicly be an expensive glorifed scratch disk.
 
If Apple introduces new MBPs with a better keyboard this year do you think they will replace current 16/17 keyboards with the new one?
If they wait for the 2018 models rather than just starting to build the 2017s with whatever they've done as a fix, I'd hazard a guess it's probably required a change that makes it not backwards compatible?
 
  • Like
Reactions: RandomDSdevel
1/240HZ = 4ms
DDR4 has a latency of ~14ns or 0.000000014ms

It's not even close.

That's not the point though. DRAM is faster, yes, but I'm not talking about replacing LPDDR3 with Optane, I'm talking about supplementing it. 16 GB RAM + 32 GB 3D XPoint, with good memory management, would effectively be almost as good as 48 GB DRAM, since timing sensitive data can be stored in the actual DRAM, while you can use the 3D XPoint memory to store files like Photoshop assets which are less time critical but still need tons of memory.
 
That's not the point though. DRAM is faster, yes, but I'm not talking about replacing LPDDR3 with Optane, I'm talking about supplementing it. 16 GB RAM + 32 GB 3D XPoint, with good memory management, would effectively be almost as good as 48 GB DRAM, since timing sensitive data can be stored in the actual DRAM, while you can use the 3D XPoint memory to store files like Photoshop assets which are less time critical but still need tons of memory.

What you are talking about is RAM tiering which is done in large enterprise multi-CPU superclusters. For a consumer machine, it doesn't make much sense. To get the best out of it, the client software would need to support such architectures explicitly, and customer-grade software can't even get basic memory management right. Its also very expensive and not space-efficient. Current SSDs are more then fast enough as a backing store.

At some point, when the persistent RAM is fast enough, we'd go away with the current tiering (RAM vs disk) and just have a single storage device. Its how operating systems have worked for a while now anyway. But it will still take some time :)
 
That's not the point though. DRAM is faster, yes, but I'm not talking about replacing LPDDR3 with Optane, I'm talking about supplementing it. 16 GB RAM + 32 GB 3D XPoint, with good memory management, would effectively be almost as good as 48 GB DRAM, since timing sensitive data can be stored in the actual DRAM, while you can use the 3D XPoint memory to store files like Photoshop assets which are less time critical but still need tons of memory.

Wouldn't be anywhere near as good as 48 GB DRAM, you still would move the data from the Xpoint to real ram to process it as not doing so would be slow. You'd get the same effect just by storing it on the solid state drive as the process isn't time critical enough to make any noticeacle effect for the end user.
 
  • Like
Reactions: RandomDSdevel
What you are talking about is RAM tiering which is done in large enterprise multi-CPU superclusters. For a consumer machine, it doesn't make much sense. To get the best out of it, the client software would need to support such architectures explicitly, and customer-grade software can't even get basic memory management right. Its also very expensive and not space-efficient. Current SSDs are more then fast enough as a backing store.

At some point, when the persistent RAM is fast enough, we'd go away with the current tiering (RAM vs disk) and just have a single storage device. Its how operating systems have worked for a while now anyway. But it will still take some time :)

I don't think that's the case. After all, Intel is pushing for 3D XPoint DIMMs this year, and I'm pretty sure they know a bit about their market. I also think you could implement this on an OS level in a way that would keep performance acceptable without having to hardcode support into any application.

Don't get me wrong, I do not expect Apple to implement such a solution anytime soon, but it would be one possible solution for adding more RAM without increasing the power consumption, especially since the standby power consumption of 3D XPoint is essentially zero.
[doublepost=1520861721][/doublepost]
Wouldn't be anywhere near as good as 48 GB DRAM, you still would move the data from the Xpoint to real ram to process it as not doing so would be slow. You'd get the same effect just by storing it on the solid state drive as the process isn't time critical enough to make any noticeacle effect for the end user.

I think you underestimate the speed of 3D XPoint storage. While the current M.2 based Optane storage devices feature a latency of approximately 4 μs compared to the effective 0.3 μs latency of DRAM, the NVMe protocol severely limits its capabilities here. I'm pretty confident there's a lot more performance you could get out of this technology when put into a DIMM form factor.
 
I don't think that's the case. After all, Intel is pushing for 3D XPoint DIMMs this year, and I'm pretty sure they know a bit about their market.

For enterprise market, yes.

I also think you could implement this on an OS level in a way that would keep performance acceptable without having to hardcode support into any application.

You could use is as a dedicated swap. But it wouldn't have much utility I'm afraid. The current SSDs are already very fast and if you are swapping a lot, the real problem is somewhere else (most likely badly written software). But who knows, maybe a faster cache between the SSD and the RAM does make sense.
 
If they wait for the 2018 models rather than just starting to build the 2017s with whatever they've done as a fix, I'd hazard a guess it's probably required a change that makes it not backwards compatible?

I agree with you, but I think this may upset quite a lot of people who purchased 2016/2017 models and had keyboard issues.

I've heard they have a policy of giving you a new machine after you've had to have 4 or more repairs so maybe it's conceivable that they would swap an older machine for a 2018?
 
  • Like
Reactions: RandomDSdevel
I agree with you, but I think this may upset quite a lot of people who purchased 2016/2017 models and had keyboard issues.

I've heard they have a policy of giving you a new machine after you've had to have 4 or more repairs so maybe it's conceivable that they would swap an older machine for a 2018?
Quite possible they might quietly start swapping out 2016 or 2017 units for 2018 replacements if they have made a conclusive fix to the issue with it - I don’t think it will be announced, maybe hinted at if they mention a new/ revised keyboard design. Obviously confirming the issue and admitting culpability isn’t in their interests, though I have to say in the few instances I have had a problem, Apple have done right by me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.