Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

studio347

macrumors newbie
Original poster
Nov 16, 2010
17
0
When do you think Macbook Pro will be equipped with 32 ram?
Just curious... we are not expecting it anytime soon?
 
Once 16 GB 204-pin DDR3 SO-DIMM RAM modules arrive on the market, Apple will probably implement 32 GB RAM as an option some months later.
 
I expect 32GB of RAM to become available when it can be implemented within the same physical and power envelope as 16GB.
 
Why do they need to wait for modules? It is soldered onto the board.

Barney

As those SO-DIMM modules use the same chips as Apple does, Apple does need to wait for higher capacity chips anyway, unless they simply make the RAM modules a bit larger to use the existing chips and just double the amount used.
 
They will canibalize Mac desktops. 32GB enters in the workstation market. High performance computing doesn't need small input data being processed fast (i.e. small RAM, fast CPU, lots of I/O). It needs big input data being processed fast (i.e. tons of RAM, fast CPU, few I/O operations).
 
Yes 32 will come IMO and 8 will be the new 4. That doesn't mean 99% of people will all of a sudden need 32Gb, though some will swear by it. In fact, 99% of people really only need between 4-8.
 
Yes 32 will come IMO and 8 will be the new 4. That doesn't mean 99% of people will all of a sudden need 32Gb, though some will swear by it. In fact, 99% of people really only need between 4-8.

Right now. But as the web gets more and more interactive and more and more graphics heavy...
 
I would rather have 8 or 16 GB DDR4 RAM than 32 GB DDR3 RAM

Hopefully next year, and most likely not this year.
 
With DDR4. It will not happen with DDR3. There is a manufacturer of high-density DDR3 modules that could enable this but they are insanely expensive and do not work with Intel CPUs
 
I would like to think its a ways off, but given how people are so quickly embracing 16GB, I think the market could support 32GB. Its not a question of need as much as making a profit.
 
Would like to see this myself but I fear if/when it does come in MBP's it will be prohibitively expensive.

Might see it with a Broadwell MBP
 
More RAM does not improve performance, it merely allows you to open more apps/render more.

I honestly can't see why you'll need more than 16.
 
To be honest, I think Apple's next approach will be to increase the clock speed of the ram modules from the current 1600Mhz to 1866Mhz rather than implementing "more RAM".
 
Do you need 32GB now? OSX does have excellent memory management.

Have run up against memory as a limit in my current 16Gb machine once or twice. Its not about OSx - why do folks always assume that all people do on their MBP is open lots of Word and Firefox windows ? I work with large statistical models with several thousand variables - the entire model has to be loaded into memory at once in almost all statistical packages. This means RAM becomes a hard limit on what you can do. 16GB has only been a problem once or twice and I've avoided it by reducing the parameters. But I have an ever growing list of stuff to add into the models so I am going to run into 16GB as a limit in the next 12-18 months. So do I need 32GB now ? No. Soon? Yes.
 
Have run up against memory as a limit in my current 16Gb machine once or twice. Its not about OSx - why do folks always assume that all people do on their MBP is open lots of Word and Firefox windows ? I work with large statistical models with several thousand variables - the entire model has to be loaded into memory at once in almost all statistical packages. This means RAM becomes a hard limit on what you can do. 16GB has only been a problem once or twice and I've avoided it by reducing the parameters. But I have an ever growing list of stuff to add into the models so I am going to run into 16GB as a limit in the next 12-18 months. So do I need 32GB now ? No. Soon? Yes.

I used to have this kind of problem, but with increasing network availability, I've fallen back to the old client-server models: I built a cheap Windoze machine to run LINUX in Virtualbox with 32GB of RAM to do all my number crunching/compiling, and I access the box from anywhere using VPN from my mbp.

Advantages: it's cheaper/faster/quieter/cooler overall, I can leave the Server running in the background all the time, no heat issues with mbp, etc...

Disadvantages: I need net access from the mbp

NOTE: I used to be waiting to upgrade to a high-end rmbp, but now, due to all the number crunching being done elsewhere, I am now waiting for a rmba - don't need the computing power - just want a good screen with good battery life
 
Last edited:
I work with large statistical models with several thousand variables - the entire model has to be loaded into memory at once in almost all statistical packages

But this is not a problem with the rMBP, but rather the problem with bad programming which use naive algorithms. Cache-aware code, with proper use of memory mapping, should be able to dramatically reduce the required RAM. If you are using R, there are some packages that can help. So far, I haven't encountered any problems running generalized mixed models on data sets with over 100000 observations over 5 variables, but it depends if course what kind of data and algorithms you are using.
 
I used to have this kind of problem, but with increasing network availability, I've fallen back to the old client-server models: I built a cheap Windoze machine to run LINUX in Virtualbox with 32GB of RAM to do all my number crunching/compiling, and I access the box from anywhere using VPN from my mbp.

Advantages: it's cheaper/faster/quieter/cooler overall, I can leave the Server running in the background all the time, no heat issues with mbp, etc...

Disadvantages: I need net access from the mbp

NOTE: I used to be waiting to upgrade to a high-end rmbp, but now, due to all the number crunching being done elsewhere, I am now waiting for a rmba - don't need the computing power - just want a good screen with good battery life

Yeah thought about that and it appeals in some ways but isn't really feasible since I move around alot and don't always have a connection and also there would be noone at home to reboot it if it crashed.

But this is not a problem with the rMBP, but rather the problem with bad programming which use naive algorithms. Cache-aware code, with proper use of memory mapping, should be able to dramatically reduce the required RAM. If you are using R, there are some packages that can help. So far, I haven't encountered any problems running generalized mixed models on data sets with over 100000 observations over 5 variables, but it depends if course what kind of data and algorithms you are using.

Maybe bad programming but I'm not the one who wrote those programs and nor do I want/need to spend an extra 6 months learning to implement proper memory management into them - it would be much preferable to be able to put 32Gb into a laptop :D

But like I said - not a problem here and now today, will be a problem when things upscale a notch.
 
More RAM does not improve performance, it merely allows you to open more apps/render more.

I honestly can't see why you'll need more than 16.

In my case, I easily used up 32GB of RAM. I normally have 3 VMs running at the same time, with 8GB assigned to each, or 4 VMs at the same time, with 3 of them having 8GB and one having 4GB.

That said, I do all this on my 27" iMac. It'd be nuts to do it on my rMBP because it'd heat up real fast.
 
But this is not a problem with the rMBP, but rather the problem with bad programming which use naive algorithms. Cache-aware code, with proper use of memory mapping, should be able to dramatically reduce the required RAM. If you are using R, there are some packages that can help. So far, I haven't encountered any problems running generalized mixed models on data sets with over 100000 observations over 5 variables, but it depends if course what kind of data and algorithms you are using.

If you do some scientific work, you need a comfortable platform for doing simulations and prototypes without spending time tweaking low level stuff. You want seeing results fast. Even better if you can dispatch your algorithm sooner for computing while you implement the next step of your experiment. Of course, if you're dealing with a production environment, you'll develop or learn state-of-the-art libraries and frameworks, but most of the time you need finding an evidence that corroborates your hypothesis -- FAST.
 
Do you need 32GB now? OSX does have excellent memory management.

yes, I need. It would be nice to have Mac Pro and macbook pro both. But it would be cheaper with Macbook pro_32 ram.
I want to open ram hungry software 2 or 3 together. But now, I have to use one at a time...
 
yes, I need. It would be nice to have Mac Pro and macbook pro both. But it would be cheaper with Macbook pro_32 ram.
I want to open ram hungry software 2 or 3 together. But now, I have to use one at a time...

Can you tell us what software you use that is so "RAM hungry"? I suspect you are misinterpreting your RAM need. I can run Logic Pro, Photoshop, Parallels with Windows 7, iTunes, Handbrake, Safari, Mail, and just about anything else all at the same time in 8GB on my 11" Air and I have no problems.
 
If you do some scientific work, you need a comfortable platform for doing simulations and prototypes without spending time tweaking low level stuff. You want seeing results fast. Even better if you can dispatch your algorithm sooner for computing while you implement the next step of your experiment. Of course, if you're dealing with a production environment, you'll develop or learn state-of-the-art libraries and frameworks, but most of the time you need finding an evidence that corroborates your hypothesis -- FAST.

Agreed. Don't want to be messing about with low level code - i'm not a computer scientist. I code for statistical modelling - thats my focus, I don't really want to get distracted with stuff like memory managment tweaking.


There is also the point that I could actually run two analyses side-by-side at the same time if I had 32GB. My current setup on 16GB only uses half my processor power - there is no advantage to giving one analysis more CPU cores (for complex statistical reasons). Given that my (current) largest models take 24hours to run on 2 of 4 cores and 16GB RAM, with a 32GB RAM machine I could run 2 analyses in 24 hours without any extra cores.
 
Can you tell us what software you use that is so "RAM hungry"? I suspect you are misinterpreting your RAM need. I can run Logic Pro, Photoshop, Parallels with Windows 7, iTunes, Handbrake, Safari, Mail, and just about anything else all at the same time in 8GB on my 11" Air and I have no problems.
I agree that 99% of people do not utilize even 4gb, but there is people in this thread who do.
If you run 3VMs while editing tenthousands of raw files then I can imagine pushing 32gb of ram.
I nevertheless would not do this on a laptop. But the time will come when it will be possible.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.