Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Happy Snowman

macrumors newbie
Original poster
May 14, 2015
24
3
I'm in the market for a new MBP. I've narrowed it down to the new 2016 models, 15in, 500 GB.

Basic information about me: I'm a graduate student at a university also serving as a research assistant. My job requires me to perform relatively complex statistical analyses on large datasets (think 25,000 cases with 64 variables each sort of large). I may need to run these analyses in Windows 8 using VMWare. Other than this, I'll be writing lesson plans and playing the very occasional game. I also use Adobe Creative Suite to design posters for presentations and do some side graphic design work for extra cash.

My dad (a programmer) is insisting that I spring for the 2.9 GhZ processor, but I'm not really sure it's worth the extra $200. So I'm turning to you guys. Thoughts?

Thanks in advance!
 
Nah, same number of cores, same amount of L3 cache.

I don't think the 2.9 GHz processor is worth the price differential in this case.
 
I say you could work with either, you likely wouldn't notice a significant difference.
Fortunately if you buy from Apple, you have 14 days to try it out and see if it is the right fit for you.
Personally, I usually opt for the higher processor because I keep my macs for several years and I find the higher processors tend to hold better resale value.
 
I'm in the market for a new MBP. I've narrowed it down to the new 2016 models, 15in, 500 GB.

Basic information about me: I'm a graduate student at a university also serving as a research assistant. My job requires me to perform relatively complex statistical analyses on large datasets (think 25,000 cases with 64 variables each sort of large). I may need to run these analyses in Windows 8 using VMWare. Other than this, I'll be writing lesson plans and playing the very occasional game. I also use Adobe Creative Suite to design posters for presentations and do some side graphic design work for extra cash.

My dad (a programmer) is insisting that I spring for the 2.9 GhZ processor, but I'm not really sure it's worth the extra $200. So I'm turning to you guys. Thoughts?

Thanks in advance!
25k with 64 variables is really not big at all ( I would say quite small ). You, as a human, would notice zero difference between the two processors. You could run analytics and see that the 2.9 runs several hundred nano seconds faster.
Even recursive functions running permutation and/or combinations will a human only barely notice a difference in response times.

When you start running permutations/combinations on multiple billion entry tables, several times per day; then you begin wasting work time for compilation. This is when you want to start looking for faster processor.

Your money would be better spent on 16GB of RAM.
 
Last edited:
Can you put a representative case load on a USB stick and take it to a store? At our local Apple Store, they've said it was OK to load something on their systems and see how the performance compared. At the time, it was a large-ish (250K) cell spreadsheet, running under StarOffice. You may not be able to make an exact comparison, but you could get some data points to help you decide.I agree with a previous post - spend the money on other resources, such as RAM or disk.
 
Your money would be better spent on 16GB of RAM.

No RAM options on the 15 inch. Your options are CPU, GPU and storage.

I'd suggest spending the extra cash on a power brick or two. You'd be surprised how convenient it is not having to put it away and unpack.
 
  • Like
Reactions: 960design
If you will be going VM you won't see much gains of the 2.9 versus the 2.7. You are in the VM software's CPU realm. Which is a generic "processor". VM's present a common profile as it were for the image building. Its why VM's tend to be very stable, they don't have as many driver complications and such.

Most flexibility I have seen really is in your NIC selections for that config. You can adjust the graphic card a bit but its more feature based. A windows VM I will give more power and features. My CLI Linux vms'....lowest memory possible since just text and any features like 3d acceleration not needed so disabled.




Since you mention statistical applications what is being used? Is a switch to R feasible. it remove your platform dependency for starters.
 
First of all, thanks for the responses here. I really appreciate everyone's advice.

25k with 64 variables is really not big at all ( I would say quite small ). You, as a human, would notice zero difference between the two processors. You could run analytics and see that the 2.9 runs several hundred nano seconds faster.
Even recursive functions running permutation and/or combinations will a human only barely notice a difference in response times.

When you start running permutations/combinations on multiple billion entry tables, several times per day; then you begin wasting work time for compilation. This is when you want to start looking for faster processor.

Your money would be better spent on 16GB of RAM.

I might be doing some bootstrapping, longitudinal data analysis (multilevel modeling) in SAS, and multiple imputation procedures. That'll be the occasional thing, though. Most days it'll be relatively simple regressions. I do my super complex structural equation modeling on a remote server.

All 15" MBPs come with 16 GB of RAM, so no worries.

Can you put a representative case load on a USB stick and take it to a store? At our local Apple Store, they've said it was OK to load something on their systems and see how the performance compared. At the time, it was a large-ish (250K) cell spreadsheet, running under StarOffice. You may not be able to make an exact comparison, but you could get some data points to help you decide.I agree with a previous post - spend the money on other resources, such as RAM or disk.

Hmm... not a bad idea. It can't be a truly representative case – that could violate privacy/intellectual property laws – but I can generate a random number set.

If you will be going VM you won't see much gains of the 2.9 versus the 2.7. You are in the VM software's CPU realm. Which is a generic "processor". VM's present a common profile as it were for the image building. Its why VM's tend to be very stable, they don't have as many driver complications and such.

Most flexibility I have seen really is in your NIC selections for that config. You can adjust the graphic card a bit but its more feature based. A windows VM I will give more power and features. My CLI Linux vms'....lowest memory possible since just text and any features like 3d acceleration not needed so disabled.

Since you mention statistical applications what is being used? Is a switch to R feasible. it remove your platform dependency for starters.

I'm not very familiar with R (I know I'm going to have to learn it eventually). But I'm primarily using SPSS and SAS (in a virtual machine). I use Mplus on a virtual machine at the moment, but I might be loading it on the new Mac.
 
They're both Core i7's with the same amount of L3 cache and roughly the same boost. I think you can figure on a straight clock speed performance difference for pure-CPU jobs in cache, so that's a max of 10%. For real jobs that access memory and do I/O it's probably 5% or less. Only you can decide whether that's worth $200 to you. I know that for me the answer would be no, and I do a fair amount of software development (compiling etc) in a VM.
 
I'm in the market for a new MBP. I've narrowed it down to the new 2016 models, 15in, 500 GB.

Basic information about me: I'm a graduate student at a university also serving as a research assistant. My job requires me to perform relatively complex statistical analyses on large datasets (think 25,000 cases with 64 variables each sort of large). I may need to run these analyses in Windows 8 using VMWare. Other than this, I'll be writing lesson plans and playing the very occasional game. I also use Adobe Creative Suite to design posters for presentations and do some side graphic design work for extra cash.

My dad (a programmer) is insisting that I spring for the 2.9 GhZ processor, but I'm not really sure it's worth the extra $200. So I'm turning to you guys. Thoughts?

Thanks in advance!
For your stated purpose, I agree with the consensus on here, and not spend that extra $200 on the 2.9 GHz processor. Personally I'd put that $200 towards the $100 graphics upgrade to the optional 4GB memory Radeon Pro 460, with a couple of USB-C dongles thrown in for legacy connectivity, or towards the $400 storage bump to 1TB, or if the latter is too much, towards a good size, quality external HDD for TM back-ups.
 
Last edited:
Your dad should know better. You won't notice the difference. Its like having a car that goes 0-60mph in 5sec vs 4.9sec both crazy fast!

Would depend on what his father programs. If in the science/technical realms and/or so called "Big data" you can see even .2gz help. It could chip away seconds (well milliseconds to be accurate lol) here and there on very long analyses.



By long I mean press enter and walk away for a good 15 minutes easy. Because life is too short to watch your stat program think deep and hard for that time frame. Big data gets...well...big lol.

For OP's datasets in time they might see the difference. If they ran the analysis in the native OS, covered how VM works already. Its nice stuff...its just the only thing leveraging the power of the hardware is the Host OS that sits on it. Guest OS's don't even see the hardware.

2011 to 2015 wasn't a omfg massive jump in processor speed grand scheme of things but I did see some time shaved on older analyses I reran for a time comparison. Here enters personal preferences and expectations. My case even a 1 minute savings on 15 minute run to be 14 minutes is pretty good.

It breaks down to scale of work. .4 ghz in the case of 2011 to 2015 in my case, .2 ghz in the case what OP is looking at. this only shaves a millisecond here and there. In a quick excel based analysis of small data its over too quick. Now for some R stuff I do in data science classes the runs can take several minutes. Gets you ample time to see those 1 milliseconds stack up to a somewhat noticeable time difference. Especially in the case of R or the stat packages OP is using. These can be doing some complex things with the data depending on packages called up and the data looked at.
 
... Guest OS's don't even see the hardware...

I think I know what you mean, but just to prevent confusion ... any big name virtual machine manager (VirtualBox, VMware etc) will run user code natively on the hardware CPU. I/O devices get virtualized, so that guest OS's don't normally see actual hardware, or if they do they are constrained in some way.

I didn't want someone unfamiliar with today's VM's to think that user programs running in a VM were interpreted or something like that. They will run pretty much full speed until they try to do I/O of some sort.
 
I think I know what you mean, but just to prevent confusion ... any big name virtual machine manager (VirtualBox, VMware etc) will run user code natively on the hardware CPU. I/O devices get virtualized, so that guest OS's don't normally see actual hardware, or if they do they are constrained in some way.

I didn't want someone unfamiliar with today's VM's to think that user programs running in a VM were interpreted or something like that. They will run pretty much full speed until they try to do I/O of some sort.


Very true. OP or others get access to a nice esx based system....VMware does nice things here (one I work with and administer, won't speak with any authority on others out there).

Now we can go down the container technology route where this gets even more fun too firmly define. I think of as existentialist computing at times. As sometimes its best described as they just exist lol. Whats the app running in? A container. NO I meant what OS? Free your mind of that...it just is running, and that is all there is to it lol. Or like me one day going overboard geek, parallels vm running linux with docker installed to run a docker container running a complete linux install in the repos docker has. OS running in a recursive setup as it were...the odd stuff that amuses me at times lol.
 
Either one should work fine. I got the 2.9 in my new 15 in MBP it's what they had in stock when I walked in the store.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.