Best MacBook Pro for a computer science students

The best MacBook for CS (read thread first)


  • Total voters
    25

siebe004

macrumors newbie
Original poster
Sep 6, 2018
1
0
Belgium
Soon I'm going to study computer science at uni, so I need a MacBook Pro but I want to know which of these models would be the best for me. They are both the same prize €1999 and both 13".

1. Macbook Pro 2017 without Touch Bar
  • 2.5GHz dual-core 7th-generation Intel Core i7 processor, Turbo Boost up to 4.0GHz
  • Intel Iris Plus Graphics 640
  • 16GB 2133MHz LPDDR3 memory
  • 128GB SSD storage
  • Two Thunderbolt 3 ports
2. Macbook Pro 2018 with Touch Bar
  • 2.3GHz quad‑core 8th‑generation Intel Core i5 processor, Turbo Boost up to 3.8GHz
  • Retina display with True Tone
  • Touch Bar and Touch ID
  • Intel Iris Plus Graphics 655
  • 8GB 2133MHz LPDDR3 memory
  • 256GB SSD storage
  • Four Thunderbolt 3 ports
 

Audit13

macrumors 603
Apr 19, 2017
5,617
1,434
Toronto, Ontario, Canada
Does the science program require you to run specific software for Mac, windows, Linux, or another operating system?

I recommend matching the hardware to the required programs.
 
  • Like
Reactions: duervo

psingh01

macrumors 65816
Apr 19, 2004
1,479
484
I second that. Get what you can comfortably afford. Laptops in general have been powerful enough to do most things for many many years now. Long gone are the days when you sacrificed power for portability in a laptop.

What you do at school isn't going to require any crazy amount of power anyway either. You can get the cheapest MacBook and it'll be fine imo. A MBP of any year will be more than enough.
 

icymountain

macrumors 6502
Dec 12, 2006
332
166
Impossible to say with so little information. In absolute terms, I would say that 128GB of storage is more likely to be limiting than 8GB of RAM.

What kind of CS curriculum are you going to start ?
For some, you may need more storage (e.g., if using a lot of Virtual Machines), for others you will like more RAM.

Will you have access to other computers ?
If you have a desktop at home, storage need on the laptop is lower.

What software will you use ? Will you study theoretical CS, or will you do a lot of programming ? If the latter, with what environment ? (some are ways heavier than others).
 

jerryk

macrumors 603
Nov 3, 2011
6,057
3,065
SF Bay Area
Either one will work. I would lean toward the 2018 just because I think it is more reliable (keyboard). But either one will work. The only issue I have both of these is that they are 13" systems. I find 13" tight for coding.
 

sosumi99

macrumors 6502
Oct 27, 2003
362
320
As has been said, both are way overkill for what you need. I personally lean toward the 2017 for your use because many comp sci programs will have you learn something like vi/vim, and the escape key is crucial. The TB MBPs have the escape key as a virtual key on the TB, and that is just a horrible horrible user experience. Thought I'd point this out because it's not the kind of thing most people think about.
 
  • Like
Reactions: kanon14

haralds

macrumors 65816
Jan 3, 2014
1,225
374
Silicon Valley, CA
Go for the larger screen size and look for a larger monitor in the dorm. Make sure you have enough SSD space to run BootCamp, if a class demands a Windows based IDE. Make sure, you have a backup drive to CarbonCopy clone - not just for backup but also emergency running. Trouble always hits right before the due date.
 

buran-energia

macrumors 6502
Oct 9, 2017
283
103
2018, no brainer.

128 GB is too little. I'd stay away from the non-TB models because of inferior cooling solution (1 fan vs 2) and weaker gpu and cpu (cpu especially compared to 2018).

You could wait a bit and see what Apple will show in a week. Btw if the student discount applies to custom models, you can get 2018 16gb for the same price.
 
  • Like
Reactions: afir93

Ries

macrumors 68020
Apr 21, 2007
2,212
2,628
Screen estate / RAM / Storage and a dGPU (doesn't have to be top model, it's purely for learning GPU accelerated stuff), the rest is inconsequential as long it's somewhat decent. Anything requiring horse power, will properly be run on a desktop in the lab anyway.
 

deadworlds

macrumors 65816
Jun 15, 2007
1,026
752
Citrus Heights,CA
As a current computer science student I would say to get something that will grow with you. In my course work I had to regularly run VMs because some of the courses had us working with Linux. There were also some courses that had to use windows software, like visual studio. I personally would go with a 15” model because you’re going to spend a lot of time sitting and staring at the screen, bigger size will put less strain on the eyes. My courses also included some projects where I had to create some games using either unreal engine or unity so having a dedicated GPu Was very helpful.
I guess what I’m saying is your don’t really know how intense your course work will be, so if you have the money you should get something that is able to grow and handle any surprises as your progress through your courses.
 

jerryk

macrumors 603
Nov 3, 2011
6,057
3,065
SF Bay Area
Screen estate / RAM / Storage and a dGPU (doesn't have to be top model, it's purely for learning GPU accelerated stuff), the rest is inconsequential as long it's somewhat decent. Anything requiring horse power, will properly be run on a desktop in the lab anyway.
If you mean Machine learning with a GPU you need an NVidia GPU, not AMD.
 

Ma2k5

macrumors 68020
Dec 21, 2012
2,476
2,428
London
Maybe see if Apple release something in the next few weeks if you can wait. The potentially new MacBook Air might suit your needs better.
 
  • Like
Reactions: uecker87

Ries

macrumors 68020
Apr 21, 2007
2,212
2,628
If you mean Machine learning with a GPU you need an NVidia GPU, not AMD.
There are several options for AMD on a Mac, like PlaidML with Keras or booting into linux and using ROCm. (Or an eGPU with Nvidia)
 

jerryk

macrumors 603
Nov 3, 2011
6,057
3,065
SF Bay Area
There are several options for AMD on a Mac, like PlaidML with Keras or booting into linux and using ROCm. (Or an eGPU with Nvidia)
Unless it changed recently, ROCm is not gaining much traction. The contributions and updates to Github repo have been minimal. It's current status on Github shows "
There hasn’t been any commit activity on RadeonOpenCompute/ROCm in the last week." O pull requests.

Why we may not like it, ML is a CUDA world at the low end. And the large companies, (Google, Microsoft, and others) are building custom, low bit hardware that is dedicated to the ML instead of using Graphics processors. Google is even pushing their TPUs down to Edge computing devices soon.
 
Last edited:

Ries

macrumors 68020
Apr 21, 2007
2,212
2,628
Unless it changed recently, ROCm is not gaining much traction. The contributions and updates to Github repo have been minimal. It's current status on Github shows "
There hasn’t been any commit activity on RadeonOpenCompute/ROCm in the last week." O pull requests.

Why we may not like it, ML is a CUDA world at the low end. And the large companies, (Google, Microsoft, and others) are building custom, low bit hardware that is dedicated to the ML instead of using Graphics processors. Google is even pushing their TPUs down to Edge computing devices soon.
Sure, but he is a CS student, the point is to learn how and why (The principle), not a specific implementation. Once you know the stuff you can apply it to any implementation you want.
 

jerryk

macrumors 603
Nov 3, 2011
6,057
3,065
SF Bay Area
Sure, but he is a CS student, the point is to learn how and why (The principle), not a specific implementation. Once you know the stuff you can apply it to any implementation you want.
Just pointing out AMD is not useful.

As a CS student you may or may not need GPU acceleration in ML. But, if a student's program builds more than basic models, especially Deep NNs, training time on a CPU only can be very long. And it really sucks when it turns out you made a mistake and just wasted hours or days. Better to find out you made an error 10 to 20 times faster with a GPU.
 

Ma2k5

macrumors 68020
Dec 21, 2012
2,476
2,428
London
Just pointing out AMD is not useful.

As a CS student you may or may not need GPU acceleration in ML. But, if a student's program builds more than basic models, especially Deep NNs, training time on a CPU only can be very long. And it really sucks when it turns out you made a mistake and just wasted hours or days. Better to find out you made an error 10 to 20 times faster with a GPU.
If it helps, I’d presume any course which benefits dGPU will have resources allocated to allow students to do the work well without having to buy a dGPU equipped machine (be it lab computers or a Remote Desktop or service to log in to which does the processing server side). Most students have very crappy laptops to be fair, in talking sub £300-£500. At my university we was able to remote login to our university machines (which are far more powerful than a normal laptop).
 
  • Like
Reactions: macjunk(ie)

jerryk

macrumors 603
Nov 3, 2011
6,057
3,065
SF Bay Area
If it helps, I’d presume any course which benefits dGPU will have resources allocated to allow students to do the work well without having to buy a dGPU equipped machine (be it lab computers or a Remote Desktop or service to log in to which does the processing server side). Most students have very crappy laptops to be fair, in talking sub £300-£500. At my university we was able to remote login to our university machines (which are far more powerful than a normal laptop).
Sounds like a good program. Here in the states you see some ML programs aligning themselves with the large companies such as Google for the same reason. But, you are typically only given a finite amount of time on these servers.

Did you program have such restrictions. Or did they have enough hardware to not worry about it?
 

Ma2k5

macrumors 68020
Dec 21, 2012
2,476
2,428
London
Sounds like a good program. Here in the states you see some ML programs aligning themselves with the large companies such as Google for the same reason. But, you are typically only given a finite amount of time on these servers.

Did you program have such restrictions. Or did they have enough hardware to not worry about it?
We had no restrictions that I know of (we wasn’t made aware of any).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.