Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Mobile Tech Review just put up a video of the 2.8 Ghz version:

https://www.youtube.com/watch?v=HRnVrlFfK0g

Some benchmark info in there plus a little Tomb Raider action starting at 19:20.
Link:
https://youtu.be/HRnVrlFfK0g?t=1160

Darn, you beat me to it. lol
I've been waiting for MTR to do the review for this Laptop cause they are unbiased and fair when reviewing and comparing. Plus they actually game on it and show what it is actually like in real life gaming and tasks in addition to giving it the rundown of benchmarks.
I'd say this should help answer questions or put to rest some of the thoughts of this GPU sucking, (it clearly doesn't suck) it isn't the highest gaming performance GPU but it certainly can get that job done still, and then it barrels through regular video/photo work. *noice! :)
I'm glad that Tomb Raider looks awesome at the high settings, the benchmarks showed a big improvement in many areas and a some slight improvement in others. Overall this video made me even more happy I got this guy. I'm going to go ahead and say good job Apple and AMD, (an AMD GPU that doesn't destroy my laptop... Yes! j/k). ;) Also it is about time Apple upgraded the GPU to something new, they should have done an upgraded GPU last year... but whatever. I'm glad we got one now.
+1 Apple and AMD

Sorry for the long weird post, I apparently need sleep. Ha ha.


Edit: will anyone ever be happy with anything Apple does? lol
Maybe if she had played Arkham Knight (the one that isn't out) that would have shown the real performance of the 370. ;)
It's probably Apple's fault that didn't get released earlier. ;) j/k



Kal.
 
Last edited:
Three year old Cape Verde, two year old Tomb Raider; seems appropriate don't you think? ;)

I am not even sure that this GPU is Cape Verde. This was first claimed by Anandtech who did a article on 8800M series. However, there are also slides from AMD stating that 8800M is based on newer architecture. And AMD SDK also states that 8800M series support OpenCL 2.0, while Cape Verde chips are limited to OpenCL 1.2. If that is all true, than this GPU is not 3 years old, but merely 2 years old :p
 
Darn, you beat me to it. lol
I've been waiting for MTR to do the review for this Laptop cause they are unbiased and fair when reviewing and comparing. ...
Kal.

I really like Lisa as a reviewer.
Nice to see a grown-ass person handling these reviews.
 
I really like Lisa as a reviewer.
Nice to see a grown-ass person handling these reviews.

Completely agree 100%. Lisa does a great job with every apple review that I've seen to date. Very in depth and descriptive of what the people are looking for.

Now after all of these benchmarks and reviews from all of the various sources out there, would you consider this a worthy upgrade or no?
 
Now after all of these benchmarks and reviews from all of the various sources out there, would you consider this a worthy upgrade or no?

For me this is a very nice upgrade.
I'm coming from a late 2010 Macbook Air (the fans in that thing, christ...) and a 2011 "gaming" laptop (with an AMD 6970M) and this updated Pro neatly replaces both of them.
-when it arrives ...in two weeks :(
 
For me this is a very nice upgrade.
I'm coming from a late 2010 Macbook Air (the fans in that thing, christ...) and a 2011 "gaming" laptop (with an AMD 6970M) and this updated Pro neatly replaces both of them.
-when it arrives ...in two weeks :(

Good stuff! My question was mainly in regards to my machine listed within my signature if it was a worthy upgrade or not. I have the Early 2013 with the GT 650m in it and the 2.7 ivy bridge with 16GB ram.
 
Good stuff! My question was mainly in regards to my machine listed within my signature if it was a worthy upgrade or not. I have the Early 2013 with the GT 650m in it and the 2.7 ivy bridge with 16GB ram.

Just the SSD speed will increase 4-5x, and feel like you've moved to a super computer. The GPU will be steeper for you then 750m users as well. I would say it would be solid.
 
Just the SSD speed will increase 4-5x, and feel like you've moved to a super computer. The GPU will be steeper for you then 750m users as well. I would say it would be solid.

He probably wont even feel the difference here unless he is moving large files from an ssd to ssd constantly.
 
  • Like
Reactions: Jaro65
Just the SSD speed will increase 4-5x, and feel like you've moved to a super computer. The GPU will be steeper for you then 750m users as well. I would say it would be solid.

I feel the same as well. I wonder how much of a comparison the GT 650m is compared to the m370x in terms of performance and FPS and what not. The only real game I play is WoW and currently I play at 2880x1800 on low settings and get like 40 FPS and when dong intense stuff like 25 man raids I hit below 30 sometimes teens so I'm curious as to how much of a performance delta there is between mine and the new GPU. Clearly there will be a gain here but just wonder how much. Thanks though mike for all of your hard work on this topic in here over the past week :)
 
Good stuff! My question was mainly in regards to my machine listed within my signature if it was a worthy upgrade or not. I have the Early 2013 with the GT 650m in it and the 2.7 ivy bridge with 16GB ram.

I'm not really comfortable giving direct advice in these matters.
It comes down to personal needs and wants.
If you need the GPU and SSD bump; go for it.

I'd say the M370X should give you a decent 50-100% performance boost, depending on the task. Then again, I am hardly an expert on such things.

In my case, I don't play a lot of recently released games; mostly "old" stuff.
So I'm happy with a decent gpu that is at about the same level as my 4 year old gaming laptop.

Sorry for this, a bit of a digression:
I hated that damned gaming laptop. It was loud as all hell, big, heavy and a pain to type on. I mean, even when idle with the fans on their lowest level you could hear the cursed noise across the room. And once those fans hit Mach 3...you could burn down the house with that thing.

So, if I had your machine and could use it for browsing and office work without the fans making my ears bleed and play the odd bout of Skyrim/Shogun 2/Pillars of Eternity/Deus Ex Human Revolution at decent settings; I would not upgrade.
 
He probably wont even feel the difference here unless he is moving large files from an ssd to ssd constantly.

I could feel a significant difference when launching apps, and Im already using a 2X PCIE ssd. Safari also felt significantly snappier. For me the biggest perception in speed was Xcode, but most probably don't use it for that. Screen loads in D3 and WOW were quicker too. Again, it depends on what you value.
 
I could feel a significant difference when launching apps, and Im already using a 2X PCIE ssd. Safari also felt significantly snappier. For me the biggest perception in speed was Xcode, but most probably don't use it for that. Screen loads in D3 and WOW were quicker too. Again, it depends on what you value.

I mainly value the SSD and the dGPU. The CPUs are so damn good now I mean it's just nuts. I would also consider getting the 2.8 processor instead of the 2.5. Seems like it might be worth it.
 
I am not even sure that this GPU is Cape Verde. This was first claimed by Anandtech who did a article on 8800M series. However, there are also slides from AMD stating that 8800M is based on newer architecture. And AMD SDK also states that 8800M series support OpenCL 2.0, while Cape Verde chips are limited to OpenCL 1.2. If that is all true, than this GPU is not 3 years old, but merely 2 years old :p

You may be right, but usually the chip's ID is the same as Cape Verde and that is usually hard-coded into the silicon, not from the driver per say. The only possibility is GPU-Z may be wrong (which it was with the 970 fiasco)

Good stuff! My question was mainly in regards to my machine listed within my signature if it was a worthy upgrade or not. I have the Early 2013 with the GT 650m in it and the 2.7 ivy bridge with 16GB ram.

Keep your machine you have the top of the line Ivy Bridge which isn't much slower than Haswell. Unless you're going to moving gigabytes of stuff between a Thunderbolt 2 SSD array and your SSD, you're not going to notice a huge difference in day to day activity. The CPU is the same as last years model, and we've only seen mostly synthetic benchmarking on the GPU so far. As above, unless GPU-Z is wrong, you're paying top dollar for a 3 year old GPU. (In other words, it's old enough that Apple could've put it in your current laptop)

Next year is going to be Skylake or bust... There's also potential for a 14nm dGPU in there which will be the first die shrink in GPU tech in over 3 years (a lifetime in the computer world.)
 
You may be right, but usually the chip's ID is the same as Cape Verde and that is usually hard-coded into the silicon, not from the driver per say. The only possibility is GPU-Z may be wrong (which it was with the 970 fiasco)



Keep your machine you have the top of the line Ivy Bridge which isn't much slower than Haswell. Unless you're going to moving gigabytes of stuff between a Thunderbolt 2 SSD array and your SSD, you're not going to notice a huge difference in day to day activity. The CPU is the same as last years model, and we've only seen mostly synthetic benchmarking on the GPU so far. As above, unless GPU-Z is wrong, you're paying top dollar for a 3 year old GPU. (In other words, it's old enough that Apple could've put it in your current laptop)

Next year is going to be Skylake or bust... There's also potential for a 14nm dGPU in there which will be the first die shrink in GPU tech in over 3 years (a lifetime in the computer world.)

Interesting stuff. All fair points. I guess it comes down to my specific needs and day to day tasks as to whether it's worth it to sell or hold on. I appreciate your feedback!
 
Interesting stuff. All fair points. I guess it comes down to my specific needs and day to day tasks as to whether it's worth it to sell or hold on. I appreciate your feedback!

No problem, I see you have almost the same machine as mine. At this point, I'll have to get the $3000+ machine as I really need a bigger SSD, my 512gb is almost always maxed out between 4K video files and bootcamp...
 
You may be right, but usually the chip's ID is the same as Cape Verde and that is usually hard-coded into the silicon, not from the driver per say. The only possibility is GPU-Z may be wrong (which it was with the 970 fiasco)

Well, that's the thing. The device ID of HD7770 (the 640 shader cores Cape Verde) is 0x683D. The device ID of the 8870M and M370X is 0x6281. As far as I know, the GPU-Z database is created by people, the information about chips etc. is not pulled from the GPU itself. And I would guess that after Anandtech declared it as Cape Verde, GPU-Z creators might very well have taken that information over. At any rate, I was unable to find any official (that is, with AMD being the definitive source) information on the matter. The only official-looking things I saw what that presentation which states that 8870M has the second-gen GCN (what Anandtech calls GCN 1.1) and that 8870M supports OpenCL 2.0 (which Cape Verde certainly does not).

So yeah, in the end its very confusing. Its unfortunate that AMD would not provide any clear information on their website. All this rebranding and tons of different chips which are then the same makes it very difficult to understand what is going on. And then you have reputable sources like Anadtech and others, which provide contradictory information and people repeat this stuff all the time, often mixing bits and pieces.
 
So yeah, in the end its very confusing. Its unfortunate that AMD would not provide any clear information on their website. All this rebranding and tons of different chips which are then the same makes it very difficult to understand what is going on.

Wholeheartedly agree with you on this.
I'd love to get an official statement from AMD on the nature of the M370X.

And then there is the little gnome at the back of my head telling me that AMD is keeping mum because the details are unsavory...shut up, Gnome, you suck the joy out of life.
 
I'd love to see Premiere Pro / Media Encoder export benchmarks using OpenCL vs the 370m in both OpenCL and CUDA, as glitchy as it is.

I'm usually forced to export software only on my rMBP because of the showstopping CUDA issue, and openCL export glitches, making a 2 minute export take around 20 minutes.


Not fun for a $2800 machine that's barely two years old.
 
I also just submitted a support / product info request on this GPU to AMD, asking what architecture it's based on and how many shader cores it has. Wonder if they'll provide it.
 
Agreed, if FCPX truly renders close to 2x faster than the 750M I may just upgrade to the 2015 anyway even without the major dGPU upgrade.
With a caveat, I'll agree. As an owner of a late-2013 rMBP I was impressed by the results of the new Mac, but then I read the compared Macs and found that the late-2013 is crippled (to a degree) in that comparison by the last spec: "512GB PCIe-based Flash Storage (x2 Link Width)", the only Mac in that comparison with a x2 Link Width PCIe SSD while the other three have a x4 Link Width PCIe SSD.

I'm holding onto my 1TB SSD dGPU Mac for just a little longer (read: Skylake) - and I'll feel good about it. IMHO robART isn't fully comparing comparable Apples to Apples in that comparison.
 
I also just submitted a support / product info request on this GPU to AMD, asking what architecture it's based on and how many shader cores it has. Wonder if they'll provide it.

Good luck, as AMD wasn't forthcoming at all to even Anandtech which is a major tech site. You'll probably get some canned response from some useless PR lackey.
 
He probably wont even feel the difference here unless he is moving large files from an ssd to ssd constantly.

The inverse benefit of ludicrous speed is battery life. As transfer speeds get faster, system spends less of its time in full power state, thus sleeping more, saving power. It's the entire theory behind "race to sleep" for CPUs and cell radios which makes expending a bit more energy for a short period of time worth it instead of drawing it out over time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.