Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
so is there any reason that apple did not go with maxwell nvidia this year ?

Because they went with AMD, I imagine the reasons were far to numerous to list here and that only the apple employees involved in that decision know what they are and how the pro's and cons worked out.

Despite the NVIDIA love going on here (I don't game so I couldn't care less about them) the latest Fury AMD cards are benching faster in just about everything. Their new architecture with embedded RAM seems to be a bit of a beast with massive bandwidth capabilities. Once these get to the mobile stage we could have some wonderful graphics cards in the rMBP's and iMac's.
 
Because they went with AMD, I imagine the reasons were far to numerous to list here and that only the apple employees involved in that decision know what they are and how the pro's and cons worked out.

Despite the NVIDIA love going on here (I don't game so I couldn't care less about them) the latest Fury AMD cards are benching faster in just about everything. Their new architecture with embedded RAM seems to be a bit of a beast with massive bandwidth capabilities. Once these get to the mobile stage we could have some wonderful graphics cards in the rMBP's and iMac's.

Fury's still to hot to make it into mobile I'd bet we're still a couple years out for that. I do agree that '16 and '17 are going to be killer years for AMD and it's about time.
 
I'm gonna go ahead and provide some of my experiences and advice here:

Just use what you like. If you're into Windows, that's cool. If you prefer the cost effectiveness of a Windows notebook in terms of what you get, that's cool too. Macs aren't better than PCs by default, PCs aren't better than Macs by default. No need to defend purchases, build quality or other standpoints of what makes up a computer.

That Asus may last you a year, or it may last you 3 or even 4 - it all comes down to what you prefer, and what you prefer to do with your money. It's your computer after all - no need to bash or defend what someone else is considering or is happy with.

I, for one, in general with a few exceptions, believe the longevity of a computer is influenced mostly by how it is taken care of - not because its Apple, Asus, Acer, or what have you. I am aware of the idea and reality of build quality - but I had an Asus notebook for 3 years that I paid $650 for, a Lenovo Ideapad (also $650) that is still being used within my family for 3 and a half years now, and a Hannspree (that's right - who here has ever heard of Hannspree) netbook that I paid $250 for which I used for a while and now another family member is using it, and that was purchased 3 years ago as well.

There is no default answer to this. Macs don't last X amount of years and Windows notebooks don't last Y amount of years - and, as always ... hardware specifications are not a sole reason to purchase a notebook. There are many other factors, just do your homework and find what you prefer.
I have an Asus notebook, with already outdated core i5 2410 m processor, 8 GB of ram, gt 540m gpu. I use it from the very first day until the day I leave from my home to here for a higher degree.

When I add ram from 2 GB default to 8 GB, I encounter intolerable BSOD issue. Asus finally replace me a new mother board, and then, there is no significant issue at all. Very solid, and still can do decent job, including heavy gaming (you may know supreme commander, which is a gpu hunger). On the contrary, many of my school mate computer is suffering from overheat issue or hard disk error.

I agree with your point. If you use it carefully, then it just can last long enough time before it dies. If not, even ultra expensive rMB could become unusable in a few days.
 
I'm typing this on a cheapo Acer window 8.1 laptop. Within a year of purchasing this the DVD/CDR drive failed, the number keys above the keyboard stopped working; numbers three through six do not work although the function keys do and I have a long Ethernet cable snaking across my kitchen floor because the wireless network adapter stopped working. All within a year.
sad for your experience
I still have an eMachines (a sub-brand of Acer) from 2011, and i treat it in a very good way.
Still alive and effective for most of day-to-day tasks.

So it does matter a lot how do you guys care ur electronics.
 
Up until about 6 weeks ago I was using a 2009 era Latitude E6500 my new laptop is a 2011 Latitude E6420. PC will last just as long as Mac's
 
ere is no default answer to this. Macs don't last X amount of years and Windows notebooks don't last Y amount of years - and, as always ... hardware specifications are not a sole reason to purchase a notebook. There are many other factors, just do your homework and find what you prefer.

That's the thing, while I agree mostly about Apple's build quality, many people here who tout its quality typically buy new laptops every three years. To me, I'd rather spend < 1,000 every 3 years then > 2,000 every 3 years.

As for Apple quality, my faith in them has been shaken, with the issues with the dGPU, I still think Macs are great computers and I its hard for me to imagine getting anything else, yet if I'm spending over 2,000 for a computer I'd like the peace of mind that the dPGU won't blow out in 3 years or the anti-reflective coating won't start flaking off.
 
That's the thing, while I agree mostly about Apple's build quality, many people here who tout its quality typically buy new laptops every three years. To me, I'd rather spend < 1,000 every 3 years then > 2,000 every 3 years.

As for Apple quality, my faith in them has been shaken, with the issues with the dGPU, I still think Macs are great computers and I its hard for me to imagine getting anything else, yet if I'm spending over 2,000 for a computer I'd like the peace of mind that the dPGU won't blow out in 3 years or the anti-reflective coating won't start flaking off.

That peace of mind would be nice but you just won't get it from anyone, all electronnics can fail, and all mass manufactured items will have issues in some units, thats life I'm afraid.

I just don't understand why people hold apple to far higher standards than they do everyone else...

You can't just say price when all the other OEM's make computers that are very comparable in their costs and specs if seemingly not build quality and attention to detail.
 
Despite the NVIDIA love going on here (I don't game so I couldn't care less about them) the latest Fury AMD cards are benching faster in just about everything. Their new architecture with embedded RAM seems to be a bit of a beast with massive bandwidth capabilities. Once these get to the mobile stage we could have some wonderful graphics cards in the rMBP's and iMac's.
Fury has almost no overclocking headroom is still ever so slithly slower than the equally big 980 TI chip which has a quite serious overclock headroom. Fury is much better than the 290x/390x but it still needs more power than Maxwell despite the more power efficient and faster memory. It also is water cooled.
They can close in on Nvidia on Desktops where water cooling works and the total power consumption is not as important but on notebooks power efficiency is everything. And Fury cannot beat Maxwell, the M370X is still a much older tech than Fury.
It is not about loving Nvidia or AMD but just objective results in which Maxwell quite simply is the best that goes around right now.
 
  • Like
Reactions: vbedia
Well if we go back to 2013 you may note that in June at WWDC 13 they announced the Mac Pro with Dual AMD graphics. This was the start of Apple turning away from NVIDIA.

In October that year they released the MacBook Pro 15" with 750m graphics but this was the same chip as the 650m, both kepler, same pin out on the chip. All Apple had to do was swap the chips over. Very little change.

Fast forward to the iMac Retina, AMD again, now we have the 2015 rMBP, also AMD.

I think what we are seeing here isn't really that Apple is choosing AMD because it's better but because they are curating a closer relationship with AMD when it comes to graphics. That close relationship has resulted in the D300, D500 and D700 for Apples Mac Pro which are completely Apple exclusives still to this day (although they are based on the HD 7970) and they have been first to get the M370X too.

NVIDIA seems to be much less flexible, they won't be dictated to. They will not for example give Apple GTX 980's or 780 Ti's (back then) rebranded as Quadro D780's. This is what AMD did for Apple.

So again I don't think it has anything to do with performance it's about a relationship the two companies are curating and it's pushing NVIDIA out of Apple's systems and I think it's detrimental as NVIDIA currently has the fastest parts in every thermal envelope.
Have you actually used AMD and Nvidia GPU for anything like Final Cut Pro?

From my personal experience in real world tasks, in content creation AMD GPUs are faster than Nvidia ones. MUCH Faster. R9 280X in Mac Pro 5.1(3.46 GHz, 24 GB of RAM) was faster than GTX970. 40% faster.

I know people will show me benchmarks from benchmarks that show how Nvidia GPUs are better than AMD in Face detection. OK. But Sony Vegas Pro is a benchmark which simulates the real world tasks. Again, it Simulates. And even that benchmark show, that in transcoding, for example, AMD GPUs are faster than Nvidia ones. Only thing that makes Nvidia better in peoples eyes is the gaming brand appeal. But even that does not show the whole truth. In 4K for example R9 390X is only 20% slower than GTX980 Ti, which is everywhere praised for its performance. And as we already seen in gaming, AMD GPUs are getting better with every year, whereas Nvidia ones are getting worse. Compare benchmarks of r9 290X vs 780 Ti from 2013 and from 2015. They are really different. Now the R9 290X is faster.

Also, DirectX, Metal and Vulkan will shake things up. Adoption of low-overhead APIs was what AMD needed.

Fury has almost no overclocking headroom is still ever so slithly slower than the equally big 980 TI chip which has a quite serious overclock headroom. Fury is much better than the 290x/390x but it still needs more power than Maxwell despite the more power efficient and faster memory. It also is water cooled.
They can close in on Nvidia on Desktops where water cooling works and the total power consumption is not as important but on notebooks power efficiency is everything. And Fury cannot beat Maxwell, the M370X is still a much older tech than Fury.
It is not about loving Nvidia or AMD but just objective results in which Maxwell quite simply is the best that goes around right now.
It turns out, you may not be right. http://forums.anandtech.com/showpost.php?p=37519644&postcount=201
Check it out yourself. Before OC the GPU was at 12K overall score, and 15.5K Graphics score.
100 MHz, and the GPU starts to FLY! The score is higher than anything right now on the market.

Edit. About the power efficiency...
If you have a reference Nvidia GPU, then yes it will use less power than AMD GPU. However, Reference GTX 970 with BIOS clock cap will be 3 times slower than R9 280X. If you have non reference GTX 970, there will be no clock cap in BIOS, and it will be only 40% slower. The problem is: It will draw 270W of power in OPENCL work.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html Here you have a proof.
 
Last edited:
I thought the same thing, but if you actually look at Maxwell OpenCL benchmarks all of NVIDIA's cards stomp AMD.

Really. Look at the 970 or 980 or Titan or Titan X or 980 Ti. Way above all the AMD cards in OpenCL benchmarks. Same situation on the mobile versions, 950m and 960m.

The older NVIDIA kepler cards were terrible at OpenCL absolutely rubbish, 70%-90% slower than AMD. But the Maxwell cards are 10-50% faster than AMD at the same market segments.

Even I was surprised to find this out when I investigated benchmarks recently, check this out: https://i.imgur.com/wiK5ZSG.png
They are not faster in REAL WORLD TASKS! Use Maxwell GPU for Final Cut Pro and you will throw it away and go back to AMD solution. I have tested GTX970 and R9 280X in Final Cut Pro, and AMD GPU was 40% faster!

Sony Vegas Pro benchamark show only a glimpse of difference between AMD and Nvidia GPU in transcoding, but its a pretty big scale.
 
Have you actually used AMD and Nvidia GPU for anything like Final Cut Pro?

I have not. I've only looked at benchmarks of general OpenCL performance. If you have any links to benchmarks showing differences in video suites I'd be interested in looking at them to educate myself.

They are not faster in REAL WORLD TASKS! Use Maxwell GPU for Final Cut Pro and you will throw it away and go back to AMD solution. I have tested GTX970 and R9 280X in Final Cut Pro, and AMD GPU was 40% faster!

Sony Vegas Pro benchamark show only a glimpse of difference between AMD and Nvidia GPU in transcoding, but its a pretty big scale.

Again, if you have any benchmarks I'd be happy to look at them.
 
I can only give you what I have tested on my own. None of the reviews I have seen tested the real world benchmarks, apart from real world workload simulation of Sony Vegas Pro. And that is the benchmarks to go for it.
 
I think something to consider is Apple doesn't actually ship the GTX 970 or 980 in any of their systems so I wouldn't expect Final Cut Pro to sing on it. I also wouldn't expect 3rd party developers to focus on that hardware.

But if all the OpenCL benchmarks are showing NVIDIA is faster then there is probably something to that. OpenCL is afterall just an API to perform workloads on the GPU.
 
No. Not all.
http://scr3.golem.de/screenshots/15...uxmark-v3-(complex-scene,-gpu-only)-chart.png
http://s28.postimg.org/a8bdgk7pp/Capture.png

The second one is from the best review site right now: computerbase.de.

And the performance of Final Cut Pro is not due to the optimization for AMD hardware. They are simply faster in OpenCL, real world tasks. If you want to believe that FaceDetection is key for OpenCL performance then go ahead, try transcoding a film using face detection.

Lets end it here.
 
No. Not all.
http://scr3.golem.de/screenshots/15...uxmark-v3-(complex-scene,-gpu-only)-chart.png
http://s28.postimg.org/a8bdgk7pp/Capture.png

The second one is from the best review site right now: computerbase.de.

And the performance of Final Cut Pro is not due to the optimization for AMD hardware. They are simply faster in OpenCL, real world tasks. If you want to believe that FaceDetection is key for OpenCL performance then go ahead, try transcoding a film using face detection.

Lets end it here.

Ok, first of all, calm down. Secondly, I've not once mentioned anything about face detection, that's all you. The OpenCL benchmark I posted a few pages back is a OpenCL scene rendering. You can view it here if you'd care to take a look: https://i.imgur.com/wiK5ZSG.png

Obviously the Fury X is the new OpenCL king, it has only been out a few days though. Perhaps we'll find that GPU in the refreshed Mac Pro, it's clear Apple is using AMD at every opportunity now, I don't for a second believe it's because of OpenCL performance though.
 
I did not posted that link because of Fury. You have there Grenada/Hawaii GPUs and GM204. Also Grenada turns out to be faster in some tasks than GTX Titan X.

Also, Luxmark v2.1 is old benchmark. Luxmark v3 is the same but newer and more complex. Its like comparing scores of 3dMark Vantage with 3dMark FireStrike. The second one is newer, more complex and shows a bit more than the old one.

Third. Im completely calm. I just don't buy everything that is in reviews. If you want to believe that Maxwell GPUs are better in compute, which they are not, its up to you. I know that not everything what gets in reviews is true, and always gets really outdated with every new version of Drivers/technology that comes along.
 
I did not posted that link because of Fury. You have there Grenada/Hawaii GPUs and GM204. Also Grenada turns out to be faster in some tasks than GTX Titan X.

Also, Luxmark v2.1 is old benchmark. Luxmark v3 is the same but newer and more complex. Its like comparing scores of 3dMark Vantage with 3dMark FireStrike. The second one is newer, more complex and shows a bit more than the old one.

Third. Im completely calm. I just don't buy everything that is in reviews. If you want to believe that Maxwell GPUs are better in compute, which they are not, its up to you. I know that not everything what gets in reviews is true, and always gets really outdated with every new version of Drivers/technology that comes along.

I'm not saying I'm not believing the benchmarks you shared. I just don't think it is as black and white as you make out, many of the differences are within a few percentage points if we discount the Fury X, which I don't think we should because it probably will be used in the Mac Pro refresh.

And I agree the newer benchmarks should be focused on. I'm not some Luxmark fanboy, I don't know anything about it, I just googled OpenCL Maxwell and it was the top hit.

Sip some lemonade and enjoy the warm weather.
 
It turns out, you may not be right. http://forums.anandtech.com/showpost.php?p=37519644&postcount=201
Check it out yourself. Before OC the GPU was at 12K overall score, and 15.5K Graphics score.
100 MHz, and the GPU starts to FLY! The score is higher than anything right now on the market.
You might want to take a closer look. The score jumped from 14k default to 17k while Maxwell OC hits 19.4k (while being Air cooled)
http://uk.hardware.info/reviews/615...-flag-ship-graphics-card-overclocking-results
Fury is great value on desktops because you get a nice and relatively quite water cooling system which alone would set you back quite a bit of pesetos. AMD also said memory oc is pointless on those GPUs so I have my doubts about these overclocking results. I check a few forum threads and none could find anything confirming these results. One did not manage such an overclock and got nothing worthwhile out of a 50Mhz memory overclock.
Edit. About the power efficiency...
If you have a reference Nvidia GPU, then yes it will use less power than AMD GPU. However, Reference GTX 970 with BIOS clock cap will be 3 times slower than R9 280X. If you have non reference GTX 970, there will be no clock cap in BIOS, and it will be only 40% slower. The problem is: It will draw 270W of power in OPENCL work.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html Here you have a proof.
A 280X is a 20-30% slower card on average than a 970 in 3D work. OpenCL work and a stress test are not the same thing. In theory you could get such load secnarios but in reality most CL benches don't come anywhere near it.
A lot of the weakness Nvidia shows in such computer scenarios is purely the driver.
Compare the quadro M6000 to something like the Firepro W9100.
http://www.tomshardware.de/quadro-m...ikkarte-benchmarks,testberichte-241756-7.html
Maxwell is quite good in all these scenarios it just so happens that Nvidia does not want its consumer cards to be any good in workstation scenarios. Since Apple is reponsible for a significant part of the driver on OSX they would not have quite a purposefully crippeled performance in these kinds of load scenarios under OSX. Probably not quite Windows performance but not the stark difference between workstation and consumer performance.
 
We are at the point where Apple does need to try and lead the market. I understand in terms of revenue and logistics it does things in cycles whereas the PC makers are more nimble. The "pro" market could be better looked after with options for more. I don't believe having the 750m is about a certain vision of computing but rather about buying in bulk and using a cheaper, weaker chip for the power savings, price, etc. There are those that would pay for the 970m but the designs now value form over function to an extent.
 
Dusk, tell me why, Luxmark v2 show that GM200 is faster than Hawaii GPU, and Luxmark v3 shows that Hawaii GPU is faster than GM200? Both are the same chips. There is no crippling of performance, because Maxwell in DP is ...t. They are differed by names only. How is that possible, that older bench shows advantage of Maxwell GPU, and newer one shows advantage of Hawaii GPU?

http://scr3.golem.de/screenshots/15...uxmark-v3-(complex-scene,-gpu-only)-chart.png
Which chart is more valid, in your opinion? Older, or newer?
 
The 960m is 60% faster than the m370x. I don't think it's overrated, but rather a preferred card to the lower end AMD Apple chose for this refresh. The 950m is also a much better card. I do agree with the complaints. Apple is charging the same price as always, and while the new card is more powerful, it's a step down in overall class.

The 960M is also 50% more power hungry than the 370x, which won't fly in an already tight TDP design. The best Apple would be able to do is the 950M, which is similar to AMD's offering, maybe a little better. Why they didn't go for Nvidia I have no idea.
 
I would not go that far with power consumption. GTX960 should be within 35-45W power draw envelope. And by looking at notebookcheck review of MBP, looks like the power consumption is pretty high, because it can exceed even the 85W limit of power supply, by 6W. So I would not say that its more power hungry than M370X.

Also, I would not say its 60% faster than M370X, because performance in Final Cut Pro will be much higher on AMD chip than any Maxwell GPU available for 45W power budget.

In games, possibly Nvidia GPU will be even 60% faster. But games are not everything.
 
I would not go that far with power consumption. GTX960 should be within 35-45W power draw envelope. And by looking at notebookcheck review of MBP, looks like the power consumption is pretty high, because it can exceed even the 85W limit of power supply, by 6W. So I would not say that its more power hungry than M370X.

Also, I would not say its 60% faster than M370X, because performance in Final Cut Pro will be much higher on AMD chip than any Maxwell GPU available for 45W power budget.

In games, possibly Nvidia GPU will be even 60% faster. But games are not everything.

The GTX 960m is 75 Watt, the 950m is 45 Watt, the M370X is 50 Watt (apparently).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.