The AMD 560 Pro GPU is so far behind nVidia. Any hope for an improvement?

Discussion in 'MacBook Pro' started by whitedragon101, Apr 2, 2018.

  1. whitedragon101 macrumors 65816

    Joined:
    Sep 11, 2008
    #1
    I just checked some benchmarks and this one says it all. The Current high end Macbook Pro uses an AMD 560 Pro GPU and compared against the nVidia offerings its nowhere. Now yes these GPUs are desktop ones but remember since the 900 series GPUs nVidia mobile variants are within 10-20% of their desktop counterparts. Some of the nVidia mobile versions offer equal performance to the non overclocked desktop chips.

    https://www.videocardbenchmark.net/gpu.php?gpu=Radeon+Pro+560&id=3820

    AMD are so far behind nVidia you need binoculars to see them. Are Apple going to stay the course with AMD just because they get them cheap?
     
  2. leman macrumors G3

    Joined:
    Oct 14, 2008
    #2
    You are looking at different weight categories — as in, you are comparing a mid-range GPU optimised for power efficiency to end-of-the line desktop gaming GPUs...

    The 560 Pro is a 35W parts. The 1060 GTX (mobile) draws 80W of power. And its lower-powered version (which is proportionately slower) still draws 60-70W. So yes, you get 2x increase of performance at 2x increase of power usage. If anything, AMD is very competitive with Nvidia at lower TDPs — and seems to surpass it with Vega.

    P.S. Why are there no hi-end AMD cards in that list?
     
  3. meme1255 macrumors 6502a

    Joined:
    Jul 15, 2012
    Location:
    Czech Republic
    #3
    1) The R560 is over a year old.
    2) Apple uses AMD because NVIDIA ignores OpenCL.
    3) The R560 has significantly lower TDP.
     
  4. sputnikBA, Apr 2, 2018
    Last edited: Apr 2, 2018

    sputnikBA macrumors 6502

    Joined:
    Jan 2, 2018
    #4
    I don't think its a pricing thing, its a mix of several things;

    1) Apple & Nvidia have had bad blood between them ever since those macs with the GPUs that had really bad issues and had to be recalled / extended warrantee'd.
    2) As I understand it Nvidia also don't want to collaborate for some kind of technical reason (I think its to do with not wanting to give any info that could eventually help Apple in their own GPU creation efforts for iOS hardware)
    3) Apple don't really want to do too much to support Nvidia in their attempts to become a monopoly in the GPU industry. (See discussions relating to CUDA, Gameworks etc)
    4) [and the main one tbh] is power draw. The Nvidia mobile GPUs do come impressively close to their desktop counterparts, but not without a power cost. As long as Apple is pursuing thinness (even dropping down 15" MBP batteries from 99.5Wh down to 76Wh... in the 2016/2017s), there is no way they can put in one of the higher powered Nvidia GPUs without it really hurting battery or having to compromise on the types of CPUs they prefer to use.
     
  5. whitedragon101 thread starter macrumors 65816

    Joined:
    Sep 11, 2008
    #5

    Thanks.

    Not sure why there are no high end AMD cards on the list. I think its probably because the numbers in the market are small as they were not very successful as nVidia have won the high ground for the last 5 years. There is a 480rx on the list.

    Its almost impossible to find hard numbers like the TDP to do apples to apples comparisons. Its like the TDPs are secret or something. Also sometimes I find a TDP number and find its totally at odds with the number on another website. Eg I have seen a mobile 1050 quoted as both 75w 35w tdp
     
  6. jerryk macrumors 601

    Joined:
    Nov 3, 2011
    Location:
    SF Bay Area
    #6
    Apple is stuck with NVidia for now. And it really hurts them with people that want to use NVidia specific features based on CUDA. People that are doing the tasks like AI and ML, and some graphic programs.
     
  7. meme1255 macrumors 6502a

    Joined:
    Jul 15, 2012
    Location:
    Czech Republic
    #7
    I am sorry for these people, but nVidia has caused it itself by its ignorance and things such as the recent Game Program or PhysX on nVidia GPUs only...
     
  8. jerryk macrumors 601

    Joined:
    Nov 3, 2011
    Location:
    SF Bay Area
    #8
    Why is NVidia stuck? Just compare stock prices ($9 AMD vs $220 Nvidia) and profitability. Nvidia also has made big inroads into self-driving and other fields diversifying themselves away from lower margin fields like gaming.
     
  9. meme1255 macrumors 6502a

    Joined:
    Jul 15, 2012
    Location:
    Czech Republic
    #9
    AMD gives away many technologies for free - nVidia keeps them for itself. Do you still wonder why Apple cooperates with AMD?
     
  10. jerryk macrumors 601

    Joined:
    Nov 3, 2011
    Location:
    SF Bay Area
    #10
    Specifically what are you referring to? CUDA drivers are free, CUDA Deep Neural Network (CUDAddN) libs are free. PhysX is free.

    You seem to imply you can take a high performance driver from one GPU and run it on another with no degradation. Unfortunately that is not true.
     
  11. meme1255 macrumors 6502a

    Joined:
    Jul 15, 2012
    Location:
    Czech Republic
    #11

    Free in a sense of usage, not in a sense of providing source code to community or give away the technology for free ( FreeSync (AMD, free, runs via DP or HDMI) vs GSync (proprietary, requires specific chip for the same functionality) , for example).
     
  12. whitedragon101 thread starter macrumors 65816

    Joined:
    Sep 11, 2008
    #12
    I agree while nVidia are storming away with performance in Gaming, AI, Self Driving, I do prefer the business practices of AMD as a company. However, nVidia's business savvy and much larger R&D budget seem almost insurmountable.

    I really hope AMD have something in the works that can put them back in contention at the top. At least they have the console market sown up (except for nintendo) for guaranteed revenues.

    nVidia are almost a monopoly in Windows PC gaming at the moment and they a charging ever increasing amounts for their GPUs because there is no competition to make them do otherwise. nVidia hold the top 16 spots for the most used graphics cards on steam, 16 !!

    http://store.steampowered.com/hwsurvey/videocard/?sort=pct
     
  13. leman macrumors G3

    Joined:
    Oct 14, 2008
    #13
    AMD is quite competitive with Nvidia at the moment... the problem is that you can't buy any of their GPUs since they are being instantly bought up by cryptocurrency miners.

    On a more serious note, Nvidia used its superior marketing to bind a substantial part of the professional segment to its hardware, since they were the only party to offer decent quality programming tools. Unfortunately, the Khronos group failed at the practical aspect of their job regarding OpenCL, which is partly the reason why we have the proprietary mess we have today. CUDA is inherently very dangerous, since using it means voluntarily creating a monopolist who is in total control of your business — and a single mistake from Nvidia would mean a disaster for the CUDA users. And recent PR debacles show that Nvidia is getting too cocky with their status.

    As far as hardware goes, current AMD's offerings are technologically a superior product. They offer more flexible setups, proper task scheduling in hardware, adaptive execution etc (and your an see that in complex workflows such as raytracing, where Vega outperforms everything else). Nvidia's GPUs are simpler, which is probably also the reason why they are more energy efficient and ultimately perform so well in embarrassingly parallel tasks. Of course, technological superiority does not necessarily translate to a better product — Nvidia's success is in part good marketing and in part smart decisions as to which tradeoffs to make.

    I think my point is that your premise might be wrong: while you are correct that Nvidia has much higher market share and visibility in pro market, its not really the case that AMD's GPUs are inferior. AMD is in an awkward position right now, so much is sure, and it will be difficult for them to bounce back — especially since a lot of damage has been done already by users who helped create the monopolist.
     
  14. jerryk macrumors 601

    Joined:
    Nov 3, 2011
    Location:
    SF Bay Area
    #14
    Good summary. AMD failed to recognize something about GPUs early one. That is they can do a lot more that graphics.

    There can be built from simplistic, dense, sets of processors that are good at simple calculations. This lines up nicely for applications like cryptography, machine learning, etc. Nvidia did realize this and backed into the their dominant position, by making it relatively simple to access this processing power. Also, like Apple, Nvidia was shrewd and made their sources proprietary and tied to their architecture.

    However, there is hope, sort of. I have heard of recent efforts to make a variation of the Cuda API that runs on AMD hardware. This library is API transparent to the caller so little or no code changes are needed to use it.
     
  15. leman macrumors G3

    Joined:
    Oct 14, 2008
    #15
    I don't think that AMD failed to recognise this, after all, AMD GPUs were designed to as general-purpose processors before Nvidia did it. What AMD fail is to play its cards right. They relied to much on a rather inefficient committee and an open standard which was not backed up by any concrete tools. In the meantime their competitors all but ignored the open initiative and developed their own, easy to use framework — which incidentally could be only used on their hardware...

    Of course, Khronos learned from their and ARB's mistakes and Vulkan comes in a much better shape (which doesn't make it any more usable of course — that API is rather nightmare-inducing :D). But its a bit too late for OpenCL I am afraid...
     
  16. jerryk, Apr 3, 2018
    Last edited: Apr 3, 2018

    jerryk macrumors 601

    Joined:
    Nov 3, 2011
    Location:
    SF Bay Area
    #16
    "Only used on their hardware" has made Apple the biggest company in the world. So it obviously has a good track record as a business strategy. :)
     

Share This Page