RMBP users, are you disappointed...

Discussion in 'MacBook Pro' started by darksithpro, Oct 15, 2017.

  1. darksithpro macrumors 6502a

    Oct 27, 2016
    ...that Apple still chooses AMD/Ati for their graphics chipset? I know the AMD GUPs are inferior to nVidia's in a lot of aspects and their has been some QAC issues with those chips in the past, with some recalls. Curious, for paying so much for such a nice machine you'd think they'd opt for nVidia GPUs instead, wouldn't you say?
  2. Tarrant64 macrumors member

    Mar 10, 2006
    Not really. If I needed more GPU power it probably wouldn’t have been my purchase overall. It has something I think that sacrifices some performance for mobility purposes. Also, they do support external GPU now so I could always go that route.

    But, everyone is different for their needs.
  3. leman macrumors G3

    Oct 14, 2008
    Given the fact that Nvidia hasn't yet produced a better GPU within the given thermal bracket, not really. It is undoubtedly true that Nvidia takes the performance/watt crown in a higher TDP bracket. But in the lower TDP segment, where the 15" MBP traditionally has been positioned (30-40W GPUs), Nvidia doesn't have a better GPU than current AMD offerings.
  4. Samuelsan2001 macrumors 604

    Oct 24, 2013
    This basically, NVIDIA gpu’s running at this TDP are no better. Apple have done what they always do, choose the best GPU for their thermal design and battery life and optimise their own apps to scream on the available hardware.
  5. bjet767 macrumors 6502a

    Oct 2, 2010
    Not sure, but what’s the point of this thread?

    Most people really could care less about what the GPU brand is as long as it works.
  6. mcpryon2 macrumors 6502a


    Dec 12, 2008
    I'm not disappointed in the AMD choice. I use FCPX for work and it flies on my MBP. Creative Cloud is fine, too, even without cuda. But I'm using it for work, not as a gaming rig, though World of Warcraft is pretty sweet on my 2016 15" and a 4k display, but that's just icing on the cake.

    Actually most of the graphics problems I've had with Macs have been NVidia, namely MBPs from the timebomb era. Thankfully Apple had their repair programs going and I had a few fixed through that.

    That said, recent desktop NVidia cards, like the GTX 980, have run like champs in my 5,1 Mac Pros.

    Things have changed a lot from when I started building and using computers. There's so much good stuff out there and it's easy to get hung up on data sheets. Of course, I remember when you needed half a million dollars in equipment to do a crossfade on video tape and you hoped you'd get the frames you wanted, so maybe I'm easily impressed with modern technology.
  7. Sterkenburg macrumors 6502

    Oct 27, 2016
    Well, it's not that simple, actually. Some apps are optimized to run well on certain cards (e.g. FCPX on AMD, the Adobe suite on Nvidia), and users of some key technologies like CUDA have no choice but to use GPUs from a specific maker.

    As others have said, Apple choose the current Radeon Pro cards because of several valid reasons:

    - No better cards from Nvidia in the same TDP range
    - Apple's own apps running better on AMD
    - Better external display support than other cards in the same range (iirc)

    I personally would prefer Nvidia because of CUDA support, but it's obvious that no choice will ever please everyone, and people have been complaining about GPUs ever since the first MBP was launched on the market anyway (sometimes not without valid reasons, tbh).
  8. Queen6 macrumors 604


    Dec 11, 2008
    Flying over the rainforest at dawn - Priceless
    Apple optimises OS X and it's applications for AMD GPU's which is a major factor. For the MBP, it doesn't have the thermal headroom for a powerful dGPU. The likes of a GTX-1060 or greater the MBP is simply not capable of cooling adequately as Apple value the aesthetic versus performance.

    If you want a notebook with a powerful dGPU you have to lookout outside Apple's offerings or wait on Apple's eGPU solution, which to me is becoming less attractive given the advancements in mobile dGPU's and notebook design. As others have stated given the MBP'S form factor an Nidia dGPU would not be significantly better, potentially worse.

  9. Daeve macrumors member

    Sep 11, 2007
    Being stuck with a borked nVidia dGPU in my 2012 rMBP, I'd be more than happy for an AMD dGPU. I took it in before the recall ended and it passed all tests, a month later - kernel panics, blank screens and 20 reboots to get it to work each time.
  10. ET iPhone Home macrumors 68040

    ET iPhone Home

    Oct 5, 2011
    Orange County, California USA
    It's all in the packaging. Have you watched "HAUL" vlogs on youtube, where the buyers are entice more on the packaging than the inside product itself. Apple builds and designs a great exterior shell with mediocre internals, but heck, I bought one, a rMBPtb so there you go.
  11. Patcell macrumors 6502

    Aug 8, 2016
    Bergen County, NJ
    As someone mentioned earlier, I think a big reason Apple went with the Radeon Pro chips was external display support... they wanted to boast dual 5K display support and - because 5K uses multi-stream transport - they needed a card that could drive 5 display signals simultaneously, which the Radeon Pro can. The GPU essentially sees each 5K display as two monitors, so when two are connected, the GPU thinks it’s driving 4 displays, plus the internal and you get 5. None of the current Nvidia options in the mobile space offer 5 monitor support currently.
  12. Samuelsan2001 macrumors 604

    Oct 24, 2013
    I know exactly what you are saying but disagree that it’s about aesthetics, it’s about portability. Apple know that the best computer is the one you have with you, they have always used mediocre GPU’s to maintain portability in their laptops and always will.

    Nothing has changed, every version since the PowerBook has sacrificed graphical power for portability and battery life. Expecting that to change is the issue here.
  13. Queen6 macrumors 604


    Dec 11, 2008
    Flying over the rainforest at dawn - Priceless
    I would consider it's both, as Apple does make compromises for the aesthetic and all do for portability. I do agree that the trend is unlikely to change, equally for some the latest MBP is more an exercise in diminishing returns. Seems a lot of sacrifice of usability and capability, equally Apple is meeting the needs of it's target audience.

    If people need more capable hardware, or shall I say more specific to their use case then one needs to look outside of Apple. I would love to see Apple reintroduce the PowerBook as a more flexible & powerful professional tool, equally the customer base doesn't exist in sufficient numbers as was exemplified by the demise of the 17' MBP.

    Portability, again depends on perspective, my own 15.6 W10 notebook is definitely thicker than the MBP, also far more powerful & flexible and even under full load runs cool. For that I have no issue with a couple of extra pounds in weight including the power pack. As ever much depends on the use case, workflow etc.

    I'm lucky as my hardware pays for itself very rapidly, I can choose what I want, all it needs to do is meet my requirements, be it a Mac or Windows system, ultraportable, 2 in1, portable workstation or even a gaming notebook, and generally I don't keep the notebooks much more than 24 months.

  14. spacebro Suspended

    Oct 1, 2015
    I'm disappointed that many years after the promise of thunderbolt, then thunderbolt 2, then thunderbolt 3- STILL. NO. EXTERNAL. GPU! I thought for sure we would have this in the 2016 macbook update, and instead they put usb-c for no damned reason, nothing uses it still a year later!
  15. Samuelsan2001 macrumors 604

    Oct 24, 2013
    There are TB3 external gpu's and enclosures that use a usb c connection with full support in high sierra.

  16. The Mercurian macrumors 68000

    Mar 17, 2012
    Maybe I've become a rarity, but GPU performance is not important to me once the OS can display every day stuff. CPU matters alot to me, and I'm positively drooling at the prospect of extra cores, but until GPU programming becomes more easily accessible I couldn't give a monkeys what kind of graphics card is in it.

Share This Page

15 October 15, 2017