Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No I didn't "describe how AMD GPUs run". I only mentioned facts and they're pretty darn straight.

AMD Radeon Pro 460 is running at 35W TDP, way more efficient in computing and does in fact support ascynchronous computing, unlike any NVidia GPU and far more efficient at OpenCL. No Nvidia GPU at 35W is as efficient.

Facts. Straight.

Nvidia 1060 TDP is 120W
Nvidia 1070 TDP is 150W
Nvidia 1080 TDP is 180W

They're like electrical furnaces compared to the Radeon Pro 460 at 35W.

Facts. Straight. The Polaris 11 GPU, which the Radeon Pro 460 is built on, was designed specifically to be an effective, efficient laptop GPU. It's not scaled down for that purpose. It's designed for that purpose. The leafblowing 10xx series from nvidia is a desktop GPU.

And you're actually whining about the lack of one of these leafblowers in the new thin and light MacBook Pro, you know the one with limited battery life, questionable heat spreading and a super low power CPU/RAM.

Reality doesn't much intrude in the life of the nvidiot. Whatever nvidia poops out must be moar powah efficient, and totes lol better than Ati. Lulz.

This is misleading to forum readers. You're comparing AMD's mobile GPUs to Nvidia's desktop GPU's.
 
Last edited:
This is misleading to forum readers. You're comparing AMD's mobile GPUs to Nvidia's desktop GPU's.

Correct however even the Nvidia Laptop GPU are more power hungry than the AMD. The point is adobe needs to get off there behind and use the AMD GPU. News flash apple is not going to buy NVIDA just cause adobe refuses to optimize for it.
 
  • Like
Reactions: Atlantico
Fact Adobe refuses to optimize for AMD vs Nvidia furnace cards. Thus every one you meet will tell you just how poor a choice the AMD card was for apple to make. The users will not turn on there adobe masters and demand that they optomize for AMD. They just cry about how poor a choice apple made. They do not know about TDM or efficiency and they don't care cause they want the adobe thing to work better year over year with no effort from adobe.

For me, it's more about the horrible track record of AMD chips in Apple laptops. After you've had a couple of several thousand $ machines go bad because of the GPU, you tend to favor the other options.
 
Was that one very wide-spread? I don't recall hearing much about it, but I'm sure either kind can have issues.
yep the 2011 ~ 2012 Nvidia had a REP (repair extension program) I had to have the logic board swapped out. That made it better but it will come back.

This all starts and ends with the solder / poor QA at the chip makers.

I had to fight with apple no less to get the thing fixed.

So Nvida eats more power vs and right now. Nvidia is also just as likely to fail.

This whole uproar is all about adobe and there lack of optimization. This is born out in the final cut vs premier.

I saw some where that Nvidia never did kuda for the Mac. I assume these people were rebooting to boot camp windows using Kuda and adobe there. Thus if you rip the Nvidia out and drop in AMD it is suddenly worse for you since adobe does not care about AMD.

This is all about adobe. I do not fault you for using adobe. I just fault you for saying Nvidia is better cause it works better for your application and not from a holistic approach like power and heat management.

Adobe could optimize and make it work just as good if they wanted to.
 
  • Like
Reactions: Atlantico
yep the 2011 ~ 2012 Nvidia had a REP (repair extension program) I had to have the logic board swapped out. That made it better but it will come back.

This all starts and ends with the solder / poor QA at the chip makers.

I had to fight with apple no less to get the thing fixed.

So Nvida eats more power vs and right now. Nvidia is also just as likely to fail.

This whole uproar is all about adobe and there lack of optimization. This is born out in the final cut vs premier.

I saw some where that Nvidia never did kuda for the Mac. I assume these people were rebooting to boot camp windows using Kuda and adobe there. Thus if you rip the Nvidia out and drop in AMD it is suddenly worse for you since adobe does not care about AMD.

This is all about adobe. I do not fault you for using adobe. I just fault you for saying Nvidia is better cause it works better for your application and not from a holistic approach like power and heat management.

Adobe could optimize and make it work just as good if they wanted to.

Fair enough... I'm no expert on the subject (just a bit sore about getting burned a couple of times). :(

Agreed on the solder, and that's for sure, why heat is such an issue. Remember all the PS3 and Xbox problems? Same thing. A friend of mine had a $12k wave solder machine and did a good bit of business fixing them for a number of years.
 
Fair enough... I'm no expert on the subject (just a bit sore about getting burned a couple of times). :(

Agreed on the solder, and that's for sure, why heat is such an issue. Remember all the PS3 and Xbox problems? Same thing. A friend of mine had a $12k wave solder machine and did a good bit of business fixing them for a number of years.
Yeah it is not like apples thermals are bad. They are pretty good and much better this cycle than last. The issue is that eco solder is hard to work with and one single pin get cold solder and your done for.

I also had a AMD 2011 iMac that had the same issue. I have had it from both card makers and apple. Apple did a decent job of sweeping most of it under the rug with REP programs to make it just go away till the machines reached EOL. This I think is apple plan which is to just sweep the mess under the rug.
 
Yeah it is not like apples thermals are bad. They are pretty good and much better this cycle than last. The issue is that eco solder is hard to work with and one single pin get cold solder and your done for.

I also had a AMD 2011 iMac that had the same issue. I have had it from both card makers and apple. Apple did a decent job of sweeping most of it under the rug with REP programs to make it just go away till the machines reached EOL. This I think is apple plan which is to just sweep the mess under the rug.

I'm glad to hear you think the thermals are better this cycle. I'm still debating whether to go MBP (and try to get eGPU going and/or wait until Apple, hopefully, supports it better), or desktop/lesser-laptop combo. My big fear of using MBP as my primary machine is the potential thermal damage when doing heavier stuff. (Having damaged a couple in the past this way!)

I like one machine, but I suppose I'm better doing the combo. At least it's getting easier these days to work between multiple machines.
 
The thermals are better bout about 10c under load easy. I think it depends on what you do work flow wise with it.

I am a system admin / developer. The only reason I got a 15 was for screen size. I am however strongly considering downgrading to a 13 for portability.

I loved my iMac to the day it died and the day I got it fixed was the day it left my hands. I was not going to fight with apple for graphics card #3.


I want to drive a 5k display with a laptop. The current line up will do that with no issue no matter 13 or 15. I do not see my self needing 2 5k displays.

The only perks for 15 for me is screen size and 4 ports. The 13 unit with TB is a no go for me. The thing is under battered for anything I need.
 
  • Like
Reactions: SteveW928
The thermals are better bout about 10c under load easy. I think it depends on what you do work flow wise with it.

I am a system admin / developer. The only reason I got a 15 was for screen size. I am however strongly considering downgrading to a 13 for portability.

I loved my iMac to the day it died and the day I got it fixed was the day it left my hands. I was not going to fight with apple for graphics card #3.


I want to drive a 5k display with a laptop. The current line up will do that with no issue no matter 13 or 15. I do not see my self needing 2 5k displays.

The only perks for 15 for me is screen size and 4 ports. The 13 unit with TB is a no go for me. The thing is under battered for anything I need.

It's a mixture of issues for me re: thermals.

First, I'd like it to be able to handle reasonable load w/o going crazy noisy, as the noise is irritating, and I'm getting into podcasting. I think I'd probably be fine there if I didn't try to fire up too much stuff or certain apps while recording, etc.

Second, I do a good bit of video encoding and also sometimes 3D rendering. The video encoding, usually, comes in chunks so maybe it's running full-blast for 30 min to a couple of hours. The 3D rendering, however, could keep it running full-out for much longer. I suppose I'm best just off-loading that, but I'd like to not have to worry about damaging a Pro machine because I use it. :)

My friend is working to talk me into just using a MB/MBA or entry MBP, and then just building a Hackintosh. He built a couple of them several years ago (one for a server, another for a workstation) and both have been excellent. The workstation one still kicks the %#( out of anything Apple makes.... and he can keep upgrading it. (I generally don't like to mess with that stuff... but if Apple refuses to give me the hardware, I think that's where I'm going. Plus, if the the OS keeps going as it has been, it will make an easier transition to Windows.)
 
I will say the 13 non-touchbar is a fantastic ultra portable. I am borrowing my SO to test. I have the 15 and I am seriously thinking of downgrading.
 
I will say the 13 non-touchbar is a fantastic ultra portable. I am borrowing my SO to test. I have the 15 and I am seriously thinking of downgrading.

Yes, that's the only one I'd likely go with. Aside from not needing the Touch Bar (and maybe preferring regular keys?), the non-replaceable SSD is something I'd try to avoid at all costs. Afaik, that's a part that WILL fail several years down the road, more or less, depending on usage. A laptop should be able to go for min 5 years, to 10 years, IMO.
 
  • Like
Reactions: Demo Kit
Yes, that's the only one I'd likely go with. Aside from not needing the Touch Bar (and maybe preferring regular keys?), the non-replaceable SSD is something I'd try to avoid at all costs. Afaik, that's a part that WILL fail several years down the road, more or less, depending on usage. A laptop should be able to go for min 5 years, to 10 years, IMO.

I have not honestly seem a high volume of SSD dropping off from the 2012 line. So I will say the jury on that is still out. The SSD in the non-touch bar unit is not soldered on so it is a costly fix but a fix can be mustered.

The thing is apple charges a flat rate to fix the bottom case of all of there laptops. So that is some what re-assuraing
 
  • Like
Reactions: SteveW928
This is misleading to forum readers. You're comparing AMD's mobile GPUs to Nvidia's desktop GPU's.

No, it isn't. Nvidia is not offering any mobile Pascal GPUs. Pascal GPUs are simply downclocked in laptops so the 1070 on a laptop only burns about 120W TDP.

http://www.ultrabookreview.com/10939-laptops-nvidia-1070-1080/

1070 downclocked for mobile is at 120W TPD
1080 downclocked for mobile is at 150W TDP

These nvidia chips suck power like there's no tomorrow.

There is no mobility line in Pascal. You want a Pascal GPU on Apple's thin and light laptops, you will get a desktop Pascal, downclocked.

Meanwhile AMD offers a mobility line of Polaris, with Polaris 11 - which was specifically designed for that job. Something evident in incredibly efficiency, low 35W TDP and low powerdraw. Polaris 10 is the desktop chip from AMD.

That's why, among many other very good reasons, nvidia is a no-show on Apple equipment.
 
  • Like
Reactions: Naimfan
No, it isn't. Nvidia is not offering any mobile Pascal GPUs. Pascal GPUs are simply downclocked in laptops so the 1070 on a laptop only burns about 120W TDP.

http://www.ultrabookreview.com/10939-laptops-nvidia-1070-1080/

1070 downclocked for mobile is at 120W TPD
1080 downclocked for mobile is at 150W TDP

These nvidia chips suck power like there's no tomorrow.

There is no mobility line in Pascal. You want a Pascal GPU on Apple's thin and light laptops, you will get a desktop Pascal, downclocked.

Meanwhile AMD offers a mobility line of Polaris, with Polaris 11 - which was specifically designed for that job. Something evident in incredibly efficiency, low 35W TDP and low powerdraw. Polaris 10 is the desktop chip from AMD.

That's why, among many other very good reasons, nvidia is a no-show on Apple equipment.

Well no wonder the Pascal cards make the AMD's look like an even bigger joke. However, I read reports that during gaming the cards don't reach their peak TDP anyway.

I'd gladly take a proper desktop grade GPU over an AMD... unless you prefer shoddy performance.
 
Well no wonder the Pascal cards make the AMD's look like an even bigger joke. However, I read reports that during gaming the cards don't reach their peak TDP anyway.

I'd gladly take a proper desktop grade GPU over an AMD... unless you prefer shoddy performance.

AMD GPUs make Pascal look like a joke on laptops. 1 hour battery life on that "dream machine" of yours? LOL if you're lucky. Might as well just have a destktop with that Pascal GPU because that machine is going to be permanently plugged into the wall socked and a 3rd party cooling board under the laptop so it won't melt.

Pascal is useless on MacBook Pros and a complete laughing stock. I bet you'd like some Xeons as well on the MacBook Pro. No, Xeons are useless as well. So are any other desktop CPUs. Except in nvidia lala-land apparently.

Meanwhile in reality, Apple has chosen not to buy nvidia since 2012 or something. Hint: that's because nvidia is overpriced as much as it is powersucking. It's just a bad deal.

One that Apple is not interested in, which is why they go with the best option and have for some time: AMD.
 
  • Like
Reactions: SteveW928
AMD GPUs make Pascal look like a joke on laptops. 1 hour battery life on that "dream machine" of yours? LOL if you're lucky. Might as well just have a destktop with that Pascal GPU because that machine is going to be permanently plugged into the wall socked and a 3rd party cooling board under the laptop so it won't melt.

Pascal is useless on MacBook Pros and a complete laughing stock. I bet you'd like some Xeons as well on the MacBook Pro. No, Xeons are useless as well. So are any other desktop CPUs. Except in nvidia lala-land apparently.

Meanwhile in reality, Apple has chosen not to buy nvidia since 2012 or something. Hint: that's because nvidia is overpriced as much as it is powersucking. It's just a bad deal.

One that Apple is not interested in, which is why they go with the best option and have for some time: AMD.

This is a portable not a luggable. Nvidia people want Nvidia cause there applications are optimized for them.

Nvidia is superior if you can plug it in the wall and have large thermal capacity. I mean Pascal is extremely impressive. The issue is they are not ready for portables. AMD decided this trip round the circuit to go for efficiency and not pure Gflops.

There is a place for both. But the uproar about the chipset chosen just cause it is not Nvidia is firmly rooted in adobe being lame.

IGPU I had hopped would be faster in the 15 so it would shift less often. I get that is an Intel issue but I still wished I had happened.
 
This is what's happening with my 15 in MBP. 1/3 of the screen is missing. Also at some point it starts flickering. So much money for faulty product. GRRRR
 

Attachments

  • IMG_9394.JPG
    IMG_9394.JPG
    1.4 MB · Views: 187
  • IMG_7875.JPG
    IMG_7875.JPG
    774.8 KB · Views: 165
The 10.12.2 Sierra update did NOT fix the video problem for me :-(
It's not fixed in 10.12.3 beta either!

MANY older macbooks are recently showing a similar video problem with the GPU - it's not hardware!

If it's not the hardware, I guess that's good news (assuming Apple puts in enough resources to ever fix it). Hopefully it doesn't end up like the WiFi issues that took nearly a decade to fix (if they actually fixed it... maybe the hardware just moved on).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.