Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hm. Curious. The official Tensorflow Github states Tensorflow is available for CUDA enabled GPUs. No mention of AMD or ROCm. I know they had inofficial builds for quite some time, but none of these were considered stable.

Question is: is the standard Tensorflow-GPU install ROCm enabled?
They announced ROCm 2.0 at the event on 6th Nov. and there will be more information coming next on the said topic.
 
Unfortunately, eGPU is not an option when you are traveling or want to get acceleration in Windows without the external monitor.

Agreed. It all depends on your use case. Some people might want to go with both the internal Vega AND an eGPU, but that's certainly going to be the priciest option.
 
They announced ROCm 2.0 at the event on 6th Nov. and there will be more information coming next on the said topic.

Its a mess. Let's assume ROCm gains momentum. Let's further assume cloud providers offer AMD options. Its still required to decide beforehand which platform to develop for. Vendor lock-in included. What a mess.

Edit: as it seems, ROCm is only supported on Linux only. Not an option for me.
 
Last edited:
I JUST ran across this article while looking for the original "Expose" report I saw.

https://appleinsider.com/articles/1...olicies-are-abusive-but-proof-falls-far-short

That reads like a lengthy paid apology piece without addressing anything a DIY'er or someone with product life cycle experience knows.

Every product has its unique common failures and the people who know best are the ones who made it since it goes through fault analysis into a knowledge base then to engineering to correct.

The proper response to customer should've been either:

1) Offer to send the equipment to the repair center for diagnostic and quote

2) Attempt to be honest by looking up the fault in the knowledge base and explain that it can be as simple as a cable replacement or worse but repair center can only determine

Anything but falsely claiming water damage and the highest quote for gutting the device unnecessarily.
 
Why do you prefer the 13.3" over the 15.4"?

Portability in size (more now with dimensions overall than with weight), battery life is pretty much the same, I use high resolution so I never really needed a large 'desk' as I cope with Spaces, I stopped using a desktop/laptop computer for gaming since the Nintendo Game Cube debuted Metroid Prime/Prime Echo's and with PS1 with MGS and then onward up to the PS3. now it's gaming on my iPhone since I missed out on the first 2yrs with PS4 that if i buy one now ... I may only have a full year to be current in games before I need to shell out what ... $600 for a console?! No Thanks.

The ONLY 14"+ Apple laptops (or any for that matter) that I really liked was:
PowerBook Pismo
TiBook G4
X1 Carbon 5th generation.

Since the PowerBook G4 12" I've always prefered the smaller notebooks.
 
  • Like
Reactions: Equitek
Its a mess. Let's assume ROCm gains momentum. Let's further assume cloud providers offer AMD options. Its still required to decide beforehand which platform to develop for. Vendor lock-in included. What a mess.

Edit: as it seems, ROCm is only supported on Linux only. Not an option for me.

ROCm is a fantastic idea... it's just very badly implemented.

As you say, it's only available on linux (which is fair enough IMO, as anyone digging far enough down the rabbit hole to need it is probably okay with that). It's biggest issues are a) how bloody difficult it is to get it to work and b) how damn buggy it is when it's running.

I've tried several times to get it working on my iMac under Ubuntu (2017, internal 580 + external Vega Frontier). Sometimes I get it working, but only on the internal GPU. Sometimes it works, but after a reboot, it stops working. Sometimes it freezes at boot randomly, forcing me to drop to a root shell to disable the driver.

So far, I'd estimate I've got around 50 hours invested in ROCm, and absolutely nothing to show for it.
 
No because external SSD storage can always be purchased and it is quite fast now, either with USB-C 3.1 gen 2 or Thunderbolt 2 and many people ordered machines with 1-2 TB which is plenty. There is no substitute for a good internal GPU. An eGPU can be added to any of the 2016s and newer regardless of whether or not they even have a dGPU but carrying that around is not practical! External SSDs are small and can be thrown in a backpack or notebook bag.

My analogy was iPhone storage sizes.

You can't blame Apple for simply adding another option.


While I have now gotten over it, the fact is that Apple has never changed the GPU in between two releases. What they released when the original models dropped is what they had through the entire cycle. This is unprecedented and I hope it’s not the new normal.

They have added new configurations and options in between releases before.
Not new high end graphics options, but still that isn't anything completely outlandish.

The fact is that two-three years from now when people want to upgrade and put their machines on eBay those with the 560X will no longer fetch top dollar. People will only pay the extra cash for Vega 16 and 20. In fact a 1TB machine with Vega 20 will probably sell for the same as a 2 TB machine with the 560X. The GPU is that important.

It’s similar to the desirability of machines with 32 GB RAM vs. 16. Two years from now 16 GB will not be very attractive.
Sure a 1 TB machine with Vega 20 might sell for the same as a 2 TB one with 560X. That doesn't make the latter one worth any less though. It just means there will be 2 TB ones with Vega that will sell for even more.

Why should it make any difference for this whether Apple had announced the Vega options back in August or not?
 
Quick question here - granted, I know AMD chips have had their issues - but has there ever been an Nvidia card in a MBP that didn't have premature failures?

8600 - boot failures due to poor solder joints
9600M - repair extension program
330M - Issues with automatic graphics switching
650M - Repair extension program
750M - might have been okay?

I think it's fair to say that the NVidia chip failures in 2008-2012 MacBook Pros likely cost Apple at least a couple to several hundred millions of dollars in replacements and repairs.

Those that wonder why Apple doesn't put NVidia in its computers? That's the reason why.
 
  • Like
Reactions: Equitek
According to Notebookcheck for the Vega 20, "The performance should be on a level with the Radeon RX Vega M GL (also 1,280 shaders) and therefore between a Nvidia GeForce GTX 1050 and 1050 Ti"
Nope. Vega 20 is on the same level as Vega M GH/Quadro P3000.
 
ROCm is a fantastic idea... it's just very badly implemented.

As you say, it's only available on linux (which is fair enough IMO, as anyone digging far enough down the rabbit hole to need it is probably okay with that). It's biggest issues are a) how bloody difficult it is to get it to work and b) how damn buggy it is when it's running.

I've tried several times to get it working on my iMac under Ubuntu (2017, internal 580 + external Vega Frontier). Sometimes I get it working, but only on the internal GPU. Sometimes it works, but after a reboot, it stops working. Sometimes it freezes at boot randomly, forcing me to drop to a root shell to disable the driver.

So far, I'd estimate I've got around 50 hours invested in ROCm, and absolutely nothing to show for it.

Interesting. I looked into it a couple of months ago, but do not recall why I did not consider it further. Probably you just answered that.

Too bad it is that complicated to get it running; even more so as it is not only crucial to get it started somehow. It of course also needs to integrate into the app/framework being developed.
As an example, I extensively use the Qt framework. Integrating CUDA into Qt Creator is not exactly straightforward (took me 3 or 4 days just to get the project file working), even more complicated to get it running on all platforms (Windows, Mac, Linux) - and there are still severe limitations (not Qt related, but C++ related).

If it is that difficult to even get ROCm started - I cannot see how it could possibly be integrated into a larger, diverse project? I fear production readyness is not due anytime soon
 
Last edited:
I know these kind of comparisons aren't exactly fair, but just for a bit of perspective....I bought a 17" Aorus with a GTX 1080, 32 GB of 2666hz RAM (I can upgrade to 64 GB if I want) an overclockable i7-8850, 512 GB SSD Samsung 970 Pro (upgradeable in the future), custom build with upgraded thermal paste, no throttling whatsoever, .9 inches thick (pretty thin for this kind of hardware), Thunderbolt 3, USB-C, USB-3, ethernet, hdmi, 8k displayport output, super fast card reader, two year warranty, for $4K even with taxes.

A maxed-out MBP with 512 GB SSD, Vega 20 (comparable to a gtx 1050 ti at best?), no RAM or SSD upgradeability, one port type, useless touchbar, smaller screen, AppleCare+, taxes, for $4600.

I jumped ship from Windows PCs to Apple about 5 years ago and I'll take MacOS over bloatware riddled glitchy Windows 10 any day of the week...but for people who need that kind of power in a portable format, Apple just doesn't cater to them, and that's their prerogative. I love my 15" 2017 MBP but to get work done in the field, in the meantime, I'll have to bite the bullet with Windows PCs it seems.
[doublepost=1542336951][/doublepost]
Nope. Vega 20 is on the same level as Vega M GH/Quadro P3000.
According to whom/what source/what reasoning?
 
I know these kind of comparisons aren't exactly fair, but just for a bit of perspective....I bought a 17" Aorus with a GTX 1080, 32 GB of 2666hz RAM (I can upgrade to 64 GB if I want) an overclockable i7-8850, 512 GB SSD Samsung 970 Pro (upgradeable in the future), custom build with upgraded thermal paste, no throttling whatsoever, .9 inches thick (pretty thin for this kind of hardware), Thunderbolt 3, USB-C, USB-3, ethernet, hdmi, 8k displayport output, super fast card reader, two year warranty, for $4K even with taxes.

A maxed-out MBP with 512 GB SSD, Vega 20 (comparable to a gtx 1050 ti at best?), no RAM or SSD upgradeability, one port type, useless touchbar, smaller screen, AppleCare+, taxes, for $4600.

That maxed out MBP will weigh 40% as much and be a third less tall. It’s also telling that battery life isn’t mentioned at all on Aorus’s end. Their battery has more capacity, but nowhere near enough to offset the power draw.

This just isn’t a market segment Apple is particularly interested in.
 
Last edited:
  • Like
Reactions: MrUNIMOG
I know these kind of comparisons aren't exactly fair, but just for a bit of perspective....I bought a 17" Aorus with a GTX 1080, 32 GB of 2666hz RAM (I can upgrade to 64 GB if I want) an overclockable i7-8850, 512 GB SSD Samsung 970 Pro (upgradeable in the future), custom build with upgraded thermal paste, no throttling whatsoever, .9 inches thick (pretty thin for this kind of hardware), Thunderbolt 3, USB-C, USB-3, ethernet, hdmi, 8k displayport output, super fast card reader, two year warranty, for $4K even with taxes.

A maxed-out MBP with 512 GB SSD, Vega 20 (comparable to a gtx 1050 ti at best?), no RAM or SSD upgradeability, one port type, useless touchbar, smaller screen, AppleCare+, taxes, for $4600.

I jumped ship from Windows PCs to Apple about 5 years ago and I'll take MacOS over bloatware riddled glitchy Windows 10 any day of the week...but for people who need that kind of power in a portable format, Apple just doesn't cater to them, and that's their prerogative. I love my 15" 2017 MBP but to get work done in the field, in the meantime, I'll have to bite the bullet with Windows PCs it seems.
[doublepost=1542336951][/doublepost]
According to whom/what source/what reasoning?
Same shaders, though we don't know the clocks on the vega 20... I am guessing that it should be between 1050ti and 1060 (take it with a grain of salt).
Some facts and stats here... https://www.notebookcheck.net/AMD-Radeon-Pro-Vega-20-GPU-Graphics-Card.361941.0.html
https://www.notebookcheck.net/NVIDIA-Quadro-P3000.191075.0.html
https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1050-Ti-Notebook.168400.0.html
 
According to whom/what source/what reasoning?
Vega M GL has 1280 GCN cores, but only 1011 MHz turbo clock.
Vega 20 Pro has 1280 GCN cores with 1.3 GHz core clock.

Here is 3dMark 11 benchamark for Vega 20: https://www.3dmark.com/compare/3dm11/12875524/3dm11/12886777#
GPU Score: 12405 is the same level as Vega M GH and Quadro P3000, which is based on GTX 1060's chip, but has lower core clocks(1.243 MHz).

Overall, Vega 20 Pro will be 10-15% slower than GTX 1060 Max-Q, but much faster than GTX 1050 Ti.

The only thing that can bottleneck this GPU at the start are drivers.

Same shaders, though we don't know the clocks on the vega 20... I am guessing that it should be between 1050ti and 1060 (take it with a grain of salt).
Some facts and stats here... https://www.notebookcheck.net/AMD-Radeon-Pro-Vega-20-GPU-Graphics-Card.361941.0.html
https://www.notebookcheck.net/NVIDIA-Quadro-P3000.191075.0.html
https://www.notebookcheck.net/NVIDIA-GeForce-GTX-1050-Ti-Notebook.168400.0.html
We know the clock speeds of both Vega GPUs.

Here is link: 15-Inch MacBook Pro AMD Radeon Pro Vega Graphics Options Now Available to Order
 
Last edited:
  • Like
Reactions: MrUNIMOG and akis-k
Vega M GL has 1280 GCN cores, but only 1011 MHz turbo clock.
Vega 20 Pro has 1280 GCN cores with 1.3 GHz core clock.

Here is 3dMark 11 benchamark for Vega 20: https://www.3dmark.com/compare/3dm11/12875524/3dm11/12886777#
GPU Score: 12405 is the same level as Vega M GH and Quadro P3000, which is based on GTX 1060's chip, but has lower core clocks(1.243 MHz).

Overall, Vega 20 Pro will be 10-15% slower than GTX 1060 Max-Q, but much faster than GTX 1050 Ti.

The only thing that can bottleneck this GPU at the start are drivers.


We know the clock speeds of both Vega GPUs.

Here is link: 15-Inch MacBook Pro AMD Radeon Pro Vega Graphics Options Now Available to Order
I wish this to be true, but can you give us an official document (amd or apple or whatever) that shows the 1.3Ghz core clock of vega 20? Because i can't find any info on that.
 
I wish this to be true, but can you give us an official document (amd or apple or whatever) that shows the 1.3Ghz core clock of vega 20? Because i can't find any info on that.

Anandtech said it based on information from AMD: https://www.anandtech.com/show/13532/amds-vega-mobile-lives-vega-pro-20-16-in-november
[doublepost=1542381719][/doublepost]
Vega M GL has 1280 GCN cores, but only 1011 MHz turbo clock.
Vega 20 Pro has 1280 GCN cores with 1.3 GHz core clock.

Just a quick comment: its quite likely that Vega M (in Kaby Lake G) is actually Polaris CUs + Vega's memory controller, so its probably going to be slightly less efficient than "real" Vega. But who knows. Benchmarks should be out soonish.
 
  • Like
Reactions: MrUNIMOG and koyoot
Just a quick comment: its quite likely that Vega M (in Kaby Lake G) is actually Polaris CUs + Vega's memory controller, so its probably going to be slightly less efficient than "real" Vega. But who knows. Benchmarks should be out soonish.
Its not quite likely. Its 100% correct. Vega M from Intel SoI is actually Polaris+ Vega Memory subsystem. It does not have for example Rapid Packed Math, and is GFX family 806. Vega 12/(Vega Pro 20) is GFX 904.
 
  • Like
Reactions: MrUNIMOG
That reads like a lengthy paid apology piece without addressing anything a DIY'er or someone with product life cycle experience knows.

Every product has its unique common failures and the people who know best are the ones who made it since it goes through fault analysis into a knowledge base then to engineering to correct.

The proper response to customer should've been either:

1) Offer to send the equipment to the repair center for diagnostic and quote

2) Attempt to be honest by looking up the fault in the knowledge base and explain that it can be as simple as a cable replacement or worse but repair center can only determine

Anything but falsely claiming water damage and the highest quote for gutting the device unnecessarily.
What "false claim" of water damage? Even Rossman noted the triggered moisture sensors. The fact that they didn't dump out a cup of water from the MBP, or have to scrape the rust off of screws to remove them, is no guarantee that the MBP wasn't operated outside of its humidity-limits.

I have created the software and hardware for probably a dozen embedded system-based industrial products, and for several other custom one-off measurement and control systems. I have also, at various times, worked as an electronic bench tech. I am also an electronic hobbyist and DIY-er. Suffice it to say I understand "Product Life Cycle" and "Common Problems".

BTW, I assure you that a folded-back connector-finger on a connector that normally only gets plugged-in ONCE during manufacture and basically never again, is NOT a "Common Problem". And if you read the Comments to the AppleInsider article I linked, you will find several obviously knowledgeable posters that raise EXACTLY the same points I have.
 
  • Like
Reactions: MrUNIMOG
That maxed out MBP will weigh 40% as much and be a third less tall. It’s also telling that battery life isn’t mentioned at all on Aorus’s end. Their battery has more capacity, but nowhere near enough to offset the power draw.

This just isn’t a market segment Apple is particularly interested in.
All true. For battery what I've been doing is using Throttlestop to create a specific profile that draws way less power for simple tasks like web browsing, word processing, etc. In that scenario I'll get 5 hours at best because there's no integrated GPU so no switching (which in a way is actually a good thing for other reasons). It's heavy, but again, way less heavy than comparable laptops in that class and much less thick. But of course, compared to a 15" MBP a 17" Aorus is going to look huge : / A 15" MBP with a GPU like a GTX 1080 and possibility of 64 GB RAM would be a dream come true for me, even if it were thicker.
 
Its not quite likely. Its 100% correct. Vega M from Intel SoI is actually Polaris+ Vega Memory subsystem. It does not have for example Rapid Packed Math, and is GFX family 806. Vega 12/(Vega Pro 20) is GFX 904.
It is not standard Polaris, it is custom. It has more ROPs, for example.

But it is still only DX 12.0, not 12.1 . I see this as the main reason why it shouldn't be called "Vega".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.