I don’t know how you guys game on a Mac or PC.Macs have been such poor gaming machines for so long, or at least have gone through such hills and valleys, that I gave up a long time ago, bought an Xbox One and called it a day. I do understand that gaming is possible on a Mac and for some even desirable.
I am old...and I have been too busy to play. Also, Killer Instinct stopped at Season 3 and I have been bummed ever since.
CUDA is like Windows, not Intel.I do not believe it's Apple trying to save the GPU market. They had no problem at all going with Intel when AMD CPUs were so far beyond that AMDs existence was on the verge of going astray.
That's clearly not the reason
i didnt say Apple deceived us, but i find it highly unlikely Apple was in the dark about the Vega chips. Apple works with AMD closely...<SNIP>
I don’t know how you guys game on a Mac or PC.
I’ve always been a console guy.
I tried to play Fornite on my 5K using a keyboard and mouse and I was all out of sorts.
Then again, even when I use an XBONE or a PS4 and controller I get torched online in shooter games and Madden by what I assume are 13 year olds.
CUDA is like Windows, not Intel.
Apple only moved to x86 when they had no other choice, they did not care if people could not run Windows properly. They now give you Bootcamp, but no support.
Good, but I wonder if they shouldn't make room for better cooling first - no point in having the best in class GPU if it's going to be throttled like the CPUs are.
Also, isn't it a little weird to have this sort of high-end GPU option available with only 4 GB of memory ?
11 pages, AND NOBODY posted the right specification for this GPU, and its performance.I hope it’s true but it seems that:
Radeon Pro Vega 16 could be like a GeForce GTX 470 or GeForce GTX 560 Ti which are 2010-2011 cards.
https://www.techpowerup.com/gpu-specs/radeon-pro-vega-16.c3331
Radeon Pro Vega 20 could be like a GeForce GTX 660 which was released on 2012!
https://www.techpowerup.com/gpu-specs/radeon-pro-vega-20.c3263
They should include (or at least give the option) nVidia cards, even if they have to add a fan and and few millimeters to the case.
if Apple offer the same internal design (as i expect) this will be a hot very hot laptop and i would not recommend this since these vega are still on 14nm
That’s just bizarre. It’s good that Apple is offering this option now. It makes the MBPro a little more Pro. Simple as that.
So many people complaining that Apple upgraded a model, because they recently or relatively recently they bought one, pre upgrade.
Are you guys completely new to the world of personal computing?
It's the same story 40 years now.
FFS get over it.
Only thing I feel sorry for Apple. If they don't update their machines regularly they get blasted. If they update their machines, they get blasted.
Vega 20 is on the same performance level as Quadro P3000(EXACTLY the same), which is based on GP106 chip, but much more efficient.
11 pages, AND NOBODY posted the right specification for this GPU, and its performance.
Vega 16: 16 CUs, 1024 GCN5 cores, 1185 MHz core clock, 4 GB HBM2 with 192 GB/s memory bandwidth, 35W TDP Power Limit. 35% faster than Radeon Pro 560X.
Vega 20: 20 CUs, 1280 GCN5 cores, 1300 MHz core clock, 4 GB HBM2 with 192 GB/s memory bandwidth, 35W TDP Power Limit, 60% faster than Radeon Pro 560X.
Vega 20 is on the same performance level as Quadro P3000(EXACTLY the same), which is based on GP106 chip, but much more efficient(Vega Pro 20 has 35W PL, and Quadro P3000 has 80W PL). In games, the GPU will be 10-15% slower than GTX 1060 Max-Q. Vega 16 will be betqween GTX 1050-1050 Ti, which are as fast, or faster than desktop GTX 1050 Ti, but use more power, than desktop version.
For me it is truly insane, how much BS is spilled over this forum about AMD GPUs.
Tell me but don't look at brands.Since you seem to know this as fact, this must mean you're an Apple or AMD enginner. If we go by what you say, then we already have benchmarks...
https://www.techpowerup.com/gpu-specs/quadro-p3000-mobile.c2923
Looking at the graph the Vega 20 would be 17% faster than a 1050ti and 41% slower than a 1060 6gb. It would be roughly equal to a desktop GTX 780, which was released 5 years ago.
https://www.notebookcheck.pl/NVIDIA-Quadro-P3000.191403.0.htmlIf they perform like the 1050-1060, it’d great! I hope so.
P.S. Don't use techpowerup benchmark list for comparisons. Its mainly wrong.
That ranking is not based on avearges, from their benchmarks.How are they wrong? They take hunderds of benchamrks and divide to get the proper averages. If the P3000 is identical in prformance then those numbers should be accurate.
Do you have any evidence to back this up or are you a fan of being sued for libel?
Also, Quadro P3000 according to Techpowerup is based on GP104. Its not. It is based on GP106.
https://www.notebookcheck.net/NVIDIA-Quadro-P3000.191075.0.htmlThe mobile version is 104, techpowerup is correct. Are you talking about the desktop, or the mobile version when you cite Vega 20 and the p3000?
https://videocardz.net/nvidia-quadro-p3000-mobile/
Look at the scores from the benchmarks.That seems to be the desktop version with 16 gigs of ram. Are you saying vega 20 is as fast as the DESKTOP card, or the mobile card? The mobile card has 6 gigs of ram and clocks are different.
Look at the scores from the benchmarks.
Vega Pro 20 has 12405 GPU score in 3d Mark 11 benchamark, and Quadro P3000 in that benchmark scores 12105 pts.
And its 6 GB Version.
At least they are updating chips in-cycle for new buyers.
I imagine people who bought in July are not pleased....
I have seen wrong performance data on that site before.How are they wrong? They take hunderds of benchamrks and divide to get the proper averages. If the P3000 is identical in prformance then those numbers should be accurate.
I could see them doing that, since they seem to be focusing on the Mac a bit more...True, I am not super excited that we didn't know the release roadmap when I bought mine, but I think I'll be plenty fine with my 560x for quite some time - I didn't have enough time in between final cut background renders to even breathe!
Wonder if this is the new strategy - upgrade design/enclosure for the masses, and have a way to upgrade chips for us picky pros. That way, they both win.
Yes, they did the wrong thing before. They don't need to do the wrong thing again.Not true. Apple HAD a choice. They, however, chose the sensible thing to do and switched to the superior manufacturer. This time, the situation is similar: They should switch GPU manufacturers; it would be the sensible thing to do. But they don't, apparently.
Btw, it actually does not matter if you compare CUDA with Windows or the CPU market situation.
Point being that Apple certainly does not care at all about the GPU market. They did not care about the CPU market either, so why should they? Apple is trying to make money, looooots of money. Maybe they even care about their customers, trying to create better products (as they tend to claim). But no, they certainly have no desire to act as saviors of the GPU market; even the thought seems preposterous (no insult intended). Why would they want to sell inferior products, putting them in a bad, bad market position. That's the last thing they'd do
Whatever the reason is - it seems unreasonable. They intentionally sell inferior products to customers at extremely high prices without a good cause. Very, very, very irrational
I'm one of those people... That said, I would go back to an iMac. Without an official Apple monitor, it is way clunkier and buggy compared to my old Air on a Thunderbolt Monitor. I've had to hard restart this MacBook more times in the last 3 months, than I have on all previous Macs I've owned. Only have the issues when connecting to the 3rd party LG monitor.