Waiting for the 2019 MBP [Merged]

Small based on the research Apple did when they decided to can the 17” (as well as the 11” Air) - if you was a profitable enough of a market, they would have created 17” models...


either you KNOW and have seen the research or you're pulling stuff out of your @rse mate... 17" was quite profitable machine. The problem IMHO seemed to coincide with 2 things:


1) lack of Retina LCD & powerful GPU

2) abandonment of Pro photo/graphic design segment


First one was explained and as of today we do have all components available, just not in a MacBook Pro shape. Second one is tricky and is far difficult to fathom especially as Apple continues to invest in Final Cut X and Logic. To me photo/graphic design is an important segment that would have been wonderfully complemented by iPads running some Apple software (they could have bought Pixelmator long ago and improve, not abandon Aperture). Why they chose to leave this segment is another topic, but if they didn't care about these customers it didn't make sense to create new laptop for them either


R2FX wants one gaming laptop with very decent level of performance, which is exact point of 17 inch laptop in this context.


Actually not a gaming laptop - a decent, portable desktop replacement :) I bought my 17" in 2006 along with 23" ACD to get into digital photography. After some time I went for Aperture 2 (then 3). At work I spend most of the time in Excel, Access, SQL db crunching numbers and when time allowed in the evening I would fire up Windows in Bootcamp and play IL-2 with buddies back home (using up all USB 2 ports for X52 Pro, TrackIR and CH pedals:) ).


I went the expat route and needed a desktop replacement that I could carry around if needed and 17" was perfect for that. In addition to the 17" had better thermal stats - I lived in Middle East and I saw friend's MBP 15" getting far more hotter and louder than my machine ever did during normal use indoors.


As you pointed out there are things that don't necessary scale up with larger displays, but this does certainly apply to graphic design and photography. Someone mentioned that Apple didn't differentiate 15" and 17" enough - maybe, but then how does one explain iPhone Xs and Xs Max which have identical specs and the only difference is screen size? At work (mobile telco) I see roughly 50:50 split of both models. Again, I couldn't care about specs difference, I would always prefer 17" and gladly pay premium for the screen estate alone. Apple knows that screen size is important (iPad Pro 10.5" and 12"), they just can't be bothered to do something new in the Mac lineup.
 
I don’t know where you got those figures but let’s roll with them.

5/200 is 2.5%. Clearly niche.

So Apple can expect at best 2.5% if we go by PC trends - but probably not even 2.5% because a lot of 17” users are probably gamers, not professionals. If you are a gamer, you get a Windows machine, period instead of faffing about with Bootcamp. I’d also bet that the gamer users have far less disposable income than professional users, so they are also a more price sensitive market. A lot of 17” laptops are cheap plastic builds too, and Apple isn’t interested in that.

Saying that 5M 17” unit sales is half of Apple sales means it isn’t niche, is like saying “If Apple gets into the laptop market, it can get 200m unit sales”...

Anyway, if you want to argue about it, best you let Apple know how many sales they are missing out on, not me. Something tells me they know more than me or you on the matter, considering they became a trillion dollar company.
And Apple is not fu***** nieche? We are talking about JUST ONE SIZE OF LAPTOPS. 17 inch or bigger. As mentioned they are more often seen in professional world, because they deliver very good performance, and the CPUs, thanks to adequate cooling, and power delivery - do not throttle.

Apple has 5% of global sales of laptops. If that is not a nieche, I do not know what is.

Apple designed 17 inch laptop, with current hardware, would start around 3000-3200$, knowing Apple price margins. That would be the SOLE reason why it would not sell, at all.
I never said they were equal. And my post clearly says laptops will not have the same performance as desktops.

I think you're wrong about the Nvidia 10 series being the same as past generations. They are the reason why gaming and creative professional work are being taken seriously on laptops. The 10 series are clearly better than any mobile 600/700/800 or 900 variant. By leaps and bounds better. Nothing like past generations where you saw a big gap between the mobile and desktop dGPU's.
Nvidia's martketing is working ;). In previous generations, Nvidia mobile GPUs were always Desktop-1 performance. This time, its the same. People consider however Nvidia GPUs to be as good as desktop, because of Nvidias marketing and perception. The tests on Max-Q GPUs shows that: GTX 1070 Max-Q is as fast as GTX 1060 Desktop, GTX 1080 Max-Q is as fast as GTX 1070 desktop. GTX 1060 MaxQ is 3% faster than GTX 1050 Ti Desktop.

Actually not a gaming laptop - a decent, portable desktop replacement :) I bought my 17" in 2006 along with 23" ACD to get into digital photography. After some time I went for Aperture 2 (then 3). At work I spend most of the time in Excel, Access, SQL db crunching numbers and when time allowed in the evening I would fire up Windows in Bootcamp and play IL-2 with buddies back home (using up all USB 2 ports for X52 Pro, TrackIR and CH pedals:) ).


I went the expat route and needed a desktop replacement that I could carry around if needed and 17" was perfect for that. In addition to the 17" had better thermal stats - I lived in Middle East and I saw friend's MBP 15" getting far more hotter and louder than my machine ever did during normal use indoors.


As you pointed out there are things that don't necessary scale up with larger displays, but this does certainly apply to graphic design and photography. Someone mentioned that Apple didn't differentiate 15" and 17" enough - maybe, but then how does one explain iPhone Xs and Xs Max which have identical specs and the only difference is screen size? At work (mobile telco) I see roughly 50:50 split of both models. Again, I couldn't care about specs difference, I would always prefer 17" and gladly pay premium for the screen estate alone. Apple knows that screen size is important (iPad Pro 10.5" and 12"), they just can't be bothered to do something new in the Mac lineup.
I always say, that if You HAVE TO have Apple computer, and want desktop replacement:
MacBook Pro 13 inch quad core+ external GPU + Monitor.

There is nothing better that has almost everything: portability and performance.
 
either you KNOW and have seen the research or you're pulling stuff out of your @rse mate... 17" was quite profitable machine. The problem IMHO seemed to coincide with 2 things:


1) lack of Retina LCD & powerful GPU

2) abandonment of Pro photo/graphic design segment


First one was explained and as of today we do have all components available, just not in a MacBook Pro shape. Second one is tricky and is far difficult to fathom especially as Apple continues to invest in Final Cut X and Logic. To me photo/graphic design is an important segment that would have been wonderfully complemented by iPads running some Apple software (they could have bought Pixelmator long ago and improve, not abandon Aperture). Why they chose to leave this segment is another topic, but if they didn't care about these customers it didn't make sense to create new laptop for them either

This makes even less than sense my theory. If it was that popular/profitable, they would have ensured they solved 1) and I don't know what your'e talking about regarding 2). The demand is simply not enough for Apple to justify creating it, simple as that. Not sure why people fail to understand this simple concept.

And Apple is not fu***** nieche? We are talking about JUST ONE SIZE OF LAPTOPS. 17 inch or bigger. As mentioned they are more often seen in professional world, because they deliver very good performance, and the CPUs, thanks to adequate cooling, and power delivery - do not throttle.

Apple has 5% of global sales of laptops. If that is not a nieche, I do not know what is.

Apple designed 17 inch laptop, with current hardware, would start around 3000-3200$, knowing Apple price margins. That would be the SOLE reason why it would not sell, at all.

Why would a company which has 5% of global sales of laptops, invest in a market which is at best 2.5%? I hope you didn't get your Maths muddled and thought that this 2.5% could be added to their 5% share somehow - because that would be silly wouldn't it?

Just because you may have a small market, doesn't mean you go to create a niche within a niche. I think I mentioned somewhere it being almost niche³ in this thread. And if it wouldn't sell due to cost, why the hell would they create it? You are making even less sense than when you rant about AMD/Intel/Nvidia...
 
Nvidia's martketing is working ;). In previous generations, Nvidia mobile GPUs were always Desktop-1 performance. This time, its the same. People consider however Nvidia GPUs to be as good as desktop, because of Nvidias marketing and perception. The tests on Max-Q GPUs shows that: GTX 1070 Max-Q is as fast as GTX 1060 Desktop, GTX 1080 Max-Q is as fast as GTX 1070 desktop. GTX 1060 MaxQ is 3% faster than GTX 1050 Ti Desktop.

You do realize the that the GTX 1080/1070/1060/1050 in laptops are the same as the desktop GPU's just slightly clocked down? Max-Q are marketed as GPU's that are even further clocked down versions of the desktop variants that sacrifice some performance for better battery life. Nvidia isn't pretending (or marketing) that the Max-Q variants run faster.

None of Nvidia's previous mobile generations ran as fast as their desktop GPU's. The 970m wasn't even close to being as good as the desktop 960. In most benchmarks the 970m was just slightly better than the 800m variants.

Not sure where you are getting your numbers from. But the 10 series found in laptops have about 90% of the performance of the desktop versions. I'm not an Nvidia fanboy, but that's a big deal. It's not marketing fluff. You can maybe say they do that for the new RTX GPU's, but the GTX 10 series is a remarkable GPU for mobile devices. It's the reason why most of us on MR wanted Apple to switch to the Nvidia 10 series in all their machines. They are good for gamers and professionals. AMD still hasn't come out with a GPU that matches the performance of the GTX 1080 without drastically overclocking and and increasing the TDP of the GPU.
 
I’m as multi Window as you’ll get - VM’s, IDE, SQL Server, App etc. However, I really do value portability - and frankly, my workflow doesn’t improve much more on a 15” than a 13”, it does improve it a bit but it does come at a portability cost. Perhaps if I was in design/photography I’d appreciate it more? It’s not even just the screen size, but also screen placement. This is why connecting it up with separate monitor/keyboard/mouse is so desirable, although I understand it’s not always possible.

I don’t have the actual figures, but in my experience, the vast majority of laptops have been 13”/14” I believe (in industry). This is not in design but devs team, so it probably differs elsewhere. A lot of famous programmer blogs that I follow also seem to have very compact laptops as their go to machine interestingly!
I guess YMMV but I just find 1440x900 too limited for having several things on screen at once, forcing you to use spaces and flick back and forth, rather than just setting what you need in front of you. With the 15" you can reasonably comfortably set it to 1680x1050 and work away, whereas everything is so minuscule at that resolution on the 13" it's not comfortable to use for a prolonged period from a normal viewing distance. For example you can have a pages document at 125% and see all of it at 1680x1050 on the MacBook pro, still allowing a good amount of safari room on the side in split view and still all rendered at a reasonable size...

This makes even less than sense my theory. If it was that popular/profitable, they would have ensured they solved 1) and I don't know what your'e talking about regarding 2). The demand is simply not enough for Apple to justify creating it, simple as that. Not sure why people fail to understand this simple concept.
It’s probably not even that the market isn’t there, rather that it’s more profitable to make two models (13 and 15) to split sales between them than invest further in a third size which will probably mostly cannibalise 15” sales. In this case it looks like Apple is more worried about cannibalising itself than just leaving the 17” fruit on the tree for other manufacturers.
 
Why would a company which has 5% of global sales of laptops, invest in a market which is at best 2.5%? I hope you didn't get your Maths muddled and thought that this 2.5% could be added to their 5% share somehow - because that would be silly wouldn't it?

Just because you may have a small market, doesn't mean you go to create a niche within a niche. I think I mentioned somewhere it being almost niche³ in this thread. And if it wouldn't sell due to cost, why the hell would they create it? You are making even less sense than when you rant about AMD/Intel/Nvidia...
You do realize that first you said that nobody buys 17 inch laptops, and when proved you are wrong you suddenly move the goalpost?

I precisely said, in one of those posts, that knowing Apple pricing policy 17 inch, gaming, pro oriented, powerful laptop would cost a fortune(overpriced) and nobody would buy one. Actually current MacBook Pro's are overpriced as ****, but everybody buys them.
You do realize the that the GTX 1080/1070/1060/1050 in laptops are the same as the desktop GPU's just slightly clocked down? Max-Q are marketed as GPU's that are even further clocked down versions of the desktop variants that sacrifice some performance for better battery life. Nvidia isn't pretending (or marketing) that the Max-Q variants run faster.

None of Nvidia's previous mobile generations ran as fast as their desktop GPU's. The 970m wasn't even close to being as good as the desktop 960. In most benchmarks the 970m was just slightly better than the 800m variants.

Not sure where you are getting your numbers from. But the 10 series found in laptops have about 90% of the performance of the desktop versions. I'm not an Nvidia fanboy, but that's a big deal. It's not marketing fluff. You can maybe say they do that for the new RTX GPU's, but the GTX 10 series is a remarkable GPU for mobile devices. It's the reason why most of us on MR wanted Apple to switch to the Nvidia 10 series in all their machines. They are good for gamers and professionals. AMD still hasn't come out with a GPU that matches the performance of the GTX 1080 without drastically overclocking and and increasing the TDP of the GPU.
I perfectly know that Nvidia GPUs are the same as desktop. THEY ALWAYS WERE THE SAME. But downclocked. Max-Q GPUs are howevger much more downclocked, than they typically are(you lose 50% of maximum core clocks, compared to desktop versions). What I have looked at benchmarks of Max-Q GPUs, they have shown that GTX 1070 Max-Q is the same in performance as GTX 1060 Desktop. GTX 1080 Max Q is the same as GTX 1070 desktop. Mobile Nvidia GPUs are desktop GPU SKUs-1 in performance. Exactly the same way as they always been. GTX 980M(1536 CUDA Cores vs 2048 in desktop GTX 980) was exactly the same performance level as desktop GTX 970, and GTX 970M was exactly the same level as GTX 960.
 
You do realize the that the GTX 1080/1070/1060/1050 in laptops are the same as the desktop GPU's just slightly clocked down? Max-Q are marketed as GPU's that are even further clocked down versions of the desktop variants that sacrifice some performance for better battery life. Nvidia isn't pretending (or marketing) that the Max-Q variants run faster.

None of Nvidia's previous mobile generations ran as fast as their desktop GPU's. The 970m wasn't even close to being as good as the desktop 960. In most benchmarks the 970m was just slightly better than the 800m variants.

Not sure where you are getting your numbers from. But the 10 series found in laptops have about 90% of the performance of the desktop versions. I'm not an Nvidia fanboy, but that's a big deal. It's not marketing fluff. You can maybe say they do that for the new RTX GPU's, but the GTX 10 series is a remarkable GPU for mobile devices. It's the reason why most of us on MR wanted Apple to switch to the Nvidia 10 series in all their machines. They are good for gamers and professionals. AMD still hasn't come out with a GPU that matches the performance of the GTX 1080 without drastically overclocking and increasing the TDP of the GPU.

For the average desktop card versus a notebook with the same dGPU pretty much a wash theses days.
Desktop are always going to have a performance edge that's common sense, equally having near desktop performance in a notebook remains impressive. Unfortunately Apple's focus is solely thin & light, leaving them with around just 35W TDP to play with for the dGPU, arguably far less now with the event of the hex core 8th Gen CPU's.

Personally I'm seeing more and more switch who have higher needs, as Apple simply does not produce anything comparable without a significant performance trade off.

Q-6
 
Last edited:
You do realize that first you said that nobody buys 17 inch laptops, and when proved you are wrong you suddenly move the goalpost?

Let’s stop here - I’m tired of your factually false posts (in general). If you actually read any of my posts instead of frothing at your mouth, you’d realise that

a) That I never said no one buys them
b) Acknowledged there are people who would love a 17”.

Please save the BS “moving goalpost” rubbish, it’s insulting.
 
For the average desktop card versus a notebook with the same dGPU pretty much a wash theses days.
Desktop are always going to have a performance edge that's common sense, equally having near desktop performance in a notebook remains impressive. Unfortunately Apple's focus is solely thin & light, leaving them with around just 35W TDP to play with for the dGPU, arguably far less now with the event of the hex core 8th Gen CPU's.

Personally I'm seeing more and more switch who have higher needs, as Apple simply does not produce anything comparable without a significant performance trade off.

Q-6

The video here is proof of what I've been trying to say in my latest posts. I'm just not going to post about this anymore, since it's getting pointless and I rather not just argue for the sake of arguing.
 
You do realize the that the GTX 1080/1070/1060/1050 in laptops are the same as the desktop GPU's just slightly clocked down? Max-Q are marketed as GPU's that are even further clocked down versions of the desktop variants that sacrifice some performance for better battery life. Nvidia isn't pretending (or marketing) that the Max-Q variants run faster.

None of Nvidia's previous mobile generations ran as fast as their desktop GPU's. The 970m wasn't even close to being as good as the desktop 960. In most benchmarks the 970m was just slightly better than the 800m variants.

Not sure where you are getting your numbers from. But the 10 series found in laptops have about 90% of the performance of the desktop versions. I'm not an Nvidia fanboy, but that's a big deal. It's not marketing fluff. You can maybe say they do that for the new RTX GPU's, but the GTX 10 series is a remarkable GPU for mobile devices. It's the reason why most of us on MR wanted Apple to switch to the Nvidia 10 series in all their machines. They are good for gamers and professionals. AMD still hasn't come out with a GPU that matches the performance of the GTX 1080 without drastically overclocking and and increasing the TDP of the GPU.

You are trying to argue with a die hard AMD fanboy (who believes AMD is going to release a 15W CPU which beats Intels 45W CPU), it’s pointless. You are right, modern Nvidia GPU’s used in laptops are no longer the joke they used to be before.
 
You are trying to argue with a die hard AMD fanboy (who believes AMD is going to release a 15W CPU which beats Intels 45W CPU), it’s pointless. You are right, modern Nvidia GPU’s used in laptops are no longer the joke they used to be before.
Look in the mirror first, then call anyone a fanboy.

You claim that 15W, 7 nm AMD CPUs will not beat 45W Intel CPUs, even if those 15W CPUs will have 8C/16T Cluster, and around 4 GHz maximum turbo clock(not all-core Turbo)? ;)

Today there has been spilled some beans on upcoming AMD CPUs, from architectural, technical point of view. All of 7 nm designs: APUs, AM4, SP4 will have 8C/16T cluster. Intel will have very hard time competing with AMD from technological point of view, considering, that - if rumors will turn out to be true - Zen2 is 2-5% faster than Skylake/Kaby Lake/ Coffee Lake on IPC, and can clock as high, but will have more cores, and higher efficiency(Higher clocks, in lower thermal envelopes).
 
Look in the mirror first, then call anyone a fanboy.

You claim that 15W, 7 nm AMD CPUs will not beat 45W Intel CPUs, even if those 15W CPUs will have 8C/16T Cluster, and around 4 GHz maximum turbo clock(not all-core Turbo)? ;)

Today there has been spilled some beans on upcoming AMD CPUs, from architectural, technical point of view. All of 7 nm designs: APUs, AM4, SP4 will have 8C/16T cluster. Intel will have very hard time competing with AMD from technological point of view, considering, that - if rumors will turn out to be true - Zen2 is 2-5% faster than Skylake/Kaby Lake/ Coffee Lake on IPC, and can clock as high, but will have more cores, and higher efficiency(Higher clocks, in lower thermal envelopes).

Basically, no substance yet again - but how else would the AMD train survive :rolleyes:. Please come back to me when we have something other than gossip, such as product launches and comparative bench marks. Not all of us drool at the AMD rumor mill, believe me. If we went by rumors, we would be on 10nm Cannon-lake chips 2 years ago..
 
Last edited:
No substance?

https://semiaccurate.com/2018/10/29/more-details-about-amds-rome-cpu-leak/

In our initial reveal we said there were 9 die or more specifically 8+1 dies with 8x 8-core CCXs and one IOX
Which means the cluster is now 8 core, and will translate into 8 core APU, and at least 8 core CPUs(yes, Ryzen 3 3200 might be 8C/8T CPU).

https://twitter.com/BitsAndChipsEng/status/1052194745647165441

Which basically means Zen2 is 2-5% faster than Skylake/Kaby Lake/Coffee Lake.

Your point with CannonLake, and rumors is wrong. Because it was not rumored, that we would get CNL two years ago. It was blatantly "official" Intel roadmap. Rumors said 10 nm is dead in the water, and CNL is dead in the water. What turned out to be true? Intel official roadmap, or rumors about it?

(Heck. I can even remember arguing with people on this very subforum in one of Waiting for MBP threads about this 10 nm fiasco. "There is no way Intel would lie to consumers about their product roadmap!1!1!1oneonez)

Charlie was dead right about Intel 10 nm process woes, no reason to not believe him on the matter of AMD Rome, and Matisse(It is the same design), and no reason not to believe Bits and Chips considering they were also 100% right on Zen1(Broadwell IPC, and max @ 4GHz core clocks, but with 8C/16T design), and the info on IPC comes from the same source, as previous Zen information.

P.S. Why did Charlie posted it today? Because on 6th November there is AMD event. And they will talk a lot on 7 nm products. That is why Charlie had access to this information.
 
No substance?

https://semiaccurate.com/2018/10/29/more-details-about-amds-rome-cpu-leak/


Which means the cluster is now 8 core, and will translate into 8 core APU, and at least 8 core CPUs(yes, Ryzen 3 3200 might be 8C/8T CPU).

https://twitter.com/BitsAndChipsEng/status/1052194745647165441

Which basically means Zen2 is 2-5% faster than Skylake/Kaby Lake/Coffee Lake.

Your point with CannonLake, and rumors is wrong. Because it was not rumored, that we would get CNL two years ago. It was blatantly "official" Intel roadmap. Rumors said 10 nm is dead in the water, and CNL is dead in the water. What turned out to be true? Intel official roadmap, or rumors about it?

(Heck. I can even remember arguing with people on this very subforum in one of Waiting for MBP threads about this 10 nm fiasco. "There is no way Intel would lie to consumers about their product roadmap!1!1!1oneonez)

Charlie was dead right about Intel 10 nm process woes, no reason to not believe him on the matter of AMD Rome, and Matisse(It is the same design), and no reason not to believe Bits and Chips considering they were also 100% right on Zen1(Broadwell IPC, and max @ 4GHz core clocks, but with 8C/16T design), and the info on IPC comes from the same source, as previous Zen information.

P.S. Why did Charlie posted it today? Because on 6th November there is AMD event. And they will talk a lot on 7 nm products. That is why Charlie had access to this information.

Don't care who Charlie is - Until these new CPU's (or old Zen :rolleyes:) are used on a massive scale, it means nothing to me. Me and you have different opinions on what "substance" is. I don't want "details", I want benchmarks. Wait for them before getting overly excited like a college kid.
 
You do realize the that the GTX 1080/1070/1060/1050 in laptops are the same as the desktop GPU's just slightly clocked down? Max-Q are marketed as GPU's that are even further clocked down versions of the desktop variants that sacrifice some performance for better battery life. Nvidia isn't pretending (or marketing) that the Max-Q variants run faster.

None of Nvidia's previous mobile generations ran as fast as their desktop GPU's. The 970m wasn't even close to being as good as the desktop 960. In most benchmarks the 970m was just slightly better than the 800m variants.

Not sure where you are getting your numbers from. But the 10 series found in laptops have about 90% of the performance of the desktop versions. I'm not an Nvidia fanboy, but that's a big deal. It's not marketing fluff. You can maybe say they do that for the new RTX GPU's, but the GTX 10 series is a remarkable GPU for mobile devices. It's the reason why most of us on MR wanted Apple to switch to the Nvidia 10 series in all their machines. They are good for gamers and professionals. AMD still hasn't come out with a GPU that matches the performance of the GTX 1080 without drastically overclocking and and increasing the TDP of the GPU.

The problem with the RTX is that 10 bit is not enabled.... making it useless for pro apps on an external monitor like photography or video work
 
Sometimes you are right by expecting the unexpected: Apple actually did refresh the 2018 MacBook Pros today. Or rather, they added an additional tier with Radeon Pro Vega graphics on top of the existing lineup, which I suppose still counts as a refresh.

Sorry to all the people whose suggestion that there might be another MacBook Pro update this year only four months after the last one I so quickly dismissed. I was wrong.
 
16 and 20 CUs: 1024 and 1280 GCN cores. And yes it actually is Vega(GCN5) with Rapid Packed Math.
 
Sometimes you are right by expecting the unexpected: Apple actually did refresh the 2018 MacBook Pros today. Or rather, they added an additional tier with Radeon Pro Vega graphics on top of the existing lineup, which I suppose still counts as a refresh.

Sorry to all the people whose suggestion that there might be another MacBook Pro update this year only four months after the last one I so quickly dismissed. I was wrong.
WHAT.

****!? MINE IS ONE MONTH OLD.

That's it, this is the last Apple computer I bought. I'm switching to Cubase and ditching apple.
 
Last edited:
Might be upgrading my 2016 model with the Vega model next month.

Wanted to get an iMac, but that doesn't seem to be happening. Improved graphics and keyboard would be nice, even if my keyboard has never failed.
 
Sometimes you are right by expecting the unexpected: Apple actually did refresh the 2018 MacBook Pros today. Or rather, they added an additional tier with Radeon Pro Vega graphics on top of the existing lineup, which I suppose still counts as a refresh.

Sorry to all the people whose suggestion that there might be another MacBook Pro update this year only four months after the last one I so quickly dismissed. I was wrong.
Huh, unusual. I guess it's too big an upgrade to not do it, but they couldn't really wait any longer to get the 2018s out? Their website suggests up to 60% faster than the 560X so this is pretty significant (though I'd imagine be prepared for it to be price in kidneys?)
 
Hmmm.

1.3 GHz Turbo Clock Speed for 20 CU version(1280 Core), and 1185 MHz for 1024 core one(16CU). The TDP target is the same as for Polaris: 35W. For comparison: P11 in MBP with 1024 cores has around 900 MHz core clock. Vega has more cores, and much higher clock speed. That 60% faster may be related to Full Polaris performance.

From Linux Drivers it appears it has 5 CU's per Shader Engine, with 4 Shader Engines, and 8 ROPs per Shader Engine, totalling in 32 ROPs.

3.3 TFLOPs from 35W TDP. That is one hell of a design. That GPU should be close to Vega M GH/GTX 1060 Max-Q, in ultimate performance.

192 GB/s Bandwidth from single HBM2 stack. Not bad at all. 65-75W, desktop version of this GPU would be very close(1.5-1.6 GHz) to desktop GTX 1060.
 
Last edited:
Huh, unusual. I guess it's too big an upgrade to not do it, but they couldn't really wait any longer to get the 2018s out? Their website suggests up to 60% faster than the 560X so this is pretty significant (though I'd imagine be prepared for it to be price in kidneys?)

This is also a pretty big finger up theirs for recent 2018 buyers...
a) we were beta testing new machines plagued with issues
b) we will be stuck with 2 year old architecture.

if i knew one month ago when i got mine that this was an option I'd have waited... now i cant return it. I was never *happy* with the GPU, i just didnt have an option to have a better one. Ugh.

This whole 2018 mbp ride has been a disappointment. I'm really really unhappy, and I'm considered a fanboy...
 
Hmmm.

1.3 GHz Turbo Clock Speed for 20 CU version(1280 Core), and 1185 MHz for 1024 core one(16CU). The TDP target is the same as for Polaris: 35W. For comparison: P11 in MBP with 1024 cores has around 900 MHz core clock. Vega has more cores, and much higher clock speed. That 60% faster may be related to Full Polaris performance.

From Linux Drivers it appears it has 5 CU's per Shader Engine, with 4 Shader Engines, and 8 ROPs per Shader Engine, totalling in 32 ROPs.

3.3 TFLOPs from 35W TDP. That is one hell of a design. That GPU should be close to Vega M GH/GTX 1060 Max-Q, in ultimate performance.

192 GB/s Bandwidth from single HBM2 stack. Not bad at all. 65-75W, desktop version of this GPU would be very close(1.5-1.6 GHz) to desktop GTX 1060.
Few comparisons:

Vega M GH. 24 CUs: 1536 GCN4 cores, 1190 MHz core clock, 4 GB HBM2, 55W TDP, 3.6 TFLOPs.
Vega Pro 20: 20 CUs, 1280 GCN5 cores, 1300 MHz core clock, 4 GB HBM2, 35W TDP, 3.3 TFLOPs.

Vega 20 should be slightly faster.

AMD! We need this on desktop!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top