Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's a shame they don't offer these graphics card choices for the MacBook Pro 13.

Maybe Apple does power management like Kaby Lake G.

I am surprised Apple did not use the Kaby Lake-G chipset in the 13" MBP. Perhaps the Iris Pro gfx perform better?


Yeah the 13in is so far down in performance compared to the 15in it's kinda deceiving on Apple's part calling both machines "pro". There is no way they could fit a Vega class graphics card in the 13in's thermal budget, but there has to be a card out there with performance in the middle between an intel 650 and a monster vega 16 that would work in the 13 in laptop?

Where does the Kaby Lake-G slot in?


want to have a MBP 15" with onboard gfx, i5 and not a jet fighter.

That would be better for mobility while making its pairing with an eGPU more justifiable.
 
Blower style coolers can be noisy and the card can tun hotter. But they push the air to the outside of the case.

You have to check that a card works with a Mac (it seems XFX usually does not because of an incompatible VBIOS, for example).

There are some eGPU enclosure deals around $200. They are much better than what you could get for that money some years ago, but maybe you would not be able to fit a monster).

Okay, yes, I gotcha on the blower style coolers. I was looking at an MSI RX580 that NewEgg had on sale, but the common complaint was that the card ran hot due to the single blower style fan.

It's a shame that AMD chose that configuration for the FE version of their Vega 64 as that is the only card I know of that contains 16GB of HBM. I suppose 8GB is enough, but for the cost that Sonnet Technology is charging, $1200, for the card and the enclosure, I find it very reasonable given that that card was somewhere north of $1500-$2000 back about 6 months ago.

I suppose that this card might now work with the Mac, however, it is essentially the same GPU that Apple is putting in the iMac Pro, so compatibility should be Golden™.

I restrict my longings to Sapphire and MSI as those are being advertised and sold by Sonnet Technology, although I believe Asus and Gigabyte should be good. I have heard many bad things about XFX and although Apple recommends the PowerColor DevilBox, I would never buy one due to PowerColor's rather dubious record here in the USA.

The Sonnet eGFX Breakaway Box 350 is $199, but it can only support a subset of cards due to power requirements and as a result, it cannot provide 85w of charging for my 2016 15" MacBook Pro. The 550 is a better way to go, but the 650w is the one that I want, although even it is not guaranteed to support the 7nm Vega Instinct consumer GPUs slated for 2019 if the rumors about power consumption are to be believed.

Herein lies the Achilles Heel of the eGPU market and why the BlackMagic boxes may fair better as they work right out of the box, while the roll your own approach is still sort of Wild West right now.


PS - Prices for the RX570 and RX580 may go down a bit once AMD formally announces the Radeon RX590.
 
  • Like
Reactions: RandomDSdevel
I am surprised Apple did not use the Kaby Lake-G chipset in the 13" MBP. Perhaps the Iris Pro gfx perform better?




Where does the Kaby Lake-G slot in?




That would be better for mobility while making its pairing with an eGPU more justifiable.

Kaby Lake-G is either a 65w TDP (GL) or a 100w TDP (GH) package that would not work inside the 13" MacBook Pro's thermal envelope give that it rocks a 28w CPU, has no PCH (it's in the CPU), the two JHL-7540 controller dissipate another 5w combined, plus the NVMe drive at full speed is probably about 3w, using a 58w/hr battery. The 13" MBP would be on fire and last about 4 hours on battery if you were lucky enough not to have it, like...catch on fire.
 
You were talking about a Vega 64 Frontier.
[doublepost=1542234785][/doublepost]
I am surprised Apple did not use the Kaby Lake-G chipset in the 13" MBP. Perhaps the Iris Pro gfx perform better?
No way. Kaby Lake G is faster than a 1050 Ti.
 
  • Like
Reactions: RandomDSdevel
Having paid £2,100 for my 15" last year with discount, the new borderline £3k price for the new higher end model and upgraded graphics costing £800 more for the same spec (RAM, SSD etc.) - yeah the CPU is more powerful now but...I am struggling with the price hike here. The MBP seems to have experienced a far more egregious hike than other models on the line. Might just be me, and as always to each their own...

I can't find any "price hike" in either USD or EUR. Prices for 2018 MBP are the same as 2017 or 2016 for the respective models. Maybe it's just that GBP has gotten weaker due to Brexit?
 
  • Like
Reactions: RandomDSdevel
Louis Rossman is a cheat and a liar.
Yeah, I gotta tell ya, those are pretty strong words. Louis may be a bit high strung, arrogant if you will, in his appraisal of Apple, but calling him a cheat and a liar needs evidence to back it up. You might want to rephrase that unless you have a link to a civil suit bearing his name and/or business name.

PSA: I am not an attorney spokesperson. Nor am I a member of the Louis Rossman Fan Club, but I do enjoy his videos and I suspect there is more truth than some may want to believe.

PPSA: I believe that what you said could be construed as defamation of character.
 
As much as I like AMD at least tries to keep up with nVidia: for now and the foreseeable future nVidia is waaay ahead both in term of efficiency and sheer performance. So AMD GPUs for now are - sorry to say - just second choice, which is not exactly in line with Apples pricing and premium brand claims.

That said, I am aware for some the brand logo sticking on top of their GPUs just is not important.

For others, like me, nVidia GPUs are essential (which is why I actually sold my MBP 2018 and replaced it by a late 2013 MBP, the last one with nVidia GPUs).

Why you ask?
I am a computer scientist, programming AI algorithms mostly. That is, machine learning, deep learning, genetic algorithms. In production this software usually runs in the cloud. That is to say: on Linux VM instances. Using Tensorflow. Keras. You name it. As a matter of fact, literally each and every of those frameworks uses CUDA - exclusively, mostly - under the hood.
With its "No nVidia" policy, Apple essentially shows pretty much 100% of ML researchers/practitioners (a booming industry at present) the middle finger. There is quite literally no other option other than CUDA/nVidia. Just check out the usual suspects - Google, AWS, Azure - none of those cloud providers currently offers AMD GPUs. At present only Alibaba seems to offer an AMD option. Unfortunately, Chinese providers are a controversial choice due to privacy concerns.

As said, I'd very much would like to have a choice - that is, an open standard. Alas, there is none (at least no competitive).

If Apple continues this policy I'd really HATE to have to switch to some different platform. Windows maybe. Shudder
 
Last edited:
Mag-Safe has been very much SOLVED by third-party vendors. Some of the solutions are only $10. Get over it.

And the "Emoji Bar" meme is as tired as those who still insist upon using it.

And as for your signature, if Apple doesn't care about the Mac anymore, then why are they running MacBook Air commercials with at LEAST as much frequency as iPhone Xr commercials? Why have they updated FOUR Macs in the past month?

You make no sense.
[doublepost=1542233958][/doublepost]

How much is Apple paying you to make up excuses for them? And isn't this a 'premium' machine? Where is the 'premium' part come in?

And the emoji bar is terrible and it's been criticized since day one of its release and for good reason.

In regards to the Mac: the new Mac Mini. Check and mate.

"You make no sense."

Stop psychologically projecting.
[doublepost=1542236098][/doublepost]
So you know 100% that it's sensational? Recently, I experienced moronic service from Apple for a repair. The "geniuses" are far from being geniuses.

They are geniuses, at making people replace the whole machine for an insane fee. The idiots are the one's accepting this (and also the engineers who designed a riveted keyboard assembly with something like 50 rivets on the MBP).
 
Only if that's your sole reason for buying one. Most people buy a Mac Mini specifically to fulfil a particular role in their computing setup (eg Pro Audio). Adding an eGPU allows one to turn a Mac Mini into a respectable gaming machine, after hours when the work is done.

Macs have been such poor gaming machines for so long, or at least have gone through such hills and valleys, that I gave up a long time ago, bought an Xbox One and called it a day. I do understand that gaming is possible on a Mac and for some even desirable.

I am old...and I have been too busy to play. Also, Killer Instinct stopped at Season 3 and I have been bummed ever since.
 
As much as I like AMD at least tries to keep up with nVidia: for now and the foreseeable future nVidia is waaay ahead both in term of efficiency and sheer performance. So AMD GPUs for now are - sorry to say - just second choice, which is not exactly in line with Apples pricing and premium brand claims.

That said, I am aware for some the brand logo sticking on top of their GPUs just is not important.

For others, like me, nVidia GPUs are essential (which is why I actually sold my MBP 2018 and replaced it by a late 2013 MBP, the last one with nVidia GPUs).

Why you ask?
I am a computer scientist, programming AI algorithms mostly. That is, machine learning, deep learning, genetic algorithms. In production this software usually runs in the cloud. That is to say: on Linux VM instances. Using Tensorflow. Keras. You name it. As a matter of fact, literally each and every framework worth mentioning uses CUDA under the hood.
With its "No nVidia" policy, Apple essentially shows pretty much 100% of ML researchers/practitioners (a booming industry at present) the middle finger. There is quite literally no other option apart other than CUDA/nVidia.

As said, I'd very much would like to have a choice - that is, an open standard. Alas, there is none (at least no competitive).

If Apple continues this policy I'd really HATE to have to switch to some different platform. Windows maybe. Shudder

This is the direction they seem to be going, regardless of what the unpaid apologists say. 'Pro' machines that are anything but: no user upgradable parts, last gen CPUs, thermal throttling because they refuse to properly cool the machines, and the computers are almost impossible to service, with everything being soldered on.

The Radeon R9 M395X 4GB GPU is perfectly 'ok' at running the 5K screen on my Late 2015 iMac... while AutoCad and Illustrator are open (I work in vector based 2D design at the moment for a professional project... it's nowhere as advanced as what you are doing). However, the moment I plug in another 4K 27" screen and am trying to say use the 'trim' command in AutoCad and use 'all' to select every line (it's really the fastest way to trim something IMHO), graphically, it starts to get wonky.

When I bought my iMac there was no iMac Pro AND this was the newest Pro-like machine Apple offered... I would have def bought a real Pro machine if they offered one (I didn't want the already outdated Trashintosh).
 
Anyone think they’ll cone out with a new MBP design by next summer? So let’s say a spring update.

Debating on whether to buy a high end mbp now or wait for a redisgn, since I’m not a fan of the keyboard
[doublepost=1542210749][/doublepost]
I don’t see any new cpus
I’m referring to the discrete graphics. I thought that was pretty clear.
 
I am seriously puzzled as to why Apple's got this feud with nVidia.

Yes, there were issues with some GPUs on Macbook Pros. Just as there were issues with AMD GPUs on said MBPs. And there's no doubt nVidia for now is ahead by a rather substantial margin.
So, in technical terms AMD GPUs in top of the line products just don't make any sense. Using AMD, Apple only can lose; as long as they stick with Radeons its easy to predict they will be second in every test against Dell XPSs and other competitors.
nVidia is even more efficient; what should please the "slimmer, slimmer & quieter" - zealots around Sir Ive. So what on earth is going on there?
 
How much is Apple paying you to make up excuses for them? And isn't this a 'premium' machine? Where is the 'premium' part come in?

And the emoji bar is terrible and it's been criticized since day one of its release and for good reason.

In regards to the Mac: the new Mac Mini. Check and mate.

"You make no sense."

Stop psychologically projecting.
[doublepost=1542236098][/doublepost]

They are geniuses, at making people replace the whole machine for an insane fee. The idiots are the one's accepting this (and also the engineers who designed a riveted keyboard assembly with something like 50 rivets on the MBP).
My screen went on my MacBook Pro so I tested it out by shining a torch through the Apple logo and connecting it to a monitor with an HDMI. I could see the cursor it clearly a backlight issue. After days of Apple saying they were ordering in the parts without telling me the exact issue, they said the “genius” thinks the backlight had gone so they were going to replace the whole screen. The only test they had done was exactly what I had done myself. I asked if it made sense to find out exactly what had caused the issue instead of just replacing the whole screen, thinking it could be a cheap component. They said they couldn’t do that and they had to replace the whole screen. After doing exactly that, it still didn’t work and it turned out to be the logicboard. It probably was just a loose connection but all they did was replace the entire thing. At the end of the day, it’s a retail job. I have nothing exact retail jobs or people who work at Apple stores but Apple completely overhypes their expertise.
 
That’s just bizarre. It’s good that Apple is offering this option now. It makes the MBPro a little more Pro. Simple as that.
bizarre? if they can make us wait 4 years for a mac mini, they could make us wait a few months for a faster gfx option (or 1 year for next year's gfx option)
 
My screen went on my MacBook Pro so I tested it out by shining a torch through the Apple logo and connecting it to a monitor with an HDMI. I could see the cursor it clearly a backlight issue. After days of Apple saying they were ordering in the parts without telling me the exact issue, they said the “genius” thinks the backlight had gone so they were going to replace the whole screen. The only test they had done was exactly what I had done myself. I asked if it made sense to find out exactly what had caused the issue instead of just replacing the whole screen, thinking it could be a cheap component. They said they couldn’t do that and they had to replace the whole screen. After doing exactly that, it still didn’t work and it turned out to be the logicboard. It probably was just a loose connection but all they did was replace the entire thing. At the end of the day, it’s a retail job. I have nothing exact retail jobs or people who work at Apple stores but Apple completely overhypes their expertise.

Every time I ended up speaking to the 'top' genius and they saw that I know WTF I'm talking about (or at least more than your average Apple product consumer in 2018), they didn't treat me like a child and often with head nods agreed that certain things were ridiculous re: company policy.

For example: not officially allowing TB2 Macs to use eGPUs for no reason whatsoever. All I have read about eGPUs connected via TB3 vs TB2 show that often TB2 is at the same speed or just slightly slower, by slightly I mean negligible in real world computing. so why turn it off? So you buy a computer with TB3, since Apple no longer makes a Mac product where you can add in a new card: https://www.amazon.com/Sonnet-Thund..._rd_t=40701&psc=1&refRID=2SBFHQHNXG141PES0FQN
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
I am seriously puzzled as to why Apple's got this feud with nVidia.
I don't see it as Apple having a feud with NVIDIA, as much as trying to protect the GPU market.

You explained clearly the problem of being locked in to NVIDIA. The community created it themselves.
 
  • Like
Reactions: MrUNIMOG
I do not believe it's Apple trying to save the GPU market. They had no problem at all going with Intel when AMD CPUs were so far beyond that AMDs existence was on the verge of going astray.

That's clearly not the reason
 
  • Like
Reactions: Val-kyrie
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.