Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple has monopolistic tendencies, but they like to have a choice of GPU vendor. Feeding the CUDA juggernaut does not help.
All they need to do is support it. Why are you arguing with me? Apple are far worse than Nvidia in terms of being a monopoly and it's incredibly dumb to accuse Nvidia of being the juggernaut.
 
All they need to do is support it. Why are you arguing with me? Apple are far worse than Nvidia in terms of being a monopoly and it's incredibly dumb to accuse Nvidia of being the juggernaut.
You can buy Windows or Android, which have more market share. NVIDIA has a lock on certain applications and many people just buy it blindly.
 
It's a shame they don't offer these graphics card choices for the MacBook Pro 13.
Yeah the 13in is so far down in performance compared to the 15in it's kinda deceiving on Apple's part calling both machines "pro". There is no way they could fit a Vega class graphics card in the 13in's thermal budget, but there has to be a card out there with performance in the middle between an intel 650 and a monster vega 16 that would work in the 13 in laptop?
 
  • Like
Reactions: where is it
I use a Radeon RX 580 in a Razer Core X with my 15" MacBook Pro 2017 daily. It's great. I highly suggest going for something like this over the black magic and other premade solutions, since you will want to upgrade your GPU someday, and buying a new enclosure every time doesn't make sense.

I haven't ran benchmarks to see if there would be a dramatic difference between internal/external performance, but I can say it runs very well as I expect.

The eGPU keeps the laptop much cooler. I do fairly heavy 3d game development. Using the internal GPU, my laptop gets to 70 degrees or so. Using the eGPU, it sits around 45 degrees instead.

All in all I strongly recommend eGPUs for Mac users that need graphics power. It solves two of the biggest Mac hardware problems; heat and hardware options.

Honestly, I often wish I got a 13" MBP instead of the 15" - I didn't expect I'd use an eGPU this often, and the 15" seems to have more problems.
Mind me asking how loud is the Razer Core X case/fans?

My only bugbear with the MBP is how damn noisy they are when running multiple monitors and I'm hoping a better GPU (Vega internal or eGPU) will alleviate this.
 
You can buy Windows or Android, which have more market share. NVIDIA has a lock on certain applications and many people just buy it blindly.
I'm talking about how aggressive Apple are in terms of their pricing and their views on right to repair. Developers develop purely for Nvidia for performance gains. It's simply moronic to say Nvidia forces developers to create software that exclusively works with their GPUs.

Anyway, lets just agree to disagree because this conversation is going nowhere.
 
  • Like
Reactions: RandomDSdevel
I'm talking about how aggressive Apple are in terms of their pricing and their views on right to repair. Developers develop purely for Nvidia for performance gains. It's simply moronic to say Nvidia forces developers to create software that exclusively works with their GPUs.

Anyway, lets just agree to disagree because this conversation is going nowhere.
When did I say NVIDIA forces developers to push their GPUs? It is the same as Windows.

Some developers use NVIDIA proprietary tools because it means less work or they are sponsored.
 
It's simply moronic to say Nvidia forces developers to create software that exclusively works with their GPUs.

It's exactly what Nvidia is doing. It has been actively sabotaging open standards (OpenCL and then later Vulcan, partially) and manipulating developers into adopting their proprietary framework, which - surprise surprise — runs only on Nvidia GPUs. They started innocently enough, with Cg, a convenient framework that supported all the GPUs, and then, once Cg became popular enough, they migrated it to CUDA which dropped third-party GPU support.
 
Why should it be?

You payed and bought for the product at that time for the current price at that time. It is with everything like that. At one point you decide to buy or to wait. If you have to wait until the next revision or model you never have to buy anything :)

I also don’t understand why some Apple consumers say a 2600 dollar product is deprecated or obsolete when the next one comes out with only a few minor changes. It is not like the iPhoneX is total crap and stops working now the the XS is out...
What I want to know is do these same people deal with the next model version of their automobiles, especially if it has a more powerful engine or better suspension? I imagine they just move on with their life, because, well...that's life.

But not when it comes to Apple, its pitchforks and torches, and the familiar refrain that "Apple screwed me!"

Mind you, these are probably the same people who were furious that Apple did not release the MacBook Pros at WWDC, and screamed bloody murder that Apple was selling out of date computers, when we all know those 45w hexacore had only been announced and released on April 3rd and were in short supply as OEMs started receiving shipments.

Imagine if Apple had waited until November to ship the hexacore update along with the Vega GPU BTO option...damned if they, damned if the don't!
 
Why do you prefer the 13.3" over the 15.4"?
Its just a more convenient size. It's lighter and less intrusive in small spaces. I can carry around with one hand far more comfortably than a 15 inch. It's portability is just better. That's not say say the 15 inch is that much bigger but the small differences matter to me.
 
  • Like
Reactions: Val-kyrie
It's exactly what Nvidia is doing. It has been actively sabotaging open standards (OpenCL and then later Vulcan, partially) and manipulating developers into adopting their proprietary framework, which - surprise surprise — runs only on Nvidia GPUs. They started innocently enough, with Cg, a convenient framework that supported all the GPUs, and then, once Cg became popular enough, they migrated it to CUDA which dropped third-party GPU support.
Nvidia-sponsored games usually have performance issues on AMD because GameWorks is not optimized for Radeon.

AMD-sponsored games usually also run well on NVIDIA from what I have seen.
 
It's exactly what Nvidia is doing. It has been actively sabotaging open standards (OpenCL and then later Vulcan, partially) and manipulating developers into adopting their proprietary framework, which - surprise surprise — runs only on Nvidia GPUs. They started innocently enough, with Cg, a convenient framework that supported all the GPUs, and then, once Cg became popular enough, they migrated it to CUDA which dropped third-party GPU support.

The GPP - https://www.anandtech.com/show/12716/nvidia-terminates-geforce-partner-program- Program that they tried to institute and then they were rebuked and had to drop due to bad PR is what finally nailed the coffin shut for me with NVIDIA. Don't like competition from AMD or Intel, well, too bad...
 
If the Vega naming continues to indicate compute unit count then is the Vega 16 just 16 compute units, same as the Pro 560 but +$250 for HDM2 memory?

The Vega 20 is then... a standalone Vega M GL?
 
  • Like
Reactions: RandomDSdevel
I’m good with mine I got in August. It’s for work, so I only check email, use teamviewer, Microsoft RDP, use safari and every now and then fire up my VM. Don’t think I need Vega for that.
 
It's exactly what Nvidia is doing. It has been actively sabotaging open standards (OpenCL and then later Vulcan, partially) and manipulating developers into adopting their proprietary framework, which - surprise surprise — runs only on Nvidia GPUs. They started innocently enough, with Cg, a convenient framework that supported all the GPUs, and then, once Cg became popular enough, they migrated it to CUDA which dropped third-party GPU support.
Swap the name Nvidia with Apple and replace the word GPU with API and the same criticism applies to Apple and their depreciation of OpenCL and OpenGL
 
  • Like
Reactions: RandomDSdevel
If the Vega naming continues to indicate compute unit count then is the Vega 16 just 16 compute units, same as the Pro 560 but +$250 for HDM2 memory?

The Vega 20 is then... a standalone Vega M GL?
No, "Vega" M is a custom Polaris with HBM.

Vega 20 would be real Vega.

16 Vega CUs would be more powerful than 16 Polaris CUs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.