Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think you need to open up a 15" 2016-2017 MBP the amount of room between the batteries is INSANE. Im talking about over a 1cm between each cell, if apples focus was there, they need to put their money where their mouth is. Say for instance 35 watts over 10 minutes compared to 40 watts over 5 minutes. Which one has overall lower power draw?
Gaming laptops are really heavy and still last a stupidly short amount of time on battery. I don't doubt Apple's ability to make a long-lasting laptop.
 
Last edited:
  • Like
Reactions: sd70mac
Apple is all about open source (hence OpneCL ) and Nvidia is CLOSE source with cuda.
That’s why Apple doesn’t want to move on with nvidia. If Nvidia open their source code for cuda, Apple will 100% switch.
 
Not natively, but with a few workarounds from the good folks over at https://egpu.io/, this functionality can be enabled. I’m currently able to run an Nvidia GTX 1080 via Akitio Node eGPU enclosure in Bootcamped Windows 10 on my 2014 15” rMBP. It can drive an external display or accelerate the internal Retina display. Totally awesome.
[doublepost=1522457853][/doublepost]

Yes you can, if you implement the workarounds shown over at https://egpu.io/. I’m able to run a GTX 1080 externally from Bootcamped Wondows 10 on my 2014 rMBP using tricks from the eGPU.io build guides!

Has anybody tried with Windows 7?
 
  • Like
Reactions: sd70mac
I don't get it. What is it with Apples bias towards AMD? Nvidia kills AMD in GPU rendering and 3D partical simulations. Cinema4D integrated pro render and frankly it just sucks.

The upcoming version of Octane (and there is an Octane for C4d) will be running on Metal, Vulcan, and CUDA. They said when running on metal, there isn't any real difference in speed even on nVidia. Their CUDA cross compiler got figured out. They have Octane being accelerated by AMD, nVidia, and even intel integrated graphics on the CPUs. They show mac, ipad, and iphone demos of octane running. It's impressive.


I agree ProRender has been disappointing. Very disappointing.



I've been fooling with the free Corona demo in C4d on my mac. It is CPU based, and still it is lightning fast even with just my 7700k based iMac. If you buy corona, it comes with 2 render nodes. You could install on 2 PCs like the upcoming new Ryzen chips, and use with team render to render on your main mac screen.

With GPUs as expensive as they are, having the option to make cheap CPU render farms just got more important.

But damn, do I want Apple and nVidia to get their crap together. I just want to use nVidia for computing without having to kludge everything together with kexts and terminal commands.
 
Last edited:
  • Like
Reactions: sd70mac
I'll believe it when i'll see it.Don't think Apple will do that.

I admit I'm being a bit glib saying that :p

But I can imagine they'd start selling their own GPU chassis in the future, which might blur the line to what a pro machine is, and allow them to upsell a huge swathe of their mobile customers and reap all the profits.

A modular future for many of their devices?
 
  • Like
Reactions: sd70mac
Whilst no Nvidia support is a bit frustrating (I use a 1070 in my PC Desktop which I was thinking about sticking in an enclosure for the wife so I could accidentally upgrade to a 1080ti) I don’t see it as the disaster that some people here seem to think it is. At most price points AMD has a card that can match Nvidia except when you get to the 1080/ti level. The only real downside of them is they tend to draw more power and therefore produce more heat, the electricity cost of that is going to be minimal and the heat shouldn’t be an issue if you get a decent enclosure. Anyway, if you bought a Mac with integrated graphics or low end dedicated graphcis which most of us will have then any eGPU is gonna be a huge step up!
 
In terms of FP16 - the Vega frontier blows pretty much any Nvidia consumer card out of the water.

Which is why I actively chose AMD rather than Nvidia.
 
  • Like
Reactions: sd70mac
Does this mean I could play some AAA games (Through Windows) with a GTX 1080ti using my 2016 matchbook pro?

Edit: NM...now Nvidia support :(


My MacBook Pro 2016 + BootCamp + Akitio Node + 1080Ti = AAA game heaven. Works perfectly. Be sure to use "refit" to enable the Intel GPU in windows (while eGPU is connected) for best results
 
  • Like
Reactions: sd70mac
How does that lag manifest itself? Is it like an input latency when looking around in a first person game? I'm wondering if it's something that can be fixed (or at least improved) in software or if it takes a move to the next version of Thunderbolt to hopefully get improved.

There is no lag with the eGPU. Just that due to the limited TB3 bandwidth, FPS will drop during smooth level transitions or when textures change too much
 
It isn’t just space, though. Batteries are the heaviest part of any laptop. Where do you think the weight loss came from?

And wait a second. They couldn’t have used the 1050 mobile in 2016 anyway. Not to mention I’m not finding the whole “only 5 watts higher” anywhere else buy your post. I’m definitely not finding the 40watt claim anywhere.

The Pro 555 is in the 2017 MBP not the 2016. Is it not just space? Look at the 12" MacBook, the tiered battery design Maximises its design minimising wasted space. Nvidia state the 1050 mobile is 53 watts, in addition to the standard apple 'under clock' you are looking at 40-45 watts tops. Im finding it hard to believe that you are arguing FOR there pro 555. It just simply doesn't live up to the 'pro' spec anywhere. The CPU's in the line up are great! But the GPU, just lets the laptop down...
Look, for the 2018 variant i'm hoping we might see the mx150 (/successor) or the 8th gen intel w/ RX Vega M. For a professional product these would be welcomed.
 
Last edited:
  • Like
Reactions: sd70mac
Apple is all about open source (hence OpneCL ) and Nvidia is CLOSE source with cuda.
That’s why Apple doesn’t want to move on with nvidia. If Nvidia open their source code for cuda, Apple will 100% switch.
No, it's Metal vs CUDA and also Metal vs DirectX. Closed source or otherwise restricted platforms on both sides. Mainly cause whoever is behind OpenCL/GL held things back for so long that both sides gave up. No way Apple can win that war on the desktop, but with mobile probably, so that will be their bargaining chip.

It's like the CPU wars except that there's nothing analogous to cross-compiling. It sucks.
 
Last edited:
  • Like
Reactions: sd70mac
How does this affect a Cheese Grater Mac Pro? Does this mean we can put these graphics cards in our old machines, plug & play now?
You already could. RX580 and certain other AMD cards work great in a cheese grater due to that native support. Of course, you could also do that with Nvidia cards, but you risked having glitches. I've got a GTX660 installed in mine, and it's fast but glitchy.

One issue is there's no boot logo if you don't flash your card for Mac, which is very difficult to do in most cases unless you pay MacVidCards to do it. For some reason, not flashing it also prevents you from using FileVault on an AFPS boot drive. So I keep a crappy Apple-blessed GPU around for that. More info that on the Mac Pro forums here.
 
Last edited:
  • Like
Reactions: sd70mac
Unable to use an eGPU on my 2015 4k iMac only to find it only supports Thunderbolt 3 whereas the 2015 model has Thunderbolt 2.

Yet more damning proof of planned obsolescence from Apple.

Disgusting seeing as the iMac is less than three years old.
disgusting is the right word, especially since Apple's own thunderbolt bidirectional adapter works fine under windows 10 with a TB2 interface. I'm pretty sick of Apple intentionally crippling their products.
 
Apple is all about open source (hence OpneCL ) and Nvidia is CLOSE source with cuda.
That’s why Apple doesn’t want to move on with nvidia. If Nvidia open their source code for cuda, Apple will 100% switch.

How about Apple opensource Metal? Or ditch Metal, and rather use Vulkan, which is open source?
Apple with their proprietary Metal API isn't any better than Nvidia with their proprietary Cuda technology.
 
  • Like
Reactions: sd70mac
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.