Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Y

Do you have a clue, that there is NO sing of Maxwell GPUs in OS X in any build?

...

AMD fanatic detected! =) As I said before I doubt Mac will go nvidia again so don't stress yourself. Not going answer the rest of the retoric questions but if you don't mind next time share some sources so we all can check that facts you write.

And you are right, AMD has a better Open CL performance and maybe more usefull for Apple Metal API and upcoming Vulcan. But nvidia also has CUDA which benefits from adobe and autodesk software so it's fine too.
 
I love this. If you don't have a clue about what you are writing, and someone points it out, you are Brand fanatic. Great, especially I am Nvidia fan.

I gave already enough on this forum links to reviews, argued about AMD vs Nvidia, enough. I will not go into it again.

All I want to say is this. AMD and Nvidia are two completely opposite architectures. They are not better/worse. They approach compute differently and benefits are coming only from software that uses it. CUDA, GameWorks, DirectX12, etc.
 
  • Like
Reactions: Vanilla35
I love this. If you don't have a clue about what you are writing, and someone points it out, you are Brand fanatic. Great, especially I am Nvidia fan.

I gave already enough on this forum links to reviews, argued about AMD vs Nvidia, enough. I will not go into it again.

All I want to say is this. AMD and Nvidia are two completely opposite architectures. They are not better/worse. They approach compute differently and benefits are coming only from software that uses it. CUDA, GameWorks, DirectX12, etc.

lol everything I said were benchmarks and data from notebookcheck and techpowerup, you are just a random internet dude being impolite and getting too serious about a thirdparty conversation, that's why I ask for sources.

PD. this is the SKYLAKE thread. We should stop the offtopic.
 
250W is accurate number for TDP of M395X in your mind? No I'm not impolite if Im pointing that benchmarks are different from real world usage.

I genuinely suggest comparing the numbers from release times of GPUs and looking at how performance of the GPUs improved both on Nvidia and on AMD. You may be staggered how it looks in reality. 3dMarks are meaningless in this case. Best example for it is R9 290X vs. GTX 780Ti, GTX 970 and GTX980.

Now R9 290X trades blows with GTX980. When at launch GTX780Ti was faster, than 290X. Its the case of every AMD GPU. Over time it gets better in real world usage than Nvidia hardware.

Edit. When compute will be added to games R9 390X will be much closer to GTX980Ti performance, because of how much raw compute horsepower it has under the hood. But for that we have to wait even months, because the game engines must have be written from ground up for DX12.

There is enough of it over the internet. Even I posted on this forum enough data, you can search for it yourself.
 
Last edited:
lol everything I said were benchmarks and data from notebookcheck and techpowerup, you are just a random internet dude being impolite and getting too serious about a thirdparty conversation, that's why I ask for sources.

PD. this is the SKYLAKE thread. We should stop the offtopic.
The thing is who can run our most commonly used apps better, who are the best. That's what processor used to do.

As for benchmarks they used to do that, but now more like showing the score.
 
  • Like
Reactions: AirdanMR
I missed the M sorry, the benchmarks are from the 970M according to notebookcheck. The AMD TDP seems to much, but I'm also aware of AMD going with higher TDPs to mach nvidia, so I don't know what to think, that's information from techpowerup in the mobile gpu section.

In order: AMD // 970m // 980m

Fire Strike => 6.819 // 7.486 // 9697

Cloud Gate => 38.490 // 50.535 // 65.993

3DMark 11 => 8.656 // 9.962 // 12.562

TDP (power) => ? // 100-120W // 120-130W
https://www.techpowerup.com/gpudb/2742/radeon-r9-m395x.html
TDP: 250 W

The power budget for the dGPU on a 15" rMBP is around 40W. None of these come anywhere close to being suitable.
 
https://www.techpowerup.com/gpudb/2742/radeon-r9-m395x.html
TDP: 250 W

The power budget for the dGPU on a 15" rMBP is around 40W. None of these come anywhere close to being suitable.

We were talking about an hipothetically dGPU for the iMac! =( rMBP only use the M370

250W is accurate number for TDP of M395X in your mind? No I'm not impolite if Im pointing that benchmarks are different from real world usage.

I genuinely suggest comparing the numbers from release times of GPUs and looking at how performance of the GPUs improved both on Nvidia and on AMD. You may be staggered how it looks in reality. 3dMarks are meaningless in this case. Best example for it is R9 290X vs. GTX 780Ti, GTX 970 and GTX980.

Now R9 290X trades blows with GTX980. When at launch GTX780Ti was faster, than 290X. Its the case of every AMD GPU. Over time it gets better in real world usage than Nvidia hardware.

Edit. When compute will be added to games R9 390X will be much closer to GTX980Ti performance, because of how much raw compute horsepower it has under the hood. But for that we have to wait even months, because the game engines must have be written from ground up for DX12.

There is enough of it over the internet. Even I posted on this forum enough data, you can search for it yourself.

Well if you insist quoting me being arrogant about something after I rectified in my SECOND post, then you maybe not be impolite but something worse.

The M395X as far as I know is the TOP of dGPU AMD Family for mobility solutions. Since there are laptops with SLI and crossfire and chargers with up to 330 WATTS, 250WATT for the GPU despite of being high sounded possible to me in that moment for high-end and desktop solutions with mobile gpus. Isn't the iMac a desktop solution? Well that was my logic.

Althought I would like to stop this offtopic right now, I have to point that it seems you and me were wrong and nvidia it's not OpenCL weak anymore...

Luxmark v2.0 which is a purely based OpenCL test.

GTX980M 2290 samples
iMAc M395X 1868 samples

Sources:
http://barefeats.com/imac5k15.html
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-980M.126692.0.html

Bye.
 
Luxmark v2.0 which is a purely based OpenCL test.

GTX980M 2290 samples
iMAc M395X 1868 samples

Sources:
http://barefeats.com/imac5k15.html
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-980M.126692.0.html

Bye.
You know why BareFeats uses Lexmark 2.0? Because it shows that Maxwell GPUs are better than AMD in OpenCL. Luxmark 3.1 which is newer and more "real-worldish" shows completely opposite view.

Compare:
Desktop GTX 970 on which GTX980M is based(2 CU's disabled): http://www.luxmark.info/node/418 Score: 9375
With M395X: http://www.geekunivers.com/wp-content/uploads/2015/11/LuxMarkGPU.jpg score: 9242

So, with lower core count and with lower core clock how will it(GTX980M) perform in this, OpenCL, scenario? And the GTX980M has higher TDP than M395X.

Lets end it here.
 
Not gonna fast as the wire one, but old USB 2.0 or something might be good enough.
how fast a Bluetooth can be?
No way.

USB 2.0: 480 Mbit/s
USB 3.0: 5 Gbit/s
USB 3.1: 10 Gbit/s

You'd have to have some sick future wireless technology to reach 3.1's speed.

Edit: Bluetooth 4.0 reaches 24 Mbit/s. No one uses BT for file transfer anymore.
 
Last edited:
Long time lurker of this thread, first time poster... Absolutely gagging for an MBP update. Typing this on my unbelievably robust 2008 unibody Macbook which is really overdue replacement. Despite now seeming slow and heavy, I can't bear to plump for the current offering if an update is around the corner. Feeling foolish for clinging on so long. Please do hurry up, oh Apple!
 
And there is a lot of hardware possible to launch at that time: Skylake laptops, Broadwell-EP Mac Pro, possibly new Thunderbolt Display. Also, not to forget the Apple Watch 2. Which makes perfect sense why there was no October event this year.
 
Has Apple ever released two products at once? Wouldn't that cause overwhelming pressure at Foxconn (or do they not manufacture the Macbooks)?
 
Last edited:
And there is a lot of hardware possible to launch at that time: Skylake laptops, Broadwell-EP Mac Pro, possibly new Thunderbolt Display. Also, not to forget the Apple Watch 2. Which makes perfect sense why there was no October event this year.
Haha when pigs fly! ;)
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.