Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dallas112678

macrumors 6502a
Original poster
Feb 17, 2008
821
606
https://www.theverge.com/2017/5/30/15711298/nvidia-max-q-thin-light-gaming-laptops

Optimizations that allow very thin laptops to hold a GTX 1080 and get about ~90% of its performance (Which equates to roughly 7.9 Tflops). We are talking about laptops around 18mm and 5 lb's.

Even nVidia's Volta will be using a smaller process than Vega and therefore be more power efficient.

If Apple sticks with AMD, everyone here can not mindlessly scream "You need a 10lb brick to have a powerful GPU" excuse.
 
It seems like Apple might have signed a contract with AMD to supply GPUs for their desktop and mobiles for an unspecified amount of time. I think Apple needs to dump it because choosing AMD certainly has been a bad bet. If its for 5 years starting in 2013, that contract probably won't be up until next year.

This probably suggest why Apple is pushing out the next generation Mac Pro until late 2018.
 
Apple doesn't make gaming laptops.
macOS isn't a great gaming platform.
Apple doesn't care about making something that will "also work great for games in Bootcamp".

Apple will not get "destroyed" if they don't make a gaming laptop or integrate this type of gaming focused GPU architecture from NVIDIA.

Gaming performance is not Apple's concern and certainly not in their laptops.

If this architecture can greatly increase productivity for their laptop users - maybe... but not if it's going to cost too much or generate too much heat.

We'll have to wait and see on those fronts.
 
Apple doesn't make gaming laptops.
macOS isn't a great gaming platform.
Apple doesn't care about making something that will "also work great for games in Bootcamp".

Apple will not get "destroyed" if they don't make a gaming laptop or integrate this type of gaming focused GPU architecture from NVIDIA.

Gaming performance is not Apple's concern and certainly not in their laptops.

If this architecture can greatly increase productivity for their laptop users - maybe... but not if it's going to cost too much or generate too much heat.

We'll have to wait and see on those fronts.

VERY much agree. Until Final Cut Pro is rewritten to support CUDA and CUDA VERY well. AMD belongs in Macs.

I do not know why people are seriously OBSESSED with NVIDIA. The way people act around here makes it seem like playing even Stardew Valley on a Mac will result in 1 FPS at best JUST BECAUSE it has an AMD card in it. I don't get the obsession with NVIDIA. I use both. I like both.
 
I do not know why people are seriously OBSESSED with NVIDIA

Must be the gaming focus..

My Hackintosh FCPX build got SO much more reliable, stable and fast (at it's intended purpose of FCPX) when I stuck with older but natively supported (by macOS out of the box) AMD cards (7970x2 in there now).

The right tool for the specific job.
 
Even nVidia's Volta will be using a smaller process than Vega and therefore be more power efficient.
It is not obvious that TSMC "12nm" FinFET process would be more dense or more efficient than GloFo "14nm" FinFET.
 
Last edited:
It's not just gaming anymore. The whole computing world is moving towards GPUs from deep learning to AI to whatever demanding calculation in the future. Nvidia is ruling that part of the market thanks to their much better offering.

Max Q looks awesome and Apple should do it too.
[doublepost=1496253875][/doublepost]Lower price and Max Q, that's my high hopes.

I can live without the touch bar, doesn't really seem that useful.
 
  • Like
Reactions: jerryk and Queen6
You do understand that 50% TDP of a 1080 is still 60 watts...thats way over the thermal envelope that these MBP are allowed for GPU (in the 30-40 watt range).

Do you understand that the TDP limit you are talking about is not a constraint imposed by a foreign superpower to Apple? It is just a dumb design choice...
 
Do you understand that the TDP limit you are talking about is not a constraint imposed by a foreign superpower to Apple? It is just a dumb design choice...

As far as we are concerned its enforced by a supreme being that has unbending control over Apple, and that will never change. So it silly to discussing things like why is apple not sticking "X" into their "Y" machine when their design mentality never has and never will allow it.

When AMD or NVidea design a 35TDP GPU that has the power of a 1080, thats when they'll include it.
 
Apple has drawn it's line in the sand, you wont be seeing any MacBook Pro with a decent dGPU as power supply is limited to 100W thx to USB C charging. So 35W - 40W is best expectation, while others can get a GTX 1080 to work in a 18mm chassis. As a longterm user of the Mac Apple & innovation in 2017 equals oil & water. From a professional perspective of over 20 years goodbye Apple...

Q-6
 
Last edited:
For the number of machines in the world sold then High End GPU's such as GTX1080/1070 sold is very small.

Nvidia and AMD makes there money on the mainstream GPU's that sell in much higher numbers because thats what the vast majority of people need ( as opposed to want )

Intel is the biggest GPU vendor on Steam.

Quite a large percentage of the world can get by more then adequately on the GPU's that Apple use. Apple seems more then content to supply the smaller market ( and make the margins ) then try an be all things to all people.

Same with Android vs iPhones. Android outsells iPhones but Apple makes more money still.
 
  • Like
Reactions: turbineseaplane
VERY much agree. Until Final Cut Pro is rewritten to support CUDA and CUDA VERY well. AMD belongs in Macs.

I do not know why people are seriously OBSESSED with NVIDIA. The way people act around here makes it seem like playing even Stardew Valley on a Mac will result in 1 FPS at best JUST BECAUSE it has an AMD card in it. I don't get the obsession with NVIDIA. I use both. I like both.

The "obsession" with nVidia, is that you can now get a GPU more than 4x as powerful as what is in the MBP, in a laptop approximately the size of a MBP.

This is an obsession because believe it or not, GPU's are not only for games, and there is significantly more support for nVidia cards in engineering and design programs.

In other words, significantly more power and significantly more support. You'd have to be drinking some serious Apple kool-aid to think AMD is the best option.
 
You'd have to be drinking some serious Apple kool-aid to think AMD is the best option.

Power consumption, heat and cost are what matter to Apple on the laptops.

We are all simply commenting from the angle of what Apple is known to prioritize with GPU selection on their laptops.
 
Last edited:
Apple has drawn it's line in the sand, you wont be seeing any MacBook Pro with a decent dGPU as power supply is limited to 100W thx to USB C charging. So 35W - 40W is best expectation, while others can get a GTX 1080 to work in a 18mm chassis. As a longterm user of the Mac Apple & innovation in 2017 equals oil & water. From a professional perspective of over 20 years goodbye Apple...

Q-6

No, not thanks to USB-C. Thanks to the law.
[doublepost=1496326858][/doublepost]
The "obsession" with nVidia, is that you can now get a GPU more than 4x as powerful as what is in the MBP, in a laptop approximately the size of a MBP.

This is an obsession because believe it or not, GPU's are not only for games, and there is significantly more support for nVidia cards in engineering and design programs.

In other words, significantly more power and significantly more support. You'd have to be drinking some serious Apple kool-aid to think AMD is the best option.

Nope. A GTX 1080 Ti does not beat the AMD RX 480 in Final Cut Pro. In fact, in some tests, it is shameful how slow the GTX 1080 Ti compares.
 
Half of TDP of GTX 1800 is still 90W, which is comparable of the entire TDP of any laptop Apple has ever produced. And besides, aren't we already getting the same thing as what this Max-Q is supposed to be? 80% of the desktop card performance at half the TDP (comparing to the RX 460).
 
Half of TDP of GTX 1800 is still 90W, which is comparable of the entire TDP of any laptop Apple has ever produced. And besides, aren't we already getting the same thing as what this Max-Q is supposed to be? 80% of the desktop card performance at half the TDP (comparing to the RX 460).

I guess as long as it doesn't say NVIDIA people will complain.
 
BTW, just seen that the TDP of a Max-Q GTX 1080 is still going to be 110 Watt. Its fairly impressive and similar to what AMD has achieved with the Pro 4*0 line, but way out of MBP's thermal budget. And the Max-Q GTX 1060 is also out of the question, as its TDP is also still 70W.
 
  • Like
Reactions: Queen6
Apple has drawn it's line in the sand, you wont be seeing any MacBook Pro with a decent dGPU as power supply is limited to 100W thx to USB C charging. So 35W - 40W is best expectation, while others can get a GTX 1080 to work in a 18mm chassis. As a longterm user of the Mac Apple & innovation in 2017 equals oil & water...

Q-6
No, not thanks to USB-C. Thanks to the law.

Wrong legal limit for carry on is not the same as the 100W power supply limitation of USB C. You could legally carry on a top of the line gaming notebook with over 300W of power supply, as long as the internal battery meets the requirements...

Apple have simply neutered the MBP for the foreseeable, fundamentally for ease of manufacture and aesthetics...
 
Last edited:
Guess the point is that if you look the new max q laptops and their tdp envelope it makes one think if apple could do the same in their thin laptops.
[doublepost=1496333344][/doublepost]But I guess the fans will always complain unless it's something apple is already doing.
 
since, now,every macbook pro and upcoming macs will support eGPUd dont see the need to have a louder and not so much effiecnt dGPU into a thin macs
You want nvidia, now you can easy with eGPU and TB3
[doublepost=1496335191][/doublepost]and BTW that 1080 MAX-Q can fit only the iMac enclosure
 
Wrong legal limit for carry on is not the same as the 100W power supply limitation of USB C. You could legally carry on a top of the line gaming notebook with over 300W of power supply, as long as the internal battery meets the requirements...

Apple have simply neutered the MBP for the foreseeable, fundamentally for ease of manufacture and aesthetics...

Look at the FAA. Up to 100 watts. Apple didn't go from >100 watts to <100 watts JUST BECAUSE of USB-C. Which Apple laptop had over 100 watts before USB-C?

Passengers may carry all consumer-sized lithium ion batteries (up to 100 watt hours per battery). This size covers AA, AAA, cell phone, PDA, camera, camcorder, handheld game, tablet, portable drill, and standard laptop computer batteries. The watt hours (Wh) rating is marked on newer lithium ion batteries and is explained in #3 below. External chargers are also considered to be a battery
 
Look at the FAA. Up to 100 watts. Apple didn't go from >100 watts to <100 watts JUST BECAUSE of USB-C. Which Apple laptop had over 100 watts before USB-C?
100 watt hours of battery capacity, and more than 100 watts in power delivery are two entirely different things...

That said, the complement of moose's law indicates that 20 years from now we'll be getting the same performance from 5 watts that we get from 200+ today....assuming we ditch silicon along the way.
 
100 watt hours of battery capacity, and more than 100 watts in power delivery are two entirely different things...

That said, the complement of moose's law indicates that 20 years from now we'll be getting the same performance from 5 watts that we get from 200+ today....assuming we ditch silicon along the way.

I was replying to what he said. His argument was with the power supply which is in line with the FAA regulations and power delivery has nothing to do with this argument.

He stated

Apple has drawn it's line in the sand, you wont be seeing any MacBook Pro with a decent dGPU as power supply is limited to 100W thx to USB C charging.
.

So USB-C is the only reason we have 100 watt power supply? No. That is what I was saying. This was the case before USB-C on the laptops.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.