Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

bounou

macrumors 6502
Original poster
Jun 6, 2012
354
110
As in the GPU that Intel includes built in your CPU is better then a 300$+ dedicated GPU from AMD or Nvidia?

At what point does intel stop doubling the power every generation because at the rate they have been going the last few generations Nvidia and AMD could be in some very real trouble if they aren't already.
 

Count Blah

macrumors 68040
Jan 6, 2004
3,192
2,748
US of A
Better than a $300 dGPU? Three+ iterations away. A $100 dGPU? As soon as the next gen, but probably 2.
 
Last edited:

priitv8

macrumors 601
Jan 13, 2011
4,038
640
Estonia
This was just discussed here on the forum not long ago. The decisive performance factor between iGPU vs dGPU is the video RAM: iGPU runs off the system RAM and needs to share it's bandwidth with CPU, whereas dGPUs have wider and faster RAM interfaces dedicated exclusively for their use.
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
In theory they already beat quite a few dGPUs. They basically just put the entry dGPU department out of business. They will not beat a 150W+ desktop GPU for a long time. Even their process advantage isn't enough for that.
In any case I would say the yearly improvement is already slowing. Ivy to Haswell was mostly the same architecture just bigger and better drivers. Broadwell will be less of an improvement. The next bigger step will need some faster memory access.
Once HMC is out and available they might beat most of dGPU competition out of notebooks. At the same time quite fast ARM chips will be available too and AMD has its own APUs and Nvidia might have some faster tuned SoC for thin notebooks that will leverage the SteamOS Linux gaming move.

Eventually dGPUs will simply die out in the mobile space. On Desktops they will last longer because there is less competition in the high performance high TDP space. Only SoC (CPU+GPU+a ton other stuff) from various sources will be left. First the mainstream will switch (which more or less has already happend) and only some gaming notebooks and workstations will be left and at some point those will go too.
Intel, Nvidia, AMD all prepare for that. Possibly someone like Qualcomm or Apple will have their own licensed IP based SoCs that push into the somewhat higher performance space of 10+W. I am quite sure Qualcomm aims for that. Apple I think will be a little more conservative and watch each new GPU architecture and eventually decide to scale up and move over, depending on how it compares to Intel.

I wouldn't expect that too soon. Apple is currently making its own chips for the huge volume ipad, iphone line. The Mac products are way way less important in volume and Intel isn't going to sleep. I also doubt they move over unless they can move all of the OSX products.
Qualcomm on the other hand sells a huge amount of all kinds of chips. For them it is just an effort of expaning into a new market. As soon as Steam OS fixes the gaming issue on Linux, these kind of cheap efficient chips stand a good chance. First probably in developing markets.
 

Starfyre

macrumors 68030
Nov 7, 2010
2,905
1,136
Those dGPU companies arn't going to let up so easily. 3+ iterations for iGPU means 3+ iterations for dGPUs too! By the time iGPUs can catch up, Nvidia will have probably come up with some competing solution themselves to keep them in the business, maybe even trying to come up with a processor of their own in the process.
 

Doward

macrumors 6502a
Feb 21, 2013
526
8
Current integrated GPUs are faster than discrete graphics of old.

Will we ever see iGPUs faster than same-gem dGPUs? Eventually, perhaps, but for now, no.
 

Asuriyan

macrumors 6502a
Feb 4, 2013
622
23
Indiana
This will never happen in the way that you are hoping for, for the same reason a phone or tablet CPU will never beat a same-generation desktop CPU: physics. Parallel computing like that used in GPU calculations has a fairly straight correlation between TDP and computing power. Intel isn't even playing that game with a chip that shares its power and its cooling with the CPU.

That said, mobile computing (laptops, tablets, you name it) is growing more power efficient while still making huge leaps in computation ability, so Intel's solutions are going to continue to become more practical for the largest number of users, in the same way that 80% (conjecture) of home users don't need power beyond that offered by the iPad.
 

aiom

macrumors member
Jul 26, 2013
78
0
Faster maybe not so soon, but similar performance. The recent iMac benchmarks http://www.anandtech.com/show/7399/...review-iris-pro-driving-an-accurate-display/3 showed that the integrated Iris Pro GPU is very strong.

It doesn't look so on the first look, but when you investigate the columns you realize that the performance is dependent mostly on the resolution. The iGPU fails on larger resolutions, but performs well on "standard"-resolutions. So the iGPU would obviously be killed if you tried to play in Retina Resolution, but if you go with the default resolution of 1440x900 (or in the benchmarks 1366x768) it is mostly en par with a dedicated GPU. And that is quite cool.

I think that is a situation where Apple has layed itself an egg with their Retina displays. Not that anyone needs retina displays. Still you could say: a macbook is not for gaming... but for working. Well then, there are people who work with 3D modelling and texturing applications professionally, be it for movies or games. I can see performance problems arising there when you use full Retina resolution then... so what's the point of the Retina then?

But wait... maybe I am too hasty... maybe Apple see this problem: the only following logic to solve this issue would be to release the iGPU only for MBP which hasn't got the Retina display but an "old" display with standard resolution. Who wants to bet with me? ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.