Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dlb253

macrumors member
Original poster
Apr 13, 2010
70
0
Arizona
From what I understand, Intel blocked NVidia from making iX integrated chipsets...so NVidia had to provide discrete GPU's in addition to intel integrated graphics (on the 15" & 17"). On the 13" however, NVidia integrated graphics are used since they're allowed to make integrated chips for the C2D's. Is that correct?

Can someone explain to me why intel integrated graphics are bad? Why is NVidia so much better?
 
From what I understand, Intel blocked NVidia from making iX integrated chipsets...so NVidia had to provide discrete GPU's in addition to intel integrated graphics (on the 15" & 17"). On the 13" however, NVidia integrated graphics are used since they're allowed to make integrated chips for the C2D's. Is that correct?

Can someone explain to me why intel integrated graphics are bad? Why is NVidia so much better?

You are correct.

They're bad because Intel doesn't put enough R&D into their graphics chips. They design as "just good enough" to get by, while Nvidia specializes in graphics chips, therefore making a better product.
 
You assumption is correct. Intel has no agreement with Nvidia that Nvidia can provide the chipsets and IGPs with i-Core CPUs.

Why Intel is worse than Nvidia in regards to graphics may have something to do with Nvidia having more experience in graphics than Intel, as it is longer in the GPU market than Intel, which only entered that marked in this millennium as far as I recall.
 
So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?
 
You are correct.

They're bad because Intel doesn't put enough R&D into their graphics chips. They design as "just good enough" to get by, while Nvidia specializes in graphics chips, therefore making a better product.


Thats part of why intel is playing nasty now when it comes to chipsets. They want to make a more serious move into the GPU arena. So they feel the bets way to do that is to screw with the other GPU makers restricting moves
 
So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?

It basically boils down to Intel being a big baby as per usual.

Thats part of why intel is playing nasty now when it comes to chipsets. They want to make a more serious move into the GPU arena. So they feel the bets way to do that is to screw with the other GPU makers restricting moves

And they're right.
 
So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?

I don't believe Intel is selling iX CPU's without the Intel integrated graphics that's why.
 
So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?

With the i series chips, the Intel integrated graphics solution is on chip, meaning you can separate them. The best you can do is disable the Intel solution.
 
It basically boils down to Intel being a big baby as per usual.


And they're right.

So true, look at Apple with HTC. Apple wants to be the only smartphone player, so when someone threatens their superiority, they try and stomp them out.
 
With the i series chips, the Intel integrated graphics solution is on chip, meaning you can separate them. The best you can do is disable the Intel solution.

thats the reasoning behind it! you cant get rid of it - there is no choice!

i think its good anyway, the new MBPs can now choose which GPU to use on the fly, so battery life will always be optimised as will performance. best way to go IMO. its just like those new cars that choose the cylinders for you - except with 2 motors :rolleyes: :confused: :p
 
So true, look at Apple with HTC. Apple wants to be the only smartphone player, so when someone threatens their superiority, they try and stomp them out.

Except that nVidia has a long long record of making actually respectable GPUs. Intel is rather new to the market (in the way they're pushing now) and is forcing its way in via pretty ****** moves. Ultimately Intel is holding things back and annoying everyone.
 
Except that nVidia has a long long record of making actually respectable GPUs. Intel is rather new to the market (in the way they're pushing now) and is forcing its way in via pretty ****** moves. Ultimately Intel is holding things back and annoying everyone.

of late though, nVidia seems to be lagging just a tiny bit behind ATi... i hope they do pull back though, i really like them!
 
Except that nVidia has a long long record of making actually respectable GPUs. Intel is rather new to the market (in the way they're pushing now) and is forcing its way in via pretty ****** moves. Ultimately Intel is holding things back and annoying everyone.

Nvidia has pulled their share of shady moves too. Rebranding countless GPUs for one example. They're also falling way behind, and are currently almost a full generation behind ATIs offerings.

Regardless, any company who uses legal means to bully another company impedes progress.
 
of late though, nVidia seems to be lagging just a tiny bit behind ATi... i hope they do pull back though, i really like them!

Lagging behind in getting their cards out yes, in the high-end enthusiast market (which won't matter to these notebooks/laptops for 3 or 4 years). This has happened before with the FX series, though if you remember the series after that (6 series) blew everything else out of the water. With that said, though they suck a lotta power and run hot, their top-tier cards are still faster then ATIs offerings.

Nvidia has pulled their share of shady moves too. Rebranding countless GPUs for one example. They're also falling way behind, and are currently almost a full generation behind ATIs offerings.

Regardless, any company who uses legal means to bully another company impedes progress.

Of course nVidia has pulled lots of shady movies. Falling way behind? Eh I wouldn't say so, sure they were late to the table with Fermi, but they still have the fastest cards on the market and the largest market share in the industry. Also, the way the next die-shrink is going both ATI and nVidia are going to have to wait til big issue's are sorted out. Die-shrink companies are skipping 32nm (as they're abandoning it for technical reasons) and going straight to 28nm, so both ATI and nVidia are going to have to wait for the technology to even become available before they get truly started on their next generation of cards.
 
Lagging behind in getting their cards out yes, in the high-end enthusiast market (which won't matter to these notebooks/laptops for 3 or 4 years). This has happened before with the FX series, though if you remember the series after that (6 series) blew everything else out of the water. With that said, though they suck a lotta power and run hot, their top-tier cards are still faster then ATIs offerings.

we are talking consumer cards here lol. GTX480s or whatever they are called. i recall seeing reviews and the top (5890x2 ATis) were faster/cooler/consumed less power?

correct me if im wrong though! any benchies?
 
we are talking consumer cards here lol. GTX480s or whatever they are called. i recall seeing reviews and the top (5890x2 ATis) were faster/cooler/consumed less power?

correct me if im wrong though! any benchies?

You're wrong, the GTX470 and 480 are not considered consumer cards, they're high end enthusiast cards just like ATIs 5850, 5870 and 5970.
There is no 5890x2, I believe you're referring to the 5970, which is two GPUs on one PCB (the ones on the 5970 are two slightly hampered 5870s), so of course that card will preform better then a GTX480 which only has one GPU on the PCB. Yes ATIs 5000 series consume less power and run cooler, but overall nVidia still has the performance crown on the high end (GTX480).
 
You're wrong, the GTX470 and 480 are not considered consumer cards, they're high end enthusiast cards just like ATIs 5850, 5870 and 5970.
There is no 5890x2, I believe you're referring to the 5970, which is two GPUs on one PCB (the ones on the 5970 are two slightly hampered 5870s), so of course that card will preform better then a GTX480 which only has one GPU on the PCB. Yes ATIs 5000 series consume less power and run cooler, but overall nVidia still has the performance crown on the high end (GTX480).

mybad. im not really in the loop anymore as far as GPUs go ;)

but yes, of course a x2 card will beat a x1 card - no matter how far the technology has come!

has nVidia announced any dual core variants of their GPUs? because if what you are saying is correct, then nVidia would FLOG ATi - yes?
 
So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?

Because some people use their computers for actual work, rather than playing games - and for that integrated Core i5 graphics are more than enough (and a real power saver)
 
has nVidia announced any dual core variants of their GPUs? because if what you are saying is correct, then nVidia would FLOG ATi - yes?

Not at the moment no, and judging by the heat/power issues I wouldn't want to see a two GPU on one PCB solution at least until a die-shrink. If you want to see an estimate of what a dual GPU variant of the 480 would look like, look up benchmarks for GTX480s in SLI. Yes, I'd guesstimate that a dual GPU variant of the GTX480 would be a brutally fast card.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.