Help me understand intel integrated graphics

Discussion in 'MacBook Pro' started by dlb253, Apr 16, 2010.

  1. dlb253 macrumors member

    Joined:
    Apr 13, 2010
    Location:
    Arizona
    #1
    From what I understand, Intel blocked NVidia from making iX integrated chipsets...so NVidia had to provide discrete GPU's in addition to intel integrated graphics (on the 15" & 17"). On the 13" however, NVidia integrated graphics are used since they're allowed to make integrated chips for the C2D's. Is that correct?

    Can someone explain to me why intel integrated graphics are bad? Why is NVidia so much better?
     
  2. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #2
    You are correct.

    They're bad because Intel doesn't put enough R&D into their graphics chips. They design as "just good enough" to get by, while Nvidia specializes in graphics chips, therefore making a better product.
     
  3. spinnerlys Guest

    spinnerlys

    Joined:
    Sep 7, 2008
    Location:
    forlod bygningen
    #3
    You assumption is correct. Intel has no agreement with Nvidia that Nvidia can provide the chipsets and IGPs with i-Core CPUs.

    Why Intel is worse than Nvidia in regards to graphics may have something to do with Nvidia having more experience in graphics than Intel, as it is longer in the GPU market than Intel, which only entered that marked in this millennium as far as I recall.
     
  4. dlb253 thread starter macrumors member

    Joined:
    Apr 13, 2010
    Location:
    Arizona
    #4
    So why even bother with intel integrated graphics? Why not just have iX CPU's, and NVidia discrete GPU? Why have both discrete AND intel integrated?

    Is it like a package deal with intel or something? If you buy the iX, do you have to get the integrated graphics too? Or do computers NEED some form of integrated graphics?
     
  5. vasuba macrumors member

    Joined:
    Apr 13, 2010
    #5

    Thats part of why intel is playing nasty now when it comes to chipsets. They want to make a more serious move into the GPU arena. So they feel the bets way to do that is to screw with the other GPU makers restricting moves
     
  6. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #6
    It basically boils down to Intel being a big baby as per usual.

    And they're right.
     
  7. Benito macrumors 6502

    Benito

    Joined:
    Jan 5, 2010
    Location:
    Toronto, Canada
    #7
    I don't believe Intel is selling iX CPU's without the Intel integrated graphics that's why.
     
  8. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #8
    With the i series chips, the Intel integrated graphics solution is on chip, meaning you can separate them. The best you can do is disable the Intel solution.
     
  9. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #9
    So true, look at Apple with HTC. Apple wants to be the only smartphone player, so when someone threatens their superiority, they try and stomp them out.
     
  10. DoFoT9 macrumors P6

    DoFoT9

    Joined:
    Jun 11, 2007
    Location:
    Singapore
    #10
    thats the reasoning behind it! you cant get rid of it - there is no choice!

    i think its good anyway, the new MBPs can now choose which GPU to use on the fly, so battery life will always be optimised as will performance. best way to go IMO. its just like those new cars that choose the cylinders for you - except with 2 motors :rolleyes: :confused: :p
     
  11. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #11
    Except that nVidia has a long long record of making actually respectable GPUs. Intel is rather new to the market (in the way they're pushing now) and is forcing its way in via pretty ****** moves. Ultimately Intel is holding things back and annoying everyone.
     
  12. DoFoT9 macrumors P6

    DoFoT9

    Joined:
    Jun 11, 2007
    Location:
    Singapore
    #12
    of late though, nVidia seems to be lagging just a tiny bit behind ATi... i hope they do pull back though, i really like them!
     
  13. mikeo007 macrumors 65816

    Joined:
    Mar 18, 2010
    #13
    Nvidia has pulled their share of shady moves too. Rebranding countless GPUs for one example. They're also falling way behind, and are currently almost a full generation behind ATIs offerings.

    Regardless, any company who uses legal means to bully another company impedes progress.
     
  14. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #14
    Lagging behind in getting their cards out yes, in the high-end enthusiast market (which won't matter to these notebooks/laptops for 3 or 4 years). This has happened before with the FX series, though if you remember the series after that (6 series) blew everything else out of the water. With that said, though they suck a lotta power and run hot, their top-tier cards are still faster then ATIs offerings.

    Of course nVidia has pulled lots of shady movies. Falling way behind? Eh I wouldn't say so, sure they were late to the table with Fermi, but they still have the fastest cards on the market and the largest market share in the industry. Also, the way the next die-shrink is going both ATI and nVidia are going to have to wait til big issue's are sorted out. Die-shrink companies are skipping 32nm (as they're abandoning it for technical reasons) and going straight to 28nm, so both ATI and nVidia are going to have to wait for the technology to even become available before they get truly started on their next generation of cards.
     
  15. DoFoT9 macrumors P6

    DoFoT9

    Joined:
    Jun 11, 2007
    Location:
    Singapore
    #15
    we are talking consumer cards here lol. GTX480s or whatever they are called. i recall seeing reviews and the top (5890x2 ATis) were faster/cooler/consumed less power?

    correct me if im wrong though! any benchies?
     
  16. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #16
    You're wrong, the GTX470 and 480 are not considered consumer cards, they're high end enthusiast cards just like ATIs 5850, 5870 and 5970.
    There is no 5890x2, I believe you're referring to the 5970, which is two GPUs on one PCB (the ones on the 5970 are two slightly hampered 5870s), so of course that card will preform better then a GTX480 which only has one GPU on the PCB. Yes ATIs 5000 series consume less power and run cooler, but overall nVidia still has the performance crown on the high end (GTX480).
     
  17. DoFoT9 macrumors P6

    DoFoT9

    Joined:
    Jun 11, 2007
    Location:
    Singapore
    #17
    mybad. im not really in the loop anymore as far as GPUs go ;)

    but yes, of course a x2 card will beat a x1 card - no matter how far the technology has come!

    has nVidia announced any dual core variants of their GPUs? because if what you are saying is correct, then nVidia would FLOG ATi - yes?
     
  18. Azathoth macrumors 6502a

    Joined:
    Sep 16, 2009
    #18
    Because some people use their computers for actual work, rather than playing games - and for that integrated Core i5 graphics are more than enough (and a real power saver)
     
  19. DesmoPilot macrumors 65816

    Joined:
    Feb 18, 2008
    #19
    Not at the moment no, and judging by the heat/power issues I wouldn't want to see a two GPU on one PCB solution at least until a die-shrink. If you want to see an estimate of what a dual GPU variant of the 480 would look like, look up benchmarks for GTX480s in SLI. Yes, I'd guesstimate that a dual GPU variant of the GTX480 would be a brutally fast card.
     

Share This Page