Intel HD 5200 vs. nVidia 650M

Discussion in 'MacBook Pro' started by B..., Mar 16, 2013.

  1. B..., Mar 16, 2013
    Last edited: Mar 19, 2013

    B... macrumors 68000

    B...

    Joined:
    Mar 7, 2013
    #1
    Has anyone seen the video where an Intel HD 5200 and NVidia 650M are running a racing game on two different computers? The settings are on the same (I believe high) and their performance looks equal.

    Does this mean a next gen iGPU will be equally good as today's 15" Pro dGPU? Or is Intel somehow deceiving us?

    Thanks.
     
  2. joshhedge macrumors regular

    Joined:
    Sep 23, 2012
    #2
    The 950M doesn't exist, nor does the HD 5200 at present. I believe you mean the GT 650M. I don't have a link but youtube is your friend here. Furthermore that video might be hugely biased towards intel in the respect that the GT 650M could be based on DDR3 or GDDR3 RAM rather than the much faster GDDR5 RAM inside MacBooks.
     
  3. B... thread starter macrumors 68000

    B...

    Joined:
    Mar 7, 2013
    #3
    Of course, 650M.
     
  4. xShane macrumors 6502a

    xShane

    Joined:
    Nov 2, 2012
    Location:
    United States
    #4
    Sounds fishy...
     
  5. B... thread starter macrumors 68000

    B...

    Joined:
    Mar 7, 2013
    #5
    It was during an Intel conference where they presented a working Haswell chip. Search "Intel GT3 vs 950M" on Google and click on the first result from Anandtech,or the second from Youtube.
     
  6. Mr MM macrumors 65816

    Mr MM

    Joined:
    Jun 29, 2011
    #6
    the interesting part here is that intel didnt do anything, they got a mid range gpu and got the gt3e core in there, put both to run the same game at the same settings, here is what:

    1) its an old game
    2) the amd 7660g (igpu on a10 apus) can do that as well
    3) We sure have hit the vsync, which is good news, otherwise there would be a difference to notice on that video

    I really ont believe that it will be at the 650m lvl, maybe at 640m

    I made thread here http://forums.macrumors.com/showthread.php?t=1557558

    so that people can stop spreading threads about haswell and inform
     
  7. xShane macrumors 6502a

    xShane

    Joined:
    Nov 2, 2012
    Location:
    United States
    #7
    Interesting point. One could take a very old game, and have a very expensive $800 graphics card and also a cheap integrated GPU run the game at high fps, and make it look like a cheap integrated GPU is just as good.
     
  8. thekev macrumors 604

    thekev

    Joined:
    Aug 5, 2010
    #8
    Exactly. Reading into a staged presentation as if it's a complete set of data is ridiculous.
     
  9. Mr MM macrumors 65816

    Mr MM

    Joined:
    Jun 29, 2011
    #9
    interesting thing they didnt said anything, just let people fiddle with that, so they can come up with their own conclusions
     
  10. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #10
    Actually the difference between the stock GDDR5 and DDR3 version is pretty small at about 5-7%. The DDR3 makes up for it in higher core clock speed.

    @Op I think people said that a 650M can run the game at those settings at around 45 fps. As Intel probably picked a setting that was high for promotional value but just fluid. Probably 25-30fps. Therefore the HD 5200 should end up at at least 60% the performance of a 650M.
    If those assumptions are true, the 5200 should end up at around 630M levels. Still pretty damn good and significantly better than AMDs APUs.

    At the same time Nvidia won't release anything significantly new until Maxwell 20nm in 2014. So Intel will come within arms reach. Half performance used to be the difference between the upper mid range as in 650M and all the cheaper lower derivatives. There would really be no reason to add a dedicated GPU unless one reserves at least 30W TDP for it. As Intel won't put them into the Quad Cores as the leaked roadmaps show those will be 35W CPUs with a GPU that beats stuff that today needs a 35W CPU + 15-20W GPU.
    Anything that isn't at least as fast as a 650M doesn't really have a reason to exist anymore outside of AMD cheap notebooks. The 650M will stay top of the class in the 30W TDP range until 20nm hits and that won't be until 2014.
     
  11. B... thread starter macrumors 68000

    B...

    Joined:
    Mar 7, 2013
    #11
    Here's a question: who thinks Apple will implement the GT3 in their next gen rMBPs? Might they skimp on GPU and use the GT2?
     
  12. leman macrumors 604

    Joined:
    Oct 14, 2008
    #12
    It depends on Intel and how they build their CPUs. So far, mobile CPUs get the fastest IGPs and desktop CPUs use the slower IGPs (which also makes perfect sense). I doubt Haswell will change that.
     
  13. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #13
    I'd just like to say that that's not true. Apple just elected to use high-end chips in some of their mobile products.

    The MacBook Air, for instance, has slower IGP than the MacBook Pro.

    The same segregation would be true for Haswell as well.

    On desktop, the reason why most chips have slow IGP (or not at all) is because Intel has to cut costs down. A mobile i5 chip in the MacBook Air can cost as much as a i7 quad-core chip on desktop. And that's OEM pricing.
     
  14. UBS28 macrumors 6502a

    Joined:
    Oct 2, 2012
    #14
    From what I read, people are blaming that NVIDIA GPU's handles that game poorly and hence why the Haswel GPU performed as good as the 650m in that game.

    So we need more games to see how well the Haswel GPU is in comparison to the 650m. But it's looking good so far.
     
  15. leman macrumors 604

    Joined:
    Oct 14, 2008
    #15
    Its slower because its clocked lower to reduce the TDP. It still have the same number of computation units (16). I expect this to be the same with the Haswell CPUs - higher-end (i5/i7) mobile CPUs will get the fastest (GT3) GPU, maybe the i3 version will get the GT2). The actual GPU clocks are a completely different matter. I expect the ULV CPUs to have GT3 as well - just slightly lower clocked.
     
  16. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #16
    There's only one Intel HD 4000 part. There's no variance that has "more" or "less" computational units. In that sense, it's just a matter of how slow/fast Intel chooses it to be.

    And GT3 is not guaranteed for the higher-end mobile CPUs. The leaked specifications show that the highest-end quad-core i7 chip just gets HD 4600:

    [​IMG]
     
  17. leman macrumors 604

    Joined:
    Oct 14, 2008
    #17
    Ivy Bridge has two GPUs variants. One is called HD 4000 (also GT2) and has 12 units, another is called HD 2500 (also GT1) and has 6 units. All mobile IB CPUs have the GT2 (HD 4000) variant, some desktop CPUs have the GT1 and others the GT2. The poster to whom I was responding was specifically asking about the GPU core (GT1/2/3) will used in mobile Haswell GPU, so I tried to commend on that.
     
  18. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #18
    I am pretty sure it will go like this.
    15" with all the Quads only get GT2(aka 4600) + some dedicated GPU.
    The 13" rMBP will almost definitely get the GT3e aka 5200.
    The Airs will likely also get a GT3e at lower clocks but maybe only as an upgrade. With lower clocks it would still be the most efficient.

    If the cMBP still exist I guess it will look similar though as the 13" MBP is a bit of the budget solution now, it may get a cheaper CPU.

    Afawk now Intel only plans three real variants. GT2, GT3 and GT3e. There should be significant price differences therefore the best might only come in the optional i7 and not the i5 default configs.
    The single SoC new ULV Haswell will surely get the best possible I think and an (e) version should be the most energy efficient.

    Ergo nothing great in the 15". In the lower ranges it will probably come as an expensive upgrade and not the default config. 13" rMBP might be an exception with the display the 35W power envelope and the high starting price. On the other hand if they offer a 35W Quad Core it may get confusing for none techy people as one might have to trade Quad vs better GPU.
     
  19. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #19
    Are you sure you are not confusing HD 3000 with HD 4000? Because as far as I know, HD 4000 comes with 16 computational units (unified shaders), and that's it. There's no other version of it.

    And the HD 2500 is a completely different beast... that was made to cut costs rather than to provide a viable IGP solution.
     
  20. vatter69 macrumors 6502a

    vatter69

    Joined:
    Feb 4, 2013
    #20
    The Hd4000 in the ULV Cpu only clocks at 650 mhz compared to the 1100 in the normal CPUs. So even if they are both branded HD4000, one is faster than the other.
     
  21. leman macrumors 604

    Joined:
    Oct 14, 2008
    #21
    HD 2500 is essentially HD 4000 'cut in half'. Both are part of Ivy Bridge architecture. You are correct on the rest of course. If you want a source to believe me, you could look at this article for example: http://www.xbitlabs.com/articles/graphics/display/intel-hd-graphics-4000-2500_3.html
     
  22. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #22
    Oh no, I do believe you.

    I'm just saying... Intel doesn't seem to rely on a "if it's high-end and mobile then it has top GPU performance" philosophy to determine IGP tiers for their CPUs.

    And Haswell is about to demonstrate just that. It is looking very likely that only ULV and select mobile chips will get HD 5200 (I think we should drop GT2/GT3 because it's confusing). The highest-end mobile chips still get just HD 4600, which is a minimal improvement over HD 4000, but TDP has increased so it's just like an overclocked Ivy chip.
     
  23. leman macrumors 604

    Joined:
    Oct 14, 2008
    #23
    Did you even try to read the table?

    http://en.wikipedia.org/wiki/Ivy_Bridge_(microarchitecture)#Mobile_processors

    ULV CPU (i5-3337U ) max GPU frequency - 1.1 GHz (http://ark.intel.com/products/72055/)
    'normal' CPU (i7-3820QM) max GPU frequency - 1.25 GHz (http://ark.intel.com/products/64889)

    Yes, the ULV will be considerably slower (because both CPU and GPU are slower and because it is not a quad core), but there is no 50% difference in clocks.

    ----------

    Agreed. I would just find it logical if they put the best IGP options into the mobile CPUS, where they are needed the most. Your leaked roadmap is surprising though, I really expected to find the GT3 core in the mobile CPUs. Maybe they will at least integrate it into their ULV CPU line...

    Agreed again. Still, Intel is getting there. Slowly, but they are catching up. And I suppose its at least something. I would appreciate having a reasonable minimal GPU performance spec to base the code against.
     
  24. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #24
    I think the reason why they're not in every product is because they consume quite a bit more power than good ol' HD 4000, so putting them on high-end quad-core chips would squash any chance for the OEM to couple those chips with a discrete GPU.

    And I think discrete GPUs still make more sense for the higher end of the market.

    Yeah, but while Intel is playing the catchup game, the competition just further widens the gap. Here's AMD upcoming IGP solution:

    [​IMG]

    On a side note, I think AMD has very good chances of outperforming the GT 650M with their IGP, since they're already matching the GT 670M performance with their HD 7870M mobile GPU, while keeping its TDP to the same package as the GT 650M.
     
  25. vatter69 macrumors 6502a

    vatter69

    Joined:
    Feb 4, 2013
    #25
    Sorry, the base frequency was different. 350 vs 650.
     

Share This Page