Intel claims Haswell iGPU will be as fast as a GT650M, does a comparison demo

Discussion in 'MacBook Pro' started by pgiguere1, Jan 9, 2013.

  1. pgiguere1 macrumors 68020

    pgiguere1

    Joined:
    May 28, 2009
    Location:
    Montreal, Canada
    #1
    Intel is showing off a mobile Haswell prototype (reference board currently in a desktop form factor) at CES and decided to test its integrated GPU (GT3 in this case).

    They claim its performance is on par with an NVIDIA GT 650M an ran a comparison demo to prove it. Both systems are running Dirt 3 at 1080p with those graphics settings:

    [​IMG]

    The Haswell computer is on the left, GT 650M computer on the right.

    Here's the video in which you can see both in action:
    http://www.youtube.com/watch?v=VPrAKm7WtRk

    Both look pretty fluid.

    Thought this may be of interest for anyone waiting for Haswell to buy a 13" rMBP.
     
  2. N19h7m4r3 macrumors 65816

    N19h7m4r3

    Joined:
    Dec 15, 2012
    #2
    Until there are independent reviews I'm very skeptical. Especially since Anandtech were not allowed to see or report on actual performance numbers and could only judge performance based on observation.

    If Haswell matches the GT650 perfect, as long as that performance is not limited to only the top end CPU iterations.
     
  3. Erasmus, Jan 9, 2013
    Last edited: Jan 9, 2013

    Erasmus macrumors 68030

    Erasmus

    Joined:
    Jun 22, 2006
    Location:
    Hiding from Omnius in Australia
    #3
    Comparing the HD4000 to the 650M using numbers on notebookcheck.net, it looks like the 650M is close to 4 times as fast as the HD4000 presently available. The GT3 IGPU will likely be a little over 2.5 times as fast as the present HD4000. So no, I don't see them being equal. But I guess they would be close. The GT3's probably running Dirt 3 at 30fps, and the 650M at like 45 fps.

    Also, just to make things worse, the 650M in the present MBPs are clocked over 10% higher than the standard 650M. And when Haswell is released with its GT3, a likely MBP candidate GPU will likely be a good 20-50% faster again.
     
  4. thedarkhorse macrumors 6502a

    Joined:
    Sep 13, 2007
    Location:
    Canada
    #4
    funny they choose a game that was developed favouring AMD gpu's to compare their gpu to nvidia's offering
     
  5. pgiguere1 thread starter macrumors 68020

    pgiguere1

    Joined:
    May 28, 2009
    Location:
    Montreal, Canada
    #5
    Yeah, this is to be taken with a grain of salt, especially following Intel's shady marketing tactic about their new "7W" mobile CPUs.
     
  6. Erasmus, Jan 9, 2013
    Last edited: Jan 9, 2013

    Erasmus macrumors 68030

    Erasmus

    Joined:
    Jun 22, 2006
    Location:
    Hiding from Omnius in Australia
    #6
    Of course, if Intel have decided to increase the number of cores in GT3 from 40 to 60, or increase the frequencies by 50% over the HD4000, then perhaps it would truly be the same amount of power.

    Something else they could do is decrease the clock speeds and voltage of the 650M until it draws a similar amount of power to the GT3, and then do the comparison. Depending on your point of view, that could be a legitimate showcase.
     
  7. Blaine macrumors 6502a

    Blaine

    Joined:
    Dec 3, 2007
    Location:
    Abilene TX
  8. DisMyMac macrumors 65816

    DisMyMac

    Joined:
    Sep 30, 2009
    #8
    MacBook Air is probably the next thing to update. They better hurry with Haswell, or else we'll have to keep waiting.
     
  9. bill-p macrumors 65816

    Joined:
    Jul 23, 2011
    #9
    60fps and 200fps both look "pretty fluid" on a 60Hz monitor, y' know...

    Push 2560 x 1600 on those screens and let's see if both still looks "pretty fluid".
     
  10. throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #10
    Intel will eventually get there. "There" being "good enough". Nvidia competing on performance is not going to work when the low end is good enough. They can't compete on power, because built into the CPU will trump add-on card every time assuming "good enough" performance.


    Once they are "good enough" the number of people buying discrete will drop and the R&D budget for Nvidia will dry up, which means development pace at the high end will slow down/stop.



    Intel eventually integrate everything. In the past there was a market (or in some cases, a much larger market than today) for discrete:

    - FPUs
    - NICs
    - Wifi chipsets
    - non-accelerated VGA cards (vs onboard video)
    - sound cards
    - caching storage controllers
    - etc.


    Intel has previously killed the market for FPUs, consumer NICs, Wifi Chipsets, non-3d video cards, sound cards (except for the extreme high end), low-end cache/raid controllers, etc.

    The market for discrete 3d will be killed in a similar manner. If i was Nvidia i'd be planning an exit strategy. Unfortunately GPGPU tech is a very small niche.

    They need to find the next big thing, or die/be absorbed by intel or AMD.
     
  11. duervo macrumors 68000

    duervo

    Joined:
    Feb 5, 2011
    #11
    I sure hope nobody believes this, or else my hope for humanity will be greatly diminished.
     
  12. throAU, Jan 9, 2013
    Last edited: Jan 9, 2013

    throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #12
    Why is that?

    It has happened with every other add-on industry intel has tackled.

    It killed the Amiga indirectly (Amiga had custom processors for everything, eventually the PC way doing it in software + throwing CPU horsepower at the problem was cheaper in the long term and won). Also improved CPU power is more versatile - custom hardware is much harder to re-purpose. I.e., the shift to 3d rendered all of the Amiga's custom 2d sprite hardware useless.


    It happens with everything:

    - new tech comes out (sound, 3d, signal processing, compression, etc)
    - cpu is too weak to support it well, so dedicated purpose built hardware is developed
    - cpu horsepower catches up (maybe some new instructions are added) and the CPU does it


    I forgot to include - MPEG decoding. that used to be done by hardware either via dedicated card and later your video card.

    CPUs are doing it now.

    Ditto for AES encryption. Previously you'd use a crypto accelerator - now the Core I-series CPU can do it in hardware.

    It's just a matter of time. Nvidia and the other add-on card guys need to come up with a new technology that CPUs can't do yet.
     
  13. cirus macrumors 6502a

    Joined:
    Mar 15, 2011
    #13
    The two really are not comparable. The 650m gets over 60 fps at high 1080p.
    The haswell could get 30 fps (as long as no major drops) and you could not tell the difference (feel it yes if you are playing but seeing it is harder).

    650m gets about 2200 pts 3d mark 11. hd 4000 gets around 700. I doubt they are going to get more than three times performance of ivy out of haswell, especially as haswell is memory bandwidth handicapped (or does this haswell have the the special high bandwidth chip?).
     
  14. Xgm541 macrumors 6502a

    Joined:
    May 3, 2011
    #14
    I dont buy your theory. Intel GMA chips were "good enough" for most consumers. Do you need a AMD 7970 to use your excel spreadsheets in your cubicle? Doubt it.

    Remember the good old quote by Bill Gates "640k memory should be enough for everyone" (or however it went..)?
     
  15. krravi macrumors 65816

    Joined:
    Nov 30, 2010
    #15
    The next MBA must be a beast, if this is true.
     
  16. duervo macrumors 68000

    duervo

    Joined:
    Feb 5, 2011
    #16
    Because most of what you listed is not integrated into modern Intel CPU's.

    1. FPU's are integrated into CPU's. This one is correct. However, heavy physics/mathematical processing is still done much, much better by a dedicated graphics card (i.e.: SETI@Home, and Folding@Home are the two most popular examples of this.)
    2. NICs are not integrated into CPU's. They require separate chips as well. If you are referring to LAN-on-board (LoB), they still need a separate chip to be installed on the motherboard. This is not the same as being integrated into the CPU.
    3. Wifi is not integrated into CPU's. It also requires a separate chip.
    4. Current CPU's do not have integrated audio. Audio still requires a separate chip for processing.
    5. Storage controllers still require a separate chip for processing, and these chips can come from one or more manufacturers. They are not integrated into the CPU.

    I think what you were trying to convey was that motherboards have a lot of features on-board these days, that previously required the presence of separate physical adapters. This is not the same thing as having those features integrated into the CPU.

    A more accurate statement would probably have been that separate FPU and memory controller chips are a thing of the past (because those two functions are indeed now integrated into Intel's CPU's), but saying that all of those other functions and features are integrated in the CPU is misleading.

    Edit: Sorry for sounding like a have a huge stick up my butt. What I've said here (more specifically "how" I've said it) makes me sound like a bit of a prick ... rest assured, that was not my intention.
     
  17. el-John-o macrumors 65816

    Joined:
    Nov 29, 2010
    Location:
    Missouri
    #17
    I don't think people are being very fair here. Apples to Apples this is pretty good. No, it won't beat a dedicated GPU.. duh... but the world of Integrated Graphics is changing.

    I think it's cool. Especially for those of us who like smaller form factor notebooks that generally don't have integrated graphics. With Haswell, we may not have to give up as much performance as we have before.

    Even with HD4000 though, there was a time I would never consider a notebook with Intel Integrated graphics. Though I prefer the smaller form factor, I'd have gone with a bigger, clunkier notebook in order to have usable graphics. However, the Intel HD4000 works quite well, and I've been extremely happy with it's performance. It seems like the next gen will be as much of a leap as 3000 was to 4000, which is great!
     
  18. throAU, Jan 9, 2013
    Last edited: Jan 9, 2013

    throAU macrumors 601

    throAU

    Joined:
    Feb 13, 2012
    Location:
    Perth, Western Australia
    #18


    No they weren't.

    They were good enough for business, but you couldn't game on them, even casually.

    The HD4000 is already good enough for casual gaming if you drop back to games a few years old or are willing to drop detail even with most new mainstream games. Most games now are also ported to console (all of which are very mediocre 3d wise) and even mobile devices. The requirements of 3d games have not been increasing anywhere near as quickly lately.

    I have an old 8800GTS in a spare PC that will run most things in a playable manner and that card is what... 5 years old now? That was unheard of in the 90s.


    It already happened with a heap of other market segments. 3d is just next in line.

    ----------

    True. Never said they were all in-built into the CPU....

    However a lot of the PROCESSING that was previously done by dedicated hardware is now done in the CPU and the low-cost on-board chip is only possible in many cases because of that. The add-on cards were a lot less CPU load intensive in the driver - but as CPU power improves that becomes less important. Modems are a prime example. A winmodem (remember those?) used to need quite a hefty CPU to drive it when they first came out. Now? It doesn't even register on any cpu load meter.

    Ditto for sound. The add-on cards like the SB Audigy, etc do/did a lot more processing in DSP hardware on the card. Onboard audio makes more use of the CPU, but CPUs are so fast now it doesn't matter, the percentage of CPU load is minimal.


    End result is this: dedicated add on hardware eventually dies out - in every non-niche field.


    Nvidia should be scared.
     
  19. el-John-o macrumors 65816

    Joined:
    Nov 29, 2010
    Location:
    Missouri
    #19
    Exactly. GMA struggled even with HD YouTube playback and more graphically intense applications. GMA was good enough for a point-of-sale machine at the grocery store or to make word documents in a cubicle, but it was largely underpowered.

    Now, things are different. On an HD4000 I can play old games at max settings (like Age of Empires III maxed out at 2560x1440, impressive? No not really, but way better than previous IGP). HD playback, even with clunky flash players, is just fine. Fluid animations in software, etc. It's not a gaming machine, naturally, but no longer do you have to sacrifice everyday performance either! It's finally what integrated graphics SHOULD be, something that performance well in day to day tasks, handles basic 3D, etc. and uses little power and produces little heat. I don't game on my laptop (well, some casual gaming perhaps), but I expect it to be able to do the stuff I do on a daily basis. I've got only good things to say about HD4000... for what it is. No, it doesn't outdo by HD5870 crossfire system in my desktop, but I don't expect it to. I just expect the tasks I perform on my notebook to be fluid and manageable, and they are.
     
  20. Muscle Master macrumors 6502a

    Muscle Master

    Joined:
    Oct 15, 2010
    Location:
    Philadelphia
    #20
    Question

    Why won't intel deep into the discrete GPU market? Surely they don't think integrated GPU's is the future unless they know something we don't
     
  21. Xgm541 macrumors 6502a

    Joined:
    May 3, 2011
    #21
    The components you listen that no longer have a big 3rd party market are one's that never really had a top notch performance enthusiast market, but maybe I'm too young to remember that far back. A 100mbps ethernet adapter was a 100mbps ethernet adapter, nothing more nothing less. A WiFi chip is a WiFi chip, you need it to perform one function and they all pretty much do it the same. I believe GPUs will continue to exist for a while longer.

    nVidia is looking at other markets btw, they have their tegra line of mobile processors, and I believe they specialize in professional game design with their quadro line.
     
  22. thedarkhorse, Jan 9, 2013
    Last edited: Jan 9, 2013

    thedarkhorse macrumors 6502a

    Joined:
    Sep 13, 2007
    Location:
    Canada
    #22
    Not to say it isn't impressive for integrated, but I have a feeling the integrated/dedicated graphic gap for gaming will widen this year when new consoles are supposed to be released. Being on par with today's mid-range mobile gpu will not be enough for next gen ports to pc/mac, or maybe just only enough for the first wave of the launch generation of games.
     
  23. el-John-o macrumors 65816

    Joined:
    Nov 29, 2010
    Location:
    Missouri
    #23
    Right but I don't think integrated GPU's should be considered or expected to be a gaming platform. Rather, being able to handle the latest video standards, fluid animations within apps and operating systems, etc. And casual gaming (flash games, small time wasters, etc.)

    I'm not disappointed to learn that the next gen IGP chips are not going to run the latest games, because I don't expect them to. I mean, sure, it'd be cool.. but also a bit unreasonable.

    ----------

    It's not quite that easy to just jump into it, unfortunately. AMD does, but only after purchasing a company (ATi) who was already producing them.

    I don't think Intel thinks integrated is the future, but they are comfortable with their partnership with nVidia and their compatibility with ATi chips. What they are producing, is graphics capable of making x86 tablets relevant (one area where Intel is trying to compete.. with ARM.), and graphics that improve the performance of smaller form factor devices. Face it, that's the largest part of the market right now.
     
  24. Erasmus macrumors 68030

    Erasmus

    Joined:
    Jun 22, 2006
    Location:
    Hiding from Omnius in Australia
    #24
    They tried. It was called Larrabee, and it failed. I think it was because they were trying to jump in the deep end, creating a huge chip from scratch. I think Intel are definitely trying again, but with a "from the bottom up" approach, starting with small iGPUs and gradually making them bigger and better, with more and more cores.

    I agree. "Good enough" being at least equal to all but the most powerful of cards, i.e. everything below "enthusiast" level. And I think Intel will get there in a few years.

    Intel already sells the most GPUs of any company, as AMD and NVIDIA (mostly NVIDIA) have to rely on gaming and graphics design markets. Clearly Intel need only to continue to push their iGPU performance up, and they will swallow more and more of the marketshare. Then, NVIDIA will only be a small company catering to the most extreme gamers and other graphics application users. It's pretty much inevitable.
     
  25. el-John-o macrumors 65816

    Joined:
    Nov 29, 2010
    Location:
    Missouri
    #25
    Technology in the high-end realm has slowed dramatically. Clock speed has come to a SCREECHING halt, but of course, it's less relevant with new technologies.

    I'm not sure if this is because all of the innovation is shifting towards the mobile sector, or we really have just hit a wall where software developers haven't really figured out how to really push the limits, and thus there is less of a need to improve hardware. You mentioned the 90's, sheesh! You'd drive home from the store with a brand new computer and there'd already be a new model before you got home.
     

Share This Page