Intel Details Upcoming GPU Project (Larrabee) Due in 2009

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Aug 4, 2008.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]

    Intel has started releasing some details about their upcoming GPU project code named Larrabee. New graphics cards based on this technology will compete with NVidia and ATI video cards that currently dominate the market. Larrabee appears to be a hybrid design between existing CPUs and GPUs, according to Extremetech:
    The advantage of such a design is said to be improved scalability as additional processor cores are added. They claim an almost linear improvement in gaming performance as the number of processor scores increase:

    [​IMG]

    Intel claims that existing programming APIs such as DirectX and Open CL can be used so existing games should be able to take advantage of Larrabee. While Apple has made no announcements surrounding the adoption of Larrabee, Arstechnica's Jon Stokes claims that Apple will be adopting it:
    Intel is quick to point out that describing Larrabee as just a GPU is misleading, in that they expect Larrabee multi-core processors could be used in a number of applications outside of gaming.

    Larrabee is expected to be released in 2009-2010 and will be initially targeted at "the personal computer market". Apple should be well equiped to leverage this technology with the introduction of Snow Leopard sometime in 2009. Snow Leopard will incorporate tools such as Grand Central and Open CL to harness both multi-core and GPU processors.



    Article Link
     
  2. scotty56 macrumors regular

    Joined:
    Feb 12, 2008
    Location:
    Oregon
  3. elmateo487 macrumors 6502a

    Joined:
    Jun 12, 2008
    #3
    This is REALLY good news. Its about time someone does something with the GPU. Im really excited about where laptops in 2009-2010 will be.

    Perfect time for me to upgrade! :)
     
  4. andiwm2003 macrumors 601

    andiwm2003

    Joined:
    Mar 29, 2004
    Location:
    Boston, MA
    #4
    i somehow doubt that apple will use it. does apple use the hardware acceleration for H.246 encoding on current graphics cards?

    i somehow feel apple doesn't want to deviate from standard technology platform because that would lead to a mix of systems using larrabee and others using integrated GPU's and again others using Nviea cards. Too complicated and unpredictable.
     
  5. fintler macrumors newbie

    Joined:
    Aug 5, 2003
    #5
    That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.
     
  6. The Tall One macrumors regular

    Joined:
    Aug 1, 2008
    #6
    Graphics cards

    I'm very disappointing in the graphics card on the macBook. I would think that Apple would have met the power of the intel chip with a good graphics card. But the most simple of graphics often causes my macbook to crash or have errors. I'm totally not impressed.

    This is good news, especially if they'll incorporate it into the macbook, it needs a serious graphics card update.
     
  7. jholzner macrumors 65816

    jholzner

    Joined:
    Jul 24, 2002
    Location:
    Champaign, IL
    #7
    This does sound pretty damn cool! I'll be upgrading my C2D 2.16 Macbook in 2010 so I'm looking forward to this and Snow Leopard.
     
  8. iSee macrumors 68040

    iSee

    Joined:
    Oct 25, 2004
    #8
    Interesting: GPUs are evolving in to general processors on a card.

    When you realize you need more computational horsepower in your system you'll be able to upgrade your "graphics" card.

    Even though this isn't a traditional design, Intel is going to have to perform on the traditional beanchmarks to make headway in this market.
     
  9. iMacmatician macrumors 601

    Joined:
    Jul 20, 2008
    #9
    Awesome!

    Now I'm really interested in seeing how this turns out.

    Especially with the nearly-perfect linear scaling and the driver support for future APIs.

    Also the possibility that there could be a Larrabee Mac without an extra CPU could mean that multi-threaded performance would increase very highly in the Mac Pro.
     
  10. Apple Ink macrumors 68000

    Apple Ink

    Joined:
    Mar 7, 2008
    #10
    Nice for Intel Apple and some of us.
    But I definitely dont want for Intel to prove to be a tough competitor in the graphics market! If it does... it'll be the hardest (and probably the last) blow to AMD ATi which is actually standing due to its graphics business!

    So why bother about AMD? Because Intel has the worst product pricing in the industry and history is proof to it. With AMD gone it'll be a field day for Intel but a death curse for us..... considering that: only processor manufacturer + horrible pricing decisions = poorer consumers!
     
  11. andrewdale macrumors 6502a

    Joined:
    Jan 28, 2008
    Location:
    Memphis, TN
    #11
    The only problem is, if Intel keeps coming out with such awesome products, when is the possible monopoly going to arrive?
     
  12. gnasher729 macrumors P6

    gnasher729

    Joined:
    Nov 25, 2005
    #12
    Amdahl's law (not the law of the poster here who calls himself Amdahl, but Gene Amdahl) is a law for vector processors, not for multi-processor machines and quoting it in a current context is misleading.

    On a vector processor, all the vector capability on a computer was useless and wasted as soon as an application couldn't make use of it. However, if an application cannot make use of multiple cores, then _that_ application will be limited in speed, but the cores that it cannot use are then available to other applications. You can make 100 percent use of an eight core Mac Pro by running eight applications that are each totally incapable of using multiple cores.

    Larrabee won't do that. Larrabee is about 32 cores, each core about the same as a ten year old Pentium 2 processor, with a 256 bit vector unit bolted on. It will _not_ perform very well on a traditional benchmark at all. It will absolutely _scream_ at anything written specifically for it.
     
  13. fintler macrumors newbie

    Joined:
    Aug 5, 2003
    #13
    http://en.wikipedia.org/wiki/Image:AmdahlsLaw.svg

    Unfortunately, it's not that linear when you zoom out the graph a wee bit. :rolleyes:
     
  14. arn macrumors god

    arn

    Staff Member

    Joined:
    Apr 9, 2001
    #14
    If your program supports OpenCL it will support Larrabee and NVIDIA and ATI.

    arn
     
  15. Gasu E. macrumors 601

    Gasu E.

    Joined:
    Mar 20, 2004
    Location:
    Not far from Boston, MA.
    #15
    Not that I am an expert on this, but isn't 3-D rendering nearly 100% parallelizable? I don't know the specifics but I would not be surprised if rendering constituted more than 85% of the computation of certain games-- the graph label does say games.
     
  16. diamond.g macrumors 603

    diamond.g

    Joined:
    Mar 20, 2007
    Location:
    Virginia
    #16
    Graph was indication graphics performance not CPU. So, when comparing within lines, it could be very accurate.

    AMD/ATI isn't going anywhere just yet. R700 is proving to be quite a bit nicer than GT200 is thus far. Plus Intel does have quite a hill to climb in the GPU market.
     
  17. iSee macrumors 68040

    iSee

    Joined:
    Oct 25, 2004
    #17
    I think the point is that this is for tasks that can be highly parallelized (> 85%, for sure).

    Game graphics rendering is one area where this can be and is being done today--notice the graph is showing games. Intel is clearly expecting there to be demand for large amounts of parallel processing power in the future. Perhaps video processing. Simulations of various sorts (both for games and science, finance, etc.). Much, much more is possible.
     
  18. bobrik macrumors member

    Joined:
    Apr 13, 2007
    Location:
    Prague, Czech Republic
    #18
    Typo

    should be OpenGL, not OpenCL here
     
  19. diamond.g macrumors 603

    diamond.g

    Joined:
    Mar 20, 2007
    Location:
    Virginia
    #19
    Bolding mine...

    Well some folks over at Beyond3D are having a discussion on what Larrabee is and are pretty sure it isn't just a Pentium Core with stuff bolted on. The discussion is progressing and there are even a couple of folks that work at Intel chiming in in the thread.

    EDIT: Anandtech has an article up on Larrabee...
     
  20. Michael73 macrumors 65816

    Joined:
    Feb 27, 2007
    #20
    Does this mean that those of us with upgradeable machines e.g. a Mac Pro could (some day) swap out the nvidia 8800GT for a Larrabee-based card while running Snow Leopard and further the lifespan of our machines?
     
  21. CWallace macrumors 603

    CWallace

    Joined:
    Aug 17, 2007
    Location:
    Seattle, WA
    #21
    nVidia is developing general purpose CPUs and now Intel is developing advanced GPUs. It stands to reason the two technologies will start to merge in the future to maximize resources available to the consumer, especially as pricing becomes more and more important.
     
  22. daneoni macrumors G4

    daneoni

    Joined:
    Mar 24, 2006
    #22
    I would imagine you'd need a new motherboard or BIOS update
     
  23. Rorikynn macrumors member

    Joined:
    Dec 24, 2007
    #23
    Everything hinges on the scale of that y-axis. Yes, Larrabee could possibly scale linearly on rasterized graphics (I'll believe it when I see it), but if having 16 cores gets me 15 fps then 32 will only get me 30 fps...I'll need at least 64 cores to get near the target 60 fps. And thats with perfect linear growth, the graph doesn't show that.

    Where Larrabee will really shine is with ray tracing based graphics. Developers could use the main CPU to run an algorithm that calculates the scene complexity for every frame and then use that information to optimally split up the frame into different sized buckets so Larrabee can ray trace each bucket in parallel and each bucket will get done approximately at the same time.

    If Intel can ray trace Quake Wars at 30 fps on a 16 core, 2.93Ghz Tigerton system then having 48+ Larrabee cores would be great, even if they are clocked much lower.
     
  24. Yvan256 macrumors 603

    Yvan256

    Joined:
    Jul 5, 2004
    Location:
    Canada
    #24
    Power required

    From what I've read so far, it seems that Larrabee requires from 150 to 300 Watts. We won't be seeing these things in MacBooks, Mac minis or even iMacs.

    Unless they lower the power requirements by about 90-95% we won't see Larrabee in anything but Mac Pros.
     
  25. iSee macrumors 68040

    iSee

    Joined:
    Oct 25, 2004
    #25
    That's too bad. I think this will have a hard time getting off the ground, then. It's a classic software/hardware chicken-and-egg problem: Developers won't spend the resources to support Larrabee unless a significant number of customers have it, and a significant number of customers won't buy Larrabee unless the software they want to use supports it.

    Slipping this in to the GPU market is a way to resolve this. By supporting DirectX and OpenGL, Larrabee is instantly supported by tons and tons of games. But for gamers, the question is: should I get (A) a Larrabee card, (B)an ATI card, or (C) an nVidia card? For Larrabee to be widely deployed, the answer for many gamers is going to have to be (A). So it's got to complete head-to-head with those guys.

    Once widely deployed, it will get attention from developers of all sorts of apps.

    If it can't complete with traditional GPUs on existing apps, I think we'll wonder in a few years, "what ever happened to that Larrabee thing Intel was working on?"
     

Share This Page

80 August 4, 2008