Become a MacRumors Supporter for $25/year with no ads, private forums, and more!
  • Did you order new AirTags? We've opened a dedicated AirTags forum.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
52,473
14,166
https://www.macrumors.com/images/macrumorsthreadlogodarkd.png

Intel has started releasing some details about their upcoming GPU project code named Larrabee. New graphics cards based on this technology will compete with NVidia and ATI video cards that currently dominate the market. Larrabee appears to be a hybrid design between existing CPUs and GPUs, according to Extremetech:
Each of these Larrabee cores is quite distinct from the execution "cores" in many current graphics processors. It's not the x86 instruction set that makes it special (though that is certainly unique as well). Rather, it is the support for full context switching and preemptive multitasking, virtual memory and page swapping, and full cache coherency. These are features developers have come to expect in modern x86 CPUs, but don't yet exist in modern GPUs.
The advantage of such a design is said to be improved scalability as additional processor cores are added. They claim an almost linear improvement in gaming performance as the number of processor scores increase:

090000-linear_400.jpg


Intel claims that existing programming APIs such as DirectX and Open CL can be used so existing games should be able to take advantage of Larrabee. While Apple has made no announcements surrounding the adoption of Larrabee, Arstechnica's Jon Stokes claims that Apple will be adopting it:
And I've heard from a source that I trust that Apple will use Larrabee; this makes sense, because Larrabee, as a many-core x86 multiprocessor, can be exploited directly by GrandCentral's cooperative multitasking capabilities.
Intel is quick to point out that describing Larrabee as just a GPU is misleading, in that they expect Larrabee multi-core processors could be used in a number of applications outside of gaming.

Larrabee is expected to be released in 2009-2010 and will be initially targeted at "the personal computer market". Apple should be well equiped to leverage this technology with the introduction of Snow Leopard sometime in 2009. Snow Leopard will incorporate tools such as Grand Central and Open CL to harness both multi-core and GPU processors.



Article Link
 

elmateo487

macrumors 6502a
Jun 12, 2008
747
376
This is REALLY good news. Its about time someone does something with the GPU. Im really excited about where laptops in 2009-2010 will be.

Perfect time for me to upgrade! :)
 
Comment

andiwm2003

macrumors 601
Mar 29, 2004
4,353
414
Boston, MA
i somehow doubt that apple will use it. does apple use the hardware acceleration for H.246 encoding on current graphics cards?

i somehow feel apple doesn't want to deviate from standard technology platform because that would lead to a mix of systems using larrabee and others using integrated GPU's and again others using Nviea cards. Too complicated and unpredictable.
 
Comment

fintler

macrumors newbie
Aug 5, 2003
13
0
That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.
 
Comment

The Tall One

macrumors regular
Aug 1, 2008
150
0
Graphics cards

I'm very disappointing in the graphics card on the macBook. I would think that Apple would have met the power of the intel chip with a good graphics card. But the most simple of graphics often causes my macbook to crash or have errors. I'm totally not impressed.

This is good news, especially if they'll incorporate it into the macbook, it needs a serious graphics card update.
 
Comment

jholzner

macrumors 65816
Jul 24, 2002
1,384
0
Champaign, IL
This is REALLY good news. Its about time someone does something with the GPU. Im really excited about where laptops in 2009-2010 will be.

Perfect time for me to upgrade! :)

This does sound pretty damn cool! I'll be upgrading my C2D 2.16 Macbook in 2010 so I'm looking forward to this and Snow Leopard.
 
Comment

iSee

macrumors 68040
Oct 25, 2004
3,527
253
Interesting: GPUs are evolving in to general processors on a card.

When you realize you need more computational horsepower in your system you'll be able to upgrade your "graphics" card.

Even though this isn't a traditional design, Intel is going to have to perform on the traditional beanchmarks to make headway in this market.
 
Comment

iMacmatician

macrumors 601
Jul 20, 2008
4,249
55
Awesome!

Now I'm really interested in seeing how this turns out.

Especially with the nearly-perfect linear scaling and the driver support for future APIs.

Also the possibility that there could be a Larrabee Mac without an extra CPU could mean that multi-threaded performance would increase very highly in the Mac Pro.
 
Comment

Apple Ink

macrumors 68000
Mar 7, 2008
1,918
0
Nice for Intel Apple and some of us.
But I definitely dont want for Intel to prove to be a tough competitor in the graphics market! If it does... it'll be the hardest (and probably the last) blow to AMD ATi which is actually standing due to its graphics business!

So why bother about AMD? Because Intel has the worst product pricing in the industry and history is proof to it. With AMD gone it'll be a field day for Intel but a death curse for us..... considering that: only processor manufacturer + horrible pricing decisions = poorer consumers!
 
Comment

gnasher729

Suspended
Nov 25, 2005
17,980
5,542
That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.

Amdahl's law (not the law of the poster here who calls himself Amdahl, but Gene Amdahl) is a law for vector processors, not for multi-processor machines and quoting it in a current context is misleading.

On a vector processor, all the vector capability on a computer was useless and wasted as soon as an application couldn't make use of it. However, if an application cannot make use of multiple cores, then _that_ application will be limited in speed, but the cores that it cannot use are then available to other applications. You can make 100 percent use of an eight core Mac Pro by running eight applications that are each totally incapable of using multiple cores.

Even though this isn't a traditional design, Intel is going to have to perform on the traditional beanchmarks to make headway in this market.

Larrabee won't do that. Larrabee is about 32 cores, each core about the same as a ten year old Pentium 2 processor, with a 256 bit vector unit bolted on. It will _not_ perform very well on a traditional benchmark at all. It will absolutely _scream_ at anything written specifically for it.
 
Comment

arn

macrumors god
Staff member
Apr 9, 2001
15,712
4,556
i somehow doubt that apple will use it. does apple use the hardware acceleration for H.246 encoding on current graphics cards?

i somehow feel apple doesn't want to deviate from standard technology platform because that would lead to a mix of systems using larrabee and others using integrated GPU's and again others using Nviea cards. Too complicated and unpredictable.

If your program supports OpenCL it will support Larrabee and NVIDIA and ATI.

arn
 
Comment

Gasu E.

macrumors 601
Mar 20, 2004
4,681
2,732
Not far from Boston, MA.
That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.

Not that I am an expert on this, but isn't 3-D rendering nearly 100% parallelizable? I don't know the specifics but I would not be surprised if rendering constituted more than 85% of the computation of certain games-- the graph label does say games.
 
Comment

diamond.g

macrumors 604
Mar 20, 2007
6,933
685
Virginia
That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.
Graph was indication graphics performance not CPU. So, when comparing within lines, it could be very accurate.

Nice for Intel Apple and some of us.
But I definitely dont want for Intel to prove to be a tough competitor in the graphics market! If it does... it'll be the hardest (and probably the last) blow to AMD ATi which is actually standing due to its graphics business!

So why bother about AMD? Because Intel has the worst product pricing in the industry and history is proof to it. With AMD gone it'll be a field day for Intel but a death curse for us..... considering that: only processor manufacturer + horrible pricing decisions = poorer consumers!
AMD/ATI isn't going anywhere just yet. R700 is proving to be quite a bit nicer than GT200 is thus far. Plus Intel does have quite a hill to climb in the GPU market.
 
Comment

iSee

macrumors 68040
Oct 25, 2004
3,527
253
That graph is really misleading. Unless near 85% of that program is running in parallel, you're not going to see that kind of speedup. Mr. Amdahl says that normal programs (i.e. 50% concurrency) will cease to see the benefits of multiple cores at around 16 cores.

I think the point is that this is for tasks that can be highly parallelized (> 85%, for sure).

Game graphics rendering is one area where this can be and is being done today--notice the graph is showing games. Intel is clearly expecting there to be demand for large amounts of parallel processing power in the future. Perhaps video processing. Simulations of various sorts (both for games and science, finance, etc.). Much, much more is possible.
 
Comment

diamond.g

macrumors 604
Mar 20, 2007
6,933
685
Virginia
Larrabee won't do that. Larrabee is about 32 cores, each core about the same as a ten year old Pentium 2 processor, with a 256 bit vector unit bolted on. It will _not_ perform very well on a traditional benchmark at all. It will absolutely _scream_ at anything written specifically for it.

Bolding mine...

Well some folks over at Beyond3D are having a discussion on what Larrabee is and are pretty sure it isn't just a Pentium Core with stuff bolted on. The discussion is progressing and there are even a couple of folks that work at Intel chiming in in the thread.

EDIT: Anandtech has an article up on Larrabee...
 
Comment

Michael73

macrumors 65816
Feb 27, 2007
1,081
39
Does this mean that those of us with upgradeable machines e.g. a Mac Pro could (some day) swap out the nvidia 8800GT for a Larrabee-based card while running Snow Leopard and further the lifespan of our machines?
 
Comment

CWallace

macrumors G3
Aug 17, 2007
8,470
5,012
Seattle, WA
nVidia is developing general purpose CPUs and now Intel is developing advanced GPUs. It stands to reason the two technologies will start to merge in the future to maximize resources available to the consumer, especially as pricing becomes more and more important.
 
Comment

daneoni

macrumors G4
Mar 24, 2006
10,807
79
Does this mean that those of us with upgradeable machines e.g. a Mac Pro could (some day) swap out the nvidia 8800GT for a Larrabee-based card while running Snow Leopard and further the lifespan of our machines?

I would imagine you'd need a new motherboard or BIOS update
 
Comment

Rorikynn

macrumors member
Dec 24, 2007
44
0
Everything hinges on the scale of that y-axis. Yes, Larrabee could possibly scale linearly on rasterized graphics (I'll believe it when I see it), but if having 16 cores gets me 15 fps then 32 will only get me 30 fps...I'll need at least 64 cores to get near the target 60 fps. And thats with perfect linear growth, the graph doesn't show that.

Where Larrabee will really shine is with ray tracing based graphics. Developers could use the main CPU to run an algorithm that calculates the scene complexity for every frame and then use that information to optimally split up the frame into different sized buckets so Larrabee can ray trace each bucket in parallel and each bucket will get done approximately at the same time.

If Intel can ray trace Quake Wars at 30 fps on a 16 core, 2.93Ghz Tigerton system then having 48+ Larrabee cores would be great, even if they are clocked much lower.
 
Comment

Yvan256

macrumors 603
Jul 5, 2004
5,049
927
Canada
Power required

From what I've read so far, it seems that Larrabee requires from 150 to 300 Watts. We won't be seeing these things in MacBooks, Mac minis or even iMacs.

Unless they lower the power requirements by about 90-95% we won't see Larrabee in anything but Mac Pros.
 
Comment

iSee

macrumors 68040
Oct 25, 2004
3,527
253
Larrabee won't do that. Larrabee is about 32 cores, each core about the same as a ten year old Pentium 2 processor, with a 256 bit vector unit bolted on. It will _not_ perform very well on a traditional benchmark at all. It will absolutely _scream_ at anything written specifically for it.

That's too bad. I think this will have a hard time getting off the ground, then. It's a classic software/hardware chicken-and-egg problem: Developers won't spend the resources to support Larrabee unless a significant number of customers have it, and a significant number of customers won't buy Larrabee unless the software they want to use supports it.

Slipping this in to the GPU market is a way to resolve this. By supporting DirectX and OpenGL, Larrabee is instantly supported by tons and tons of games. But for gamers, the question is: should I get (A) a Larrabee card, (B)an ATI card, or (C) an nVidia card? For Larrabee to be widely deployed, the answer for many gamers is going to have to be (A). So it's got to complete head-to-head with those guys.

Once widely deployed, it will get attention from developers of all sorts of apps.

If it can't complete with traditional GPUs on existing apps, I think we'll wonder in a few years, "what ever happened to that Larrabee thing Intel was working on?"
 
Comment
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.