Intel's Larrabee Graphics Chip Delayed Indefinitely

MacRumors

macrumors bot
Original poster
Apr 12, 2001
47,602
9,386


ArsTechnica reports that Intel's Larrabee graphics chip technology has been delayed indefinitely:
Specifically, Larrabee v1 is so delayed that, at the time it eventually launches, it just won't be competitive as a discrete graphics part, so Intel plans to wring some value out of it by putting it out as a test-bed for doing multicore graphics and supercomputing development. Intel will eventually put out a GPU, but may not be the one we've been calling "Larrabee" for the past few years.
Larrabee had been the codename for a new graphics card technology that would compete head to head with NVidia and ATI. Larrabee had a unique hybrid design that was said to scale incredibly well with multiple cores. Apple's Snow Leopard was well poised to take advantage of this multi-core design and Apple had been rumored to be planning on adopting the chip upon its release:
And I've heard from a source that I trust that Apple will use Larrabee; this makes sense, because Larrabee, as a many-core x86 multiprocessor, can be exploited directly by GrandCentral's cooperative multitasking capabilities.
Obviously, with this development, we're not going to be seeing this technology in Macs anytime soon. In fact, Intel is not even planning on announcing details about their followup graphics product until 2010.


Article Link: Intel's Larrabee Graphics Chip Delayed Indefinitely
 

xlii

macrumors 68000
Sep 19, 2006
1,854
64
Millis, Massachusetts
Not good news. Just goes to show that it isn't easy to design a top quality graphics chip. There is a learning curve here. Even for experienced CPU designers.
 

dernhelm

macrumors 68000
May 20, 2002
1,640
117
middle earth
One word: Wow.

I knew they were struggling a bit getting traction under Larrabee, but I didn't quite understand the situation as well as I thought, I guess.

More competition is always better, so its a loss from that perspective.

It's also odd with Intel's strained relationship with NVidia, I figured part of the reason for that was Larrabee. Apparently not.

I can see it being released in future on an OpenCL compatible add-on card. That could be killer stuck in an Xserve or MacPro, doing high-end video work. It would also obviate the need for a "mature software stack" in order for the technology to be useful.
 

Data

macrumors 6502
Dec 20, 2006
389
11
So what does this mean for macs and graphics for 2010, with the whole Nvidia not getting licenses and such ?
 

iMacmatician

macrumors 601
Jul 20, 2008
4,249
55
Waiting for info on the "new Larrabee."

I can see it being released in future on an OpenCL compatible add-on card. That could be killer stuck in an Xserve or MacPro, doing high-end video work. It would also obviate the need for a "mature software stack" in order for the technology to be useful.
I can also see that happening.
 

arn

macrumors god
Staff member
Apr 9, 2001
14,655
2,086
So what does this mean for macs and graphics for 2010, with the whole Nvidia not getting licenses and such ?
The Nvidia thing shouldn't affect much with respect to graphics as that's an issue with the chipset. Apple is still free to use their graphics cards in their computers.

arn
 

borcanm

macrumors regular
Nov 4, 2008
177
0
Wow. This means were not going to see Larrabee in macs for a very long time. If Intel is delaying it so long, its obvious when it comes out it will be sluggish behind other GPUs.
 

andiwm2003

macrumors 601
Mar 29, 2004
4,340
400
Boston, MA
yep, you can't just walk in and be better than ATI or NVIDEA. They have years of GPU experience.

Well, I hope that Intel still makes progress in providing eventually a great integrated solution that can make up for the 9400. The future a light notebooks or netbooks and there integrated GPU's is the future.
 

eastcoastsurfer

macrumors 6502a
Feb 15, 2007
600
27
Not good news. Just goes to show that it isn't easy to design a top quality graphics chip. There is a learning curve here. Even for experienced CPU designers.
We don't know the constraints under which the engineers had to work. Intel is a CPU seller first and foremost. They don't want anything they sell to allow you to use a cheaper CPU (many systems nowadays only need graphics card updates to stay competitive). Anything coming out of Intel will be processor dependent. See USB vs. Firewire.
 

Speedy2

macrumors 65816
Nov 19, 2008
1,135
203
This could mean that Intel is buying Nvidia now. Strategically, it would be a wise move for both companies.
 

iPhysicist

macrumors 65816
Nov 9, 2009
1,331
969
Dresden
This could mean that Intel is buying Nvidia now. Strategically, it would be a wise move for both companies.
...and a slap for AMD/ATI. I would like to see that happen. If Intel can manage to take advise of nVidia's brains it could work out so we have a well performing integrated graphic chip in 2011 (and a decent chip for you PRO consumer with the real macbook pros :rolleyes: )
 

J the Ninja

macrumors 68000
Jul 14, 2008
1,824
0
I'm glad they canceled it, cool as it would've been. It was very quickly turning into another Itanium. I heard tell it ran as hot as the GTX 295 or the 5970, yet could barely keep up with the single GPU versions of those cards. Looking forward to what else the project puts out.
 

Riemann Zeta

macrumors 6502a
Feb 12, 2008
662
0
This certainly isn't surprising. Given that this Larrabee GPU would have been consisted of a bunch of simplified (in-order) x86 cores, it would have been utterly incompatible with any existing GPU architectures. Hence, the drivers must have been a bitch to write--and it is unlikely that the first few iterations would have yielded performance anywhere near modern ATi and NVIDIA hardware.
 

HONDAxACURA

macrumors regular
Sep 6, 2008
119
0
California
If Intel can truly make a decent graphics card, that would be great news, but I still will buy a product with nVidia graphics card inside.

An Intel GPU will be a turn-off to the computer people. Stick with nVidia Apple.
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
5
The World Inbetween
This could mean that Intel is buying Nvidia now. Strategically, it would be a wise move for both companies.
No it wouldn't be, nVidia has pushed themselves into irrelevance. The promised Fermi seems like vapourware at this point. They would probably be pouring money into something that wouldn't work out.

---

 

Eidorian

macrumors Penryn
Mar 23, 2005
29,085
288
Indianapolis
Was anyone expecting a model outside of GPGPU usage?

There's no relation to Intel GMA. I wonder what the IGP core on Sandy Bridge will be like since it is on die now.
 

Len Da Hand

macrumors newbie
Dec 8, 2009
1
0
Get Smart!

With a name like larrabee what was Intel thinking!!!

in get Smart he was even more incompetent than Max - i wonder what larrabee's agent number was!!!

Sounds like someone at Intel has lost CONTROL and KAOS is taking over. Under a cone of silence perhaps?
 

iMacmatician

macrumors 601
Jul 20, 2008
4,249
55
I'm glad they canceled it, cool as it would've been. It was very quickly turning into another Itanium. I heard tell it ran as hot as the GTX 295 or the 5970, yet could barely keep up with the single GPU versions of those cards. Looking forward to what else the project puts out.
Fudzilla said 300 W some time ago.
 

Speedy2

macrumors 65816
Nov 19, 2008
1,135
203
No it wouldn't be, nVidia has pushed themselves into irrelevance. The promised Fermi seems like vapourware at this point. They would probably be pouring money into something that wouldn't work out.

What the hell are you talking about? Nvidia might be a few months behind but that doesn't mean they are out of the game. Did you say the same thing about ATI when their HD 2000 and 3000 series couldn't even nearly beat Nvidia's competing cards? Look where they are now.

I'm just saying that Intel needs a new strategy if they want to compete in the graphics / GPU computing market. Buying Nvidia would give them a very good CPU platform AND a good GPU. I'm pretty sure Nvidia got a Nehalem chipset running in their labs, but they can't release it without a license. AMD has had the better CPU platform for ages now, which partly made up for the weaker CPU. Time for Intel to do something about it. Intel's chipsets + IGP suck big time compared to ATI + Nvidia. (That's why Apple ditched them) And don't forget about Tegra, that might a worthy acquisition for Intel, too.
 

Data

macrumors 6502
Dec 20, 2006
389
11
The Nvidia thing shouldn't affect much with respect to graphics as that's an issue with the chipset. Apple is still free to use their graphics cards in their computers.

arn

Ty for clearing that up for me ;-).
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
5
The World Inbetween
What the hell are you talking about? Nvidia might be a few months behind but that doesn't mean they are out of the game. Did you say the same thing about ATI when their HD 2000 and 3000 series couldn't even nearly beat Nvidia's competing cards? Look where they are now.
Nope... See there WERE actual ATi chips by now and they were very good value for money. If Fermi EVER comes out and doesnt turn out to be another 2900XTX, itll be over priced and probably one huge let down. See all we have so far of Fermi is an engineering sample, and that was at nVision. nVidias chips usually go to the A3 revision until they're released.

I'm just saying that Intel needs a new strategy if they want to compete in the graphics / GPU computing market. Buying Nvidia would give them a very good CPU platform AND a good GPU. I'm pretty sure Nvidia got a Nehalem chipset running in their labs, but they can't release it without a license. AMD has had the better CPU platform for ages now, which partly made up for the weaker CPU. Time for Intel to do something about it. Intel's chipsets + IGP suck big time compared to ATI + Nvidia. (That's why Apple ditched them) And don't forget about Tegra, that might a worthy acquisition for Intel, too.
Intel admit defeat, funny man. Funny.

You do realise that Tegra is just a re-branded ARM chip?
 

macerroneous

macrumors regular
Jan 13, 2008
134
0
USA
Law suit

Maybe Intel sued NVidia to try to get license to use some NVidia IP to make Larrabee work. NVidia didn't bite, evidently.
 

holmesf

macrumors 6502a
Sep 30, 2001
526
24
What the hell are you talking about? Nvidia might be a few months behind but that doesn't mean they are out of the game. Did you say the same thing about ATI when their HD 2000 and 3000 series couldn't even nearly beat Nvidia's competing cards? Look where they are now.
Seriously, and outside of gaming it's ATI that is becoming less relevant. Nvidia offers some damn nice high performance computing solutions (eg Tesla), and Fermi (when it materializes) will cement this advantage over ATI. OpenCL was also based on Nvidia's CUDA, as opposed to ATI's stream, and in general CUDA provides a great cross-platform programming environment. For these reasons Nvidia is gaining rapid adoption in the scientific computing community and academia -- perhaps not the most important markets, but it's certainly winning the mindshare of some super elite nerds.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.