Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No it will not. Larrabee is a discrete GPU that is separate from Intel's IGPs. IGPs are expected to exist at least through 2010.

The Cell is well suited for only a selection of tasks, and not general purpose tasks like a CPU. I believe Larrabee is much more powerful than Cell too.

Hmm, Anandtechs article seems to imply that Larrabee is also an in order CPU much like Cell. If this is the case then it will be as useful at general purpose code as Cell is, wrt brand prediction and loop unrolling. Either way you look at it Intel is following IBM in their own odd way. I would bet that a 24 SPU BBE would hang with a 24 core Larrabee.
 
No it will not. Larrabee is a discrete GPU that is separate from Intel's IGPs. IGPs are expected to exist at least through 2010.
You mean just because the initial implementations of Larrabee will be as a discrete accelerator card, it will never appear as an IGP? Obviously, Havendale and Auburdale will be integrating a GPU based on the GMA series, but by the time Westmere or Sandy Bridge rolls around in 2010 it seems likely that Larrabee will replace current GMA IGPs integrated in CPUs.
 
Each core consists of a real primitive Pentium 2 processor (which apparently has the best performance / watt when built with modern technology), plus extensions for SSE to SSE4 and 64 bit to be software compatible with the lastest CPUs, PLUS at least 256 bit vector units for the massive performance that is needed for a GPU.
The vector units are apparently 512-bit.

You mean just because the initial implementations of Larrabee will be as a discrete accelerator card, it will never appear as an IGP? Obviously, Havendale and Auburdale will be integrating a GPU based on the GMA series, but by the time Westmere or Sandy Bridge rolls around in 2010 it seems likely that Larrabee will replace current GMA IGPs integrated in CPUs.
Yeah, IGPs will exist through 2010. Beyond that we don't know, and we can't necessarily assume Larrabee will replace them.
 
From what I've read so far, it seems that Larrabee requires from 150 to 300 Watts. We won't be seeing these things in MacBooks, Mac minis or even iMacs.

Unless they lower the power requirements by about 90-95% we won't see Larrabee in anything but Mac Pros.

One of the nice things about it is that it is really just a bunch of light cores. You could theoretically cut it down to 1 or 2 cores for a low powered system, and while it wouldn't be particularly fast it would still perform basic tasks. Whether this is practical would depend on the power/performance ratio with traditional graphics solutions, of course.
 
One of the nice things about it is that it is really just a bunch of light cores. You could theoretically cut it down to 1 or 2 cores for a low powered system, and while it wouldn't be particularly fast it would still perform basic tasks. Whether this is practical would depend on the power/performance ratio with traditional graphics solutions, of course.
Well, it has been reported that Larrabee works in multiples of 8 cores. And the 150W+ figure is probably for the 32+ core variants. But, Larrabee is still a year or 2 away and the 45nm process continues to evolve. Plus, Anandtech reports that the first chips haven't even completed production yet so it's hard to draw conclusions on power usage.
 
Hmm, Anandtechs article seems to imply that Larrabee is also an in order CPU much like Cell. If this is the case then it will be as useful at general purpose code as Cell is, wrt brand prediction and loop unrolling.

It is in order, but not like Cell. Each core runs four in-order threads. So if one core stalls because of branch prediction, latency and so on, the processor just executes instructions from the other cores. You don't need optimised code like with the Cell processor, just not horribly bad code. Even if each thread executes dependent instructions with four cycles latency, four threads could use one hundred percent of what the core can deliver.
 
The Cell is well suited for only a selection of tasks, and not general purpose tasks like a CPU. I believe Larrabee is much more powerful than Cell too.

If you actually read the AnandTech article, it seems as if Larrabee will excel at specifically optimized applications for it (same as Cell). As a general purpose processor it will undoubtly also be very fast for it's time but nothing really out of this world. Reality is that Intel is following IBMs lead.

And I would expect Larrabee to be faster than original Cell processor (released in 2005), but the problem for Intel is that IBM has a Cell2 processor on the horizon at about the same time when Larrabee will be released and from early indications the next generation Cell will have a multicore PPE and up to 32 SPEs which on paper looks like more than enough to battle against Larrabee. And also you can't really ignore that Cell was build from ground up based on modern architecture where Larrabee still has to follow x86 standards (which basically clutters the processor design so it can be compatible with older software/technology which I guess is an advantage on one side but also kind of hinders future development and real potential)

Anyway, my point was that Apple decided to go mainstream and switch to Intel and now it showing that Intel isn't really ahead (a leader) in processor technology, far from it.
 
No it will not. Larrabee is a discrete GPU that is separate from Intel's IGPs. IGPs are expected to exist at least through 2010.

The Cell is well suited for only a selection of tasks, and not general purpose tasks like a CPU. I believe Larrabee is much more powerful than Cell too.

Each SPU in Cell is probably reasonably close to the vector unit per-core in Larrabee however. Maybe Larrabee's is twice as wide overall, and has better double precision support, and Larrabee will have ~32 cores instead of Cell's 8, but that's because it's coming out 4 years later...

However Larrabee also has 32 in-order CPUs that are 4-way multithreaded. This is very much like Sun's Niagara CPUs. Cell only has one in-order CPU that is 2-way multithreaded, however it is most likely a lot more powerful than the old Pentium-derived core in Larrabee, but not enough to make up for only having 1. Again, this is because Cell came out in 2006, and Larrabee will be lucky to make it out by late 2009.

In addition due to the design decisions for Larrabee, it too "is well suited for only a selection of tasks".

The biggest issue regarding Larrabee isn't the hardware, which will probably be adequate, and Intel will use their might to push it. It is the drivers. Intel have a terrible reputation for graphics drivers, and Larrabee is a whole new ballgame that is doing a lot more than DirectX and OpenGL that they can barely get right now.
 
As stated, teh big thing about Larrabee will be drivers. The compiler has to be damn good to support this stuff, since its essentially as general purpose a GPGPU can get. That and performance - no matter how cool it looks, if it can't live up to the performance that Nvidia and ATI have, it won't fly with general conusmers.

And that's teh big uphill battle Intel will have to face - both Nvidia and ATI have immense experience in GPUs and their companies are primarily software engineers that write drivers. Keep in mind Larrabee is due for 2009-2010 so it'll be a while before its out. Nvidia and ATI will certainly have new stuff out by then as well and this will be in discrete GPUs first, not IGPs.

As far as AMD-ATI goes, AMD still has an x86 license and has long been talking about combining CPU cores with GPU cores on a single die. I have no doubt that AMD-ATI has thoughts of Larrabee like projects since they can do essentially teh same thing - string a bunch of old old Athlon cores together and use ATI's know-how in vector processing and so on and get teh same functionality. It actually appears that Nvidia is the odd man out in this situation since they have no x86 license and have relied on brute-strength computing power in GPUs recently
 
That's too bad. I think this will have a hard time getting off the ground, then. It's a classic software/hardware chicken-and-egg problem: Developers won't spend the resources to support Larrabee unless a significant number of customers have it, and a significant number of customers won't buy Larrabee unless the software they want to use supports it.

Look for Intel to push for Larrabee in the next Xbox / PS4. Once that is done, it's golden.
 
Look for Intel to push for Larrabee in the next Xbox / PS4. Once that is done, it's golden.

<strike>not likely</strike>. no way in hell. <strike>why would a game console need intel?</strike>. that's like putting hemi engines in porches. it doesn't fit.
 
<strike>not likely</strike>. no way in hell. <strike>why would a game console need intel?</strike>. that's like putting hemi engines in porches. it doesn't fit.

Would you mind elaborating a bit, or posting links so I may better inform myself?

edit:

Here's what a quick google search provided..these seem to expect Larrabee to perform well for consoles:

http://www.ubergizmo.com/15/archive...t_could_be_a_dream_console_processor-low.html

http://seek.cc/news/gadgets/larrabee-that-could-be-a-dream-console-processor/

http://www.pcreview.co.uk/forums/thread-3253819.php

http://www.gamesetwatch.com/2008/03/trends_intels_larrabee_to_comb.php
 


a porche is a performance car built inside and out to deliver a unigue driving experience.

http://www.porsche.com/usa/models/cayman/cayman-s/gallery/?gtabindex=5

game consoles should be similarly devoted. porsches are such that curvy roads are actually more fun to drive on than straightaways. if for example, life threw curves at you, a porche would be more than grateful to take it on. it would be better for the porsche of course or any other car, for that matter, if life was a simple straightaway, then the porsche can just go as fast as it possibly can and get to its destination faster. but, as we know, life is complex. and what i like about a game console is that they're kind of like complex little machines much like life. if, you put an intel in there, then game consoles would lose that complexity. it wouldn't be as exotic as the current game consoles are and were in the past. do you know what i mean? the game industry doesn't need to get on intel's "roadmap." it needs to by any means possible (for diversity's sake) to stay off it. even if it means not having an "intel inside" sticker.
 
Very nice... my powermac g5 is getting old :rolleyes:

So is mine! I hope they make this available as a PCI-Express card you can add it to Mac Pro models down the line - along side an ATI or NVIDIA GPU.
 
Lots of Porsche stuff... and what i like about a game console is that they're kind of like complex little machines much like life. if, you put an intel in there, then game consoles would lose that complexity. it wouldn't be as exotic as the current game consoles are and were in the past. do you know what i mean? the game industry doesn't need to get on intel's "roadmap." it needs to by any means possible (for diversity's sake) to stay off it. even if it means not having an "intel inside" sticker.

So it appears your argument is that consoles shouldn't use intel because you do not consider intel to be "exotic" enough? Do you share the same concern for Apple's switch to intel?

Your original statement seemed to imply that Larrabee, for some technical / hardware-related reason, would not be able to perform meaningfully in a console..

Is that the case or would it just not seem as "special" if a console was to have intel inside?
 
In-efficient GPU's

My 8800 is roughly 800,000 wasted transistors when I am not running a 3D game or possibly encoding video.

Larrabee (and CUDA to a lesser extent) are meant to move the GPU into a position where it can be used for other things. IMO, Larrabee gets the nod because as a fairly normal x86 part, it will be much easier to coders to integrate into all sorts of things.

Combine the Grand Central technology with this and get massively parallel Mac Pros - 8-16 normal cpu cores and 32/(128 hyperthread) cores per Larrabee expansion board installed. 2011 will be a fun year.
 
What I've read until now, Larrabee will only be able to compete with todays GPU and that with a much higher power consumption. So by the time it gets out, it will already be outdated. My guess is, that the first generation Larrabee will probably
1) not break even. (but Intel most probably doesn't plan it to do)
2) end up in graphic/video workstations. but there it will compete VERY well with nvidia's sly boxes in terms of flop/dollar. most important: it will NOT be such a big deal to use this for software that is already highly parallel, like any renderers for example. nVidias solution so far demands for large adaptations of the software because it's not x86 or even RISC.

i also don't quite see the competition between Larrabee/Cell, because IBM is mostly targetting the server- market, while Intel seems to be heading for the consumber market.
 
Well Larabee may not have to compete with NVIDIA or AMD GPUs graphically. As long as it can offer decent gaming performance, the fact that it can be used for pretty much everything else on the computer as well makes it far more useful (overall) than it's competitors. Like the poster above, unless you're playing games, or decoding video then current GPUs are essentially useless. Larabee has the potential to speed up pretty much every app you have. Combined with Grand Central and Open CL Larabee will be amazing.

As for IGPs, why would intel continue pumping R&D into non-larabee based ones? They will sink all their R&D into Larabee, which is scalable from low end (eg: 8 cores, lower clock) to high end. IGPs will simply be lower speed, fewer core Larabees. Which means Intel won't be wasting resources developing two graphics systems, and two sets of drivers.

Larabee will be a revolution. It isn't a GPU, although it can perform the functions of one. Those simply saying it can't compete with NVIDIA/AMD cards are being very short sighted. The fact that Apple seem to be developing Open CL and Grand Central specifically for such an application means we'll see this in future Macs.
 
The reason why is Intel won't let anyone else build their stuff. So you can't take it to another fab for a cheaper price. MS found this out the hard way with the first xbox. Notice how the 360 has IP that they can take to anyone to fab.

If Intel allows the game console companies to take the IP and find another fab to create them, then maybe we could see their stuff in consoles. But I wouldn't hold my breath.
 
Well Larabee may not have to compete with NVIDIA or AMD GPUs graphically. As long as it can offer decent gaming performance, the fact that it can be used for pretty much everything else on the computer as well makes it far more useful (overall) than it's competitors.
..
Those simply saying it can't compete with NVIDIA/AMD cards are being very short sighted.
I'm not sure whether you are referring to me, but anyways. I think you are a little bit too optimistic there. The driving market for highly parallelized computing in consumer pcs is clearly the gamers market. and they simply won't buy an extension card that needs much more cooling and a new power supply when at the same time it's significantly slower than current GPUs. Most gamers aren't that much interested in HD encoding, Rendering or any other application that really would gain something out of larrabee They only want one thing: play. And I also don't see what you mean by 'anything else'. Tell me some killer app out there in the Consumer market, where Quad Core CPUs are still too slow. The only thing that pops up to me is realtime HD encoding, which would extend the functionality of media- center pcs. Only Problem: people don't want loud media center pcs, so they would even need a water cooling and at least a midi- tower- sized case for this - not very handy.
So as I said, the big market for it i can see are graphic/video workstations. For Apple with Grand Central it sure seems suitable for a new generation of Mac Pros. This could get Apple right back to the top of this market.
 
So it appears your argument is that consoles shouldn't use intel because you do not consider intel to be "exotic" enough? Do you share the same concern for Apple's switch to intel?

Your original statement seemed to imply that Larrabee, for some technical / hardware-related reason, would not be able to perform meaningfully in a console..

Is that the case or would it just not seem as "special" if a console was to have intel inside?

i don't share the same concern w/ apple switching to intel. although, weary at the time since i thought the powerpc architecture was just fine and stable running os x and the programs i needed it to run (fce, photoshop). my aluminum imac has had more kernel panics (3 or 4) than my 12" powerbook (1 panic) in the span that i've had it since november of 2007 than the powerbook that i've been using since 2004. i'm no techie, though and won't try to convince anyone which chip should go to which systems. i don't care. what i care about is real-world performance, optimization, stability, all that good stuff, instead of the number of cores or gigahertz.

but, i do think that the gamer's market and consoles in particular are in a unique position than the pc (yet benefit from pc technology) in that they have a longer expected lifespan than a pc and have more specific needs that game and hardware developers should really be able to squeeze the performance out of each console. but, i think people are already beginning to learn that software needs to catch up to hardware. in game console history, it used to be the reverse. you saw hardware add-ons, for example to play certain games. i think intel is just trying to shop-talk this concern now, although, i don't think it has the solution since the solution already exist - powerpc or cell, which i hear is better at graphics stuff. am i correct to make this assumption?
 
The vector units are apparently 512-bit.

Yeah, IGPs will exist through 2010. Beyond that we don't know, and we can't necessarily assume Larrabee will replace them.

Someone at Beyond3d claimed that the X4500 is the last in that line of IGP. Moving forward, the IGP will be based on Larrabee.
 
i don't share the same concern w/ apple switching to intel. although, weary at the time since i thought the powerpc architecture was just fine and stable running os x and the programs i needed it to run (fce, photoshop). my aluminum imac has had more kernel panics (3 or 4) than my 12" powerbook (1 panic) in the span that i've had it since november of 2007 than the powerbook that i've been using since 2004. i'm no techie, though and won't try to convince anyone which chip should go to which systems. i don't care. what i care about is real-world performance, optimization, stability, all that good stuff, instead of the number of cores or gigahertz.

but, i do think that the gamer's market and consoles in particular are in a unique position than the pc (yet benefit from pc technology) in that they have a longer expected lifespan than a pc and have more specific needs that game and hardware developers should really be able to squeeze the performance out of each console. but, i think people are already beginning to learn that software needs to catch up to hardware. in game console history, it used to be the reverse. you saw hardware add-ons, for example to play certain games. i think intel is just trying to shop-talk this concern now, although, i don't think it has the solution since the solution already exist - powerpc or cell, which i hear is better at graphics stuff. am i correct to make this assumption?

I believe I understand what you mean now, though I'd more likely blame Apple or random software devs for the kernel panics on the iMac than Intel..

You make good points with the console argument, but they seem to ignore what Intel is trying to get across (and what seems to have been stated throughout this thread) about optimization for Larrabee specifically being where it shows massive benefits over a current GPU.

One thing with recent consoles (which unfortunately have been more and more like desktop computers lately) is that as the console ages and the developers become more familiar with its nuances, they produce much more beautiful games. This is what I find interesting about Intel's claims with Larrabee and where I believe you're finding fault.

I looked at is from the point of view of developers being able to learn Larrabee inside and out, so that they may fully utilize the mass-core aspect as well as give it the inroads it would need for a uniform acceptance among platforms and yadda yadda idealism..

I believe you were looking at it from the idea that once the developers learn it, there is nothing more they can do..until Intel churns out a card with more cores..but isn't that the case already? And this would also have the ability to make the old games look new again, as the more cores would beef up the graphics on the games from the previous generation (to an extent)..

I'm not trying to make a strawman, so please correct me if I have, but I understand th(e/is) argument..and I believe it has interesting aspects on both sides.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.