PDA

View Full Version : Intel refocuses on integrated graphics




jhvander
May 26, 2010, 05:08 AM
anantech reports on Intel's graphics card statement posted yesterday which notes that there will be no Intel discrete GPU (duh) in the short term and that Intel is focusing with "laser like precision" on integrated graphics.

http://www.anandtech.com/show/3738/intel-kills-larrabee-gpu-will-not-bring-a-discrete-graphics-product-to-market

Not too much here, except that it indicates that Intel will be sticking to their own integrated graphics strategy for the long haul.

As the author notes, it also validates AMD's approach with ATI.

The article says to expect a 2x improvement over current Intel graphics for the early 2011 chips, and another 2x for the following chips.

Just food for thought.



Jayomat
May 26, 2010, 05:50 AM
really not too surprising..

iMacmatician
May 26, 2010, 06:31 AM
They are also refocusing Larrabee (or whatever it turns out to be) on HPC. AFAIK Larrabee was always more suited to HPC compared to GPU.

Intel's new strategy for integrated graphics looks promising, at least compared to what they have now and in the past.

iSpoody 1243
May 27, 2010, 05:36 AM
why must intel shove their rubbish integrated gpus into their processors?
so every intel mobile cpu will have a in built gpu?

iMacmatician
May 27, 2010, 07:41 AM
From 2011, every CPU except high-end desktop and most server/workstation CPUs.

iSpoody 1243
May 27, 2010, 08:13 AM
omg.....
intel own the processor market place and are using their monopoly to force their mediocre gpus into every machine.

jdechko
May 28, 2010, 06:16 PM
omg.....
intel own the processor market place and are using their monopoly to force their mediocre gpus into every machine.

Pretty much, yeah.

On the other hand, I dare say that the majority of PC's are being shipped with crappy Intel Graphics anyway, and that most people don't or won't care. For day-to-day computing, I don't even care about it. My workstation at the office has Intel graphics, but my home desktop has a 7300gs in it. Both machines function fairly well for how they're used. It's the beauty of having switchable graphics. Battery life when needed and performance when needed. Look for hybrid graphics systems like optimus to become increasingly more common.

I'd also guess that many Mac owners don't care what graphics cards are used in their computers either. I'd bet that upwards of 80% of what a typical Mac user does (ie: a non-professional) can be accomplished just fine using Intel Graphics.

applesupergeek
May 29, 2010, 07:42 AM
How does he know we should expect 2x graphics performance from intel in early 2011, does he take intel's word for it? (because I think he's just taking intel's money for it, to say that, since no concrete evidence is there).

Even if that is the case, seeing as intel 2x worse (and more) with current igfx from nvidia and ati, 2x in a year will mean that they'll still be 2x worse off, again with whatever nvidia and ati have on the horizon. Even if ati an nvidia somehow stood completely still at best, and by the most optimistic projection, ie. press office claims from intel, they would marginally come close to them.

And how about some news on when intel in going to support one of the corner stones of snow leopard (and future computing for that matter) open cl? There's a whole area of the cpu being assisted by the gpu with open cl, than intel isn't supporting. So, this means a double hit, because not only are the igfx bad, they can't either serve as adjuncts to the cpu in applications that might demand this.

Intel has royally screwed low thermal machines such as the air with their igfx and their monopoly tactics. There was absolutely no reason why they would stick their current igfx there, it was too early, and the knew they had the worse tech by far from anyone else, yet they went along and did it anyway.

Amd will have a superior product with llano in 2011, or at least a far more well rounded and future proof one no more sticking crap igfx next to the cpu on the die, this is the real deal, four x86 cpu cores with a fully integrated gpu. If the 25 tdp amd quote is right, then this will be ideal for the air who has about what (i might be off a little here) about a 35W tdp for cpu and gpu. Because this package will offer four cpu cores and a gpu that blows everything intel will have then out of the water. 25W for an excellent gpu, and four cores for the air, heh? Not bad at all.

And then apple will then inform intel to eff off.

The Llano part will be the notebook-market APU (accelerated processing unit) that combines four x86 processor cores with a Redwood-class (~400 stream processors) DirectX 11 capable performance GPU, all on the same monolithic die. Both functional units will access the same DDR3 memory controller (nothing specific on number of channels) and each of the four cores will have a 1MB L2 cache to itself but these APUs will not have any L3 cache. Clock speeds are expected to reach the 3.0+ GHz mark. AMD claims Llano will have a maximum TDP of 25-50W (remember this is a mobility processor) and will operate at voltages ranging from 0.8v to 1.3v. AMD says that the entire chip will have drastically improved power management features and clock gating to improve power efficiency.

http://www.pcper.com/images/news/roadmap.jpg

Have a look here too for a fare more detailed analysis/discussion:

http://www.bit-tech.net/news/hardware/2010/05/15/amd-fusion-cpu-gpu-will-ship-this-year/1

http://www.xbitlabs.com/news/cpu/display/20100512150105_Second_Iteration_of_AMD_Fusion_Chips_Due_in_2015_AMD.html

iSpoody 1243
May 29, 2010, 07:56 AM
intel are good at processors, not gpus
i personally thing they should leave gpus to the experts(nvidia,ati)
i just think its terrible how they are abusing their market position for their own advantage.
their gpus are not competitive at all
ati/nvidia low power range gpus beat them completely in every aspect.
imo i think intel should leave their gpus separate to their cpus.
and shouldn't force their gpu into every system.
well i guess being fair doesn't earn monies!

applesupergeek
May 29, 2010, 08:14 AM
It was a bad (and greedy) decision on their part, and an immensely stupid one for that matter, that's going to come and bite them on the ass real soon.

They could have decided since they don't have the right tech at the moment to leave their really great cpus as they are and allow both ati and nvidia to offer igfx options, god knows how these two are strapped for cash and they 'd gladly offer the best they could.

That would have satisfied their partners (apple nvidia), and give themselves a good two year window of opportunity to develop at least some half decent graphics for future integration. At the same time, they would be beating amd because they'd have a better cpu, with the option to be fitted with a choice of igfx, maybe only at a slight hit on tdp.

So amd's fusion concept apu would be irrelevant when one could get igfx of their choice and better cpus from intel. And then by late 2012 or 2013 they could throw in their own take on the apu, and be done with amd for good (if amd hadn't already dissolved)

But instead, they piss good customers such as apple off, they force them to come up with all sorts of workarounds and product delays to cater for those crap igfx and the lack of open cl.

And now by 2011 amd will be offering a solution that might be about 15% worse off say in terms of cpu raw power, or battery (phenom cores will go in the first gen apu's) but will have igfx to dwarf anything intel has in their own igfx. Intel's current greed has revived the competition!!!

And guess which company has the better platform and doesn't give a rat's ass about supposed spec sheets, when they can get a deal with a cpu manufacturer, able and willing to go into customisations, able and willing to support open cl, able and willing to offer exclusivity: that's right, it's APPLE.

And that's why they are giving serious consideration to amd. When their ecosystem in 6 months will be in full swing with the ipad, the iphone and ipods, and lots more people will want to translate their great experiences in these devices in a desktop or laptop equivalent, not many off them are going to care (since they don't know or care about cpus and gpus in the idevices) if apple has amd or intel cores. And guess who young (or old) gamers are going to choose in terms of gpu performance on a computer.

A macbook pro with an amd apu in a year say, and (because of the volumes apple will buy from them) a top spec discrete ati gpu too, will be absolutely smoking for gamers, creative professionals etc. etc. The fact that intel will have even (at most) a 20% edge in terms of raw cpu power, or power management (battery life) will be irrelevant.

NintendoFan
May 29, 2010, 08:23 AM
Not to divert the topic, but I think if Apple is indeed looking at AMD for CPUs, I would have to imagine their goal is for a single chip solution, like one found in the iPhone/iPod/iPad. They would ideally want a SoC for the X86 world.

Scottsdale
May 29, 2010, 08:25 AM
How does he know we should expect 2x graphics performance from intel in early 2011, does he take intel's word for it?

Well, actually I believe Intel on it because what they're doing is using the 22nm process on both the CPU and GMA DIEs. Since they use less energy, the clock speed can be "revved up." So if the GMA DIEs clock speed is double, one would expect that's "double" the performance.

I don't believe much of anything Intel says when it comes to graphics. However, if we look at the basics of the GMA DIE, it's currently a 45nm DIE while the CPU DIE is 32nm. If both the CPU DIE and the GMA DIE are 22nm and integrated into one DIE using less energy allowing same TDP at a greater clock speed then 2 X is probable. I think there's no real hope for the Intel GMA to magically exceed the expectations if the 45nm IGP were overclocked to run at the speed the 22nm GMA on DIE with CPU (that doesn't read very well but only way I know how to state it).

So it's a matter of how the GMA should get better than it is on the current Arrandale CPUs. I think we can look at the current GMA and see that it's not even half as capable as the Nvidia 9400m in the current MBA. No matter what, Intel SUCKS for graphics. In addition, the 320m is an 80% boost over the 9400m. So we're not going to be well off in an MBA with an Intel GMA for graphics now or in Sandy Bridge Core i7 CPUs. I certainly would prefer Apple figure out some way around the Intel GMA. I don't know if Apple could fit a discrete GPU in the MBA even if it wasn't a matter of costs? We certainly can reason a 7W ATI 5430 GPU along with a 18W Core i7-6x0UM would fit the total TDP limits of the MBA.

At the end of the day, I don't want Intel's worthless GMA in the MBA. However, if it's the only way to get a current relevant capable CPU in the MBA, I certainly hope it comes with the Sandy Bridge and 22nm CPU/GMA and not the current Arrandale CPUs. I believe Apple would probably go to an AMD/ATI solution first. I really don't believe Apple feels it can go forward with Intel's GMA, at least not right now. SJ even personally came out and bashed the GMA and said 13" MBP customers were much better off with an 80% GPU boost and 10% CPU boost rather than a Core i3/i5 with only the Intel GMA for graphics. It certainly makes it seem like Apple was trying to convey that the GMA is crap and a discrete GPU wouldn't fit. Although it could be that the costs limitations were part of the cause... along with a worthless useless dinosaur ancient "SuperDrive" taking up usable space that could result in a discrete GPU having the space needed in the 13" MBP.

applesupergeek
May 29, 2010, 08:33 AM
Well, actually I believe Intel on it because what they're doing is using the 22nm process on both the CPU and GMA DIEs. Since they use less energy, the clock speed can be "revved up." So if the GMA DIEs clock speed is double, one would expect that's "double" the performance.
For sure, I agree with that, but doubling a performance is also subjective, and that's why intel are touting how they are working on hd video for their igfx, because they aren't working on open cl. ;)


SJ even personally came out and bashed the GMA and said 13" MBP customers were much better off with an 80% GPU boost and 10% CPU boost rather than a Core i3/i5 with only the Intel GMA for graphics. It certainly makes it seem like Apple was trying to convey that the GMA is crap and a discrete GPU wouldn't fit.
Thanks for reminding me this, I had forgotten about it, this is a clear sign of intent by apple. Btw, if you get a chance to read my above post which you probably haven't as we 've been writing our posts concurrently I d like your take on it.

Dont Hurt Me
May 29, 2010, 08:49 AM
If recent history means anything Intel graphics suck, weeee they use less power. This means little, turn down any decent graphics chip to intel levels and they willuse less power. Putting their graphics with a new high power CPU is like buying a Ferrari that has no wheels. Intel graphics are horrible. Its a scam from Intel to try to takeover GPU's and so we are faced with machines with two gpus one that sucks and one that doesnt but you must take the gpu they force on you. Greedy corporations wanting more and more whats new.

iMacmatician
May 29, 2010, 12:03 PM
Well, actually I believe Intel on it because what they're doing is using the 22nm process on both the CPU and GMA DIEs. Don't you mean 32 nm?

Scottsdale
May 29, 2010, 06:24 PM
Don't you mean 32 nm?

Actually, there are 32nm and 22nm variants of Sandy Bridge (may be called Ivy). I believe that I read the 2X graphics are with the move to 22nm process. It could be less than 2X with whatever the 32nm is, but either way it's not going to be anywhere near what an Nvidia 320m can bring us with a Core 2 Duo.

The important part of the picture is Apple is not just moving the CPU from 32nm to 32nm/22nm and GMA from 45nm to 32nm/22nm, they're moving both onto the same die. Right now the Core i7 has a 32nm CPU die and 45nm GMA die on the same chip which is part of the problem for TDP if I remember correctly.

I have read multiple articles stating that Intel has multiple processes for the Sandy Bridge, and could end up being 22nm instead of the 32nm process depending on the application of the CPU (server, mobile, desktop, etc).

I don't know for sure what the "end results" will be? I think it could be either 32nm or 22nm depending on which chips Apple uses. But my assumption on the 2X GMA performance comes when 22nm variants come. I remember Wiki not being as revealing as a few other articles about latest roadmaps. Just do a Google search to figure out the latest roadmap information.

It could be that 32nm is late 2010 and 22nm is early 2011?

Maybe Apple will just move us over to AMD and give us even more incredible ATI graphics! I could be really happy with AMD and ATI. I could be really happy with C2D and Nvidia's 320m. I could be really happy with a Core i7 CPU at 2+ GHz and an ATI 5430 GPU. I could NOT be happy in any way with an Intel GMA as the sole graphics solution... it doesn't matter whether it's a 45nm, 32nm, or 22nm GMA die! I don't want Apple doing this and justifying it with the consumer doesn't need a better GPU and would benefit more from an extra three hours of battery between charges.

iMacmatician
May 29, 2010, 07:16 PM
Actually, there are 32nm and 22nm variants of Sandy Bridge (may be called Ivy). Sandy Bridge (the line of variants, not the microarchitecture itself) is 32 nm, Ivy Bridge is 22 nm. 22 nm isn't coming until early 2012 (probably late 2011 production). All segments that are getting the new microarchitecture except high-end server and high-end desktop will have both 32 nm (some time in 2011) and 22 nm (some time in 2012) variants.

I believe that I read the 2X graphics are with the move to 22nm process. It could be less than 2X with whatever the 32nm is, but either way it's not going to be anywhere near what an Nvidia 320m can bring us with a Core 2 Duo.AFAIK it's 32 nm, also the performance estimates could be theoretical performance instead of actual performance (that was the case with RV770/RV870 targets).

This (http://translate.google.com/translate?hl=en&sl=ja&u=http://pc.watch.impress.co.jp/docs/column/kaigai/20100428_364200.html&ei=XasBTKPqNZHSNdy0qeMO&sa=X&oi=translate&ct=result&resnum=5&ved=0CC0Q7gEwBA&prev=/search%3Fq%3Dpc%2Bwatch%2Bsandy%2Bbridge%2Bsite:pc.watch.impress.co.jp/%2Bivy%2Bbridge%2Bex%2Ba%26hl%3Den%26safe%3Doff) (and the links at the bottom of the page) is the most detailed roadmap I know of…

http://pc.watch.impress.co.jp/img/pcw/docs/364/200/04.jpg

Scottsdale
May 29, 2010, 08:40 PM
Sandy Bridge (the line of variants, not the microarchitecture itself) is 32 nm, Ivy Bridge is 22 nm. 22 nm isn't coming until early 2012 (probably late 2011 production). All segments that are getting the new microarchitecture except high-end server and high-end desktop will have both 32 nm (some time in 2011) and 22 nm (some time in 2012) variants.

AFAIK it's 32 nm, also the performance estimates could be theoretical performance instead of actual performance (that was the case with RV770/RV870 targets).

This (http://translate.google.com/translate?hl=en&sl=ja&u=http://pc.watch.impress.co.jp/docs/column/kaigai/20100428_364200.html&ei=XasBTKPqNZHSNdy0qeMO&sa=X&oi=translate&ct=result&resnum=5&ved=0CC0Q7gEwBA&prev=/search%3Fq%3Dpc%2Bwatch%2Bsandy%2Bbridge%2Bsite:pc.watch.impress.co.jp/%2Bivy%2Bbridge%2Bex%2Ba%26hl%3Den%26safe%3Doff) (and the links at the bottom of the page) is the most detailed roadmap I know of…

http://pc.watch.impress.co.jp/img/pcw/docs/364/200/04.jpg

Theoretical in terms of faster clock speed for GMA equals faster IGP?

Nice information there. I have seen a few different variants of the roadmaps.

Well, I guess double graphics could be in late 2011 then? Heck, I don't know.

This is all trusting in Intel which we're better off not doing anyways. Intel hasn't been good to any of its customers as we would all be better off by Intel not playing bully and forcing us to use Intel chipsets instead of Nvidia's alternative. In the end, I hope that Intel loses BIG for its moves against Nvidia and against its own customers.

applesupergeek
May 29, 2010, 10:39 PM
I was under the impression they 'd do the 22nm die shrink by 2011. If they have it on their roadmap as a 2012 thing, then by the first half of 2011 the AMD apu will be manufactured on the same 32nm process, and despite lacking in terms of the cpu cores it will be heads over shoulders over the intel igfx, thus a really good option. Of course the next apu iteration where each core will serve as both a gpu and a cpu, will take a lot of api development by amd (nvidia is well ahead with the proprietary cuda) who are fully committed to supporting open cl. I expect that we 'll soon hear of intel attempting to buy off nvidia. Maybe that was their original intention by strangling them in terms of gfx marketplace, to lower their market value so they could buy them off later on. That whole larrabee fiasco isn't getting them anywhere, and as per wiki:
On May 25, 2010 the Technology@Intel blog announced that Larrabee would not be released as a GPU, but instead would be released as a product for High Performance Computing competing with the Nvidia Tesla.
What does that leave them with ultimately? Do they expect to be taken seriously with a roadmap that includes their gma line? They 'll never get there with their gfx, they 'll always be pathetically behind, that's why the had a separate team trying to deliver larrabee instead, but this proved to be a disappointing effort. It's not a stretch then to think, like I said, that their second best option was stick those igfx in there hitting nvidia as much as we can, so we can buy them out eventually.

iMacmatician
May 30, 2010, 12:10 AM
Theoretical in terms of faster clock speed for GMA equals faster IGP?Yeah, and also in terms of "2x the SPs = 2x the performance" (RV870).

What does that leave them with ultimately? Do they expect to be taken seriously with a roadmap that includes their gma line? They 'll never get there with their gfx, they 'll always be pathetically behind, that's why the had a separate team trying to deliver larrabee instead, but this proved to be a disappointing effort. It's not a stretch then to think, like I said, that their second best option was stick those igfx in there hitting nvidia as much as we can, so we can buy them out eventually.There's speculation that Intel doesn't see enough of a (future) market for discrete graphics for them to push out a discrete GPU, given Larrabee's performance disadvantage against AMD/NVIDIA GPUs. Instead they'll let their integrated graphics focus on the video stuff and aim Larrabee at HPC computing areas (where it's actually quite good at).

Rumors say that we'll still see the "GenX" integrated graphics in Intel's CPUs through at least 2014 (as opposed to 2011 before the December Larrabee "cancelation"). Also Larrabee might show up as a GPU in 2012.

applesupergeek
May 30, 2010, 07:56 AM
There's speculation that Intel doesn't see enough of a (future) market for discrete graphics for them to push out a discrete GPU, given Larrabee's performance disadvantage against AMD/NVIDIA GPUs.

There's seldom a market for something that's crap compared to the competition.


Instead they'll let their integrated graphics focus on the video stuff and aim Larrabee at HPC computing areas (where it's actually quite good at).

This will be a Larrabee without graphics, but with data-parallel cores, as an HPC accelerator, so it might be quite good at this, but it's an unrelated purpose to its development.


Rumors say that we'll still see the "GenX" integrated graphics in Intel's CPUs through at least 2014 (as opposed to 2011 before the December Larrabee "cancelation").

Let me just point out to some people who might not know, that the GenX is not a new line, but the gma gfx graphics intel has had and has been "developing" so to speak all this time. That's why I am saying that they are in a very tight spot if they don't buy nvidia, if they think they can cut it against any gpgpu - apu efforts by amd with these they are in for another P4 fiasco.

Also Larrabee might show up as a GPU in 2012.
You mean a discrete GPU? That was rumored, but according to the latest intel blog a few days ago it's won't.

This makes for some interesting reading, and I quote a few paragraphs off it (interesting to note, how intel licenced imagination's sgx535 (yes that one one the iphone!) for the gma500 for the atom but had some other company write drivers for it:

When Intel announced that the company was working on Larrabee as their next-generation graphics part, mostly everybody thought that Intel would kill ATI/nVidia with ease. After all, the company knocked AMD from its feet with Core architecture, and Intel felt as secure as ever.

Over the course of the last couple of years, I have closely followed Larrabee with on and off-the-record discussions with a significant number of Intel employees. As time progressed, the warning lights stopped being blips in the distance and became big flashing lights right in front of our faces. After discussing what happened at the Intel Developer Forum and the Larrabee demo with Intel's own engineers, industry analysts and the like, there was no point in going back.

This article is a summation of our information on Larrabee, hundreds of e-mails and chats, lot of roadmaps and covert discussions. When we asked Intel's PR about Larrabee, his comment was that this story was "invented" and has nothing to do with truth. We were also told that our sources were "CRAP", which was duly forwarded to the sources themselves. We will cherish the comments that ensued afterwards for the remainder of our days, including a meeting that followed a comment "since [Intel] PR claims we don't work on LRB, this is a blue cookie". Also, there were some questionable statements about our integrity, but here at Bright Side of News* we are going to continue doing what we did in the past - disclose the information regardless of how good or bad it is. We hope that it is good, but if it's not - don't expect us to stay put.

Unfortunately for the PR, marketing, and sales divisions, every company owes its existence to engineers who pour their hearts in projects and if wasn't for that - you would not have chips with hundreds, and now billions of transistors in. Engineers don't speak in Ronspeak language, but rather are quite open. This is what we would call, a real inconvenient truth.

For a company of Intel's stature, we did not expect that a project such as Larrabee would develop in the way it has. In fact, according to information gathered over the years - LRB just doesn't look like an Intel project at all [read: sloppy execution, wrong management decisions]. The amounts of leaks we received over the course of the past years simply surprised us; on several occasions, we had the opportunity of seeing internal roadmaps and hearing about frustrations regarding unrealistic expectations from the management. First and foremost, the release dates: Intel's internal roadmaps originally cited "sampling in 2008", then "Release in Q4 2008", "Release in 2009". By summer 2008, we saw "mid-2009", "H2 2009," changing to "2010", "H2 2010" and after a recent conversation with several engineers that mentioned "we need another 13-18 months to get [Larrabee] out"; the time came [unfortunately] to complete this story.


"GMA500 suffers from utterly crappy drivers. Intel didn't buy any drivers from Imagination Technologies for the SGX, but hired Tungsten Graphics to write the drivers for it. Despite the repeated protest from the side of Imagination Technologies to Intel, Tungsten drivers DO NOT use the onboard firmware of the chip, forcing the chip to resort to software vertex processing."

"Intel's integrated graphics just don't work. I don't think they will ever work." But that statement could be considered as courtesy compared to his latter statement "[Intel] always say 'Oh, we know it has never worked before, but the next generation ...' It has always been the next generation. They go from one generation to the next one and to the next one. They're not faster now than they have been at any time in the past."

Now, first and foremost, we have to disclose that it is excruciatingly hard to create a graphics processor. Even though some skeptics will say that "you just build one shader unit and then multiply it by an X factor", that is frankly, a load of bull. Today's graphics processors are massively parallel beasts that require two factors to work: drivers and massively parallel hardware. This was confirmed to us by engineers at ATI, nVidia and Intel - so forget about picking sides here.

Even though it owns 50% of the world-wide graphics market, again - we cannot consider Intel to be customer-oriented "GPU" vendor, given the performance of their parts. Just ask Microsoft how many waivers Intel hardware has in their certification process [hint: the number is higher than nVidia's and ATI's worst non-compliant hardware combined]. We know the number and sure thing is, it ain't pretty.

Thus, Intel knew what the company has to do - or risk becoming a dinosaur in increasingly visual world. Now, the company knew the road to Larrabee would be difficult. The only problem is that Intel's old-school thinking underestimated the size of the task at hand and time that it will take to complete such a project.

much more interesting story continues here:
http://www.brightsideofnews.com/print/2009/10/12/an-inconvenient-truth-intel-larrabee-story-revealed.aspx

gyus
May 30, 2010, 08:04 PM
With all this talk about intel IGP's could someone tell me if any of them support open CL?

NintendoFan
May 30, 2010, 08:36 PM
With all this talk about intel IGP's could someone tell me if any of them support open CL?

I don't believe they do.

iMacmatician
May 31, 2010, 07:27 AM
There's seldom a market for something that's crap compared to the competition.There are two ways discrete graphics can go. One is the way of the discrete sound card. The other is GPGPU/HPC.

This will be a Larrabee without graphics, but with data-parallel cores, as an HPC accelerator, so it might be quite good at this, but it's an unrelated purpose to its development.That is the Larrabee microarchitecture.

You mean a discrete GPU? That was rumored, but according to the latest intel blog a few days ago it's won't.Read the blog again.

Also:

http://www.semiaccurate.com/2010/05/27/larrabee-alive-and-well/

I AM NOT sure why the technical world suddenly all came to the same misunderstanding that Intel's Larrabee is dead, it most assuredly is not. In fact, if you actually read the not-an-anouncement from Intel yesterday, you will see that it simply does not put the knife in, but rather brings a lot of clarity to the chip's position.

What am I talking about? Last December, Intel did the right thing and pulled the plug on the consumer version of Larrabee. They explicitly did NOT pull the plug on the whole program, and left the door open to market a version of it for HPC use. As we wrote at the time, "In a statement today, Intel said that the chip will be a development platform and an HPC part, but there will be no retail version, at least not any time soon."

Today, Intel announced that Larrabee would be released as an HPC part, with an announcement next week, but there would still be no consumer part. In effect, they said EXACTLY what they said in December, but added in more specificity about when it would be released as an HPC part. Basically they confirmed that it was alive, and that it would be out soon in the form that they promised it would be.

Several sites, and several authors that I respect, all lead with headlines about Larrabee being dead. How they got that from Bill Kirkos' post is beyond me, it is actually significantly more alive than it was before that post. Bullet point three said, "We will not bring a discrete graphics product to market, at least in the short-term. As we said in December, we missed some key product milestones. Upon further assessment, and as mentioned above, we are focused on processor graphics, and we believe media/HD video and mobile computing are the most important areas to focus on moving forward."

If you stopped reading there, you might have a pretty grim outlook about Larrabee. If you continued to bullet point 4 however, it read, "We will also continue with ongoing Intel architecture-based graphics and HPC-related R&D and proof of concepts." Now, if a program was dead, why would they continue to do R&D and proof of concepts? Doesn't that sound like an ongoing development program to you?

If it does, that is because there is one, and it is going full steam ahead. SemiAccurate laid it all out a few weeks ago, but the short story is that there will be a Larrabee GPU from Intel, currently set for 2012. Ongoing R&D, HPC chips on the market soon, and a solid roadmap for several generations sure doesn't sound like a dead program, now does it?

However, there were three casualties of last December's retrenchment. The first was obviously the first generation GPU, AKA Larrabee 1. The second was Intel's console dreams and the proposed Larrabee based PS4 architecture, something that will cost Medium Blue in the long term. (Note: For what it's worth, the leading candidate at Sony right now is an internal design that several describe as "Emotion Engine based". Sony should have waited for Larrabee three.....)

The third casualty was the integrated GPU in Haswell, the Intel chip that succeeds 2012's Ivy Bridge. Haswell was set to finally take the GenX architecture that debuted in the i965G out behind the shed and put a long overdue bullet in it's pipelines. Sadly, that was not the case, Haswell and it's successor, Rockwell, will still use GenX, and it is a potential candidate for the next generation CPUs as well. SIGH.

These casualties aside, the idea that Larrabee is dead is ludicrous. Development is ongoing, and nothing has changed. The only new detail put forth by Intel is that it will be a real product in short order. How anyone paying even the slightest bit of attention to events can come to any other conclusion is beyond us.S|A

Scottsdale
May 31, 2010, 12:37 PM
With all this talk about intel IGP's could someone tell me if any of them support open CL?

I have read conflicting reports. I think the problem is that the way the chip is designed it really wouldn't benefit a Mac OS X computer. With an Nvidia GPU, the full power of the GPU is always available. With the Intel Core i5/i7, the CPU is really two DIEs on one chip. The way it works is performance is available for either the CPU DIE or GMA DIE. If the GMA is being overclocked, there is no extra performance available for the CPU DIE. If the CPU is "boosting" the GMA DIE is operating at its lowest clock speed possible.

What I always think of too, and I never really use as an argument to the Core i7 CPUs is that the ULV CPUs only use 800 MHz memory. I wonder if Apple would go backwards there too? It just seems like Intel has really made every chip mediocre by forcing the GMA DIE on the CPU. The Intel replacement for the SL9x00 CPUs are the Core i7-6x0LM CPUs. These CPUs also run with RAM that's 1066 MHz. There are so many problems with going to these Intel Core i7 ULV CPUs that it just doesn't seem worth it. So Apple has to give its MBA users a worse GMA IGP that is far less capable in terms of standard graphics and it also virtually eliminates or severely reduces the OpenCL capabilities.

iMacmatician
May 31, 2010, 02:45 PM
And the Larrabee HPC part has been announced as Intel has said.

http://www.intel.com/pressroom/archive/releases/2010/20100531comp.htm

Aubrey Isle = 1st generation Larrabee chip

Knights Ferry = development card with Aubrey Isle

32 cores (128 threads)
1.2 GHz
300 W
45 nm
1-2 GB GDDR5
1.23 TeraFLOPS peak compute performance
H2 2010


Knights Corner = production card

>50 cores
22 nm

halledise
Jun 14, 2010, 04:09 PM
anantech reports on Intel's graphics card statement posted yesterday which notes that there will be no Intel discrete GPU (duh) in the short term and that Intel is focusing with "laser like precision" on integrated graphics.

http://www.anandtech.com/show/3738/intel-kills-larrabee-gpu-will-not-bring-a-discrete-graphics-product-to-market

Not too much here, except that it indicates that Intel will be sticking to their own integrated graphics strategy for the long haul.

As the author notes, it also validates AMD's approach with ATI.

The article says to expect a 2x improvement over current Intel graphics for the early 2011 chips, and another 2x for the following chips.

Just food for thought.

Moore's Law ain't dead, just modified with the SOP approach (System on Package)

JoeG4
Jun 14, 2010, 11:08 PM
Apple should just buy AMD

halledise
Jun 15, 2010, 02:47 AM
Maybe Apple will just move us over to AMD and give us even more incredible ATI graphics! I could be really happy with AMD and ATI.

+111

YESSSSSSSSSS! that would be awesome (or should that be awefull - i.e. 'full' of awe rather than just 'some' awe)

my mate, who's got his own successful small-business as a PC box builder (and who's also a closet Mac man since I gifted him a coupla old iMacs), absolutely loves AMD and ATI and dislikes intensely the Intel monopoly.

He refused to install and sell Vista on his machines, unless customers were really insistent, but kinda likes Windows 7.
but he says to all customers that AMD is the way to go.
A superior product in his opinion.

He's got a dedicated machine with AMD processor and ATI graphics running Linux on one partition and some kind of Hackintosh on another and the thing fair flies he sez.
He reckons Intel should lodge their processors and Microsoft should insert Vista in a place where the sun don't shine. :rolleyes:

peakchua
Jun 15, 2010, 10:52 AM
intel SUCKS at graphics. intel should dream on. though intel gma is improving. it SHOULD BE another option by an OEM or buyer as a CHEAPER option since it sucks so bad. i dont mind gma but thinking it makes my macbook pro have 330 m graphics instead of HIGHER end graphics because this **** fills the space makes me really not happy. if intel can make the gma get up to the performance of the 9600m gt or the 320m then i will not mind intel gma. BUT RIGHT NOW it is at the level of the nvidblow 9400murderer. :mad::mad::mad::mad::mad::mad::mad::mad::mad::mad:

applesupergeek
Jun 16, 2010, 06:39 PM
iMacmatecian's post is irrelevant to this thread and misleading. The hpc project stemming out of larabee is intel's failure to implement an x86 architecture for a consumer gpu as they intended too and had high hopes of. Of course if you put lots of x86 cores parallel then you might have a good parallel computing device, but as it turns out the pipe dreams of a consumer gpu stay just that, pipe dreams.

Their issuing of this failed project as an hpc variant is of course irrelevant to the prospects of their consumer gpus which they have saddled with inferior graphics and directly negatively affected the refress cycle and future of the air.

I will not hypothesize on why he would post something misleading and irrelevant because apparently close to the whole second page of this thread has dissipated (was expurgated that is) because I did. It's still inexplicable to me how my comments on the irrelevancy of this post have vanished, while a post that has no reason to be here stayed.

Now let me go to the parallel computing sub forums and tell them about amd releasing ultra low power chips with great integrated api graphics in a few months that could fit the air. That should interest them and elicit plenty of wt? responses. ;)

bigwig
Jun 17, 2010, 06:03 PM
I wonder why Intel decided to do a highly parallel floating-point oriented x86 (Larrabee) when Itanium was their floating-point optimized architecture, plus unlike x86 it was designed from the start for highly parallel applications (witness its use in the 1024-CPU SGI Altix). Itanium also has a tiny transistor count compared to x86. Shrink it to 32nm and let it fly.

applesupergeek
Jun 19, 2010, 07:42 PM
I wonder why Intel decided to do a highly parallel floating-point oriented x86 (Larrabee) when Itanium was their floating-point optimized architecture, plus unlike x86 it was designed from the start for highly parallel applications (witness its use in the 1024-CPU SGI Altix). Itanium also has a tiny transistor count compared to x86. Shrink it to 32nm and let it fly.

Because itanium was going down the road of abandonware and the larrabee team (and the top execs that gave the green light to it) had to justify the millions of dollars spent with at least some semblance of an applied product, even though that product was not the intended one.

Kind of like throwing out a better prepared dinner that's two days old, setting out to make a new dish, only to end up burning it and then picking up the half burned remains to serve anew.

iSpoody 1243
Jun 19, 2010, 07:47 PM
well if apple moved to amd and ati we would get quadcore cpus in the mbp a hell of allot sooner than from intel. I would so love to have a quadcore mac mini with a dedicated ati gpu :D