Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I believe I understand what you mean now, though I'd more likely blame Apple or random software devs for the kernel panics on the iMac than Intel..

You make good points with the console argument, but they seem to ignore what Intel is trying to get across (and what seems to have been stated throughout this thread) about optimization for Larrabee specifically being where it shows massive benefits over a current GPU.

One thing with recent consoles (which unfortunately have been more and more like desktop computers lately) is that as the console ages and the developers become more familiar with its nuances, they produce much more beautiful games. This is what I find interesting about Intel's claims with Larrabee and where I believe you're finding fault.

I looked at is from the point of view of developers being able to learn Larrabee inside and out, so that they may fully utilize the mass-core aspect as well as give it the inroads it would need for a uniform acceptance among platforms and yadda yadda idealism..

I believe you were looking at it from the idea that once the developers learn it, there is nothing more they can do..until Intel churns out a card with more cores..but isn't that the case already? And this would also have the ability to make the old games look new again, as the more cores would beef up the graphics on the games from the previous generation (to an extent)..

I'm not trying to make a strawman, so please correct me if I have, but I understand th(e/is) argument..and I believe it has interesting aspects on both sides.

i am a strawman but know a thing or two about computers, as well as, consoles. and what strawmans like are fast computers. not fast talking. fast talking hurts strawman's head.
 
i am a strawman but know a thing or two about computers, as well as, consoles. and what strawmans like are fast computers. not fast talking. fast talking hurts strawman's head.

Hehe, I wasn't calling you a strawman..I meant that I hoped I wasn't making a strawman out of your argument..in the logical fallacy form:

"a logical fallacy of ethos in which the arguer greatly simplifies an opponent's argument in order to make it easier to attack"

but I suppose we're getting far off topic, it was nice chatting :D
 
i don't share the same concern w/ apple switching to intel. although, weary at the time since i thought the powerpc architecture was just fine and stable running os x and the programs i needed it to run (fce, photoshop). my aluminum imac has had more kernel panics (3 or 4) than my 12" powerbook (1 panic) in the span that i've had it since november of 2007 than the powerbook that i've been using since 2004. i'm no techie, though and won't try to convince anyone which chip should go to which systems. i don't care. what i care about is real-world performance, optimization, stability, all that good stuff, instead of the number of cores or gigahertz.

but, i do think that the gamer's market and consoles in particular are in a unique position than the pc (yet benefit from pc technology) in that they have a longer expected lifespan than a pc and have more specific needs that game and hardware developers should really be able to squeeze the performance out of each console. but, i think people are already beginning to learn that software needs to catch up to hardware. in game console history, it used to be the reverse. you saw hardware add-ons, for example to play certain games. i think intel is just trying to shop-talk this concern now, although, i don't think it has the solution since the solution already exist - powerpc or cell, which i hear is better at graphics stuff. am i correct to make this assumption?

What do you do with your computers? in a space of 5 years, I haven't had a single kernel panic - not a single one. I'm either the luckiest guy on earth or I'm doing something that you're not doing.
 
This should really suite well with snow leopard

Sadly, only the Mac Pro would get to use any sort of Larrabee GPU (assuming it does come out in 2009). Plus as of right now it looks like Intel has to pretty much hand tune performance for the apps that run on it. But based on the SIGGRAPH papers it is looking pretty good, so people are becoming more optimistic.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.