Intel refocuses on integrated graphics

Discussion in 'MacBook Air' started by jhvander, May 26, 2010.

  1. macrumors newbie

    Joined:
    Apr 7, 2006
    #1
    anantech reports on Intel's graphics card statement posted yesterday which notes that there will be no Intel discrete GPU (duh) in the short term and that Intel is focusing with "laser like precision" on integrated graphics.

    http://www.anandtech.com/show/3738/...t-bring-a-discrete-graphics-product-to-market

    Not too much here, except that it indicates that Intel will be sticking to their own integrated graphics strategy for the long haul.

    As the author notes, it also validates AMD's approach with ATI.

    The article says to expect a 2x improvement over current Intel graphics for the early 2011 chips, and another 2x for the following chips.

    Just food for thought.
     
  2. macrumors 6502a

    Jayomat

    Joined:
    Jan 10, 2009
  3. macrumors 601

    Joined:
    Jul 20, 2008
    #3
    They are also refocusing Larrabee (or whatever it turns out to be) on HPC. AFAIK Larrabee was always more suited to HPC compared to GPU.

    Intel's new strategy for integrated graphics looks promising, at least compared to what they have now and in the past.
     
  4. macrumors 6502

    Joined:
    Jun 29, 2008
    Location:
    Australia
    #4
    why must intel shove their rubbish integrated gpus into their processors?
    so every intel mobile cpu will have a in built gpu?
     
  5. macrumors 601

    Joined:
    Jul 20, 2008
    #5
    From 2011, every CPU except high-end desktop and most server/workstation CPUs.
     
  6. macrumors 6502

    Joined:
    Jun 29, 2008
    Location:
    Australia
    #6
    omg.....
    intel own the processor market place and are using their monopoly to force their mediocre gpus into every machine.
     
  7. macrumors 68040

    Joined:
    Jul 1, 2004
    #7
    Pretty much, yeah.

    On the other hand, I dare say that the majority of PC's are being shipped with crappy Intel Graphics anyway, and that most people don't or won't care. For day-to-day computing, I don't even care about it. My workstation at the office has Intel graphics, but my home desktop has a 7300gs in it. Both machines function fairly well for how they're used. It's the beauty of having switchable graphics. Battery life when needed and performance when needed. Look for hybrid graphics systems like optimus to become increasingly more common.

    I'd also guess that many Mac owners don't care what graphics cards are used in their computers either. I'd bet that upwards of 80% of what a typical Mac user does (ie: a non-professional) can be accomplished just fine using Intel Graphics.
     
  8. macrumors 6502a

    Joined:
    Nov 20, 2009
    #8
    How does he know we should expect 2x graphics performance from intel in early 2011, does he take intel's word for it? (because I think he's just taking intel's money for it, to say that, since no concrete evidence is there).

    Even if that is the case, seeing as intel 2x worse (and more) with current igfx from nvidia and ati, 2x in a year will mean that they'll still be 2x worse off, again with whatever nvidia and ati have on the horizon. Even if ati an nvidia somehow stood completely still at best, and by the most optimistic projection, ie. press office claims from intel, they would marginally come close to them.

    And how about some news on when intel in going to support one of the corner stones of snow leopard (and future computing for that matter) open cl? There's a whole area of the cpu being assisted by the gpu with open cl, than intel isn't supporting. So, this means a double hit, because not only are the igfx bad, they can't either serve as adjuncts to the cpu in applications that might demand this.

    Intel has royally screwed low thermal machines such as the air with their igfx and their monopoly tactics. There was absolutely no reason why they would stick their current igfx there, it was too early, and the knew they had the worse tech by far from anyone else, yet they went along and did it anyway.

    Amd will have a superior product with llano in 2011, or at least a far more well rounded and future proof one no more sticking crap igfx next to the cpu on the die, this is the real deal, four x86 cpu cores with a fully integrated gpu. If the 25 tdp amd quote is right, then this will be ideal for the air who has about what (i might be off a little here) about a 35W tdp for cpu and gpu. Because this package will offer four cpu cores and a gpu that blows everything intel will have then out of the water. 25W for an excellent gpu, and four cores for the air, heh? Not bad at all.

    And then apple will then inform intel to eff off.

    [​IMG]

    Have a look here too for a fare more detailed analysis/discussion:

    http://www.bit-tech.net/news/hardware/2010/05/15/amd-fusion-cpu-gpu-will-ship-this-year/1

    http://www.xbitlabs.com/news/cpu/di...tion_of_AMD_Fusion_Chips_Due_in_2015_AMD.html
     
  9. macrumors 6502

    Joined:
    Jun 29, 2008
    Location:
    Australia
    #9
    intel are good at processors, not gpus
    i personally thing they should leave gpus to the experts(nvidia,ati)
    i just think its terrible how they are abusing their market position for their own advantage.
    their gpus are not competitive at all
    ati/nvidia low power range gpus beat them completely in every aspect.
    imo i think intel should leave their gpus separate to their cpus.
    and shouldn't force their gpu into every system.
    well i guess being fair doesn't earn monies!
     
  10. macrumors 6502a

    Joined:
    Nov 20, 2009
    #10
    It was a bad (and greedy) decision on their part, and an immensely stupid one for that matter, that's going to come and bite them on the ass real soon.

    They could have decided since they don't have the right tech at the moment to leave their really great cpus as they are and allow both ati and nvidia to offer igfx options, god knows how these two are strapped for cash and they 'd gladly offer the best they could.

    That would have satisfied their partners (apple nvidia), and give themselves a good two year window of opportunity to develop at least some half decent graphics for future integration. At the same time, they would be beating amd because they'd have a better cpu, with the option to be fitted with a choice of igfx, maybe only at a slight hit on tdp.

    So amd's fusion concept apu would be irrelevant when one could get igfx of their choice and better cpus from intel. And then by late 2012 or 2013 they could throw in their own take on the apu, and be done with amd for good (if amd hadn't already dissolved)

    But instead, they piss good customers such as apple off, they force them to come up with all sorts of workarounds and product delays to cater for those crap igfx and the lack of open cl.

    And now by 2011 amd will be offering a solution that might be about 15% worse off say in terms of cpu raw power, or battery (phenom cores will go in the first gen apu's) but will have igfx to dwarf anything intel has in their own igfx. Intel's current greed has revived the competition!!!

    And guess which company has the better platform and doesn't give a rat's ass about supposed spec sheets, when they can get a deal with a cpu manufacturer, able and willing to go into customisations, able and willing to support open cl, able and willing to offer exclusivity: that's right, it's APPLE.

    And that's why they are giving serious consideration to amd. When their ecosystem in 6 months will be in full swing with the ipad, the iphone and ipods, and lots more people will want to translate their great experiences in these devices in a desktop or laptop equivalent, not many off them are going to care (since they don't know or care about cpus and gpus in the idevices) if apple has amd or intel cores. And guess who young (or old) gamers are going to choose in terms of gpu performance on a computer.

    A macbook pro with an amd apu in a year say, and (because of the volumes apple will buy from them) a top spec discrete ati gpu too, will be absolutely smoking for gamers, creative professionals etc. etc. The fact that intel will have even (at most) a 20% edge in terms of raw cpu power, or power management (battery life) will be irrelevant.
     
  11. macrumors regular

    Joined:
    Apr 14, 2006
    Location:
    Massachusetts
    #11
    Not to divert the topic, but I think if Apple is indeed looking at AMD for CPUs, I would have to imagine their goal is for a single chip solution, like one found in the iPhone/iPod/iPad. They would ideally want a SoC for the X86 world.
     
  12. macrumors 601

    Scottsdale

    Joined:
    Sep 19, 2008
    Location:
    U.S.A.
    #12
    Well, actually I believe Intel on it because what they're doing is using the 22nm process on both the CPU and GMA DIEs. Since they use less energy, the clock speed can be "revved up." So if the GMA DIEs clock speed is double, one would expect that's "double" the performance.

    I don't believe much of anything Intel says when it comes to graphics. However, if we look at the basics of the GMA DIE, it's currently a 45nm DIE while the CPU DIE is 32nm. If both the CPU DIE and the GMA DIE are 22nm and integrated into one DIE using less energy allowing same TDP at a greater clock speed then 2 X is probable. I think there's no real hope for the Intel GMA to magically exceed the expectations if the 45nm IGP were overclocked to run at the speed the 22nm GMA on DIE with CPU (that doesn't read very well but only way I know how to state it).

    So it's a matter of how the GMA should get better than it is on the current Arrandale CPUs. I think we can look at the current GMA and see that it's not even half as capable as the Nvidia 9400m in the current MBA. No matter what, Intel SUCKS for graphics. In addition, the 320m is an 80% boost over the 9400m. So we're not going to be well off in an MBA with an Intel GMA for graphics now or in Sandy Bridge Core i7 CPUs. I certainly would prefer Apple figure out some way around the Intel GMA. I don't know if Apple could fit a discrete GPU in the MBA even if it wasn't a matter of costs? We certainly can reason a 7W ATI 5430 GPU along with a 18W Core i7-6x0UM would fit the total TDP limits of the MBA.

    At the end of the day, I don't want Intel's worthless GMA in the MBA. However, if it's the only way to get a current relevant capable CPU in the MBA, I certainly hope it comes with the Sandy Bridge and 22nm CPU/GMA and not the current Arrandale CPUs. I believe Apple would probably go to an AMD/ATI solution first. I really don't believe Apple feels it can go forward with Intel's GMA, at least not right now. SJ even personally came out and bashed the GMA and said 13" MBP customers were much better off with an 80% GPU boost and 10% CPU boost rather than a Core i3/i5 with only the Intel GMA for graphics. It certainly makes it seem like Apple was trying to convey that the GMA is crap and a discrete GPU wouldn't fit. Although it could be that the costs limitations were part of the cause... along with a worthless useless dinosaur ancient "SuperDrive" taking up usable space that could result in a discrete GPU having the space needed in the 13" MBP.
     
  13. macrumors 6502a

    Joined:
    Nov 20, 2009
    #13
    For sure, I agree with that, but doubling a performance is also subjective, and that's why intel are touting how they are working on hd video for their igfx, because they aren't working on open cl. ;)

    Thanks for reminding me this, I had forgotten about it, this is a clear sign of intent by apple. Btw, if you get a chance to read my above post which you probably haven't as we 've been writing our posts concurrently I d like your take on it.
     
  14. macrumors 603

    Dont Hurt Me

    Joined:
    Dec 21, 2002
    Location:
    Yahooville S.C.
    #14
    If recent history means anything Intel graphics suck, weeee they use less power. This means little, turn down any decent graphics chip to intel levels and they willuse less power. Putting their graphics with a new high power CPU is like buying a Ferrari that has no wheels. Intel graphics are horrible. Its a scam from Intel to try to takeover GPU's and so we are faced with machines with two gpus one that sucks and one that doesnt but you must take the gpu they force on you. Greedy corporations wanting more and more whats new.
     
  15. macrumors 601

    Joined:
    Jul 20, 2008
    #15
    Don't you mean 32 nm?
     
  16. macrumors 601

    Scottsdale

    Joined:
    Sep 19, 2008
    Location:
    U.S.A.
    #16
    Actually, there are 32nm and 22nm variants of Sandy Bridge (may be called Ivy). I believe that I read the 2X graphics are with the move to 22nm process. It could be less than 2X with whatever the 32nm is, but either way it's not going to be anywhere near what an Nvidia 320m can bring us with a Core 2 Duo.

    The important part of the picture is Apple is not just moving the CPU from 32nm to 32nm/22nm and GMA from 45nm to 32nm/22nm, they're moving both onto the same die. Right now the Core i7 has a 32nm CPU die and 45nm GMA die on the same chip which is part of the problem for TDP if I remember correctly.

    I have read multiple articles stating that Intel has multiple processes for the Sandy Bridge, and could end up being 22nm instead of the 32nm process depending on the application of the CPU (server, mobile, desktop, etc).

    I don't know for sure what the "end results" will be? I think it could be either 32nm or 22nm depending on which chips Apple uses. But my assumption on the 2X GMA performance comes when 22nm variants come. I remember Wiki not being as revealing as a few other articles about latest roadmaps. Just do a Google search to figure out the latest roadmap information.

    It could be that 32nm is late 2010 and 22nm is early 2011?

    Maybe Apple will just move us over to AMD and give us even more incredible ATI graphics! I could be really happy with AMD and ATI. I could be really happy with C2D and Nvidia's 320m. I could be really happy with a Core i7 CPU at 2+ GHz and an ATI 5430 GPU. I could NOT be happy in any way with an Intel GMA as the sole graphics solution... it doesn't matter whether it's a 45nm, 32nm, or 22nm GMA die! I don't want Apple doing this and justifying it with the consumer doesn't need a better GPU and would benefit more from an extra three hours of battery between charges.
     
  17. macrumors 601

    Joined:
    Jul 20, 2008
    #17
    Sandy Bridge (the line of variants, not the microarchitecture itself) is 32 nm, Ivy Bridge is 22 nm. 22 nm isn't coming until early 2012 (probably late 2011 production). All segments that are getting the new microarchitecture except high-end server and high-end desktop will have both 32 nm (some time in 2011) and 22 nm (some time in 2012) variants.

    AFAIK it's 32 nm, also the performance estimates could be theoretical performance instead of actual performance (that was the case with RV770/RV870 targets).

    This (and the links at the bottom of the page) is the most detailed roadmap I know of…

    [​IMG]
     
  18. macrumors 601

    Scottsdale

    Joined:
    Sep 19, 2008
    Location:
    U.S.A.
    #18
    Theoretical in terms of faster clock speed for GMA equals faster IGP?

    Nice information there. I have seen a few different variants of the roadmaps.

    Well, I guess double graphics could be in late 2011 then? Heck, I don't know.

    This is all trusting in Intel which we're better off not doing anyways. Intel hasn't been good to any of its customers as we would all be better off by Intel not playing bully and forcing us to use Intel chipsets instead of Nvidia's alternative. In the end, I hope that Intel loses BIG for its moves against Nvidia and against its own customers.
     
  19. macrumors 6502a

    Joined:
    Nov 20, 2009
    #19
    I was under the impression they 'd do the 22nm die shrink by 2011. If they have it on their roadmap as a 2012 thing, then by the first half of 2011 the AMD apu will be manufactured on the same 32nm process, and despite lacking in terms of the cpu cores it will be heads over shoulders over the intel igfx, thus a really good option. Of course the next apu iteration where each core will serve as both a gpu and a cpu, will take a lot of api development by amd (nvidia is well ahead with the proprietary cuda) who are fully committed to supporting open cl. I expect that we 'll soon hear of intel attempting to buy off nvidia. Maybe that was their original intention by strangling them in terms of gfx marketplace, to lower their market value so they could buy them off later on. That whole larrabee fiasco isn't getting them anywhere, and as per wiki:
    What does that leave them with ultimately? Do they expect to be taken seriously with a roadmap that includes their gma line? They 'll never get there with their gfx, they 'll always be pathetically behind, that's why the had a separate team trying to deliver larrabee instead, but this proved to be a disappointing effort. It's not a stretch then to think, like I said, that their second best option was stick those igfx in there hitting nvidia as much as we can, so we can buy them out eventually.
     
  20. macrumors 601

    Joined:
    Jul 20, 2008
    #20
    Yeah, and also in terms of "2x the SPs = 2x the performance" (RV870).

    There's speculation that Intel doesn't see enough of a (future) market for discrete graphics for them to push out a discrete GPU, given Larrabee's performance disadvantage against AMD/NVIDIA GPUs. Instead they'll let their integrated graphics focus on the video stuff and aim Larrabee at HPC computing areas (where it's actually quite good at).

    Rumors say that we'll still see the "GenX" integrated graphics in Intel's CPUs through at least 2014 (as opposed to 2011 before the December Larrabee "cancelation"). Also Larrabee might show up as a GPU in 2012.
     
  21. macrumors 6502a

    Joined:
    Nov 20, 2009
    #21
    There's seldom a market for something that's crap compared to the competition.

    This will be a Larrabee without graphics, but with data-parallel cores, as an HPC accelerator, so it might be quite good at this, but it's an unrelated purpose to its development.

    Let me just point out to some people who might not know, that the GenX is not a new line, but the gma gfx graphics intel has had and has been "developing" so to speak all this time. That's why I am saying that they are in a very tight spot if they don't buy nvidia, if they think they can cut it against any gpgpu - apu efforts by amd with these they are in for another P4 fiasco.

    You mean a discrete GPU? That was rumored, but according to the latest intel blog a few days ago it's won't.

    This makes for some interesting reading, and I quote a few paragraphs off it (interesting to note, how intel licenced imagination's sgx535 (yes that one one the iphone!) for the gma500 for the atom but had some other company write drivers for it:

    much more interesting story continues here:
    http://www.brightsideofnews.com/pri...ient-truth-intel-larrabee-story-revealed.aspx
     
  22. macrumors member

    Joined:
    Oct 26, 2007
    #22
    With all this talk about intel IGP's could someone tell me if any of them support open CL?
     
  23. macrumors regular

    Joined:
    Apr 14, 2006
    Location:
    Massachusetts
    #23
    I don't believe they do.
     
  24. macrumors 601

    Joined:
    Jul 20, 2008
    #24
    There are two ways discrete graphics can go. One is the way of the discrete sound card. The other is GPGPU/HPC.

    That is the Larrabee microarchitecture.

    Read the blog again.

    Also:

    http://www.semiaccurate.com/2010/05/27/larrabee-alive-and-well/

     
  25. macrumors 601

    Scottsdale

    Joined:
    Sep 19, 2008
    Location:
    U.S.A.
    #25
    I have read conflicting reports. I think the problem is that the way the chip is designed it really wouldn't benefit a Mac OS X computer. With an Nvidia GPU, the full power of the GPU is always available. With the Intel Core i5/i7, the CPU is really two DIEs on one chip. The way it works is performance is available for either the CPU DIE or GMA DIE. If the GMA is being overclocked, there is no extra performance available for the CPU DIE. If the CPU is "boosting" the GMA DIE is operating at its lowest clock speed possible.

    What I always think of too, and I never really use as an argument to the Core i7 CPUs is that the ULV CPUs only use 800 MHz memory. I wonder if Apple would go backwards there too? It just seems like Intel has really made every chip mediocre by forcing the GMA DIE on the CPU. The Intel replacement for the SL9x00 CPUs are the Core i7-6x0LM CPUs. These CPUs also run with RAM that's 1066 MHz. There are so many problems with going to these Intel Core i7 ULV CPUs that it just doesn't seem worth it. So Apple has to give its MBA users a worse GMA IGP that is far less capable in terms of standard graphics and it also virtually eliminates or severely reduces the OpenCL capabilities.
     

Share This Page