Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

not at all.

i know exactly where intel's IGP stands in the realm of performance. i bought Full out knowing what to expect from it's performance, and it has performed exactly as that.

If i wanted a gaming computer, I'd have used my money more wisely than a MBA. But no, I didnt want a gaming laptop, I wanted an ultra portable. if it so happens to e able to do some light gaming, thats a bonus.

We haev every right though to question why Intel has historically been a generation or two behind in performance compared to the competitors.

We are all consumers who look for the best "bang for our buck". it's no unreasonable to ask intel, if AMD and nvid can create a competent GPU, why can't you?
 
Last edited:
We are all consumers who look for the best "bang for our buck". it's no unreasonable to ask intel, if AMD and Intel can create a competent GPU, why can't you?
Strange question. ;)
It is the wrong question in any case. The right question is "Why won't you?" and has been sufficiently answered in this thread.

Intel HD 3000 - ~115 Million transistors
AMD Radeon HD 3450 - 181 Million transistors - 8 SIMDs
AMD Radeon HD 4550 - 242 Million transistors - 16 SIMDs
AMD Radeon HD 5450 - 292 Million transistors - 16 SIMDs
AMD Radeon HD 6450 - 370 Million transistors - 32 SIMDs
Nvidia 320M - based on a GT216 which is a whopping 486 Million transistors 48 shaders. This is NOT the transistor count of the MCP89 aka 320M which is probably a lot higher.

Even though the others pack tighter. Intel needs a fraction of the transistors to get almost comparable speed. It is a tiny GPU and if they ever wanted it to be anything else it just wouldn't be so tiny.

They want to maximize their profit and I can perfectly understand that they don't think etching huge CPU DIEs just to get mainstream gaming performance will earn them anymore money than they loose in yield and manufacturing costs. Most people who game have a dedicated GPU anyway with which they cannot compete and the rest buys their CPUs in any case.
They just don't waste millions of profit so the few people who care about gaming on an MBA are happy, since they buy that thing either way.
 
Strange question. ;)
It is the wrong question in any case. The right question is "Why won't you?" and has been sufficiently answered in this thread.

Intel HD 3000 - ~115 Million transistors
AMD Radeon HD 3450 - 181 Million transistors - 8 SIMDs
AMD Radeon HD 4550 - 242 Million transistors - 16 SIMDs
AMD Radeon HD 5450 - 292 Million transistors - 16 SIMDs
AMD Radeon HD 6450 - 370 Million transistors - 32 SIMDs
Nvidia 320M - based on a GT216 which is a whopping 486 Million transistors 48 shaders. This is NOT the transistor count of the MCP89 aka 320M which is probably a lot higher.

Even though the others pack tighter. Intel needs a fraction of the transistors to get almost comparable speed. It is a tiny GPU and if they ever wanted it to be anything else it just wouldn't be so tiny.

They want to maximize their profit and I can perfectly understand that they don't think etching huge CPU DIEs just to get mainstream gaming performance will earn them anymore money than they loose in yield and manufacturing costs. Most people who game have a dedicated GPU anyway with which they cannot compete and the rest buys their CPUs in any case.
They just don't waste millions of profit so the few people who care about gaming on an MBA are happy, since they buy that thing either way.

Typo on my part. I meant Nvidia. So if AMD and Nvidia can both create competent performing low end parts. Why can't intel? being that intel's actual real world performance seems to be on par with a generation old device from the other two.

I said it earlier. I understand the approach that intel is going for. it's the "Good enough" approach to make extremely cheap and efficient parts. But the fact does still remain, that while they're "good enough". they're not really close to the competitors offerings at that "performance level"
 
I hope however typed that transistor count up realized that does not include the memory controllers, nor the shared cache the hd3000 igp has access too. Also none of the power control module (which, to be fair, that is only estimated above ~100million transistors).
 
I said it earlier. I understand the approach that intel is going for. it's the "Good enough" approach to make extremely cheap and efficient parts. But the fact does still remain, that while they're "good enough". they're not really close to the competitors offerings at that "performance level"

No, they're not close enough to their competitor's offerings at the performance level because they're not designed to be performance level parts.

and you can't compare transistor counts and relate it to performance levels. For example, when the Pentium 4 was first released it had 42 million transistors. It still got beat by the Pentium III which had 26 million transistors. The difference was architectural. The Pentium 4 had a deep processing pipeline, I think 20- something instructions. The Pentium III had like 10 something. This is because the CPUs that we use are designed to predict what the next set of instructions will be. So if the Pentium 4 guesses something wrong, it needs to go back and do it all over again through it's long pipeline so it took more time.

Anyway, enough rambling :). I think my assessment of the HD 3000 as a very decent IGP stands and that Intel's Core i5 ULV offers an incredible low power solution.
 
Last edited:
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

Even if the IGP 3000 was equal to the 320m then it shows how poorly designed the Intel IGP is, and in some cases the 320m is better despite being a year older.

All other arguments around what an ultrabook should be used for and other personal opinions are completely null.

Were I unhappy with my MBA I would be very verbal about/against it. I would NEVER justify a bad purchase by saying its better than it is. My MBA is my fav computer I have ever owned...and its a great little gamer!
 
Drivers?!?

Okay, I appreciate the hardware banter and reasons pro and con, but one thing I haven't read about is drivers.

One of the things that make Nvidia and ATI/AMD stand out is the constant improvement of drivers which can make or cripple brand new hardware. Intel has been very slow to update drivers regularly.

If they expect to compete, they need to start making driver updates which address issues with new software (games). Otherwise, it doesn't matter if they have the best igp or not.
 
Okay, I appreciate the hardware banter and reasons pro and con, but one thing I haven't read about is drivers.

One of the things that make Nvidia and ATI/AMD stand out is the constant improvement of drivers which can make or cripple brand new hardware. Intel has been very slow to update drivers regularly.

If they expect to compete, they need to start making driver updates which address issues with new software (games). Otherwise, it doesn't matter if they have the best igp or not.
I can't say for Intel now, but Intel back then in 2002 was constantly updating their drivers. Their 845G had a section of compatible games and they could update it constantly to improve compatibility.

AMD, however has left ATI to rot. ATi had horrible drivers until they released the legendary Radeon 9700. Catalyst was actually pretty damn good. However when AMD bought them and released the HD 5 series the drivers started sucking. I own a Radeon HD 5770, and this driver issue annoyed me so much I got a new graphics card because of it. Whenever you play flash files the UVD (video decoder) kicks in and locks the clock speed to 400MHz. From 815MHz. AMD never solved this driver issue.

For now I can say that the only company that releases any halfway decent drivers is nVidia.
 
I can't say for Intel now, but Intel back then in 2002 was constantly updating their drivers. Their 845G had a section of compatible games and they could update it constantly to improve compatibility.

AMD, however has left ATI to rot. ATi had horrible drivers until they released the legendary Radeon 9700. Catalyst was actually pretty damn good. However when AMD bought them and released the HD 5 series the drivers started sucking. I own a Radeon HD 5770, and this driver issue annoyed me so much I got a new graphics card because of it. Whenever you play flash files the UVD (video decoder) kicks in and locks the clock speed to 400MHz. From 815MHz. AMD never solved this driver issue.

For now I can say that the only company that releases any halfway decent drivers is nVidia.

Yup. nVidia screws up big, sometimes, however, they at least normally don't leave glaring flaws like the silly UVD/Flash downclocking for.... ever. It was annoying as hell, especially as a Steam user (on Windows, of course :p). My HD5870 CF setup was sold off at dirt prices due to drivers.
 
This discussion raises the fundamental question of how the end user is employing his/her device, and I wonder if the non-technically inclined actually meter their expectations as we do (should?). After all, transistor counts and graphics chip manufacturers mean much less to folks in the real world than to us.

I only say that because if I take my tech glasses off and look at what the lesser iPad or iPhone can do (with such visually impressive feats as Infinity Blade II, for example), I don't think it's illogical to look then at a MacBook Air and expect (whether technically realistic or not) for it to wow me with a game or two as well. Especially if a certain precedent for graphical performance has been established by a prior generation of the device (as we could debate the 2010 MacBook Airs did).

And with more and more individuals buying laptops like MacBook Airs as primary machines to replace desktops, there's a growing expectation that these machines can do any thing you need them to do ... and more! I think the risk of exposure for Intel and what some have called the "good enough" graphics philosophy grows as lines are blurred between devices and performance expectations.

While I didn't buy my 2010 MacBook Air to game on (I'm a writer, so of course in that regard the laptop is overkill for my needs), I love the fact that I can squeeze out relatively good performance in games like Batman Arkham Asylum (OS X) and NBA2K12 (Wineskin port). Regardless of technical jargon, this adds a value to the device (metered in oh-so-precious FPS) that I would lose with the current Intel graphics offerings.

Just humble food for thought ...
 
I hope however typed that transistor count up realized that does not include the memory controllers, nor the shared cache the hd3000 igp has access too. Also none of the power control module (which, to be fair, that is only estimated above ~100million transistors).
You are right. The only thing that annoys me is that you are the only one who seems to even realize such a thing most just. Argue benchmarks and look at a GPU like a blackbox.
It is a bit of an unfair comparison. Yet counting the entire LLC to the GPU wouldn't be fair either. Added IMC, Powermanagement, PCIe controllers as a dedicated GPU would have, one might have to add some 100-150 million transistors. Which just shows how ridiculous it is to cry for a new 520M chipset when the GPU on DIE can save so much.
Yet comparing the 6450 where the 32simds are well over 200 millions transistors just for the simds and stuff they cannot live without like the rops and the stuff that feeds the simds.
Yet the performance difference is small at still 2:1 in transistor count.

I am aware that AMD packs much more tightly and that more than makes up for that seemingly bad number and they run on lower clocks which means it doesn't draw more power.
It still means though that Intel is not that far behind AMD/Nvidia in "what they can do" as some people here insist on being a fact.
Now with the Intel HD 4000 and the 100% increase in power efficiency of the EUs (due to 22nm and design changes) it looks like the Hd 4000 beat anything from AMD/Nvida in W/speed with IB. And the most efficient little GPU that is fast enough is IMO the perfect on DIE GPU.

Intel won't invest in Drivers nearly as much as AMD/Nvidia will. Even if they put somthing that can really rival a 400 core Fusion CPU on the next DIE (at least I doubt they would). It would probably just suck more power than necessary and still make few people really happy. I prefer the small integrated GPU for all the normal stuff and have a big dedicated GPU with much better cared for drivers for actual gaming.
It also not entirely clear how fast a GPU you can put on a CPU DIE before it needs a 3rd 64 bit memory channel. The Fusion GPU sits next to a comparably slow CPU that runs at low clock speeds when the GPU is in use. How much memory bandwitdh remains and at what point would if be just a way too big chip. Also at a certain performance point a GDDR5 equipped dedicated GPU is probably much more efficient.
There are many things one must account for in designing a decent on DIE GPU and great many people in this thread who complain about the preformance don't answer a single one of them.

In theory Intel could put a bigger GPU into the LV Dual Core IB because that has more memory bandwidth to spare. It is a smaller chip than the Quad Core. Quad Core notebooks are most of time quipped with dedicated GPUs and nobody would complain. The only problem is that Intel would want to charge quite a bit more for such a CPU and they want to use one and the same for Core i3 or Core i5 in the very cheap PC Notebooks. I am pretty sure they know all the costs involved and just don't think it is worth it to make say more different variants of chips. They want to bin the LV chips form all the rest and sell that stuff cheap. If they make higher end Dual Cores they probably couldn't match the yields and demand as well.
 
I'm really hesitating between waiting for the 2012 Macbook Air or switching back to Ultrabook PC after having used Power/Macbook for so long...

I'm tired of the underspecd, underpowered and overpriced Macbook you can get.
My friend has a three year old PC laptop that can emulate the Wii pretty well, my Macbook Air can't even emulate the Gamecube, a 10 year old console, correctly.

As for Games, well most of them still aren't release for Mac and they wouldn't work anyway.
 
If you care so much about gaming Windows is by far the better option.
I personall would not buy MBA or if only for the Touchpad with BetterTouchTool. On the go a bright matte Ultrabook is the better platform and Windows Touchpads have gotten better I have heard. At home I wouldn't be I happy with such a small thinnbook in any case.
 
When I was a college kid I cared about whitesheets and specs and thought windows XP was the best and wasn't interested in any alternative to what I thought was right. Now as an adult working in a very technical industry, using high end computers every day and switching between all OSs daily I have come to ask one question when I buy a computer. Does it work? if the answer is yes I am happy...if its no, I am not happy.
 
I'm really hesitating between waiting for the 2012 Macbook Air or switching back to Ultrabook PC after having used Power/Macbook for so long...

I'm tired of the underspecd, underpowered and overpriced Macbook you can get.
My friend has a three year old PC laptop that can emulate the Wii pretty well, my Macbook Air can't even emulate the Gamecube, a 10 year old console, correctly.

As for Games, well most of them still aren't release for Mac and they wouldn't work anyway.
This macbook air isn't an ordinary laptop, it's an ultra book/ultraportable. Its size will command a premium because it's lighter and smaller than most laptops. This is one cost I could justify because other ultra books by ASUS/Toshiba/acer were similarly priced, if not more.
 
This macbook air isn't an ordinary laptop, it's an ultra book/ultraportable. Its size will command a premium because it's lighter and smaller than most laptops. This is one cost I could justify because other ultra books by ASUS/Toshiba/acer were similarly priced, if not more.

I've seen identically spec'd devices from Samsung for more money than the MBA.
 
If you care so much about gaming Windows is by far the better option.
I personall would not buy MBA or if only for the Touchpad with BetterTouchTool. On the go a bright matte Ultrabook is the better platform and Windows Touchpads have gotten better I have heard. At home I wouldn't be I happy with such a small thinnbook in any case.

They haven't got better. Relative to everything not Apple, they are better, but still not anywhere near the same. Apple, you can glide across the touchpad. Windows.... you gotta smash into the pad and take control of it by the reigns.


Something like that.
 
If you care so much about gaming Windows is by far the better option.
I personall would not buy MBA or if only for the Touchpad with BetterTouchTool. On the go a bright matte Ultrabook is the better platform and Windows Touchpads have gotten better I have heard. At home I wouldn't be I happy with such a small thinnbook in any case.

The argument pro/con: 320M vs HD3000 vs HD4000 is splitting hairs - from a serious gaming standpoint (I can post benchmarks from popular sites) they all suck, IMHO.

Unless Intel licences it's chipsets to Nvidia - making GPUs fit into small packages will be expensive and difficult. based on benchmarks on the FX series, AMD has dropped the ball on it's first set of IGP integrated APUs (hate the name) but there may be hope in the near future.

Apple isn't out to screw it's customers - it's trying to make the most cost efficient business decision that will end in a good product that will sell.
The MBA is all about compromises - like most laptops. Size and energy efficiency trump speed.

I game on my MBA and up to a point it does the job!

I have Civ 5 and Skyrim (both on bootcamp) on my 2011 MBA i7 and it's not the fastest system but I'm level 56, 2/3 done with Skyrim, and enjoying Civ 5. I have quite a few mods installed for Skyrim to improve textures and it's a beautiful game.

In an ideal world I'd have built myself a gaming desktop (like I did in years past) or purchased another iMac with the best GPU - but for what it's worth I can do most of what I want.
 
The argument pro/con: 320M vs HD3000 vs HD4000 is splitting hairs - from a serious gaming standpoint (I can post benchmarks from popular sites) they all suck, IMHO.

Unless Intel licences it's chipsets to Nvidia - making GPUs fit into small packages will be expensive and difficult. based on benchmarks on the FX series, AMD has dropped the ball on it's first set of IGP integrated APUs (hate the name) but there may be hope in the near future.

Apple isn't out to screw it's customers - it's trying to make the most cost efficient business decision that will end in a good product that will sell.
The MBA is all about compromises - like most laptops. Size and energy efficiency trump speed.

I game on my MBA and up to a point it does the job!

I have Civ 5 and Skyrim (both on bootcamp) on my 2011 MBA i7 and it's not the fastest system but I'm level 56, 2/3 done with Skyrim, and enjoying Civ 5. I have quite a few mods installed for Skyrim to improve textures and it's a beautiful game.

In an ideal world I'd have built myself a gaming desktop (like I did in years past) or purchased another iMac with the best GPU - but for what it's worth I can do most of what I want.

True, all, relatively, suck. However, finding which sucks less... is just as fun as the joy of the IGP even being able to load the game (especailly true for people who still remember the intel GMA 950/945GM, and the "Intel Extreme Graphics....2").
 
True, all, relatively, suck. However, finding which sucks less... is just as fun as the joy of the IGP even being able to load the game (especailly true for people who still remember the intel GMA 950/945GM, and the "Intel Extreme Graphics....2").

Haha "sucks less..."

The "Intel Extreme Graphics" - You'll find it as a proper example of an oxymoron in the dictionary.

AMD may have made a mistake in swallowing ATI - they are now having to fight two fronts - building proper CPUs that are profitable/competative with Intel and GPUs/IGPs that are competative with Nvidia. It's going to be an interesting time watching how this all unfolds. Intel could surprise us and actually bring a game changing IGP to market... A fat pig just few outside my apartment... Sorry, Intel deserves a little ***** for the last 10 years of "extreme graphics" IGPs :p

If you wanna game on the 2011 MBA you can (first hand experience) - Just depends on your expectations.
 
Haha "sucks less..."

The "Intel Extreme Graphics" - You'll find it as a proper example of an oxymoron in the dictionary.

AMD may have made a mistake in swallowing ATI - they are now having to fight two fronts - building proper CPUs that are profitable/competative with Intel and GPUs/IGPs that are competative with Nvidia. It's going to be an interesting time watching how this all unfolds. Intel could surprise us and actually bring a game changing IGP to market... A fat pig just few outside my apartment... Sorry, Intel deserves a little ***** for the last 10 years of "extreme graphics" IGPs :p

If you wanna game on the 2011 MBA you can (first hand experience) - Just depends on your expectations.

Lol, there is a bit of history on that. Originally, AMD wanted to aquire nVidia, but the nVidia CEO has only one requirement: he would become CEO of the new company. Hector Ruiz, then CEO of AMD (just after he ruined Motorola) said no. Since nVidia's shareholder voting rules are extremely strict, and nVidia actually has a protection clause from MS (has to do with original xbox lic. and backwards compat. lic.), made it near impossible to hostile takeover nVidia.

However, when Intel annoinced Larrabee, AMD paniced, and bought ATi for an astronimical sum. Whether that sum was worthwhile, we'll know later on, since the fruits of that relationship are only now becomming avalible.

So I hope Trinity (Llano sucessor) will be awesome for it's power consumption. It covers, in power consumption, every point a mobile Ivy Bridge should.

Though, sadly, Witchita got cancled (Brazos/Zacate successor) got canned :( It had a lot of integration and technologies that Intel wasn't even planning on implementing till Haswell. Oh, well. Hopefully AMD's long switch to full-on TSMC will work out, in the end :)


Anywho, have fun! :D I'm now patiently waiting on the next 13" laptops from Apple :)
 
Lol, there is a bit of history on that. Originally, AMD wanted to aquire nVidia, but the nVidia CEO has only one requirement: he would become CEO of the new company. Hector Ruiz, then CEO of AMD (just after he ruined Motorola) said no. Since nVidia's shareholder voting rules are extremely strict, and nVidia actually has a protection clause from MS (has to do with original xbox lic. and backwards compat. lic.), made it near impossible to hostile takeover nVidia.

However, when Intel annoinced Larrabee, AMD paniced, and bought ATi for an astronimical sum. Whether that sum was worthwhile, we'll know later on, since the fruits of that relationship are only now becomming avalible.

So I hope Trinity (Llano sucessor) will be awesome for it's power consumption. It covers, in power consumption, every point a mobile Ivy Bridge should.

Though, sadly, Witchita got cancled (Brazos/Zacate successor) got canned :( It had a lot of integration and technologies that Intel wasn't even planning on implementing till Haswell. Oh, well. Hopefully AMD's long switch to full-on TSMC will work out, in the end :)


Anywho, have fun! :D I'm now patiently waiting on the next 13" laptops from Apple :)
I think the only reason why AMD is staying afloat is because of ATI. I don't know how long it will last though. AMD does a lot of half-assing to save on manufacturing costs. They switched to an earlier version of VLIW (What the HD 6 series uses) because they said a few instructions "weren't needed" and what I think of is "We are cutting costs to make it cheaper" but what it ended up being was that AMD was playing a confusing name game. Like 6 series was no better than the 5 series or the upgrade price wasn't worth it.

Yeah, have fun guys. While you might be able to game Battlefield 3 on your macbook airs, I'll enjoy some Battlefield 3 action at maxed out settings and at 1920x1080 :cool: and hopefully my Macbook Air won't cry over heat issues...
 
Lulz

After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

All other arguments around what an ultrabook should be used for and other personal opinions are completely null.


Sorry, I need to post a response to this comment (puts on smart ass hat) :D

*****************************************

So cupcake, I was apparently ignoring facts and wanted to justify my purchase - maybe you can help me out - I need a laptop that lets me do:

  1. WORK (so I can eat and have a roof to protect my PS3 and 3D TV)
  2. CLASS WORK
  3. weighs about 3lbs
  4. fits in my briefcase
  5. let's me work the train to NYC (that's a place on the map, fun to visit)
  6. fits on a tiny seat tray in economy on a plane.
  7. watch Twilight Sparkle on Youtube
  8. enjoy OS X
  9. oh yeah, play video games such as Skyrim and Civ 5 anywhere I want

Please enlighten me oh PC GURU on what system will allow me to do all of the above.... I am an awful person and can't defend my purchase anymore

I'm not sure what planet you live on that makes you think people will buy a MBA just to game... Gaming, while being important, is not high on my list of things... Life is about compromises, lol

xoxo

P.S. As I mentioned in the previous post - the 320M/HD3000/HD4000 all SUCK for gaming so even waiting for a 2012 would mean it still sucks as a gaming machine :D

(takes off smart ass hat) :D
 
Last edited:
All I know is I feel like I bought a couple SB Macs expecting to get Toyota Prius performance from the GPU, instead I feel like I have the heart of a Mazda Speed 6 under the hood - at that Prius price. I think that's what the anti-Intel crowd fails to understand.
Its good enough for most users and Intel is balancing performance and power consumption against a price point.
If you don't like the HD 3000, then buy a different ultrabook running a dedicated GPU that meets your performance requirements.
 
I think the only reason why AMD is staying afloat is because of ATI. I don't know how long it will last though. AMD does a lot of half-assing to save on manufacturing costs. They switched to an earlier version of VLIW (What the HD 6 series uses) because they said a few instructions "weren't needed" and what I think of is "We are cutting costs to make it cheaper" but what it ended up being was that AMD was playing a confusing name game. Like 6 series was no better than the 5 series or the upgrade price wasn't worth it.

Yeah, have fun guys. While you might be able to game Battlefield 3 on your macbook airs, I'll enjoy some Battlefield 3 action at maxed out settings and at 1920x1080 :cool: and hopefully my Macbook Air won't cry over heat issues...

Ya know.... if Trinity is all it's cooked up to be, it will actually match Ivy Bridge at the 17W level :eek:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.