Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Look at a 13" MBP logic board from 2010 and you should see how ridiculous that assumption is.

That is not much different than the PCH/CPU combo on a Sandy Bridge logic board :

2sAFILZWQrMcISoJ.medium


Remember, the PCH doesn't contain the GPU anymore, it's on the CPU die, yet there we are with about the same apparent sizes.

2 things you still fail to grasp :

- the 320M is a throwback to April 2010. Sandy Bridge shipped a year later. Comparing a year old IGP to a current one and claiming the new one is "good enough" when it can't even touch the year old one is asinine. The 4xx series of nVidia chips was shipping when Sandy Bridge got introduced, if anything, we should be comparing it to a 410M (which was released in Jan 2011, along with 5xx series GPUs, which don't have IGPs anymore since nVidia bowed out of the chipset business because of Intel).

- the Intel 3000HD is built on a 32nm process contrary to the MCP89 which is built on a 40 nm process.

In other words, your die size comparison is null and Intel is being compared to technology that is of a previous generation. You claim they are competent because they managed to achieve what the competition did the year prior. Congratulations. Meanwhile, if they hadn't been a bunch of cry babies and litigators, we'd be running 5xx series IGPs from nVidia that would literally crush the Intel 3000HD as has been proven time and again each generation.

But please, do keep apologizing for Intel. I don't get why you feel so strongly about protecting them about this.
 
Okay, we can all agree to a few things:

1. Intel's integrated graphics are bad for gaming and other 3D workloads. But, they have improved a lot since the GMA days.

2. It's a pity that Apple doesn't launch a MacBook Air with an AMD Fusion chip, but, chips like the E350 offer increased graphics performance at the cost of much lower CPU performance.

However:

No matter how much we bicker, Apple isn't going to turn away from Intel because most consumers value better CPU performance over GPU performance.

Meanwhile, if they hadn't been a bunch of cry babies and litigators, we'd be running 5xx series IGPs from nVidia that would literally crush the Intel 3000HD as has been proven time and again each generation.

Intel isn't being a crybaby. They recognized that Nvidia was a threat to their chipset business, and cut them out. If you were Intel, and you were on the verge of losing millions of dollars due to a competitor, wouldn't you use your patent portfolio to shut them out? It's perfectly logical. Remember: It's about the bottom line, not the consumer. And apparently, the consumer doesn't even care about "better graphics" all that much, outside of the nerds who bother to play video games on their MacBook Airs with outsize expectations about performance.

There will never be third-party graphics on the MacBook Air for the forseeable future. And mind I add that AMD's graphics chips offer even better performance per watt than Nvidia's? There's a reason Apple dumped Nvidia lately.
 
Remember, the PCH doesn't contain the GPU anymore, it's on the CPU die, yet there we are with about the same apparent sizes.
Just that they are not the big one here is 150 sqmm. Which means by my previous calculation the 320M in 40nm would take almost two thirds of that die in 32 it would be about one thrid not any different than the HD 3000.
In any case looks already pretty cramped with the SFF packageing. A Quad wouldn't even fit an neither would a 32 EU Intel HD 3000.
2 things you still fail to grasp :
I grasp both of your remarks you either deliberately ignore my answers or don't understand them.
The 320M is 40nm and up till now that is what Nvidia had to work with. A 520M is no smaller. It doesn't matter that it is a 2010 GPU it is 40nm and big and that didn't change in the mean time. It will with 28nm but by that time Intel is already at 22nm.
Compare it to a 410M, 520M or anything you want. Those usually just have higher clocks but the shaders and die space didn't change. Also a 520M with 17W TDP is way to hot for an IGP it would need to run much slower

- the Intel 3000HD is built on a 32nm process contrary to the MCP89 which is built on a 40 nm process.
Now where did I fail to grasp that. I did mention it in the calculations more than sufficiently. It doesn't change a thing. Even normalized to the same process node Intel is about on par with Nvidia in die size vs performance.

You claim they are competent because they managed to achieve what the competition did the year prior. Congratulations. Meanwhile, if they hadn't been a bunch of cry babies and litigators, we'd be running 5xx series IGPs from nVidia that would literally crush the Intel 3000HD as has been proven time and again each generation
No I claim they are competent because they can cramp the same kind of performance on the same space. Regardless of the production year.
You claim they are incompetent because they didn't dedicated more space to the GPU.
Check out the specs of a 520M. That thing is no smaller than a 320M has faster clock speed but also too high a TDP and thus would have to run at somewhat reduced clocks. It is almost the very same architecture and the performance it gets from all that additional clock is not exactly crushing. Most games are still unplayable on anyting but low settings.
Compare that one to the AMD Fusion GPU on the A8 Llano much more impressive but also much bigger GPU. That thing is almost half the die but that is also the selling point.
I think when compareing die space used and power efficiency, that in terms of pure gaming power the AMD VLIW5/VLIW4 architecture is the most efficient of them all. No very good for GPGPU but great for gaming.

It sound as if you'd have to give up your religion of Intel hating. Look at the facts and explain why I am wrong about my claim that Intel is very much competent in what they are doing because they fit an about equally capable GPU on an equally big space (even when you normalize the data to the same process node).
And stop ignoreing stuff I have already explained. Either ask if I am unclear or point out actual errors.

Besides whining about the fact that there is no more any Nvidia IGP doesn't help. Intel got the Process node advantage and that more than makes up currently for any not quite as efficient architecture if it was on the same process node. A Nvidia 520M IGP wouldn't really change anything. It would only make the power managment really complicated with CPU and GPU Turbos(and any implementation much slower and less reliable). Also it would require dedicated memory exactly like a real dedicated GPU (which anybody is free to implement) because taking the IMC away from the CPU and move it back would cripple performance a lot. Dedicated IMC would mean addtional space, additional VRAM chips somewhere on the logic board, additional power consumption. What is then really benefit of just putting a real second dedicated GPU on the logic board instead of fusing it with the MCH. The space requirements wouldn't be all that different considering all the IMC stuff. And once you think it worthwhile to add a 520M to MBA it would be a waste to have a 32EUs Intel HD 3000 on the chip that would hardly fit into the SFF package anyway.
Also the good thing about a on die GPU is that it shares the LLCache which kills of the GPGPU bottleneck of data transfer. Probably one of the reasons why QuickSync is so much better than anything comparable on CUDA with so little hardware. This Intel GPU has the potential to be orders of magnitudes more efficient in excuting some OpenCL stuff than any Nvidia/AMD GPU.


You can complain all you want that the GPU is too small for you and you want a Intel HD 3500 which would be a double HD 3000 version. I just think you lack arguments to support Intel's incompetence in that field.
 
Intel IGPs have always ALWAYS been behidn the ball in regards to output. This isn't anything new. They're typically 1 or 2 generations behind the similar priced counterparts from the two major houses (nvid and ati).

This dates back to the earliest computers I remember ever touching. It's more a testament to their mindsets.

While NVIDIA and ATI constalty battle for the performance crown and the gaming environment, pushing boundaries with their tech, Intel has gone a completely different route.

Intel used to completely ignore the ongoing superiority battle for performance. instead they have always touted their IGP's to being the low cost "free" alternative for those who have no use to push their computer. it's why they've managed to get their IGP's in virtually every single market. Servers, Desktops, laptops, integrated devices. if you need a display, but that doesn't need to push any boundaries, intel's integrated parts tend to be the best and cheapest solution.

This has worked wonders for Intel. we as people who love performance, harp on how poor they handle modern games. While they may care a little, it's not their top priority. And historically, intel's direction with their IGP's HAVE worked.

Look at the market share. This is a number that will absolutely shock you (not the most up to date, but it doesnt' vary greately).

AMD = 24.8%
Intel = 54.4%
Nvidia = 20%
Matrox = 0.05%
SiS = 0
Via/S3 0.07%
total = 100%

Reports have intel actually taking about 80% of the PC marketshare, with 66% of all new computers shipping with intel's IGP.

AMD and Intel both are showing growth in the Market share as well as AMD is NOW getting into the IGP market with their Liano architecture.

I think with the Lliano finally bringing decent performance into the IGP market, you will see intel give their next few chips a boost. Might be too late for Ivybridge, but the following ones might start seeing some reasonable performance. But till then, likely the status quo from Intel's parts.

The only real loser in all this seems to be Nvidia. Without the ability to make their own CPU's with Intergrated graphics on chip, It's an entire marketshare they're unable to focus on.
 
i'm with BOTH of you on this.

an nvidia option was out of the question. The space, the added circuitry, the extra need for cooling of 2 seperate chipsets and whatever extra is needed to put on a 2ndary graphics chip, would likely mean we lose out on something else. perhaps no Thunderbolt. Perhaps only 1 USB (like the first AIR).

The future of integrated graphics is CPU+GPU combinations like Intel and AMD both have. for the Ultra portable market, it's a no brainer.


But KnightWRX is right in the statement that Intel's "good enough" model is frustrating. Its a cheap cop-out. they've done this traditionally for years. They do NOT push their development teams to make top performing parts. They have always subscribed to the "good enough" mantle.

This is frustrating because yes, their graphics performance while "good enough" for desktop use and what not, Don't give any added bonus. its exactly what you get and thats it. Without the push to mak ea top performing part, we've always gotten sub par performance out of Intel per generation then the competitors. The AMD LLiano chipset has managed to create an All in one package that not only crushes Intel's offerings, But keeps up with Nvidia's parts that require the extra space.

I've said it for years. I wISH intel would try harder. But with what i said in my previous post, it's going to take a bit before they realize the threat AMD is giving them
 
I wonder why Intel just won't buy NVIDIA and be done with it. It's obvious Intel struggles at building integrated graphics, as well as SoCs, which are two things that NVIDIA does pretty well.
 
A year later. On a smaller process. Keep on apologizing. End discussion.
It is sad. You cannot so much as argue against a single of my responses. The only thing you do is repeat your stubborn stupid uneducated prejudice and dismiss an entire post with an end of discussion.
At the beginning I was under the wrong assumption that I am not argueing abotu one of these new age Internet kids that cannot comprend a whole paragraph it has more than 3 sentences.

A year later but on the same process. On a smaller process they only need about 60% the space for similar performance.

But KnightWRX is right in the statement that Intel's "good enough" model is frustrating. Its a cheap cop-out.
He never actually stated that. He keeps on insisting on other stuff. Such a statement would be something that is valid and I tried pushing him to it but it is not his.

This is frustrating because yes, their graphics performance while "good enough" for desktop use and what not, Don't give any added bonus. its exactly what you get and thats it. Without the push to mak ea top performing part, we've always gotten sub par performance out of Intel per generation then the competitors. The AMD LLiano chipset has managed to create an All in one package that not only crushes Intel's offerings, But keeps up with Nvidia's parts that require the extra space.
True. Imagine they would make a part that is almost equal to the Llano say 60% the GPU speed of an A8. They probably could even if it becomes a big chip (maybe make variants so not everybody has to buy them) but what happens next. A8 is currently still limited by the low clocked poor cpu but with the Sandy Bridge cores it would eat pretty much all the market share that AMD and Nvidia make in the low end, entry mid range.
That would hurt them bad. I think it is good that they leave the midrange as a cash cow for the GPU makers so they can still earn enough for affordable high end GPU developement. Else there wouldn't be anymore 7970 Graphics Core Next architectures. They might require prices in the workstation card range.
Intel earns **** loads of money but if AMD/Nvidia lost the valueable midrange they would be in serious trouble. Now Intel for the first time with the HD 3000 pretty much made all those low end 5450 and such useless.
Intel can afford just about any R&D they care about. For AMD/Nvidia it big chunk of costs.
If Intel wants to make the best GPU around they will just put a team on it bigger than AMD/Nvidia can afford combined. It is not worth it for them and would certainly require more money than the competition needs for the same output.

I am not sure I think that(a midrange like fast GPU) would be a good thing. Intel already has too little competition as it is IMO. They work with a quasi monopoly. The only reason to go forward is so they can sell new stuff and people don't say the old is still good enough. They already price almost monopoly prices only dependent on how much they can charge to keep total earnings at max but competition affects prices very little.
The whole Ultrabook initiative is only to make more money from their expensive chips. If there was more competition those would be much cheaper. And Atom would probably be much more prominent as it is a really small chip they can sell for very little money.
 
I wonder why Intel just won't buy NVIDIA and be done with it. It's obvious Intel struggles at building integrated graphics, as well as SoCs, which are two things that NVIDIA does pretty well.

I've been saying this since the whole lawsuit mess started. Buy the competence you don't have and be done with it. Great solution for the consumer, win-win for both Intel and nVidia.
 
I've been saying this since the whole lawsuit mess started. Buy the competence you don't have and be done with it. Great solution for the consumer, win-win for both Intel and nVidia.

I'm sure Intel has thought about it, but concluded that it was better to spend more on R&D and create better graphics chips that way.

Furthermore, Intel's real problem isn't creating good hardware - it's writing good drivers. Which is why the disparity between the 320M and the HD 3000 is so much greater in Windows than it is in OS X. Apple writes the drivers for the HD 3000 on their OS.
 
I'm sure Intel has thought about it, but concluded that it was better to spend more on R&D and create better graphics chips that way.
I think it has more to do with the kind of graphics chip they meant to create.
Their game had been to create something fast but not traditional. They wanted something better at GPGPU something that is easier to program and something more suited to Ray tracing than state of the Art GPUs were.
Nvidia realized that a huge problem of their GPGPU effort is that it is so hard to write code for it. They tried a lot but still little has changed. Intel always went for taking away as much effort from the programmers as possible. Have been quite succesful with that.
On the other side they only wanted to make a specialized small GPU that is the HD X000. Something small and very efficient at specific tasks that it might encounter and not the allround easily scalable good for everything architecture.

Buying Nvidia might help them in mobile space but in the standard GPU space, Nvidia didn't offer anything that fit in their strategy. Back when it was reasonable to buy Nvidia they also still thought Larrabee might be something decent to gain market share in big computing. Now they sunk to much cost in their GPU division that it would still make sense to change course.

Also Intel's earns some decent margins buying something as expensive as Nvidia would have needed some serious advantages.
 
Hey man, can I trouble you to post a detailed reply on how to go through these optimizations? I would love to implement them myself, and I'm sure other forum members would too!

Thank you in advance!

I just realized something. Skyrim on MBA runs MUCH better on 32bit windows than 64bit!

I had been running it 32bit for a while, then I got rage, which won't run properly under 32bit. So I installed 64bit and rage worked great! (even tho it is a sucky game!) but I went back to skyrim and realized it was going choppy and slow and having weird mouse lag issues. I reinstalled 32bit windows and now its running great again, nice and smooth, no mouse lag and I only get some choppiness when there are a few NPCs doing something in the frame...otherwise it runs great!
 
What a splendid bunfight this thread has turned into. Mind you, I could be just posting this to eke my post count nearer that magic 500 number so I too am allowed a little picture.
 
Heres the thing. I think we could all agree that the industry is moving towards ULV. Purchasing a laptop with a dedicated graphics chip is going to become extremely hard in the near future.

So where does this leave Intel and their plan to lockout other IGPs on their chipsets? I think it leaves them in a great position before Win8 and the next generation gaming consoles hit. After that, unless they make some serious changes, ARM and others will clean their clocks.

If we all get trained not to need power like we once had on the road. And if content and game delivery can be achieved efficiently through ARM design, what left for Intel in the consumer realm?

I personally can not wait for this.
 
I don't see why everyone is making a fuss about the performance of Intel IGPs. I think we're forgetting what the point of an integrated chipset is for.

I've had plenty of gaming experience with integrated chipsets. The first taste for me was the 845G "Extreme" graphics from intel and then an integrated Radeon VE. Both sucked horribly. I don't even want to think about how I played Return of Castle Wolfenstein: Enemy Territory on those chips, which they could only churn out 4-20fps at 640x480.

I've definitely had my fair share of IGP bashing.. but the reality is that...

You have to realize most people don't care about 3D gaming performance. Intel caters to the bottom line. The family that uses a computer mostly to browse the web and type email. The corporate user/corporation that only cares about email, typing up documents, and browsing the web.

In that respect, Intel's IGP is a very affordable and cost-effective solution. It gets the job done- nothing more and nothing less. For the current HD 3000 to even rival the low end Radeon HD 5xxx series is a shock to me. Never did I think Intel would achieve that good of performance.

Everyone can sit here and complain about how AMD and nVidia are both going to beat Intel's head to the curb. It simply isn't going to happen. Talk of that went as far back in 2002, and I still don't see Intel dwindling in the market share department :)

But you guys need to remember that the Macbook Air is in the ultrabook category. The only people who will use them are people who don't want to do 3D gaming. Ultrabooks are supposed to be severely underpowered but will offer good battery life and mobility (i.e. it won't feel like you tossed bricks in your laptop bag). They're not for gaming. my uncle's ultrabook/ultraportable has a 9" screen and a transmeta CPU. Definitely not for gaming.

So why all the hate? I just don't understand. Intel's IGP/CPU combo is a very efficient solution. The wattage ratings intel includes are for both the IGP and CPU. My Core i5 (desktop) is rated for 95W but uses (or rather, dissipates) only 60W (since the GPU is turned off). So I would not find it surprising that the Core i5 ULV in the macbooks rated for 17W is the rating for both the IGP and CPU. In that respect, AMD's combo package will never be able to match that spec.

The Core i5 ULV is a very power efficient chip. You guys should learn to appreciate it more ;)

I guess you could argue that the previous gen MBAs had a GeForce + Core 2 Duo combo and had better power efficiency. But the Core i5 in the MBA have turbo boost. The 1.7GHz version can top out at 2.7GHz. Quite a performance jump than just a Core 2 Duo running at 1.6GHz! Also, without the CPU horsepower to match the GPU's capabilities, it's useless pairing a good graphics adapter to an underpowered CPU. It's like expecting Skyrim to run on a machine that has a Pentium 4 paired with a GTX 590.

In the end, faster CPU translates to a smoother user experience when using the MBA for everyday tasks, which is more important than gaming.

So if you guys want better GPU performance, keep a Windows desktop like me- that's loaded with a GTX 470 with a beasty 448 cores. Keep the MBA as your work+browsing computer.
 
Last edited:
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

Even if the IGP 3000 was equal to the 320m then it shows how poorly designed the Intel IGP is, and in some cases the 320m is better despite being a year older.

All other arguments around what an ultrabook should be used for and other personal opinions are completely null.
 
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

Agree. Also, not considered by the pro-HD3K crowd is that the weaker HD3000 GPU increases the CPU's reliance on TurboBoost at the expense of more heat and lower battery life.
 
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

Even if the IGP 3000 was equal to the 320m then it shows how poorly designed the Intel IGP is, and in some cases the 320m is better despite being a year older.

All other arguments around what an ultrabook should be used for and other personal opinions are completely null.

Of course I'd like a better IGP. I've been advocating that Intel buy NVIDIA. That said, I think it's perfectly reasonable for someone to conclude that the jump to Ivy Bridge isn't big enough for a 2011 MacBook Air owner to purchase a 2012 MacBook Air. I'll evaluate it when it comes out, but my current plan is to wait until 2013 or 2014. Haswell will bring a bigger boost to the CPU, as well as the GPU.
 
Of course I'd like a better IGP. I've been advocating that Intel buy NVIDIA. That said, I think it's perfectly reasonable for someone to conclude that the jump to Ivy Bridge isn't big enough for a 2011 MacBook Air owner to purchase a 2012 MacBook Air. I'll evaluate it when it comes out, but my current plan is to wait until 2013 or 2014. Haswell will bring a bigger boost to the CPU, as well as the GPU.

My main issue with Intel IGP is not really FPS, since that cannot be dealt with easily. It's image quality. Even at the same settings vs an AMD GPU (let's say...), the texture filtering of Intel leaves much to be desired. This is especially true on Windows laptops with switchable GPU, where the difference can be seen almost instantly (and idiot reviewers question companies that place HD64xx GPU switchable with Intel HD3000 IGP). Of course, this would require Intel to actually implement texture HW... (given this is something AMD excells at, vs nVidia, I'd guess there are too many patents for Intel to bother).
 
After reading this thread it is clear that there are those who bought a 2011 MBA and are defending their purchase by ignoring facts.

Even if the IGP 3000 was equal to the 320m then it shows how poorly designed the Intel IGP is, and in some cases the 320m is better despite being a year older.

All other arguments around what an ultrabook should be used for and other personal opinions are completely null.

Not really. I'd understand your argument if we were talking about the MBP 13", but not for the MBA. The MBA is NOT meant to be a gaming machine, and for anything else other than gaming, you don't really need a dedicated GPU.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.