Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Blackberryroid

macrumors 6502a
Original poster
Aug 8, 2012
588
0
/private/var/vm/
I've heard a lot of complaints and annoyed users at Intel HD Graphics 4000 being terrible.

How "Bad" is it? How does it compare if NVidia were able to make integrated graphics?

What are the things you can't do because of Intel HD Graphics?
 
I've heard a lot of complaints and annoyed users at Intel HD Graphics 4000 being terrible.

How "Bad" is it? How does it compare if NVidia were able to make integrated graphics?

What are the things you can't do because of Intel HD Graphics?

The Intel integrated GPU is on the CPU, that's why NVidia doesn't make integrated GPUs...

Integrated GPU are meant for low power consumption, hence it will drain less the battery than a discrete GPU. On the other hand, the Intel integrated GPU can't beat the NVidia discrete GPU in terms of performances...
 
Last edited:
the hd4000 is actually a pretty solid performer. not sure what people are complaining about. it can pretty much run the retina display alone. i think a lot of people underestimate that thing.

edit.. as far as what you can do with it, think about it like this. its like a hybrid car. you run on the batteries for puttering around and crusing. when you need power the engine kicks in. same thing
 
It's absolutely fine. But, like most integrated graphics, it just can't do all the 3-d rendering. So when you're playing Portal 2 or whatever, it switches over seamlessly. The sad thing is that the current bootcamp only runs on the dedicated graphics, which eats battery. Whilst I don't have a problem with 7+ graphics on the Windows Experience, I don't need it for most real Windows apps and it would be better if we had the choice. Sadly, we don't (currently). I imagine that as more Macs start to get the dual graphics cards Bootcamp drivers may be updated.

When you're at work (unless your work is playing or writing high-graphics games), the Intel graphics do everything you might want.
 
How "Bad" is it? How does it compare if NVidia were able to make integrated graphics?

Currently I'm running a mid-2009 13" MBP. It's got an nVidia 9400M integrated graphics card. The HD4000 is 2~6 times faster. That doesn't sound bad at all to me!
 
Currently I'm running a mid-2009 13" MBP. It's got an nVidia 9400M integrated graphics card. The HD4000 is 2~6 times faster. That doesn't sound bad at all to me!

The difference is that the nVidia 9400M is integrated on the MB while the HD4k is integrated on the CPU.
 
I've heard a lot of complaints and annoyed users at Intel HD Graphics 4000 being terrible.

How "Bad" is it? How does it compare if NVidia were able to make integrated graphics?

What are the things you can't do because of Intel HD Graphics?

Intel has made huge improvements to their integrated graphics cards, especially the new HD 4000. I wouldn't call it bad by any means. Obviously, it doesn't compare to dedicated GPU's in 3D rendering tasks, but it still gets the job done.
 
The difference is that the nVidia 9400M is integrated on the MB while the HD4k is integrated on the CPU.

I understand that they're placed in different locations. When someone says "integrated graphics" they're usually talking about a low end graphics card that shares system memory. Both the nVidia 9400M and Intel HD4000 have this in common.

Anyway, my original point is that "bad" is subjective. I think the HD4000 would be great, but that's just because it's 3 years newer than what I'm using. On the other hand, it's junk compared to the Geforce GTX 670 SLI setup I've got in my gaming rig.
 
I've been fairly impressed so far with the Intel HD400. It's obviously not going to be as powerful as a dedicated GPU, but it's faired pretty well with most tasks, including some light gaming.
 
What if Intel acquires NVidia :D?... Not sure if it would be approved by antitrust regulators though...
 
The Intel integrated GPU is on the CPU, that's why NVidia doesn't make integrated GPUs...

Integrated GPU are meant for low power consumption, hence it will drain less the battery than a discrete GPU. On the other hand, the Intel integrated GPU can't beat the NVidia discrete GPU in terms of performances...

NVidia's integrated option was better. For a while Intel paid virtually no attention to integrated gpus. AMD also has some relatively strong integrated options, yet this is arguably the first credible one from Intel. A big problem when this comes up is that it's commonly misunderstood.

The difference is that the nVidia 9400M is integrated on the MB while the HD4k is integrated on the CPU.

I've seen the board differences. Both would be classified as integrated graphics, and neither used its own ram. NVidia is out of that business due to Intel's lawsuit and their eventual settlement.

Intel has made huge improvements to their integrated graphics cards, especially the new HD 4000. I wouldn't call it bad by any means. Obviously, it doesn't compare to dedicated GPU's in 3D rendering tasks, but it still gets the job done.

It's not like what you would get from discrete graphics, yet it should support OpenCL 1.2, although I don't think it's worked out yet under OSX. It should also support the latest DirectX if you're bootcamping at all.
 
Intel only has a very bad reputation.
Intel GMA 950 was crap.
3100 was bad, 4500 was still abysmal.
Intel HD Graphics (Arrendale) was a huge jump in performance and the first useable one for driving everything in 2D. Still not great in 3D.
Intel HD 3000 was another huge jump. Now you could play almost all 3D in low settings. The GPU was especially good in low settings even compared to Nvidia and AMD. Intel optimzed for that because they never intended to put a GPU in that could handle high settings anyway.
Intel HD 4000 was another big jump and finally destroyed all the low end GPUs.

The thing is Intel is still judged by those old GMA crap but they put a huge effort into the HD series. If you compare the speed increase to AMD, Nvidia they almost are equal now.
They also have the edge in process technology. They use 22nm already.
Stuff that AMD or Nvidia can cram into such a small space next to the CPU wouldn't be any better.
Compare the AMD Fusion onboard GPUs for what AMD is capable. They use more space for a more decoupled GPU and only win on the best incarnation.
With or without a lawsuit Nvidia couldn't compete anymore power efficiency wise with some separate chip that includes the MCH, or just barely. You wouldn't see any significantly better performance.

The 9400M from Nvidia was the first really decent integrated GPU and it was hailed by Users that don't do anything more today on OSX. It got smoked by the HD 3000 that still many complained about.
The 320M was the first integrated GPU that could beat some available dedicated GPUs and it is only just as fast as the HD 3000. Everybody was proud of the the 320M because it is oh so great.
HD 4000 is faster and a 620M would consume more power than it is worth.
Especially in the Intel ULV chips it is not worth to use anything else anymore. That would be a 15W CPU + 12-18W GPU. Integrated GPU just rules here.

If you need more get serious and go for a 650M that really makes a huge difference.
People complain because they don't look at the facts only repeat the reputation they get from forums which needs some years to change.
In fact the video acceleration stuff in the Intel GPUs was more than just on par with AMD, Nvidia for extremely low powerconsumption. Just going over the PCIe bus would have wasted more.
 
Intel only has a very bad reputation.
Intel GMA 950 was crap.
3100 was bad, 4500 was still abysmal.
Intel HD Graphics (Arrendale) was a huge jump in performance and the first useable one for driving everything in 2D. Still not great in 3D.
Intel HD 3000 was another huge jump. Now you could play almost all 3D in low settings. The GPU was especially good in low settings even compared to Nvidia and AMD. Intel optimzed for that because they never intended to put a GPU in that could handle high settings anyway.
Intel HD 4000 was another big jump and finally destroyed all the low end GPUs.

The thing is Intel is still judged by those old GMA crap but they put a huge effort into the HD series. If you compare the speed increase to AMD, Nvidia they almost are equal now.
They also have the edge in process technology. They use 22nm already.
Stuff that AMD or Nvidia can cram into such a small space next to the CPU wouldn't be any better.
Compare the AMD Fusion onboard GPUs for what AMD is capable. They use more space for a more decoupled GPU and only win on the best incarnation.
With or without a lawsuit Nvidia couldn't compete anymore power efficiency wise with some separate chip that includes the MCH, or just barely. You wouldn't see any significantly better performance.

The 9400M from Nvidia was the first really decent integrated GPU and it was hailed by Users that don't do anything more today on OSX. It got smoked by the HD 3000 that still many complained about.
The 320M was the first integrated GPU that could beat some available dedicated GPUs and it is only just as fast as the HD 3000. Everybody was proud of the the 320M because it is oh so great.
HD 4000 is faster and a 620M would consume more power than it is worth.
Especially in the Intel ULV chips it is not worth to use anything else anymore. That would be a 15W CPU + 12-18W GPU. Integrated GPU just rules here.

If you need more get serious and go for a 650M that really makes a huge difference.
People complain because they don't look at the facts only repeat the reputation they get from forums which needs some years to change.
In fact the video acceleration stuff in the Intel GPUs was more than just on par with AMD, Nvidia for extremely low powerconsumption. Just going over the PCIe bus would have wasted more.


Good to hear that. I was really worried when people told me the MacBook Pro's graphics card is horrible as hell. It almost made me buy a different computer since I'll be doing video editing.
 
My little Lenovo has the HD3000 graphics. It'll run anything I throw at it on the 1330x768 screen, and it'll play 1080p video to the HDMI port, and the whole machine is still only using 18W of power. Fantastic.
 
Intel only has a very bad reputation.
Intel GMA 950 was crap.
3100 was bad, 4500 was still abysmal.
Intel HD Graphics (Arrendale) was a huge jump in performance and the first useable one for driving everything in 2D. Still not great in 3D.
Intel HD 3000 was another huge jump. Now you could play almost all 3D in low settings. The GPU was especially good in low settings even compared to Nvidia and AMD. Intel optimzed for that because they never intended to put a GPU in that could handle high settings anyway.
Intel HD 4000 was another big jump and finally destroyed all the low end GPUs.

The thing is Intel is still judged by those old GMA crap but they put a huge effort into the HD series. If you compare the speed increase to AMD, Nvidia they almost are equal now.
They also have the edge in process technology. They use 22nm already.
Stuff that AMD or Nvidia can cram into such a small space next to the CPU wouldn't be any better.
Compare the AMD Fusion onboard GPUs for what AMD is capable. They use more space for a more decoupled GPU and only win on the best incarnation.
With or without a lawsuit Nvidia couldn't compete anymore power efficiency wise with some separate chip that includes the MCH, or just barely. You wouldn't see any significantly better performance.

The 9400M from Nvidia was the first really decent integrated GPU and it was hailed by Users that don't do anything more today on OSX. It got smoked by the HD 3000 that still many complained about.
The 320M was the first integrated GPU that could beat some available dedicated GPUs and it is only just as fast as the HD 3000. Everybody was proud of the the 320M because it is oh so great.
HD 4000 is faster and a 620M would consume more power than it is worth.
Especially in the Intel ULV chips it is not worth to use anything else anymore. That would be a 15W CPU + 12-18W GPU. Integrated GPU just rules here.

If you need more get serious and go for a 650M that really makes a huge difference.
People complain because they don't look at the facts only repeat the reputation they get from forums which needs some years to change.
In fact the video acceleration stuff in the Intel GPUs was more than just on par with AMD, Nvidia for extremely low powerconsumption. Just going over the PCIe bus would have wasted more.

+1 very well written. It's crazy to think that in just 2 years, Intel's HD 4000 has almost matched the performance of the GT 330m in the 2010 Macbook Pro 15. When Haswell comes out, I wouldn't be surprised if it reaches the levels of a GT 540m or 550m level of performance.
 
Another thumbs up for the Intel HD series from me. I have a late 2011 MBP and unless I'm doing serious 3D work on it, I usually set it to just use the HD 3000 as it's plenty good enough for Photoshop, Illustrator and surfing. I'm sure if the HD 4000 is better, it should be sufficient for anyone unless they're into really demanding applications or cutting edge gaming.
 
I understand that they're placed in different locations. When someone says "integrated graphics" they're usually talking about a low end graphics card that shares system memory. Both the nVidia 9400M and Intel HD4000 have this in common.

ivy-bridge-core.jpg



To me it's very impressive that a GPU located ON THE DIE of the CPU is capable of running Skyrim at 30+FPS.
ivy-bridge-benchmark.jpg
 
I've personally never use the HD3000 in my 17 inch MBP as opening more than 4-5 windows automatically switches it, guess it doesn't like it's native resolution, as Spaces is pretty clunky too. Unreal 2004 gets about 30 FPS on 1280x800 but using the 6750m it gets roughly 7-8 times that at 1920x1200.

Like Dusk007 said Intel has put in a fair amount of effort into the HD series, I was using a Macbook with a X3100 for years and just plugging in an external monitor made expose painful.
 
I've heard a lot of complaints and annoyed users at Intel HD Graphics 4000 being terrible.

How "Bad" is it? How does it compare if NVidia were able to make integrated graphics?

What are the things you can't do because of Intel HD Graphics?

They're pretty bad. Dangerous, even. Best to get rid of it NOW.

Please send your Macs with intel HD graphics to:

Environmentally Friendly Intel HD Mac Disposal Service
c/o Clyde Anderson
6636.....:D
 
The Intel 4000 is approximately 60% faster overall than the previous integrated graphics king on MacBooks, the 320M by Nvidia.
 
If someone claims that it is "bad" then they most likely tried to run a modern game or something on it (with models like the 13'' MBP that do not have switchable graphics), saw poor performance and then went on the internet to complain about it unfortunately.

It's a fine GPU, within reason of what you are trying to do with it.
 
Sadly, I discovered that the Intel HD graphics is worse than expected.
https://forums.macrumors.com/threads/1434079/

It can't even handle a normal task with Apple's own Thunderbolt display.

Intel has failed...again.

----------

If someone claims that it is "bad" then they most likely tried to run a modern game or something on it (with models like the 13'' MBP that do not have switchable graphics), saw poor performance and then went on the internet to complain about it unfortunately.

It's a fine GPU, within reason of what you are trying to do with it.
I wouldn't say that, especially when it can't handle normal tasks at Apple's own Thunderbolt Display.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.