ATI and NVidia tried to leverage coupling GPUs to memory controllers into dominating the chipset business and selling more GPUs.
But Intel is not "fast enough." I am no pro by any stretch, but something basic like BoinxTV requires a discrete graphics card to use the program.
I'd prefer to see AMD graphics replace it rather than Intel graphics. But I hate NVIDIA.
I haven't had one NVIDIA GPU that hasn't gone bad. True story. My MacPro1,1, MacBookPro4,1, and MacBookPro6,2 all had bad GPUs which caused kernel panics and needed replacing.
Maybe you should tell them.
http://www.boinx.com/boinxtv/systemrequirements/
Listed is a 9400M (not a discrete graphics card) and a ( ATI X800X ) while discrete isn't particularly more speeding that HD4000.
Uh ? Intel and AMD had been offering integrated graphics for the longest time.
TV requires a discrete ATI card or an NVidia integrated chipset or discrete card. N.B., Intel integrated GMA graphics are not supported. The trend of not supporting Intel's integrated graphics will continue due to their lackluster performance compared to discrete graphics chips.
ROFLMAO!!!
Pixelmator is the industry standard, dontchaknow?
They've even got a corporate licensing program for all the professionals and corporations who might not be able to afford the steep $59 buy-in price and have "outgrown the limited image editing capabilities of iPhoto, but don't need the full-blown approach of Photoshop, not to mention its steep learning curve..."
It's so exclusive (think McClaren, but for photo editing instead of cars) that I had to look it up in order to just be blessed with the knowledge of its existence - http://macs.about.com/od/applications/gr/pixelmator-review.htm
I'm gonna have to go back to school to become a real professional and learn Pixelmator.
I still remember faulty NVIDIA GPUs few years ago. No thx. Even if they have best GPU switching system called "Optimus". AMD seems to be much more stable partner.
I am against dGPU in thin laptops like ultrabooks or MBA due overheating problems in aluminium case which conducts heat very quickly. Current components based on Sandy Bridge platform with integrated Intel GPU still generate serious cooling/noise problems.
There are rumours that new MacBook Pros will have same design like MacBook Airs. Thermal issues will be much more perceptible. New Intel HD4000 is more than enough in MacBook Airs. Maybe APU from AMD could be a great choice from TDP perspective.
It is quite funny that Apple simulates that MacBooks Pro are gaming machines. This is just marketing in my opinion. If you are real game player Macs and OS X you have Alienware, Clevo, Asus Gxx or similar product.
What are you talking about? The GMA grahics line dead-ended about 2-3 years ago (discounting the "retread" products where Intel slaps a new label, 'Celeron', on an old product line. ) The HD3000 and HD4000 are not GMA graphics.
The GMA 3xxx (e.g., 3600 ) series are rebranded Imagination Technologies PowerVR stuff. Same stuff in the iPad/iPhone that is sooooooo horrific at games that nooooooooobody wants to play them ... *cough*. The GMA stuff is relegated to the ATOM line-up at this point anyway.
Intel had demo'ed the HD4000 powered chips running 4K TV output. For stuff like decoding/decompressing video it is not that hard to add dedicated logic just for that task. What is missiing is higher end 3D throughput which requires more general , flexible, function units .
Care to point out the non ATI descendant chipset AMD was offering that had large market share and graphics?
What the GPU is attached to is immaterial.
Who sells the most GPUs ?
Who generates the most revenue selling GPUs ?
Selling GPUs is a business. Those selling the most ( not the fastest) have leverage.
ATI and NVidia tried to leverage coupling GPUs to memory controllers into dominating the chipset business and selling more GPUs. Intel and AMD absorbed the memory controllers into the CPU die (for additional reasons that have to do with decreasing latency and increasing integration) and took away that leverage ( a happy, for them, side-effect) .
In the distant past, the classic PCI-e slot graphics card was key to the graphics market. That era is over. Embedded, whether integrated into the chipset/CPU packge or inserted directly onto the motherboard , graphics is the dominate market now.
Sorry, but modern Intel IGP is absolutely decent (for an IGP).
Sorry, but modern Intel IGP is absolutely decent (for an IGP). The IB HD4000 is sufficient for playing modern games on low resolution (read 13" MB) with low-medium settings. Moreover, HD4000 is faster than some so called dedicated cards out there. Intel has come long way from the GMA graphics, which was indeed totally and utterly horrible.
To call an IGP that can run Skyrim with over 34 fps on 1680x1050 "crappy" is, IMHO, a bit stupid. This is only 20-30% less performance than the 6750M which can be found in the current MacBook Pros and iMacs. Source: http://www.engadget.com/2012/03/07/fresh-ivy-bridge-benchmarks/ (the ATI5570 is more or less comparable with the 6750M)
Isn't AMD 7xxx series supposed to be faster than the nVidia 7xx series?
I don't understand why Apple flip flops with their GPU manufacturer, even when the other brand has superior GPUs. When Apple put the 330m in their notebooks, the 4xxx and 5xxx series was blowing away nvidia's midrange cards. When Apple put in the ATI X1600 series in, nVidia's 7xxx series was blowing ATI's GPUs out of the water.
Can someone explain this to me?
Yup!
I dunno. Maybe something to do with yield/availability/cost... I'm sure someone knows. They do have to keep their prices pretty stable, and obviously they don't like shortages... but personally I've always preferred ATI.
If you don't compare it to other IGPs out there, sure, maybe. The problem is modern Intel IGPs are the same thing as older Intel IGPs, 1 or 2 generations behind the competition.
Intel just sucks at graphics. Always have, always will it seems.
Sorry, but modern Intel IGP is absolutely decent (for an IGP). The IB HD4000 is sufficient for playing modern games on low resolution (read 13" MB) with low-medium settings. Moreover, HD4000 is faster than some so called dedicated cards out there. Intel has come long way from the GMA graphics, which was indeed totally and utterly horrible.
To call an IGP that can run Skyrim with over 34 fps on 1680x1050 "crappy" is, IMHO, a bit stupid. This is only 20-30% less performance than the 6750M which can be found in the current MacBook Pros and iMacs. Source: http://www.engadget.com/2012/03/07/fresh-ivy-bridge-benchmarks/ (the ATI5570 is more or less comparable with the 6750M)
It has come a long way. HD4000 surely is a big step. But it's still a mediocre solution. Just as with everything integrated, from audio chips, RAID cards and so on.
Also, you are comparing an unreleased product, with screen resolutions that are going out in the next generation or the one after that. Look what happened with the ipad and what's happening with phones. Huge resolutions are coming to laptops as well.
Huge resolutions are coming to laptops as well.
It has come a long way. HD4000 surely is a big step. But it's still a mediocre solution. Just as with everything integrated, from audio chips, RAID cards and so on.
Also, you are comparing an unreleased product, with screen resolutions that are going out in the next generation or the one after that. Look what happened with the ipad and what's happening with phones. Huge resolutions are coming to laptops as well.
maybe, but in this example there is barely an improvement over the 3000
That may be true, but the point of that post was to point out that today's intel graphics offerings are decent. The Starcraft 2 chart confirms that. Starcraft 2 on medium settings with ~40 FPS is absolutely playable and is better than many mid-range discrete offerings from 2 years ago.
No, it didn't not. (if you are trying to imply "dedicated" as discrete).
http://support.apple.com/kb/SP541
There was a 9400M which is a IGP. The memory for the GPU was RAM. Not VRAM. The memory controller feeding the CPU was the same as the GPU. I
If that is "fast enough" IGP then perhaps. Intel IGP has always made trade-offs for lower power draw than performance that Nvidia's didn't. Now that the process technology has "caught up" ( 22nm) Intel can afford to put performance in without making a relatively large power trade-off.
Technically, dedicated (meaning "tasked for that purpose" ) covers any of these systems where there is just one inside the box. There is no "other" GPU that could be doing the work.