Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
some interesting news.. i guess time will tell what it will be..

but i smell a refresh soon ...
 
I'd prefer to see AMD graphics replace it rather than Intel graphics. But I hate NVIDIA.

I haven't had one NVIDIA GPU that hasn't gone bad. True story. My MacPro1,1, MacBookPro4,1, and MacBookPro6,2 all had bad GPUs which caused kernel panics and needed replacing.

On the flip side, Nvidia is the only GPU I feel confident in using for all of my systems, whether laptops, custom desktops, etc. That said, I don't use enough of Macs to see if there's a Nvidia/Mac issue there. Your experience vs mine suggests there's definitely something, though, since the common denominator is "Mac."
 
Maybe you should tell them.

http://www.boinx.com/boinxtv/systemrequirements/

Listed is a 9400M (not a discrete graphics card) and a ( ATI X800X ) while discrete isn't particularly more speeding that HD4000.

Actually, from my "Read Me" file for BoinxTV:

"BoinxTV requires either an Intel–based Mac with discrete graphics from ATI or NVIDIA and a minimum of 2 GB of RAM. BoinxTV runs on all current notebook and desktop Macs (including the MacBook with NVIDIA GeForce 9400M) with the exception of the Macs with integrated Intel GMA graphics (the earliest Mac mini, and the white or black MacBook models)."


So the way they worded it threw me for a loop. BoinxTV requires a discrete ATI card or an NVidia integrated chipset or discrete card. N.B., Intel integrated GMA graphics are not supported. The trend of not supporting Intel's integrated graphics will continue due to their lackluster performance compared to discrete graphics chips.
 
Uh ? Intel and AMD had been offering integrated graphics for the longest time.

Care to point out the non ATI descendant chipset AMD was offering that had large market share and graphics? AMD, being a smaller player, tended to let the 3rd parties roll out the chipsets ( at first vendors like VIA and SIS and later ATI and Nvidia). The whole "fusion" concept was post ATI acquisition. That was one of the driving factors for the acquisition (to get graphics inside so they could merge with CPU die). There were AMD CPU powered solutions on the market with integrated graphics. There weren't much of any purely AMD ( CPU + GPU + Chipset) products on the market.


As for Intel it was very similar. Intel certainly started on the the "home grown" integrated graphics path earlier, but not particularly seriously or quickly. Until the level of integration go to the point where is was clear they needed to "wipe out" the 3rd party chipset vendors that was a slow growth path. Nvidia and ATI moved to "wipe out" those vendors before Intel did.
 
I still remember faulty NVIDIA GPUs few years ago. No thx. Even if they have best GPU switching system called "Optimus". AMD seems to be much more stable partner.

I am against dGPU in thin laptops like ultrabooks or MBA due overheating problems in aluminium case which conducts heat very quickly. Current components based on Sandy Bridge platform with integrated Intel GPU still generate serious cooling/noise problems and it is still a big challenge for engineers.

There are rumours that new MacBook Pros will have same design like MacBook Airs. Thermal issues will be much more perceptible. New Intel HD4000 is more than enough in MacBook Airs. Maybe APU from AMD could be a great choice from TDP perspective.

It is quite funny that Apple simulates that MacBooks Pro are gaming machines. This is just marketing in my opinion. If you are real game player you have a Windows machine like Alienware, Clevo, Asus Gxx or similar product.
 
Last edited:
TV requires a discrete ATI card or an NVidia integrated chipset or discrete card. N.B., Intel integrated GMA graphics are not supported. The trend of not supporting Intel's integrated graphics will continue due to their lackluster performance compared to discrete graphics chips.

What are you talking about? The GMA grahics line dead-ended about 2-3 years ago (discounting the "retread" products where Intel slaps a new label, 'Celeron', on an old product line. ) The HD3000 and HD4000 are not GMA graphics.

The GMA 3xxx (e.g., 3600 ) series are rebranded Imagination Technologies PowerVR stuff. Same stuff in the iPad/iPhone that is sooooooo horrific at games that nooooooooobody wants to play them ... *cough*. The GMA stuff is relegated to the ATOM line-up at this point anyway.

Intel had demo'ed the HD4000 powered chips running 4K TV output. For stuff like decoding/decompressing video it is not that hard to add dedicated logic just for that task. What is missiing is higher end 3D throughput which requires more general , flexible, function units .
 
ROFLMAO!!!

Pixelmator is the industry standard, dontchaknow? :p

They've even got a corporate licensing program for all the professionals and corporations who might not be able to afford the steep $59 buy-in price and have "outgrown the limited image editing capabilities of iPhoto, but don't need the full-blown approach of Photoshop, not to mention its steep learning curve..."

It's so exclusive (think McClaren, but for photo editing instead of cars) that I had to look it up in order to just be blessed with the knowledge of its existence - http://macs.about.com/od/applications/gr/pixelmator-review.htm

I'm gonna have to go back to school to become a real professional and learn Pixelmator.


A real professional could likely work circles around you with Pixelmator. It's the skillz not the toolz . maybe you should go back to school after all... both of ya.
 
I still remember faulty NVIDIA GPUs few years ago. No thx. Even if they have best GPU switching system called "Optimus". AMD seems to be much more stable partner.

I am against dGPU in thin laptops like ultrabooks or MBA due overheating problems in aluminium case which conducts heat very quickly. Current components based on Sandy Bridge platform with integrated Intel GPU still generate serious cooling/noise problems.

There are rumours that new MacBook Pros will have same design like MacBook Airs. Thermal issues will be much more perceptible. New Intel HD4000 is more than enough in MacBook Airs. Maybe APU from AMD could be a great choice from TDP perspective.

It is quite funny that Apple simulates that MacBooks Pro are gaming machines. This is just marketing in my opinion. If you are real game player Macs and OS X you have Alienware, Clevo, Asus Gxx or similar product.

I second that - Ivy Bridge + 4000HD GPU + OpenCL support which will easily cater for the midrange. The only thing I'd really love to see is some more TLC spent on improving the OpenGL and driver optimisations because I'm sure there is more performance that can be squeezed out of the GPU than what is being delivered right now.

You're right about nVidia - from the faulty GPU's to the horrible reliability when switching from Intel GPU to nVidia when compared to the current crop of MacBook Pro with the switching between the Intel GPU to the AMD/ATI one.

IMHO if I was Apple I'd keep away from nVidia for the foreseeable future - both ATI/AMD and Intel are on the right track and god knows nVidia needs a good punishing after failing to actually provide a real long term solution to all those customers who were burnt by their shipment of crappy GPU's a few years back.
 
What are you talking about? The GMA grahics line dead-ended about 2-3 years ago (discounting the "retread" products where Intel slaps a new label, 'Celeron', on an old product line. ) The HD3000 and HD4000 are not GMA graphics.

The GMA 3xxx (e.g., 3600 ) series are rebranded Imagination Technologies PowerVR stuff. Same stuff in the iPad/iPhone that is sooooooo horrific at games that nooooooooobody wants to play them ... *cough*. The GMA stuff is relegated to the ATOM line-up at this point anyway.

Intel had demo'ed the HD4000 powered chips running 4K TV output. For stuff like decoding/decompressing video it is not that hard to add dedicated logic just for that task. What is missiing is higher end 3D throughput which requires more general , flexible, function units .

All I was stating was the requirements as stipulated in the "Read Me" file. I conceded that I read the "Read Me" file in a way that I thought BoinxTV required dedicated graphics, but that the program does allow for the NVidia 9400M.

I realize the GMA chips are essentially EOL'd, but my point is that Intel HD graphics will not have the longevity or perform as well as dedicated graphics chips for software. I assume the Intel HD graphics will work with BoinxTV, but I do not know this for a fact as I have no experience with BoinxTV running on the latest integrated chips from Intel.

[Edit: Yes, I realize there is a difference between GMA and HD integrated graphics chips.]
 
What the GPU is attached to is immaterial.

Who sells the most GPUs ?
Who generates the most revenue selling GPUs ?


Selling GPUs is a business. Those selling the most ( not the fastest) have leverage.

ATI and NVidia tried to leverage coupling GPUs to memory controllers into dominating the chipset business and selling more GPUs. Intel and AMD absorbed the memory controllers into the CPU die (for additional reasons that have to do with decreasing latency and increasing integration) and took away that leverage ( a happy, for them, side-effect) .

In the distant past, the classic PCI-e slot graphics card was key to the graphics market. That era is over. Embedded, whether integrated into the chipset/CPU packge or inserted directly onto the motherboard , graphics is the dominate market now.

GPUs that are inadequate for any kind of serious use. Volume will always be king, but Intel doesn't provide anything else than basic functionality and a little consumer eyecandy.

If that wasn't the case, we wouldn't even have this article. But we do, because Intel is currently a weakling in the graphics business, as seen per product. And graphics processors are needed, on the order of magnitude more powerful than what Intel has to offer.
 
Sorry, but modern Intel IGP is absolutely decent (for an IGP). The IB HD4000 is sufficient for playing modern games on low resolution (read 13" MB) with low-medium settings. Moreover, HD4000 is faster than some so called dedicated cards out there. Intel has come long way from the GMA graphics, which was indeed totally and utterly horrible.

To call an IGP that can run Skyrim with over 34 fps on 1680x1050 "crappy" is, IMHO, a bit stupid. This is only 20-30% less performance than the 6750M which can be found in the current MacBook Pros and iMacs. Source: http://www.engadget.com/2012/03/07/fresh-ivy-bridge-benchmarks/ (the ATI5570 is more or less comparable with the 6750M)
 
Sorry, but modern Intel IGP is absolutely decent (for an IGP).

If you don't compare it to other IGPs out there, sure, maybe. The problem is modern Intel IGPs are the same thing as older Intel IGPs, 1 or 2 generations behind the competition.

Intel just sucks at graphics. Always have, always will it seems.
 
Sorry, but modern Intel IGP is absolutely decent (for an IGP). The IB HD4000 is sufficient for playing modern games on low resolution (read 13" MB) with low-medium settings. Moreover, HD4000 is faster than some so called dedicated cards out there. Intel has come long way from the GMA graphics, which was indeed totally and utterly horrible.

To call an IGP that can run Skyrim with over 34 fps on 1680x1050 "crappy" is, IMHO, a bit stupid. This is only 20-30% less performance than the 6750M which can be found in the current MacBook Pros and iMacs. Source: http://www.engadget.com/2012/03/07/fresh-ivy-bridge-benchmarks/ (the ATI5570 is more or less comparable with the 6750M)

maybe, but in this example there is barely an improvement over the 3000
44731.png
 
Isn't AMD 7xxx series supposed to be faster than the nVidia 7xx series?

I don't understand why Apple flip flops with their GPU manufacturer, even when the other brand has superior GPUs. When Apple put the 330m in their notebooks, the 4xxx and 5xxx series was blowing away nvidia's midrange cards. When Apple put in the ATI X1600 series in, nVidia's 7xxx series was blowing ATI's GPUs out of the water.

Can someone explain this to me?

Yup!

I dunno. Maybe something to do with yield/availability/cost... I'm sure someone knows. They do have to keep their prices pretty stable, and obviously they don't like shortages... but personally I've always preferred ATI.
 
Yup!

I dunno. Maybe something to do with yield/availability/cost... I'm sure someone knows. They do have to keep their prices pretty stable, and obviously they don't like shortages... but personally I've always preferred ATI.

I remember that generation because the x1600 in my asus was clocked higher than the mbp version and the 256mb was standard as opposed to being yet another necessary upgrade. FWIW my asus at the time was literally a desktop replacement in a 14" body so its understandable that apple underclocked it in the pro.
 
If you don't compare it to other IGPs out there, sure, maybe. The problem is modern Intel IGPs are the same thing as older Intel IGPs, 1 or 2 generations behind the competition.

Intel just sucks at graphics. Always have, always will it seems.

Well, yes, but you can't slap an IGP from competition on the same board with an Intel CPU, because the only competitor is AMD/ATI (and ATI has been doing graphics for far longer then Intel). Intel IGP is slower, but its still fast enough for most tasks you would expect to perform on a 13" laptop, inclusive gaming.
 
Sorry, but modern Intel IGP is absolutely decent (for an IGP). The IB HD4000 is sufficient for playing modern games on low resolution (read 13" MB) with low-medium settings. Moreover, HD4000 is faster than some so called dedicated cards out there. Intel has come long way from the GMA graphics, which was indeed totally and utterly horrible.

To call an IGP that can run Skyrim with over 34 fps on 1680x1050 "crappy" is, IMHO, a bit stupid. This is only 20-30% less performance than the 6750M which can be found in the current MacBook Pros and iMacs. Source: http://www.engadget.com/2012/03/07/fresh-ivy-bridge-benchmarks/ (the ATI5570 is more or less comparable with the 6750M)

It has come a long way. HD4000 surely is a big step. But it's still a mediocre solution. Just as with everything integrated, from audio chips, RAID cards and so on.

Also, you are comparing an unreleased product, with screen resolutions that are going out in the next generation or the one after that. Look what happened with the ipad and what's happening with phones. Huge resolutions are coming to laptops as well.
 
It has come a long way. HD4000 surely is a big step. But it's still a mediocre solution. Just as with everything integrated, from audio chips, RAID cards and so on.

Also, you are comparing an unreleased product, with screen resolutions that are going out in the next generation or the one after that. Look what happened with the ipad and what's happening with phones. Huge resolutions are coming to laptops as well.

Don't get me wrong. I am only saying that HD4000 is adequate for the 13" line as it currently stands (with its resolutions). All >15" absolutely require a dedicated GPU.

I would rage just as much as anyone here if Apple indeed releases the 15" model without a dedicated GPU ;) That would be a setback compared to what they already have on the market.

Anyway, don't trust Charlie.

P.S.

Huge resolutions are coming to laptops as well.

Its still a long way. Currently, there is no indication for huge resolutions on laptops in any near future (and there is no mobile GPU powerful enough to power 3d applications on ultra-high resolutions). Not that it is needed anyway. The iPad is the different thing, I own the first-gen iPad and I am often distracted by pixelized graphics and text. This never happens with laptop/desktop. The screen size and viewing distance is just too different. One could say that an average Apple laptop/dektop already has Retina display, because we are not able to distinguish individual pixels from an average viewing distance.
 
It has come a long way. HD4000 surely is a big step. But it's still a mediocre solution. Just as with everything integrated, from audio chips, RAID cards and so on.

Also, you are comparing an unreleased product, with screen resolutions that are going out in the next generation or the one after that. Look what happened with the ipad and what's happening with phones. Huge resolutions are coming to laptops as well.

Hey now, the NForce APU (back in the NForce 2 days) was pretty awesome...
 
maybe, but in this example there is barely an improvement over the 3000

That may be true, but the point of that post was to point out that today's intel graphics offerings are decent. The Starcraft 2 chart confirms that. Starcraft 2 on medium settings with ~40 FPS is absolutely playable and is better than many mid-range discrete offerings from 2 years ago.
 
That may be true, but the point of that post was to point out that today's intel graphics offerings are decent. The Starcraft 2 chart confirms that. Starcraft 2 on medium settings with ~40 FPS is absolutely playable and is better than many mid-range discrete offerings from 2 years ago.

meh, my mid-range desktop from 2008 with its gtx260 and q9550 still destroys it, silky smooth at 1080p on ultra with everything on..yes I realize that I'm comparing a desktop to mobile, but we're talking about 4years which is an eternity in the computer world. It wouldn't be a problem if it weren't for the fact that these integrated solutions will soon be replacing the discrete cards. If that's the future then the integrated solutions are insufficient as it stands.
 
No, it didn't not. (if you are trying to imply "dedicated" as discrete).

http://support.apple.com/kb/SP541

There was a 9400M which is a IGP. The memory for the GPU was RAM. Not VRAM. The memory controller feeding the CPU was the same as the GPU. I

If that is "fast enough" IGP then perhaps. Intel IGP has always made trade-offs for lower power draw than performance that Nvidia's didn't. Now that the process technology has "caught up" ( 22nm) Intel can afford to put performance in without making a relatively large power trade-off.


Technically, dedicated (meaning "tasked for that purpose" ) covers any of these systems where there is just one inside the box. There is no "other" GPU that could be doing the work.

Aha! I thought it was dedicated when plugged in thus the discrete lable.

But the point still stands! It is possible to get a proper dedicated GPU in a current MBP and still get enough run time epesially if the OD is gone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.