Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
WHY are Intel's graphics chip sets so inferior to other manufacturer's graphics chip sets? Intel is very very good at manufacturing a zillion different types of chips, from tiny component chips all the way up to main CPU processors. What's so different about graphics chips? Why is it so hard for Intel to create good graphics chips when they're good at making every other type of chip?

because they are designed for max battery life and low cost...that's all..
 
because they are designed for max battery life and low cost...that's all..

Okay well, assuming that's true, then why is that the only thing they're designed for? Clearly there is (and has been) a huge demand for integrated Intel graphics chips that deliver at least moderate performance rather than just absolutely longest battery life.
 
That's too bad ... I wonder if the failure rate of AMD/ATI graphics products compares ...

Which graphics are better for the Mac on the whole, anyway? ATI or nVidia? Or is the better question: which company inked the sweeter deal with Apple this year? Considering all the hype marketing by Apple recently in the embrace of nVidia on the MacBook family, I'm not terribly surprised at their announcement. No one likes a pie in the face ... :rolleyes:

Lets put it this way, one time I couldn't get some really random kind of thermal grease off a Phenom Heat Spreader (Not Sink, The head spreader is the large metal thing directly on the CPU). Out of desperation and probably stupidity I removed the spreader from the CPU. Put both the CPU and the Spreader in Spirits. Let it soak for 2 hours. Rinse with Alcohol and Water and let it dry for 24 hours. The Phenom Still works and runs cooler now for some reason. Maybe because the spreader has Arctic Silver 5 instead of that crap that comes with it.
Whereas Ive had Pentium 4s die because they've fallen of my workbench.

Haven't tried it with a GPU yet ;)
 
The major impediment to nVidia chipset adoption post Nehalem really isn't reliability or licensing issues but the fact that mainstream Nehalem's don't have QPI links but only a low-speed DMI link to a southbridge. Any IGP using the DMI link would by definition be sharing bandwidth since any IGP for mainstream Nehalem would have to be integrated on the southbridge. DMI has the bandwidth of PCIe 1.0 x4 and the IGP would be fighting for bandwidth against the SATA hard drives, optical drives, Firewire, USB, gigabit ethernet, etc. Even for quad core Clarksfield that doesn't include an Intel IGP, it doesn't have QPI links and only DMI making it hard to make an IGP viable.

The alternative of putting a separate IGP on the PCIe links or giving it a small amount of dedicated VRAM may work well for desktop solutions but not for what Apple is targeting in notebooks. Specifically, Apple's major support for IGPs in notebooks is the reduction in PCB space by not having to accommodate a separate GPU chip and VRAM. If an IGP is attached to the PCIe links and has a bit of dedicated VRAM basically acting as a crippled discrete GPU then you might as well just put in a true low-end GPU since the board space wouldn't be much more and the performance would be a lot better.

In terms of Intel's IGP in Arrandale, there is some hope that at least it won't be a downgrade from the 9400M. The GMA X4500MHD is about half the speed of the 9400M. Intel never took full advantage of the die shrink from 90nm in the X3100 to 65nm in the X4500 since they could have double the shader count but only increased it from 8 to 10. Arrandale's IGP will be further shrunk from 65nm to 45nm, so Intel could easily put in 16 shaders or more and coupled with higher clock speeds to at least match the 9400M. I doubt Apple would go back to Intel IGPs unless they would be competitive with existing solutions. Although Apple did go from the Mobility Radeon 9550 in the last iBook to the GMA950 so it isn't unheard of. In addition, admittedly the problem with Intel IGPs hasn't been the raw theoretical hardware capabilities, but poor driver support that takes a long time or never takes full advantage of the hardware.
 
Okay well, assuming that's true, then why is that the only thing they're designed for? Clearly there is (and has been) a huge demand for integrated Intel graphics chips that deliver at least moderate performance rather than just absolutely longest battery life.

GPUs are different beasts - Intel never had the technology to do high end GPUs like ATI and Nvidia does - and to make things even more difficult for Intel a lot of the GPU tech is patented amongst ATI and nVidia.

The reasons for Intel chips still being in the market are - a)majority of people do not care about graphics performance the way gamers or Pros do - they just want something that doesn't look too shabby b) For servers people want something that does not come in the way and require a lot of power and c) Intel almost bundles the GPU for free with their chipsets which means lower manufacturing costs for Dell/HP/Lenovo etc. (They don't have to pay separately to ATI/Nvidia for the GPU - remember Intel heavily discounts their entire CPU/Chipset/GPU package).
 
So what you're actually complaining about is the fact that you went out and purchased a laptop completely unsuitable for what you need to do - and it is Apple's fault because of it.

??? Confused on that one. I was just replying to your comment on the fact that the X3100 cards are just as good as the NV9400 cards, as I work making games on the Mac I can attest they are not as good when it comes to high end perfomance. As games are often a good test for how well the card can handle things I thought I would pass comment that the X3100 is vastly inferior in performance.

You might not notice it when opening a new finder window or an Excel spreadsheet but run iTunes visualization or iMovie on a high resolution or use an external screen and you might start to notice.

If what you required was something with sufficient grunt and you're a developer, why didn't you go for their Pro range? MacBooks are consumer level laptops - you're trying to use your BWM Mini as a bulldozer.

I agree in fact we don't develop on MacBooks with X3100 cards (although that is more down to the CPU and HD speeds than the card) but we as we sell games for Macintosh we have to try and support ALL the Mac's we can this includes X3100 and NV9400 based machines so we have examples of all the cards in the office. For development we can use a 8 core MacPro but you still have to support lower spec machines not just the fastest one money can buy.

As for the jump of assuming Intel - why? I have an iMac with an ATI GPU, why would it be strange for MacBooks to have an ATI graphics card?

Personally I really like guys in the Mac department at ATI and they are very helpful and they give us great support when we need help with driver issues etc. They have had their cards in the MacBookPro's with the X1600 card. They have also had the standard card in the MacPro with the 2600 and the iMac with the 2400. In fact the 3xxx and 4xxx series Mac cards are amazingly fast in the MacPro.

I did not assume Intel in any way, my comment was to reply to your comment which was in the context of comparing the current NV9400 solution to the currently unused on the Mac, Intel X4500 card.

I don't mind what card is placed in the Apple machines, NVIDIA, ATI or Intel. But as my job is based on games (that need as high a performance as possible) I would prefer a card that performs well so more people can play our games. :)

This at the moment favors the cards from ATI and NVIDIA but Intel have Larabee coming and from the news it looks like this new card series will be competitive against ATI and NVIDIA offerings.

The only point I was trying to make is the Intel integrated cards right now are underpowered compared to the ATI and NVIDIA integrated solutions. Although this effects games first it also can effect other applications like iTunes, iPhoto, iMovie and even the OS. This is because the OS is increasingly off loading tasks onto your graphics card (like QuartzExtreme) and as such a faster card will give you a nicer experience.

If you don't play games it does not matter as much but if you offered me a Mac with a X3100 or a NV9400 even if I did not play games I would choose the NV card as it is four and a half times more powerful. (ATI don't have an integrated card available on the Mac so that is not an option right now).

Edwin
 
It doesn't matter which GPU they use as long as it's not a downgrade in performance.

Intel has been working a long time on GPUs, so they might finally be able to come up with something as powerful as GeForce 9400.

Anyway, there's always ATi if Intel cannot make better graphics chip.
Hahahahahahahahahahahaha. Intel really needs to get Larrabee out. That is their only hope for competing with Nvidia/ATI.
 
I'll go on record to say that Larrabee will suck.

Intel can't make a GPU that doesn't suck. If Apple uses it, in 12 months, it will switch to something else because it sucks.

God, I hope I am wrong.
 
So don't just talk the talk, buy a Dell and see how it really is on the other side of the fence.
Err... I walked the PC walk for 20 years, in fact I'm typing on a Dell right now since my iMac is dead (for the 2nd time) and my 2-week old MBP 17" is currently running the extended Apple Hardware Test to see if I get any error codes -- it won't start in neither OS X nor BootCamp/Win7 64-bit.

My last 5 machines were from Dell and I had no issues with any of them except one, which was a total piece of crap. It was not a normal Dell machine, however, it was an XPS700 gaming PC based on a chipset from... drumroll... NVidia. Some of their video cards are OK, but don't EVER let NVidia take control of other aspects of the machine like the chipset, it will only end in misery.
 
We just got a new 17.3" HP laptop in the store last week with the ATI 4650 1GB video card. We loaded Crysis on it to see how it would run. :eek: :D I hope the new Macbook Pro gets the 4650 soon. The HP is a Quad core 9000, 6GB ram, 500GB 7,200 HD and HDMI out. I can get one for $1,200 with my employee discount. Has anybody ever hacked one of the high end Pavilions ?
 
Lets put it this way, one time I couldn't get some really random kind of thermal grease off a Phenom Heat Spreader (Not Sink, The head spreader is the large metal thing directly on the CPU). Out of desperation and probably stupidity I removed the spreader from the CPU. Put both the CPU and the Spreader in Spirits. Let it soak for 2 hours. Rinse with Alcohol and Water and let it dry for 24 hours. The Phenom Still works and runs cooler now for some reason. Maybe because the spreader has Arctic Silver 5 instead of that crap that comes with it.
Whereas Ive had Pentium 4s die because they've fallen of my workbench.

This is not an accurate comparison of the quality of the chips. One was carefully removed taken apart and cleaned while the other had to withstand the shock of being dropped. Its like saying that that you carefully detailed your Geo Metro and it runs great but you took a sledge hammer to a Mazda 6 and now it's not running and from this it is obvious that the Geo Metro is a higher quality than a Mazda 6. It just doesn't follow.
 
What's this about Apple making their own GPUs??:rolleyes:

Interesting. I wonder how that would affect bootcamp compatibility. I guess it is possible for them to come up with some decent GPU hardware. I wouldn't expect them to be able to outrun Nvidia or ATI though. They should be able to outdo Intel though...
 
What's this about Apple making their own GPUs??:rolleyes:


I wouldn't look too much into that given that they've upped their investment in Imagination (which makes their GPU's on the iPhone) - I don't see Apple entering the GPU market simply because they, along with CPU's, have become so complex that you need massive economies of scale to ever recoup just the investment alone.
 
We just got a new 17.3" HP laptop in the store last week with the ATI 4650 1GB video card. We loaded Crysis on it to see how it would run. :eek: :D I hope the new Macbook Pro gets the 4650 soon. The HP is a Quad core 9000, 6GB ram, 500GB 7,200 HD and HDMI out. I can get one for $1,200 with my employee discount. Has anybody ever hacked one of the high end Pavilions ?


Hang on, you're comparing a desktop replacement with a laptop? good lord; they aren't even the same beast. A desktop replacement is something that is minimally moved where as a MacBook Pro is designed to be used on the go. As I said, the two are completely different beasts.
 
Err... I walked the PC walk for 20 years, in fact I'm typing on a Dell right now since my iMac is dead (for the 2nd time) and my 2-week old MBP 17" is currently running the extended Apple Hardware Test to see if I get any error codes -- it won't start in neither OS X nor BootCamp/Win7 64-bit.

My last 5 machines were from Dell and I had no issues with any of them except one, which was a total piece of crap. It was not a normal Dell machine, however, it was an XPS700 gaming PC based on a chipset from... drumroll... NVidia. Some of their video cards are OK, but don't EVER let NVidia take control of other aspects of the machine like the chipset, it will only end in misery.

I've actually found Dell really reliable; they aren't sexy, they're not fancy, but if you want to upgrade components, swap parts out - its a walk in the park compared to the proprietary ridden crap that HP puts out. I swear, when I was fixing up a customers HP computer that I have never seen so much crap in a single box - custom and proprietary everything.

If I was told to purchase a computer and it could only be a PC, I'd settle for either a Dell or Lenovo; probably Dell given that I prefer to get my computers online so that I am not hassled with sales drones in stores.
 
What's this about Apple making their own GPUs??:rolleyes:

You know this isn't too far fetched honestly. Apple's PPC beat the Intel chips back in the day and we never thought we'd see them in a Mac. I don't think Apple made the change for any other reason than to get Windows on it for the people who prior to Vista wanted to use both OS'.

Rumors are that Apple is making their own chips for future iPhones and Touches because they don't want the chips they used sold to the other phone manufacturers. Making their own GPUs will allow Apple to exceed what the other companies may build for the Mac as well as shutting down the Hackintosh abilities.

I just hope we don't go from good GPUs to OK GPUs. It's the shortcoming in Apple's lineup that everyone harps about. :(
 
I didn't read all posts, but...

Apple makes smart decisions, and is making one by dropping a company producing a historically defective chipset.

A no brainer

Remember, Intel's 32nm chip is within a 1.5 yr reach.
Every time a die shrinks in size speeds get faster and chips more efficient. Less Heat - Faster - Smaller - Less Power Need - etc....

If Intel gets their GPU act together we could see some powerful new mobile devices (and desktops).

PS:Apple didn't work hard on GrandStation to abandon it ;)
:Intel wants the buisness - ATI is still an option - Nvidia, well IMO they have never represented stability

Have faith :apple: knows how to play ball:)
 
Hang on, you're comparing a desktop replacement with a laptop? good lord; they aren't even the same beast. A desktop replacement is something that is minimally moved where as a MacBook Pro is designed to be used on the go. As I said, the two are completely different beasts.


No, I'm not. I know they are two very different computers. I sell both of them to two very different types of customers.

I was just hoping that Apple would some day put an ATI 4000 series vid card in at least one Macbook Pro.
 
I wonder if this will have any bearing on NVIDIA graphics cards being available on new Mac Pros later on like when they revise them in 2010? As it stands now, I may have to switch back to a PC because Apple does not update their graphics cards often enough for me with good ones or if they do it is not the top end - like the 285 and not the 295. Given they may drop them for a while I would not expect to see NVIDIA support Apple with new cards.

I work mostly with 3D apps like Modo, Maya, ZBrush, Vue, etc. and this is REALLY going to hurt me because I MUST have the best cards possible. In fact, NVIDIA is often the favored card for most software like that.

Keeping fingers crossed though, have to wait and see.
 
NVIDIA was a step forward - if it is to be replaced it will be hard, especially on the lower end.
 
I've actually found Dell really reliable; they aren't sexy, they're not fancy, but if you want to upgrade components, swap parts out - its a walk in the park compared to the proprietary ridden crap that HP puts out.
That's pretty much my feeling as well. I started using Dell machines about 10 years ago and the perspective was very different back then... everyone else made beige junk that was horribly messy on the inside (once you hade gone through the pain of removing 4-6 screws just to open a minitower), while Dell had started making nice black enclosures that were easy to open and service, everything was neat, color-coded and tool-less on the inside. In 2001 they were the Macs of the PC universe.

Time has caught up with Dell since then and now there are plenty of brands with sexier design, but if I were to buy a PC again, I'd definitely get a Dell (just not one with NVidia chipset). Also, their support is phenomenal. I understand this isn't the case in the U.S, but here in Sweden we've got sort of a Bizarro world situation... Dell's support is uber-professional, they bend over backwards to fix any problem on-site within 24 hrs no matter if you live smack in the middle of Stockholm or in some remote mountain shack, while Apple's support is a sad joke... snooty, lazy, unhelpful amateurs.

NVIDIA was a step forward - if it is to be replaced it will be hard, especially on the lower end.
Performance wise it was a step forward. Quality wise it was two steps back.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.