Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Meh, nothing we say here is going to change anything. Only Apple, Intel, AMD and nVidia know about the politics of the whole thing. I for one think ATi, while they don't offer the strongest of graphics cards, beat nVidia hands down on bang for buck and their naming scheme. That and Intel are upping their game on the integrated graphics front.
 
If the dispute between Intel and Nvidia doesn't get resolved (which it probably will), Macs will have to use chipsets by Intel. I find it highly unlikely that Intel would go and make a contract with AMD to make alternative chipset to the Intel one. With the Intel chipset they'd probably be stuck with an Intel IGP.

For the discrete GPU they could go either way with Nvidia or ATI. I'm guessing Apple will stay with Nvidia to unify their platform and cut development costs. A couple of weeks ago Nvidia announced a bunch of new mobile GPU which consume quite a bit less power than the ATI counterparts.
 
as annoying as it is to go back to intel, Apple has been riding the back of the nVidia hypetrain to peddle iGPUs in "Pro"-designated machines as some supposedly inflation-busting tactic to lower prices that really only raises the bar of entry for a decent machine at a time when people are considering upgrading less often.

If Intel's iGPUs returning to the low-end is the price we have to pay for dGPUs in the MBPs and iMacs, across the board like it used to be, then I'm all for it.

But knowing Apple, they will simply keep the iGPU models integrated as their snowballing sales figures will suggest to them the market has accepted their ruse and nobody really needs a dedicated gfx unit anyways, even if their spending £1500 on a computer.
 
as annoying as it is to go back to intel, Apple has been riding the back of the nVidia hypetrain to peddle iGPUs in "Pro"-designated machines as some supposedly inflation-busting tactic to lower prices that really only raises the bar of entry for a decent machine at a time when people are considering upgrading less often.

If Intel's iGPUs returning to the low-end is the price we have to pay for dGPUs in the MBPs and iMacs, across the board like it used to be, then I'm all for it.

But knowing Apple, they will simply keep the iGPU models integrated as their snowballing sales figures will suggest to them the market has accepted their ruse and nobody really needs a dedicated gfx unit anyways, even if their spending £1500 on a computer.

Yes... DO IT. Bring back QuadroFX.
The only good stable peice of technology nVidia are capable of.

If this works then it gives Apple a good bitch slap in the face for such a stupid thing. I like Mac OSX but this stupidity is what is keeping me on this eMac.

Why cant they put the GTX285 and Quadro installed by default instead of buying it and installing it separately. Oh wait but it doesn't have ****** Mini DisplayPort. :/
 
But for NVIDIA (or any other manufacturer) not to be able to buy licenses to manufacture/produce a Nehalem compatible chipset, would put Intel squarely as a target for a anti-trust lawsuit.

Nvidia can buy a license to produce Nehalem chipsets. They are not going to do that though (yet) as they believe their current contract with Intel allows them to produce Nehalem chipsets.

The dispute is quite simple really: Intel wants more $$$ from Nvidia but Nvidia don't want to give it to them that easily.
 
Good riddance to bad rubbish. As long as we don't get the X4500 in the 13" (probably wont happen since it's now a pro).
 
Wait...
A GT 120 is just a 9500GT which is a 8500GT
and the GT130 is a 9600GT qhich is an 8600GT... :eek:

Nvidia ran out of numbers for the GPU:s. Instead of going to 10,000 they decided to start over. In order to show where their older GPU:s compare to the newer ones they renamed them according to the new scale. Hence why they went to 2xx instead of 1xx with the new GPU:s. Anything in the 2xx series will be a new chip, below is older.
 
The difference between a X3100 and a NV9400 is like night and day! My day job is working in computer gaming and the new NV 9400 based machines can play newer Mac games in development with decent frame rates.

The X3100 has to have a special low end path written for it for it to even run at all in many cases.

On the PC the 3D Mark 05 for the three cards are below :-

NV9400 - 3692 4.6x Faster
X4500 - 1332 1.6x Faster
X3100 - 797

The newer 4500 card is a lot better than the X3100 but that does not mean it is in the same league as the 9400. (All these cards use integrated system memory).

Edwin

So what you're actually complaining about is the fact that you went out and purchased a laptop completely unsuitable for what you need to do - and it is Apple's fault because of it. If what you required was something with sufficient grunt and you're a developer, why didn't you go for their Pro range? MacBooks are consumer level laptops - you're trying to use your BWM Mini as a bulldozer.

As for the jump of assuming Intel - why? I have an iMac with an ATI GPU, why would it be strange for MacBooks to have an ATI graphics card? as for Intel's chipsets, why not? Apple doesn't use the Intel wireless or ethernet controller and yet Intel restrain themselves from throwing a wobbly.
 
Nvidia ran out of numbers for the GPU:s. Instead of going to 10,000 they decided to start over. In order to show where their older GPU:s compare to the newer ones they renamed them according to the new scale. Hence why they went to 2xx instead of 1xx with the new GPU:s. Anything in the 2xx series will be a new chip, below is older.

I know that. But the technology being sold in a "Pro" version is actually entry level!!! That was my point. nVidia has a recent history where it has just been renaming.
 
It doesn't make sense for Apple to drop nVidia now for a few reasons:

1. All Macs come standard with an nVidia GPU and many of them only have nVidia GPU options.
Hence the switch. Besides this just refers to the chipset, Apple can still use Nvidia GPU's

2. Open CL mostly supports nVidia GPUs.
OpenCL supports the current Radeon Cards in the iMac and Mac Pro.

3. ATi is an AMD brand and Apple uses Intel chips.
Doesn't make a bit of difference. ATI chips run on Intel chipsets (whilst providing more power per cost might I add). That statement is exactly the same as Nvidia making their own chipsets yet their cards run on Intel chipsets. Both AMD and Nvidia would be screwed if they only made GPU's for their own chipsets.

4. Intel integrated GPUs (if you can call them that) suck, though Apple has switched to them before.
No one said they were going to use them. an Intel X4500 in a "pro" laptop would be ridiculous even by Apple's standards. I could see an X4500 (or whatever IGP Nehalem uses) in a Mini though.
 
Nvidia ran out of numbers for the GPU:s. Instead of going to 10,000 they decided to start over. In order to show where their older GPU:s compare to the newer ones they renamed them according to the new scale. Hence why they went to 2xx instead of 1xx with the new GPU:s. Anything in the 2xx series will be a new chip, below is older.

So by 2015, we'll be back to GeForce FX 5200, oh wait I mean GeForce GTX 5200. nVidia's naming scheme is a mess and it doesn't help that they keep rebranding, renaming and re-everythinging their old cards over and over again.
 
So by 2015, we'll be back to GeForce FX 5200, oh wait I mean GeForce GTX 5200. nVidia's naming scheme is a mess and it doesn't help that they keep rebranding, renaming and re-everythinging their old cards over and over again.

Well, they don't have to do it for at least 7 generations now. If they decide to to 1000-names again they have even more generations. Unless they decide to drop the number scheme. ATI is no better in their top segment with the 47x0 and 48x0 series cards.

Better look at the actual specs than the name of the card.
 
If Apple dose this it would be a shame. I have used used nVidia chipset based motherboards since they first came out, there fast, reliable and very tweekable. and I'm sure the engineers at Apple love making tweeks to the nVidia chipset to give Apple an edge.

There have been some problems with some nVidia products, but that's ultimately TSMC's fault not nVidia's. TSMC make chips for practically everyone except Intel. But AMD have started a new independent company called Global Foundries that is starting to make there chips and nVidia is rumored to be in talks with GF to make there chips instead of TSMC.
 
I hope it's true myself. I'm really not a fan of Nvidia - their drivers specially. Then there's their dodgy re-branding of old GPU's to make customers think they're buying new technology which they aren't.

Would be good, IMHO, to see Apple buying a share of the ATI operations of AMD. ;)
 
Why doesn't Apple just stop this and go with the people that integrate the Solutions together? That would really piss the two companies off and AMD will have the revenue to create another Athlon 64 Goliath. The opterons still outperform the Nehalem Xeons so why not or run really cool for a bit of a speed drop?

(Yes Its fun in my world isnt it :apple:)
 
I had a feeling this wouldn't last long. Defects = NVidia's middle name. It was on the verge of becoming Apple's middle name too.
 
Assuming Wikipedia is accurate, the effected 8600M were in in computers manufactured between May 2007 and September 2008 - I wonder, therefore, the replacement motherboards; are they still bundling the faulty 8600M or have they done a new run of 8600M chips and the replacements don't have the manufacturing flaws?
 
Here's a relevant question...

WHY are Intel's graphics chip sets so inferior to other manufacturer's graphics chip sets? Intel is very very good at manufacturing a zillion different types of chips, from tiny component chips all the way up to main CPU processors. What's so different about graphics chips? Why is it so hard for Intel to create good graphics chips when they're good at making every other type of chip?
 
It's kind of ironic especially cuzz ATI graphics card are far better optimized for Mac OS and always have been. Yet Apple decided to go Nvidia.

I'd welcome good old ATI/AMD card in all macs on all fronts. Mac OS X flies on ATI cards.
 
It doesn't matter which GPU they use as long as it's not a downgrade in performance.

Intel has been working a long time on GPUs, so they might finally be able to come up with something as powerful as GeForce 9400.

Anyway, there's always ATi if Intel cannot make better graphics chip.
 
That's too bad ... I wonder if the failure rate of AMD/ATI graphics products compares ...

Which graphics are better for the Mac on the whole, anyway? ATI or nVidia? Or is the better question: which company inked the sweeter deal with Apple this year? Considering all the hype marketing by Apple recently in the embrace of nVidia on the MacBook family, I'm not terribly surprised at their announcement. No one likes a pie in the face ... :rolleyes:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.