Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I am not so sure it would be a mistake to walk away from NVidia. There is growing evidence that NVidia knew about the problems with 8600M it sold to Apple and lied about it. Heck, NVidia is even trying to avoid answering the inquiries by its own insurance company which is covering reimbursement for bad chips. There was also a suggestion that the new 9400 series had problems with heat early on (this was what partially caused the delay in its release) and that NVidia was cherry-picking them for Apple.

Most of this information is covered by the Inquirer and no matter what you think about the site, its information makes sense. If true, look for Apple to change graphics (to ATI?) when Nehalem is introduced. Hopefully Apple will also use a low-end ATI card and not Intel for integrated graphics.

BTW, it also sounds like Dell showed the door to NVidia, as well. This gives more credence to Apple doing the same.

It makes me happy that I am hobbled with a 'crap Intel graphics card and chipset' given the fiasco that is unfolding with 8600, and mark my words, its going to happen to the 9400M customers as well.

I have an iMac and I deliberately bought the high end of the 20inch model - I could have bought the 24 inch model with an Nvidia chip but decided against it; I'm happy I did given all the overheating problems I see on the Mac forums.

When it comes to refreshing my MacBook at the end of next year or possibly the year after - if Apple decide to keep Nvidia for their MacBook and MacBook Pro range, they've lost a customer. Its just that simple. I couldn't care less if they're going to replace it - if it means I have to put up with my main board being replaced four times as coocooforcocoap has experienced, it isn't worth it. The replacement may be free but my time and loss of productivity isn't free.
 
the day Apple switches to AMD is when hell freezes over and that only happened when apple made iTunes for windows lol
 
Heck, NVidia is even trying to avoid answering the inquiries by its own insurance company which is covering reimbursement for bad chips.
There was also a suggestion that the new 9400 series had problems with heat early on (this was what partially caused the delay in its release)

Errr, a part had heat problems prior to release. This is a newsflash?
The 9400 part has a Northbridge (with integrated graphics) and southbridge all inside of one package. Heat problems wouldn't be an expected problem? Seriously? Likewise the number of steppings till had worked out the bugs would be larger or smaller given collapsed functionality split over two chips which normally have a number of steppings each to stability?

Similarly, like name a insurance company that pays off on multimillion dollar policy without moaning about and/or looking at disqualifying conditions associated to it?

It also doesn't seem to be the Inquirer. Look at the byline of all these doom-and-gloom stories about Nvidia. Notice the same name constantally reappearing all the time? Notice the same name since bounced from the Inquirer still stuck on the very same tune at ?


... Dell shown the door ...

Go back to post #32 in this thread. Not going to repeat it here, but there are different reasons why Nvidia is disappearing from many of Dell's offerings (e.g., pre-Nehalem systems being shown the door.)




and mark my words, its going to happen to the 9400M customers as well.
Do you have any evidence to support that or is it just in the "nvidia is bad so therefore it is true" camp?


given all the overheating problems I see on the Mac forums.

What overheating problems? Overheating can be caused by Apple not balancing out the thermal balance inside the device.

Its just that simple. I couldn't care less if they're going to replace it -

There is a problem with the 8600M and multiple replacements is bad. But it isn't like Apple hasn't had "multiple replacements" either on other problems.
 
Do you have any evidence to support that or is it just in the "nvidia is bad so therefore it is true" camp?

Four years of crap Nvidia support and product quality under my belt - I'm not expecting earth shattering, desk shaking, orgasm producing graphics at speeds that would make an old lady collapse on her zimmer frame but I do expect a reasonable degree of stability.

There is a reason why I have avoided Nvidia for almost a decade - if it isn't for their shonky product quality it is for their shonky driver quality.

What overheating problems? Overheating can be caused by Apple not balancing out the thermal balance inside the device.

Bull, that is the same pathetic and lame excuse that Nvidia used when their GPU's started to fail - blaming everyone else except themselves.

There is a problem with the 8600M and multiple replacements is bad. But it isn't like Apple hasn't had "multiple replacements" either on other problems.

The supply chain should have been flushed the moment that the fault was found and then the chain should have been resupplied. Either Apple isn't pulling all the faulty products out of the supply chain or Nvidia is still sending out shonky products. I'll let you be the judge.
 
Four years of crap Nvidia support and product quality under my belt - I'm not expecting earth shattering, desk shaking, orgasm producing graphics at speeds that would make an old lady collapse on her zimmer frame but I do expect a reasonable degree of stability.

There is a reason why I have avoided Nvidia for almost a decade - if it isn't for their shonky product quality it is for their shonky driver quality.
Driver issues from nVidia? Are you sure it's not ATI you're talking about? On Linux? I've been a strong purchaser of ATI hardware for the past decade but nVidia still edges ahead in drivers.

Bull, that is the same pathetic and lame excuse that Nvidia used when their GPU's started to fail - blaming everyone else except themselves.
Spend a few minutes looking at Apple's cooling solutions and razor's edge thermal behavior.
 
Eidorian said:
If AMD/ATI does have a license to make chipsets/controllers and IGP's for Intel's DMI base platforms I could imagine them providing a solution for Apple.
True, but that would raise an interesting question regarding 'conflict of interest' where Intel will need to transfer large amounts of information relating to their CPU to a vendor who not only makes chipsets and GPU's but also a product that competes with Intel. I wonder how they will get around that.

Outer quote then on to inner one.

How they get around that. Well Intel competes now with the chipset vendors, so they have relationships with competitors now. It is called a Non Disclosure Agreement. It can be written up so that have to throw up a firewall between parts of your company for limited disclosure. Besides Intel's roadmap is usually about 12 months out in front of delivery..

Furthermore, don't really have to give out all the details of the core. They had to give out all the details about DMI. If the DMI standardization details are not enough to come out with compatible parts then it kind of sucks as a standard doesn't it?? The whole point of DMI is that it can deal with multiple Northbridge implementations. Sure you'll need to eventually do QA to make sure the South/North bridges actually work but that pretty late in the design process when some aspects of the CPU that the Northbridge is going to be packaged with are out in the press anyway. If Intel has a "neutral" lab where pre-release CPUs and chipsets can be tested, problem solved. (or a firewalled lab at AMD that is for chipset folks only get pre-release.)

All very large tech companies do this. HP has OS which compete with windows. IBM has stuff that competes with just about everyone else of any size. Apple routinely enters spaces where its smaller partners operate.

The bigger reason for AMD/ATI to not do business with Intel is because it is a distraction from getting maximum synergies out of their combined company. Once AMD full digests ATI into a smooth integration then they may have spare time to do something Intel. In the meantime it would very dubuious to spit efforts because the drama of integrating is over. That is just more complexity than is required to deal with.


Now the inner quote.

Putting a upper end Integrated graphics on the other side of DMI is going to create problems. DMI was designed as a Northbridge to Southbridge communications channel. All of the other Integrated graphics implementations place the IGX next to the memory controller. This makes tons of sense because need to keep the graphics cores filled with data.

With DMI you are putting the graphics cores farther away from the memory they leverage to get graphics work done.

Furthermore the typical high end PCI-e channels are also usually tied to the Northbridge. Those are the channels where higher end graphics are usually added to. Again that is on the other side of the DMI connection.

The chip package sitting on the other side of the DMI link from the nehalem CPU package is really the southbridge. First, does do chipset vendors really want to be in the biz of just selling Southbridge? Being squeezed out to lower margin part isn't a good track to be on.

They could go down the road of merging discrete graphics (with its own memory controller) with the rest of the I/O. There is resistance to that in the PC world because folks are fixated on non motherboard mounted discrete graphics (at least in the non laptop space). Once on can throw more power and lower thermal constraints will be able to crank out higher numbers. The non-IGX/max-cores CPU packages from Intel are all upper end, performance primary object offerings that have the wide/fast PCI-e connections. The normal approach is to just leverage thosue PCI-e connections for discrete graphics. (otherwise what else going to put on the those links? Who is going to buy the i7 class part and not use some of the interfaces to it? )
 
Driver issues from nVidia? Are you sure it's not ATI you're talking about? On Linux? I've been a strong purchaser of ATI hardware for the past decade but nVidia still edges ahead in drivers.

I used neither; since I don't do gameing - guess what I use? :) I used a Matrox
G550 w/ 32MB VRAM; it was a marvelous device. The drivers were stable, the hardware was reliable for over 8 years.

Spend a few minutes looking at Apple's cooling solutions and razor's edge thermal behavior.

I've had a look but they're within the specifications Nvidia gives out - if it were me, though, I'd be questioning why they put 'thinness' ahead of 'making sure the damn thing works'. Then again, one only needs to look at the people here who go on and on about how an extra 0.00001 pound is a massive amount and how thin is incredibly important.
 
I used neither; since I don't do gameing - guess what I use? :) I used a Matrox
G550 w/ 32MB VRAM; it was a marvelous device. The drivers were stable, the hardware was reliable for over 8 years.
Which is quite nice but Apple isn't going to be considering Matrox as a GPU vendor at this point. The last time I considered Matrox was back when the G200 came out. Now I feel old.

I've had a look but they're within the specifications Nvidia gives out - if it were me, though, I'd be questioning why they put 'thinness' ahead of 'making sure the damn thing works'. Then again, one only needs to look at the people here who go on and on about how an extra 0.00001 pound is a massive amount and how thin is incredibly important.
Oh so true.
 
Which is quite nice but Apple isn't going to be considering Matrox as a GPU vendor at this point. The last time I considered Matrox was back when the G200 came out. Now I feel old.

Imagine if they did consider Matrox :) I would be happy lad having a MacBook with a Matrox graphics card, intel chipset and cpu - it would definitely be something out of left field if it did happen. I guess then people would only have the performance to complain about rather than the host of issues that plague the Nvidia and Intel drivers (and hardware).
 
And if they go with something other than Nvidia, I won't buy it. So that balances out nicely. ;)

Nvidia's days of making Apple IGPs are numbered.

The Inq merely stated the obvious, albeit in a somewhat misinformed way. The new range of Intel CPUs will have integrated mem-controller; and except for the high-end LGA1366, it will also contain integrated PCIe, and a dual DMI + FDI interconnect with the south bridge. The dual core versions will also have intel graphics integrated into the same MCM with the CPU and the aforementioned components.

So unless you are talking about pairing high-end LGA1366 with integrated graphics, then NV is never going to be an option. Only LGA1366 (Core i7 9xx) would have a high enough bandwidth off-package connection for a half-way decent graphics solution worth while; it has the QPI connection to a PCIe bridge, but requires a QPI license (which intel disputes NV having the access to).

The LGA1156 platform will have the PCIe controller integrated on package, and the IO-interconnect is the DMI which only operates at 2.0 GT/s, which is hardly sufficient for any remotely decent graphics with UMA access to system memory. And on top of that, it's not even certain that NV has a license for interfacing through Intel DMI either.

And as said before, the lower end dual core will have Intel graphics integrated, and certainly would preclude any Nv IGP solution. So the anser to the question of whether NV or ATI is going to make IGPs for Apple's intel platform, the answer is clearly NEITHER. And the migration over to these new platforms should be well underway by later this year (LGA1156 first arrives with Lynnfield and Clarksfield) and early next (Arrandale and company will arrive). And beyond that, I wouldn't be surprised if Apple dumps even the discrete NV GPUs for Larrabee when that arrives early next year.

NV's days with Apple are certainly numbered; and it doesn't really take any insider information to figure that out.
 
Nvidia's days of making Apple IGPs are numbered.

The Inq merely stated the obvious, albeit in a somewhat misinformed way. The new range of Intel CPUs will have integrated mem-controller; and except for the high-end LGA1366, it will also contain integrated PCIe, and a dual DMI + FDI interconnect with the south bridge. The dual core versions will also have intel graphics integrated into the same MCM with the CPU and the aforementioned components.

So unless you are talking about pairing high-end LGA1366 with integrated graphics, then NV is never going to be an option. Only LGA1366 (Core i7 9xx) would have a high enough bandwidth off-package connection for a half-way decent graphics solution worth while; it has the QPI connection to a PCIe bridge, but requires a QPI license (which intel disputes NV having the access to).

The LGA1156 platform will have the PCIe controller integrated on package, and the IO-interconnect is the DMI which only operates at 2.0 GT/s, which is hardly sufficient for any remotely decent graphics with UMA access to system memory. And on top of that, it's not even certain that NV has a license for interfacing through Intel DMI either.

And as said before, the lower end dual core will have Intel graphics integrated, and certainly would preclude any Nv IGP solution. So the anser to the question of whether NV or ATI is going to make IGPs for Apple's intel platform, the answer is clearly NEITHER. And the migration over to these new platforms should be well underway by later this year (LGA1156 first arrives with Lynnfield and Clarksfield) and early next (Arrandale and company will arrive). And beyond that, I wouldn't be surprised if Apple dumps even the discrete NV GPUs for Larrabee when that arrives early next year.

NV's days with Apple are certainly numbered; and it doesn't really take any insider information to figure that out.

Regarding Intel and integrated video; I'm wondering whether we'll see an attempt by video card vendors to try and block GPU/CPU integration - not saying that the integration is a bad thing but having seen ridiculous claims by competitors gain traction in the past it will be interesting to see how it is approached.
 
A lousy 15% increment of the RS880 (or 890? variant) over 9400m to switch to AMD ?. I don't think so. If there is not a minimum 40% increment in performance and a significant power consumption reduction (potentially 20% or more), it would not be worth it. The successor to 9400m is coming and will reap at least 20% increment in speed and 20% decrease in power consumption!.

AMD must have cross licensing agreements on CPUs and chipset core technologies to be able to make chipsets to Nahelem class CPUs. They prefer not to as the Istanbul class Opertons and its successors are competitive enough for the price they charge.

ATI needs to make serious concessions to pricing in order to win the business assuming that they have the part good enough to sell to Apple. I am holding out on this because none of us wants that crappy Intel IGP chip. Oh, give on on that already Intel, why waste R&D money?.
 
And beyond that, I wouldn't be surprised if Apple dumps even the discrete NV GPUs for Larrabee when that arrives early next year.

There are no mobile Larrabee parts on the horizon early next year. (Have seen earlier posts that try to twist http://www.intel.com/pressroom/archive/releases/20090408corp_a.htm into Larrabee mobile. The first couple of paragraphs are about mobile/MID. The end, where Larrabee gets mentioned, are not focused on that space. ) More likely, it would appear as a card for the Mac Pro (there has to be a first time Mac Pro's get a bleeding edge card; they are kind of due ;-) ).

Speculation on the web about the TDP of a 30 core Larrabee put it around 300 W TDP. [ those might have been a bit high because think folks thought it would come out at 45 nm. Looks like Intel is going to skip 45nm and come out at 32nm. That won't be an order of magnitude drop though.] Perhaps they won't use 30 cores but they would have to cut that by around 10 times, 30W TDP, to get into the 30-60W TDP of the other mobility solutions. If there is a linear ratio between cores and power you're now down in the 3-6 cores range. How much "graphics juice" is it going to have at that point?

Discrete on the motherboard? Not likely. Part of the Larrabee graphics solution is software. They need to make that work well first. Then they can start doing refinements to keep it working while decrease the power requirements. It is going to be easier to make it work with a larger power budget.


Intel has to first ship something that competes in the discrete card space. Then they can do tweaks/shrinks on that to play in the discrete mobile space.
Intel already had the GMA stuff in the "limited power consumption" space. Would be very surprising if Larrabee was targeted to kill that off early on in its development.
 
Regarding Intel and integrated video; I'm wondering whether we'll see an attempt by video card vendors to try and block GPU/CPU integration -

The driving force here is "Moore's Law". With increasingly larger transistor budget you can get more in same slice of silicon. The other factor is physics/timing (and complexity); you don't want to devote too many transistors to a single "unit". That's one of the driving factors splittting things into multiple cores. Swimming upstream from Moore's Law is doomed. They all know that or should if had a clue. The whole System on a Chip (SoC) concept has been yelped about for the last 6-9 years. That's why Nvidia bind the I/O around the GPU. That worked short term, but long term the CPU is bigger black hole like gravitational source.


Throw in the increased used of Multi Chip Module (MCM) packaging into the commodity chip package space and it is even more fighting against tech progress.

There used to be more storage controller , firewire , usb discrete chip vendors/solutions too. The North/Southbridge chips have been consolidating those for the last couple of years.

The discrete graphics folks can aim for the niches or get into the CPU business (through GPGPU, OpenCL , or combo CPU+GPU ). Matrox is aiming at the niches multi-monitor (3-4 screens), reasonable power consumption. That relatively low volume of what the overall graphics market is. If target a 2-3% niche and give up on high growth then that is a survivable over the long term. Folks who want to be geared for volume better figure out how to deliver Teraflops to end users though.

As a consumer choice.... higher chance will get screwed here. As long as AMD and VIA stay viable options in various niches they'll keep Intel somewhat honest in the PC space. The smaller/embedded space still has lots of diversity.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.