Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Umbongo

macrumors 601
Original poster
Sep 14, 2006
4,934
55
England
It seems that four board partners are seeing G92 and G94 chips going bad in the field at high rates. If you know what failures look like statistically, they follow a Poisson distribution, aka a bell curve. The failures start out small, and ramp up quickly - very quickly. If you know what you are looking for, you can catch the signs early on. From the sound of the backchannel grumblings, the failures have been flagged already, and NV isn't playing nice with their partners.

Why wouldn't they? Well, the G92 chip is used in the 8800GT, 8800GTS, 8800GS, several mobile flavours of 8800, most of the 9800 suffixes, and a few 9600 variants just to confuse buyers. The G94 is basically only the 9600GT. Basically we are told all G92 and G94 variants are susceptible to the same problem - basically they are all defective. Any guesses as to how much this is going to cost?

From the look of it, all G8x variants other than the G80, and all G9x variants are defective, but we have only been able to get people to comment directly on the G84, G86, G92 and G94, and all variants thereof. Since Nvidia is not acknowledging the obvious G84 and G86 problems, don't look for much word on this new set either - if they can bury it, it will drop their costs.

In the end, what it comes down to is that the problem is far bigger than they are admitting, and crosses generational lines, process lines, and OEM lines. Nvidia is quick to point the finger at everyone but themselves, but after a while, the facts strain those cover stories well past breaking point. There is a common engineering failure here - this problem is far too widespread for it to be anything else. The stonewalling, denials and partner gagging is simply a last-ditch attempt at wallet covering.

The full article is an interesting read:

http://www.theinquirer.net/gb/inquirer/news/2008/08/12/nvidia-g92s-g94-reportedly

We aren't going to be able to say the 8800GT for the Mac Pro is definatly going to be able to fail, but it is definatly worth considering the ATI 3870 based on the current state of affairs. Especially if you are buying after the original purchase.
 
Crap. This is especially bad since the new MacBook Pros will probably use 9600 GTs. Son of a *****.
 
Hmm.. that's a interesting find. Perhaps I should consider returning my newly purchased 8800 GT card, cause at the end of the day I don't wanna be wasting money on a defective bit of hardware.
 
Crap. This is especially bad since the new MacBook Pros will probably use 9600 GTs. Son of a *****.

Why do you think Apple will keep using Nvidia cards in their new Macs?
Give the new MacBook Pros an ATI card.

Apple should stop using faulty Nvidia from now on. That's what I call them: "Faulty Nvidia".
 
With all the very recent developments with nVidia's graphics chips, things don't look good to say the least. :rolleyes: Hopefully, they'll get it sorted, but if they don't work with vendors to replace defective products, it may cost them future sales.

Definitely steering me towards ATI products. ;) HD 3870 to be specific. :D
 
Why do you think Apple will keep using Nvidia cards in their new Macs?
Give the new MacBook Pros an ATI card.

Apple should stop using faulty Nvidia from now on. That's what I call them: "Faulty Nvidia".

Like the faulty ATI X1900 cards? Or the 2600s that
showed artifacts in both the Mac Pros and Alu iMacs?
 
All of their correct stuff isn't actually their scoop. Their big claim to fame is "leaking" XP SP3 and SP1 before anyone else did, which is more akin to "predicting rain in Seattle" as one Giz poster wrote. They are notoriously pro ATI, and are not in the same league as the big boys at Ars, Anandtech, and Tom's. Their posts are made of FUD; rather than hard evidence to back up their claims, they have hear-say. Hardly a reasonable substitute.

Is it obvious that Nvidia has some defective chips? Yes. Is it obvious that the number of defective chips is higher than usual? Yes. Are 100% of these chips defective? Only if you write for a sensationalist blog that blows everything out of proportion without regard for good, thorough journalism.
 
Those were individual cards having problems. The Nvidia case is a systemic failure of all cards in a series. A slight difference. :rolleyes:

You would prefer a brand that has been shown to
have faults rather than one that is rumoured to?

[NOTE]
This remark pertains to cards actually used in Macs.
 
This one is pretty bad!
http://online.wsj.com/article/SB121859523254035725.html?mod=googlenews_wsj

Reuters:
"Separately, the company said it would take a charge of $150 million to $200 million in its second quarter to cover anticipated warranty, repair and return costs associated with a defect on some of its chips.

The charge would be to cover expected warranty, repair, return, replacement and other costs, arising from a weak packaging material used in some of its previous generation graphics chip products sold in notebook PCs."

In theory, in the current economic climate, this is the kind of problem that can end even medium sized businesses. Fortunately nVidia should be big enough to bounce back, not to mention with their new product line up due... Hopefully they won't be defective!

More Links:
http://www.theregister.co.uk/2008/07/03/nvidia_forecast_glitch/
http://online.wsj.com/article/SB121503619200224335.html?mod=googlenews_wsj
 
You would prefer a brand that has been shown to
have faults rather than one that is rumoured to?

[NOTE]
This remark pertains to cards actually used in Macs.
Don't you see that these NVidia cards are faulty? You must be an Nvidia fan I guess.
 
You see i'm a huge Nvidia fan... but in the iMac i'm purchasing in 2 weeks it looks like i'm going to has to get an ATi as the 8800GS is also part of this latest rumor. Although i'm waiting for Nvidia or Apple to get back to me regarding this issue! (Not that they will!).

My second every ATi... my first one was appalling and i ended up chucking it. That was a good 5 years ago, and anyway, it couldn't be as bad as a defective Nvidia anyway :D :D
 
Don't you see that these NVidia cards are faulty? You must be an Nvidia fan I guess.

There is no strong evidence the 8800GTs in Macs will prove to be faulty.
The HD2600XTs, by way of contrast, have already suffered problems.

http://www.xlr8yourmac.com/feedback/ATI_2600_firmwareUpdate.html

Note: the rumoured HP/NVidia solution is the same as the rumoured
Apple/ATI solution. Namely, to increase the minimum fan speed. Both
of these rumours point to heat problems. But then, I don't place much
faith in gossip & rumours.

However, I do think it's fair to say that both these graphics cards
manufacturers are struggling to come to terms with power usage
and heat dissipation. The current path of development is possibly
unsustainable.
 
I've had no problems with my 8800GT and I'll swallow my words if something goes wrong (and probably come back here to bitch on the forums :)) but seriously it's quiet, reliable and I've had not problems so far. Why all this doom and gloom :confused: anyway if things go wrong I always have my flashed 8800GTS 512mb :D or you could just up the fan cycle on your 8800GT if your that worried.
 
Why do you think Apple will keep using Nvidia cards in their new Macs?
Give the new MacBook Pros an ATI card.

Apple should stop using faulty Nvidia from now on. That's what I call them: "Faulty Nvidia".

That would be good. Unfortunately, the best we could hope for is the 3650, barely any faster than the 8600 GT, and actually not faster in all cases. Not really an upgrade.

Like the faulty ATI X1900 cards? Or the 2600s that
showed artifacts in both the Mac Pros and Alu iMacs?

Artifacts are one thing... sounds like overheating memory, fixed by better cooling or reduced clocks.

GPU death is quite another.

People with MBP's are having their 8600's drop like flies. It's all over the MBP forum, go have a look.

This is quite a serious problem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.