Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A chipset is the interconnecting processor on the motherboard/MLB, it controls the transfer of data between all of the other components.

NVIDIA, as a whole, doesn't suck, nor do most of their products. The GeForce 8600M GT is a certifiably bad product as are the early revision GeForce 9400M and early revision GeForce 9600M GT as well. They fail, and NVIDIA made horrible choices in making that hardware. It doesn't mean that they suck, have always sucked and always will suck. They, made a series of lemons in close proximity of each other. You just happened to have terrible timing. :-(

it seems only apple gets the short end of the stick, every single nvidia card ive had in either desktop or laptop form has never failed or had any strange issues, even overclocked its entire life, ive still never had issues with either my geforce 7800GT's in SLI or 9800GT
 
get ready guys, AMD phenoms and ATI discrete graphics in the next MBPs and MBs

wouldnt be bad as long as apple lowers the price, right now you can get a quad core phenom II laptop for under $600

this laptop has

AMD Quad core 1.7
4GB Ram
500GB HDD
ATi 4250 video card

AMD is always the "Budget" player and bang for the buck
 
wouldnt be bad as long as apple lowers the price, right now you can get a quad core phenom II laptop for under $600

this laptop has

AMD Quad core 1.7
4GB Ram
500GB HDD
ATi 4250 video card

AMD is always the "Budget" player and bang for the buck
AMD's mobile quads aren't really much to talk about.
 
Aba

it seems only apple gets the short end of the stick, every single nvidia card ive had in either desktop or laptop form has never failed or had any strange issues, even overclocked its entire life, ive still never had issues with either my geforce 7800GT's in SLI or 9800GT

Agreed.... I've only had one Nvidia GPU failure, due to abuse (after a laptop CPU replacement the heat pipe no longer contacted the GPU, and the heat buildup fried it).

I've never had hardware problems with ATI, but their drivers have been a nightmare since my first ATI card in 1997.

I currently only have one ATI GPU, the HD 4600 in my Core i7 Studio XPS. (All the laptops are either Quadros or Intel IGP, and the other desktops are Quadros.) The XPS occasionally (every month or two) will hang for a minute, go black, and come back with all windows intact and with a popup that says ("Your ATI driver failed, and has been restarted" - yes, Windows 7 can unload, reload and restart the graphics driver without a reboot.)

Anything But ATI is my motto.
 
This is what I did with my Macbook 13'' with an Intel GMA X3100. This computer is almost 3 years old and it's GPU was outdated at launch. I'm in the market for a new laptop and I am really hoping Apple comes up with some better option for the lower end Macbooks soon.

I mean at least i3....that would be enough for a purchase for me.
 
it seems only apple gets the short end of the stick, every single nvidia card ive had in either desktop or laptop form has never failed or had any strange issues, even overclocked its entire life, ive still never had issues with either my geforce 7800GT's in SLI or 9800GT

I've seen an EVGA NVIDIA GeForce 8800 GT desktop card, a Sparkle NVIDIA GeForce 9500 GT, and a plethora of laptops with the GeForce 8600M GT (both MacBook Pro and PC laptops) go down. The earlier revision GeForce 9400M cards were also known to fail on occasion. That said, that's all within a a couple generations. Things both before and since haven't failed as epically.

As far as Apple's use of NVIDIA stuff, I'd agree, it has been lacking in terms of competitive performance, though, for what it is, the 320M is pretty great. But yeah, nothing anywhere near as amazing as the stuff they make for PCs.
 
Agreed.... I've only had one Nvidia GPU failure, due to abuse (after a laptop CPU replacement the heat pipe no longer contacted the GPU, and the heat buildup fried it).

I've never had hardware problems with ATI, but their drivers have been a nightmare since my first ATI card in 1997.

I currently only have one ATI GPU, the HD 4600 in my Core i7 Studio XPS. (All the laptops are either Quadros or Intel IGP, and the other desktops are Quadros.) The XPS occasionally (every month or two) will hang for a minute, go black, and come back with all windows intact and with a popup that says ("Your ATI driver failed, and has been restarted" - yes, Windows 7 can unload, reload and restart the graphics driver without a reboot.)

Anything But ATI is my motto.

If Dell supplied the driver, then it's their fault. Otherwise disregard that comment. I've had three ATI cards, two in a Mac, one in a PC. I've since recommended their cards to others and to the best of my knowledge they still work fine (drivers included). I think it's just a case of "I've had one bad experience and therefore I distrust this brand", which is fair as that is what we typically do as consumers. I like the price-tag on ATI's cards. It doesn't break the bank and I can still enjoy the games I want to play.
 
Well if Apple did go with ATI chipsets does anyone have a clue what they might use? I'm not tech savvy on that whole what fits on the motherboard. I would think they would have choices that would be good.
 
Well if Apple did go with ATI chipsets does anyone have a clue what they might use? I'm not tech savvy on that whole what fits on the motherboard. I would think they would have choices that would be good.

It is doubtful that we'd see ATI branded chipsets supporting Intel's processors for the same reason we won't be seeing NVIDIA chipsets past the 320M in Apple machines. Even if we saw ATI chipsets, they'd likely be branded as AMD and come along with a processor to match it. That said, I have no idea what's in their pipeline; it's safe to say that whatever is next could be considered if Apple is to go that route.

Otherwise, some form of Mobility Radeon HD 5700 or 5800 series chip as I haven't seen the 6 series up yet on a mobile GPU.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148a Safari/6533.18.5)

I am being suspicious that this whole discussion about MacBook Pros and stuff by NVIDIA and Intel is for preparing us for the upgrade.

In the end I think that what NVIDIA and Intel says are just tricks to get the audiences attention.

I mean if Apple chooses C2D again..they really must be crazy. Or they want to push the potential buyers in the upper models?

Oh I don't know..we'll see
 
You don't need good graphics in Apple products. They will use "old" chips for as long as it takes for Intel to catch up (about 100 years at the current rate I think :p ) and OSX machines will suffer as a result going back to their well deserved reputation of being utter CRAP for graphics and gaming, but it doesn't matter folks! Apple doesn't really care because they want everyone to buy iOS devices and dump the Mac once and for all in the long run anyway. This will simply hasten all that and before you know it, you will only be able to buy the iPad, iPhone, iPod Touch and the new iCrap desktop, featuring the ability to watch a movie on your screen and listen to music and even use Skype just like the iPhone can already do! Wow! We couldn't do ANY of that on a Mac before! Hooray! :)
 
I am being suspicious that this whole discussion about MacBook Pros and stuff by NVIDIA and Intel is for preparing us for the upgrade.


In the end I think that what NVIDIA and Intel says are just tricks to get the audiences attention.

Doubtful. NVIDIA's CEO is known for his frankness and not beating around the bush about topics of discussion. It could be a ploy on Intel's part, though. Although even that sounds at least slightly ridiculous.

I mean if Apple chooses C2D again..they really must be crazy. Or they want to push the potential buyers in the upper models?

Oh I don't know..we'll see

No, if they choose C2D again, it means that they recognize that they are able to get away with it. The vast sweeping majority of Mac mini, MacBook Air, and white MacBook customers don't even know what processor they have, or care for that matter. Not to say that EVERYONE doesn't or that YOU don't, but most don't. The only exception, with regards to C2D models still shipping is the 13" Pro, which I'm guessing will be discontinued before too long, both for that reason and due to its similarity to the white MacBook in terms of on-board (as in MLB) features and its similarity to the 13" MacBook Air in benchmarks. The only thing customers would be missing would be the 13" Pro's FireWire Port, Backlit keyboard, Aluminum enclosure, and IR sensor and only the first of those four are even practically useful.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148a Safari/6533.18.5)

Yebubbleman said:
I am being suspicious that this whole discussion about MacBook Pros and stuff by NVIDIA and Intel is for preparing us for the upgrade.


In the end I think that what NVIDIA and Intel says are just tricks to get the audiences attention.

Doubtful. NVIDIA's CEO is known for his frankness and not beating around the bush about topics of discussion. It could be a ploy on Intel's part, though. Although even that sounds at least slightly ridiculous.

I mean if Apple chooses C2D again..they really must be crazy. Or they want to push the potential buyers in the upper models?

Oh I don't know..we'll see

No, if they choose C2D again, it means that they recognize that they are able to get away with it. The vast sweeping majority of Mac mini, MacBook Air, and white MacBook customers don't even know what processor they have, or care for that matter. Not to say that EVERYONE doesn't or that YOU don't, but most don't. The only exception, with regards to C2D models still shipping is the 13" Pro, which I'm guessing will be discontinued before too long, both for that reason and due to its similarity to the white MacBook in terms of on-board (as in MLB) features and its similarity to the 13" MacBook Air in benchmarks. The only thing customers would be missing would be the 13" Pro's FireWire Port, Backlit keyboard, Aluminum enclosure, and IR sensor and only the first of those four are even practically useful.

I hope Apple knows what they're doing and they do something well.

Only that I say
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148a Safari/6533.18.5)



I hope Apple knows what they're doing and they do something well.

Only that I say

I'll put it to you this way. They're very much aware of the NVIDIA/Intel contention, and anything they can do to get away from that and the critical reception that they'll get when system performance is still in yesteryear is something they'll do. Lucky for them, the marjority of MacBook Air users don't care. The majority of the Mac mini users don't care. The majority of the white MacBook users don't care. The only users that'll have serious gripe with it are the 13" Pro customers, and I can safely guarantee you that they WILL do something about it. They'll either discontinue it or they'll merge it and the white MacBook. Either way, they won't have "MacBook Pro" customers griping about the continued use of a Core 2 Duo.
 
They'll either discontinue it or they'll merge it and the white MacBook. Either way, they won't have "MacBook Pro" customers griping about the continued use of a Core 2 Duo.

Discontinue my next planned purchase? :eek: Yeah, that'll shut me up.

You make reasonable assumptions though and while the writing is on the wall, I still think the MBP13 will stay for at least another round albeit no discrete GPU. (Which is fine by me as I'm not a gamer.)

Something's got to give sooner or later though.
 
Discontinue my next planned purchase? :eek: Yeah, that'll shut me up.

Hey man, I'm not thrilled about it either as it was also MY next planned purchase. Though to be fair, it SHOULD have a Core i and either an NVIDIA (or ATI/AMD) IGP or a discrete GPU, but neither one of those are gonna happen and that's pissing a lot of people off.

You make reasonable assumptions though and while the writing is on the wall, I still think the MBP13 will stay for at least another round albeit no discrete GPU. (Which is fine by me as I'm not a gamer.)

Something's got to give sooner or later though.

Well, here's the thing, if it sticks around, does it have an Intel IGP and a Core i or the 320M and a Core 2? And which option is going to suck more for (a) that machine, (b) Apple (in terms of critical reception) and (c) the 13" MacBook Pro target market audience?
 
Last edited:
You make reasonable assumptions though and while the writing is on the wall

The interesting thing here is also that Apple seems to go through cycles where its Pro laptop line has a 12" or 13" model almost as an in-betweener between the entire lower-end line (MacBook or iBook G4) and the rest of the Pro line (15" or 17" MacBook Pro or PowerBook G4) and then it merges that model with the lower-end line. Then it makes a differentiating in-betweener, then it merges it back into the lower-end line. I'm wondering if we'll see it happen a third time.
 
Well, here's the thing, if it sticks around, does it have an Intel IGP and a Core i or the 320M and a Core 2? And which option is going to suck more for (a) that machine, (b) Apple (in terms of critical reception) and (c) the 13" MacBook Pro target market audience?

It all depends on which implementation brings the least amount of "embarrassment" to Apple. If indeed, that even bothers them?

Rationalizing the continuation of C2D+320M in the 13MBP might prove too difficult even *if* it performs better than Sandy Bridge+iGPU. I think Apple got its tit caught in the wringer on this one and yes, somebody's going to be pissed. Doesn't matter as long as it ain't me. :D

I still say kill off the 17" MBP.

Then it makes a differentiating in-betweener, then it merges it back into the lower-end line. I'm wondering if we'll see it happen a third time.

The MBSP is born. (Semi-Pro) :)
 
AMD's mobile quads aren't really much to talk about.

course not they are budget chips, but in the context of this thread, if apple did decide to ditch nvidia and go with amd for everything, they could not possibly sell a base model laptop for $1000 without being anything but ridiculously overpriced (more so than they are right now), apple would have to drop their prices well below $1K
 
It all depends on which implementation brings the least amount of "embarrassment" to Apple. If indeed, that even bothers them?

Rationalizing the continuation of C2D+320M in the 13MBP might prove too difficult even *if* it performs better than Sandy Bridge+iGPU. I think Apple got its tit caught in the wringer on this one and yes, somebody's going to be pissed. Doesn't matter as long as it ain't me. :D

I doubt they're worried about embarrassment, though they do want to please their customers and it does seem like more customers will be pissed regardless of whether the next refresh is C2D+320M or SandyBridge+IntelIGP. Again, may I note that this is only really a problem with the 13" MacBook Pro's target customer base. For the most part, the majority of the customers for the other C2D+320M machines won't care and will be fine either way. Regardless of what happens to the 13" Pro, it'll be an interesting move.

I still say kill off the 17" MBP.

It's kind of like the Mac Pro in that it's not for everyone, but it's definitely for SOMEONE. 17" laptops are now fairly common. The only thing annoying about Apple's (to me at least) is that it doesn't have, in addition to an optical drive and a single hard drive, a second hard drive. Most 17" laptops have that and frankly, that's too awesome.

Otherwise, I may be the only person out there who'd be down for something like this, but I think a 17" MacBook Air would be pretty freakin' cool. It'd be the weight of a 13" MacBook Pro, or somewhere between that and the weight of a 15" MacBook Pro. It'd have more room for venting and therefore have room for maybe a faster processor or a larger battery or even removable RAM. It'd still be all flash blade-based. Large freakin' screen (with matte options). Sounds like win to me, though I'll bet I'm one of few, if any others, who think so.



MBSP is born. (Semi-Pro) :)

If they converge the white MacBook and 13" MacBook Pro correctly, the resulting "MacBook" essentially becomes Semi-Pro on its own, just without the "Pro" moniker.
 
Last edited:
course not they are budget chips, but in the context of this thread, if apple did decide to ditch nvidia and go with amd for everything, they could not possibly sell a base model laptop for $1000 without being anything but ridiculously overpriced (more so than they are right now), apple would have to drop their prices well below $1K

Not to mention, you'd probably have a CPU/Graphics/Chipset combo that, even with IGPs for graphics over discrete, is a more solid combo than that of Intel. I mean, don't get me wrong, the Core i3/i5/i7 is RAD! But the graphics options on the low-end BLOW. Even having some form of NVIDIA Optimus or AMD/ATI equivalent with an Intel IGP seems pointless anyway if the IGP itself is utter crap and if the computer is always defaulting to the better GPU.
 
Again, may I note that this is only really a problem with the 13" MacBook Pro's target customer base.

Well, I certainly have nothing to back it up but I would think the 13" MBP outsells the 15" & 17" varieties, no? I know the white book is #1 overall.

13.3" seems to be Apple's sweet spot otherwise they wouldn't have introduced another one.

Somebody do the math and see what going from a 16:10 display in the MBP to 16:9 measures out to be while keeping the same width. (I'm too lazy). Of course that means it won't be as deep. Is there room to do this?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.