Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
disappointed as well...

but there are 4 things i need:

1) backlit keyboard
2) slim design
3) display port output (don't care about the new audio version)
4) FW 800


What options do we have with the PC competition with those features and better CPU / VGA combo that will run hack-intosh ?
 
For me this paragraph derailed any point you were trying to make.

"Apple doesn't want people like you using their laptops and promoting their brand. They want trend setters, they want the creative class. They want people who wear Elie Tahari jackets and sit in hotel lounges and coffee shops tapping away on MacBooks — both in real life, and on the big screen. The creative class sets the standard for what the consumer class wants. By having the creative class on lock down, they create icons of cool who use Apple products; whether in music videos, tv shows, etc... and this makes people want to go out and buy iPhones, and iPads, and buy their media using iTunes."

To me this sort of attitude is the definition of elitist thought.

Not just elitism. I'm all for elitism, when it makes any kind of sense. It's shallow elitism -- focused entirely on admiration of status symbols for the sake of status symbols, without any reference to anything with any kind of merit other than being overpriced.

Sounds dismal.

Hang on.

Why are people pointing to battery life? The entire point of the GPU-switching thing is that you don't pay for the faster GPU unless you're using it.

So people who want to do non-GPU-intensive things all day could do them just as long if the "high-power" GPU were higher powered.
 
disappointed as well...

but there are 4 things i need:

1) backlit keyboard
2) slim design
3) display port output (don't care about the new audio version)
4) FW 800


What options do we have with the PC competition with those features and better CPU / VGA combo that will run hack-intosh ?

other than FW800 and possibly hackintosh, maybe this:http://www.dell.com/us/en/business/notebooks/precision-m4500/pd.aspx?refid=precision-m4500&s=bsd&cs=04

Though it's thicker and heavier. But it does have eSATA, 3G, BluRay, FHD, Smartcard reader, fingerprint reader, i7 quad, FX1800M, Expresscard (argh, why did it have to go away in the MBP 15?!), docking stations (huge, huge advantage if you have a lot of stuff to plug in), also it can do a minicard SSD and a hard drive and a BluRay drive all in one system.
 
I'm not sure what all the negativity around the MBP GPU is all about. Apple laptops have ALWAYS had mid range graphics cards. I'm a creative professional and a gamer. I do almost all of my gaming on my PC desktop or Alienware M11X laptop. All of my work is done on my MBP and the newest example (15" i7 2.66Ghz) has been awesome in the 24 hours that I've used it. I have been waiting for an update for 6 months and what Apple has provided meets my expectations. Last month I almost bought a Alienware M15X because I was tired of waiting for the MBP update. The M15X was $3300 when I had it all spec'ed out. Given that, I am not disappointed with my MBP. In fact, I NOTICE the performance increase from my 2009 MBP and that is all I wanted.
 
Looks good to me.

Those benchmarks are not accurate because most of the 8600M GTs tested use DDR2 memory, while Apple has always used GDDR3 memory with their 8600M GTs which is roughly 30% faster. The 330M is at the very best only 65% better than the 8600M GT, and roughly 55% better than the 9600M GT.

The 9600M GT was a pathetic upgrade over the 8600M GT and was only about 10 - 15% faster. My laptop scores 4400 on 3DMark06 on the default clock speeds, and 5200 when overclocked.

I'm not sure what all the negativity around the MBP GPU is all about. Apple laptops have ALWAYS had mid range graphics cards. I'm a creative professional and a gamer. I do almost all of my gaming on my PC desktop or Alienware M11X laptop. All of my work is done on my MBP and the newest example (15" i7 2.66Ghz) has been awesome in the 24 hours that I've used it. I have been waiting for an update for 6 months and what Apple has provided meets my expectations. Last month I almost bought a Alienware M15X because I was tired of waiting for the MBP update. The M15X was $3300 when I had it all spec'ed out. Given that, I am not disappointed with my MBP. In fact, I NOTICE the performance increase from my 2009 MBP and that is all I wanted.

Because Apple decided to use the ****** mid range graphic cards when there are far better mid range cards like the ATI 5650 and the ATI 5830 that completely dominate the Nvidia 330M. Not only is the Nvidia 330M weaker than that ATI cards, it's also based on 3 year old technology.

The Nvidia 330M which you are using now is basically the 9600M GT with more cores and a higher clock speed. The 9600M GT which you had is basically the same as the 8600M GT which was produced in 2007. In short, the Nvidia 330M is basically a 8600M GT with 16 more cores on it and shrunk down to 40nm.

Obviously you will notice a performance increase of roughly 50% with the additional 16 cores, however it's extremely pathetic that that is all that Nvidia managed to accomplish in 3 whole years. A mere 50% boost is crap, especially since the 5650 offers at least a 100% boost while drawing significantly less power, oh and it has DX11 too.
 
Nvidia: The Way It's Meant To Be Renamed™
 

Attachments

  • 1264563320987.png
    1264563320987.png
    284.3 KB · Views: 68
I'm not sure what all the negativity around the MBP GPU is all about. Apple laptops have ALWAYS had mid range graphics cards.

Right. That's what the negativity is about -- they're charging top-of-the-line prices for mid-range hardware.

I'm a creative professional and a gamer. I do almost all of my gaming on my PC desktop or Alienware M11X laptop.

Nice for you. I'm not interested in running Windows, so I want Apple to produce some hardware a little above the mid-range, which I would happily pay for.

In fact, I NOTICE the performance increase from my 2009 MBP and that is all I wanted.

If you're not gaming on it, sure. Me, I'd like to be able to do some gaming without a Windows machine. :)
 
Right. That's what the negativity is about -- they're charging top-of-the-line prices for mid-range hardware.



Nice for you. I'm not interested in running Windows, so I want Apple to produce some hardware a little above the mid-range, which I would happily pay for.



If you're not gaming on it, sure. Me, I'd like to be able to do some gaming without a Windows machine. :)

A comparable Dell XPS 16 goes for $1700 before all the Dell discounts and it has a Radeon 4670 GPU which is comparable to the 330m. We all know that Apple products cost more than their PC counter parts, so what is the big deal? Personally, I don't think most MBP users would benefit from a better GPU; however, I think at the very least it should be a CTO option. I running a ATI Radeon 5870 GPU in my PC and could give a rat's @ss about the nvidia crap that is running in Apple laptops.

I guess I am tired of all the b!tching. When Apple didn't release the new MBP everyone was complaining that they were being slow. When they finally did release the MBP updates many people are complaining that they didn't do more. Yes, I would have liked to see quad core i7 processors and TOL ATI cards in my MBP, but at what cost? My MBP is about $100 shy of a similarly equipped Alienware M15X with a Nvidia 240M GPU.

I am installing Modern Warfare 2 and Bad Company 2 on my MBP right now. I am curious on how it performs compared to my Alienware M11X.
 
Huh. My $800 windows 7 is capable of playing crysis max graphics at 30+ fps.
and i have photoshop and illustrator. weird how apple makes themselves think they are the best at everything. steve jobs just wants to get richer than bill gates.
 
Huh. My $800 windows 7 is capable of playing crysis max graphics at 30+ fps.
and i have photoshop and illustrator. weird how apple makes themselves think they are the best at everything. steve jobs just wants to get richer than bill gates.

You can keep your crappy windoze box with crappy software with crappy workflow.
But you get 30+ fps in crysis which makes everything A-OK!!

Another troll. :mad:
 
I hate the word troll and the constant use of it to label someone just because you disagree with them. The term needs to die.

Apparently you can tell this guy is a troll. I wouldnt be surprised if he just signed up on this forum to make this statement.

Actually, I've checked the 3 posts he made thus far on these forums and they are all windoze related. Nothing to do at all about macs.

FWIW, IMHO I think windows 7 is the same crappy OS as Vista. I dont see much of a difference between the two (except a few new features).

I use windows for some of my courses via vmware fusion and I think windows xp is still the best.
 
Huh. My $800 windows 7 is capable of playing crysis max graphics at 30+ fps.
and i have photoshop and illustrator. weird how apple makes themselves think they are the best at everything. steve jobs just wants to get richer than bill gates.

@ 800x600? What is the point to your post?
 
What's funny is that the Geforce GT 330M in the MacBook Pro runs at 500Mhz, as opposed to the 575Mhz it is capable of.

AnandTech is reviewing the base model.

However, this is still faster than the Geforce GT 9600M (450Mhz) used in last years model. Not to mention extremely faster than the Geforce 9400M used in last years base model ;)

So we have

65nm vs 40nm
450Mhz vs 500Mhz
32 "cores" vs 48 "cores"

Left 4 Dead (1440 x 900 - High Quality)
15-inch MacBook Pro (Late 2009) - GeForce 9400M
16.9 fps​
15-inch MacBook Pro (Mid 2010) - GeForce GT 330M
44.9 fps​

World of Warcraft (800 x 600 - High Quality)
15-inch MacBook Pro (Late 2009) - GeForce 9400M
19.1 fps​
15-inch MacBook Pro (Mid 2010) - GeForce GT 330M
52.3 fps​
 
the 330m in mbp is not 1024 it is 512 or even worse 256

I'm listing the maximum memory the card can handle, not necessarily what is in the laptops.

Why do you say that the 330M significantly better than the 8600M GT, but relatively on par with the 9600M GT?

It's quite the opposite actually. The 8600M GT and 9600M GT are both quite near each other since they both have 32 stream processors, and the 330M will be significantly better.

But not that much better anyway, and certainly nowhere near 2x. Probably around 50%. However a mere 50% boost is pretty sad seeing how it's been 3 years already.

benchmarks put the 9600m GT at 50% faster/better than a 8600GT, but the 330M at only 33% better than the 9600GT. given that, 300M is 100% faster/better than the 8600M GT (or twice as fast).

The Apple graphic switching works with any graphic card.

How do you know that the apple graphic switching works with any graphics card?

Are you an apple engineer? Did you help design the system? Know of an implementation that utilizes an ATI based card?

I'm guessing no, no and no. Point is, Apple's technology is most likely based off the nvidia optimus technology, and not written from the ground up by Apple, so it is most likely catered to using nvidia gpus.
 
Christ... this was depressing back when there really wasn't anything else you could squeeze into a MBP enclosure, but now, with the 5XXX series just sitting there waiting to be used...

Really embarrassing...
 
A comparable Dell XPS 16 goes for $1700 before all the Dell discounts and it has a Radeon 4670 GPU which is comparable to the 330m. We all know that Apple products cost more than their PC counter parts, so what is the big deal?

Dell, like every other vendor, has units that have a GPU that isn't, fundamentally, about three years old in design, just upgraded in process.

Personally, I don't think most MBP users would benefit from a better GPU; however, I think at the very least it should be a CTO option. I running a ATI Radeon 5870 GPU in my PC and could give a rat's @ss about the nvidia crap that is running in Apple laptops.

That's pretty much my thought.

I guess I am tired of all the b!tching. When Apple didn't release the new MBP everyone was complaining that they were being slow. When they finally did release the MBP updates many people are complaining that they didn't do more.

Right. If they're gonna wait this long for a refresh, the least they could do is do a refresh that isn't way sub-par.

Yes, I would have liked to see quad core i7 processors and TOL ATI cards in my MBP, but at what cost? My MBP is about $100 shy of a similarly equipped Alienware M15X with a Nvidia 240M GPU.

How about, say, another $100 or $200; the high-end could be $2,499 base (they've charged that much or more in the past, no big deal), and that would well more than cover the total cost of a halfway-decent GPU, let alone the difference in price between that and the 330.

I am installing Modern Warfare 2 and Bad Company 2 on my MBP right now. I am curious on how it performs compared to my Alienware M11X.

I am too! I would love to find out that the 330M is actually usable, despite being sorta low-end.
 
Saying the entire thread is silly implies the the thread and those who bother to post in it (commenting negatively about the lackluster MBP GPU offerings) are silly to even mention such lackluster GPU parts. It may be pointless (since Apple will not change) but silly, no.

For me this paragraph derailed any point you were trying to make.

"Apple doesn't want people like you using their laptops and promoting their brand. They want trend setters, they want the creative class. They want people who wear Elie Tahari jackets and sit in hotel lounges and coffee shops tapping away on MacBooks — both in real life, and on the big screen. The creative class sets the standard for what the consumer class wants. By having the creative class on lock down, they create icons of cool who use Apple products; whether in music videos, tv shows, etc... and this makes people want to go out and buy iPhones, and iPads, and buy their media using iTunes."

To me this sort of attitude is the definition of elitist thought. I can agree on Apple wanting to create and maintain a certain look, feel and vibe for their products. Helps justify the premium price but in my experience there's more to the Apple buying decision process than trying to look cool.

Even saying TV networks and studios like Turner or Cartoon Network don't do anything creative feels elitist in of itself. The activities performed are not strictly operational tasks bereft of any creativity.

Alas, as you say much of this is academic. In many ways I agree this past two days have been an exercise in futility but perhaps a cathartic one.

The tone of your missive rubbed me the wrong way and my reaction is thus. You are of course free to express you opinion in any manner you wish. Tis indeed a free country.

Cheers,

The paragraph you excerpted — did you read it? None of that is my opinion; it is factually what Apple does. If there is elitism, it's Apple's not mine. But honestly I don't understand anything to be wrong with elitism if you have some substance to back it up, and I think you're misattributing — this is just Apple's business model, its the way of the world and they have 200 billion in valuation that says its working. Regardless, its like anything else — like, I have some designer clothes which I think look and feel fantastic. Is their retail price outrageously high for what it is? Sure, should they be using better quality leather to justify the price, probably. Is anyone else out there making anything comparable? No. So is there any point to complaining? No, its silly. Same with Apple; to a certain extent they're a luxury brand, but they back it up with great quality and features/capabilities no one else offers.

The people who make the shows on comedy central and turner (what is turner? do you mean TBS? Turner owns Comedy Central so I'm confused, but the people who make the shows I KNOW use Mac's, and network operations and network management really just don't fit within any definition of what a "creative" is, its not elite of me to say that, its fact — I'm sure Turner has an organizational flow chart you can look at that will make it clearer. I'm not saying the job functions don't have any creativity in them, but we're not talking about that as an abstract term, we're talking about it the way CAA uses it, or the way an ad agency or studio would use it — I'm using it as a noun, not an adverb.

I think this might be part of the confusion — I'm not saying normal people aren't capable of creativity, I'm saying there are a specific class of people who are defined as "creatives" and create content that Apple targets its high end hardware to.
 
benchmarks put the 9600m GT at 50% faster/better than a 8600GT,

I've never seen a benchmark put the 9600M above 20% faster than the 8600M. 15% is more likely.

How do you know that the apple graphic switching works with any graphics card?

That's a good point -- it could be that the way they're doing this does actually require the nVidia hardware, in which case, they'd have to develop a more complete alternative implementation, or be stuck with nVidia's mobility options, which are pretty crappy in the low-power side of things.

I'm guessing no, no and no. Point is, Apple's technology is most likely based off the nvidia optimus technology, and not written from the ground up by Apple, so it is most likely catered to using nvidia gpus.

Actually, this part's definitely wrong; it's not optimus. But! Since they used an nVidia chipset, they may still be stuck having both GPUs be of the same basic design, meaning both nVidia.
 
How do you know that the apple graphic switching works with any graphics card?

Are you an apple engineer? Did you help design the system? Know of an implementation that utilizes an ATI based card?

I'm guessing no, no and no. Point is, Apple's technology is most likely based off the nvidia optimus technology, and not written from the ground up by Apple, so it is most likely catered to using nvidia gpus.

Because there is nothing in the explanation of the technology that says it is only vendor specific.

If it were, Apple would be considered married to NVIDIA for their computer lineup. That doesn't make sense at all.
 
Because there is nothing in the explanation of the technology that says it is only vendor specific.

But there's nothing saying it isn't.

It would be very unsurprising for the technology to require that both GPUs have basically similar architectures. For instance, you might be able to use it with an ATI discrete GPU and an ATI integrated GPU.

If it were, Apple would be considered married to NVIDIA for their computer lineup. That doesn't make sense at all.

Yeah, because Apple would never do something crazy like committing to only one of two vendors for compatible chips, even though the other one often has better price/power/performance, right? Like, say, they'd never commit to using Intel only even when AMD chips are better. Or to using AT&T only when many people get better coverage from Verizon or T-Mobile. No, Apple would never make such a deal. If by "never" we mean "pretty much any given year they'll probably make one".
 
Or to using AT&T only when many people get better coverage from Verizon or T-Mobile. No, Apple would never make such a deal. If by "never" we mean "pretty much any given year they'll probably make one".

Hollow argument. The rest of the civilized world has more choices and better coverage. And this does not apply to their hardware at all.

AMD may have had better processors back in 2005 but things have been different since 2006.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.