Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

1nteresting

macrumors member
Original poster
Feb 14, 2008
44
0
I am debating whether to upgrade my laptop or not. I feel as though my laptop is really sluggish, especially compared to the new laptops that Apple put out not too long ago. However, I could only afford the 13", which doesn't have a graphics card, persay. (only integrated)

So, would the Intel HD 3000 integrated graphics compete with the performance of the Geforce 8600M GT with 512MB?
 
I am debating whether to upgrade my laptop or not. I feel as though my laptop is really sluggish, especially compared to the new laptops that Apple put out not too long ago. However, I could only afford the 13", which doesn't have a graphics card, persay. (only integrated)

So, would the Intel HD 3000 integrated graphics compete with the performance of the Geforce 8600M GT with 512MB?

No - 8600M would still be a lot faster than the HD 3000 in Games.
 
The 8600m GT is more powerful than both the 320m and the HD3000 but don't forget that you are gaming at a lower and less demanding resolution on the 13" so that may offset some of the performance deficit.
 
I'm not so much a gamer in terms of fast paced high frame rate type games. I occasionally play some lighter games such as angry birds, minecraft, cityville (xD Yeah, it's addicting), and maybe even Civilization IV (which would be the most demanding). Would the HD 3000 handle it well?

Mostly what I do is heavy photoshop work, 18MP pictures with lens blur effects take awhile to render, but isn't that mostly done in the processor so getting the 13" would actually be faster than my current computer?

Some web designing, Pages, and virtual machines... and that's pretty much it. All my video editing is done on the desktop I have.

@grahamnp You do make a very interesting point, although if I plug it into an external monitor which I often do, it wouldn't make a difference.
 
I'm not so much a gamer in terms of fast paced high frame rate type games. I occasionally play some lighter games such as angry birds, minecraft, cityville (xD Yeah, it's addicting), and maybe even Civilization IV (which would be the most demanding). Would the HD 3000 handle it well?

Mostly what I do is heavy photoshop work, 18MP pictures with lens blur effects take awhile to render, but isn't that mostly done in the processor so getting the 13" would actually be faster than my current computer?

Some web designing, Pages, and virtual machines... and that's pretty much it. All my video editing is done on the desktop I have.

@grahamnp You do make a very interesting point, although if I plug it into an external monitor which I often do, it wouldn't make a difference.

Then the HD3000 is enough.
 
No - 8600M would still be a lot faster than the HD 3000 in Games.

I'm curious where this assessment comes from. I've been looking into this a lot recently and while the 320M is clearly superior to the HD 3000, the 8600M (I have the same card as the OP) is nearly identical in performance to the Intel chip.

The 8600m GT is more powerful than both the 320m and the HD3000 but don't forget that you are gaming at a lower and less demanding resolution on the 13" so that may offset some of the performance deficit.

The 8600M is definitely not more powerful than the 320M. I think that people tend to get in the habit of assuming that a discreet card will always be better than an integrated one. The 8600M is nearly four years old. The much more recent, but integrated, 320M has surpassed it.

In the complete list that the attached screenshot comes from, the 320M, 8600M GT and HD 3000 are ranked 131, 154 and 156 respectively, out of 317 laptop GPUs. Note that the HD 3000 handily beats the 8600M in every benchmark except the severely outdated 3DMark01. I suspect that, in reality, the HD 3000 will consistently outperform the 8600M by a small margin.

grahamnp is correct in that the lower resolution of the 13" will aid graphics performance as well.

In any case, the conclusion that you'll be fine with the HD 3000 is correct but incomplete. It will be an upgrade from what you've got. :)
 

Attachments

  • Screen shot 2011-04-13 at 1.18.38 PM.png
    Screen shot 2011-04-13 at 1.18.38 PM.png
    60 KB · Views: 3,761
In any case, the conclusion that you'll be fine with the HD 3000 is correct but incomplete. It will be an upgrade from what you've got. :)

This thread will explain a bit more:

https://forums.macrumors.com/threads/897409/

8600M GT used in MBP used GDDR3 memory. Definitely quite a lot faster than the 320M.

eg. screenshot you provided says the 8600M gets 32xx in 3dMark 06. It has been tested on the Macbook pro with 8600M GT and it gets closer to 5000.
 
This thread will explain a bit more:

https://forums.macrumors.com/threads/897409/

8600M GT used in MBP used GDDR3 memory. Definitely quite a lot faster than the 320M.

eg. screenshot you provided says the 8600M gets 32xx in 3dMark 06. It has been tested on the Macbook pro with 8600M GT and it gets closer to 5000.

If you read the results the previous poster, posted, then you can see that the Geforce 8600M beats the 320M in some areas but not all. Although looking at the results in that table don't show any reason why the 8600M is being beaten by the GT 320M or the HD 3000, apart from being Direct X 10.1 instead of 10. One could guess that even though the HD 3000 and the GT 320M have to use the system memory that there could easily be some higher clock speeds etc on the newer cards. I'm not 100% on what i just said but based on those results I feel without researching it I am taking a reasonable guess.
 
If you read the results the previous poster, posted, then you can see that the Geforce 8600M beats the 320M in some areas but not all. Although looking at the results in that table don't show any reason why the 8600M is being beaten by the GT 320M or the HD 3000, apart from being Direct X 10.1 instead of 10.

You shouldn't read much into that table from the previous poster. It compiles a list of results for the 8600M and averages them. The 2007-2008 Macbook pro uses the latest GDDR3 version of the chip which is a lot faster than the DDR2 version.
 
This thread will explain a bit more:

https://forums.macrumors.com/threads/897409/

8600M GT used in MBP used GDDR3 memory. Definitely quite a lot faster than the 320M.

eg. screenshot you provided says the 8600M gets 32xx in 3dMark 06. It has been tested on the Macbook pro with 8600M GT and it gets closer to 5000.

I hadn't noticed that. Thank you for correcting me. The 3DMark06 score on that table is clearly wrong for the MBP.

The thread you linked, and the thread linked within it, has people reporting around 4200-4400 which is indeed quite a lot better than the 3200 in the table I linked, but the only people getting as high as 5000 appear to be those who have overclocked their GPUs. And thats for the 512mb 8600M, the OP may have the 128 or 256. The scores for the 320M and the HD 3000 are right in that neighbourhood. I suspect the only reason the HD 3000 is higher than the 320M is the boost it gets from the Sandy Bridge processor.

Are there other reasons I'm overlooking where the benchmarks I posted are misleading? These specific MBP benchmarks reveal the impact the RAM speed has. Other than the fact that we can't use my benchmark table for things like ULV versions of the HD 3000 due to their lower clock speed, is there something else they're obscuring?
 
Some MBP's in that thread do seem to push up toward the high 4000s. As far as I can tell though it might still be a bit of a stretch to call the 8600M "quite a lot" faster. The benchmarks coming from MacBooks with 320Ms in that list are showing around 4300 and up to 4700. Perhaps a 10% difference?

The only report from a MacBook I see on there with a HD 3000 does indeed show a rather embarrassing 2550.

Something tells me there are more pressing matters for me to attend to than this right now but I can't think of them :p
 
I hadn't noticed that. Thank you for correcting me. The 3DMark06 score on that table is clearly wrong for the MBP.

The thread you linked, and the thread linked within it, has people reporting around 4200-4400 which is indeed quite a lot better than the 3200 in the table I linked, but the only people getting as high as 5000 appear to be those who have overclocked their GPUs. And thats for the 512mb 8600M, the OP may have the 128 or 256. The scores for the 320M and the HD 3000 are right in that neighbourhood. I suspect the only reason the HD 3000 is higher than the 320M is the boost it gets from the Sandy Bridge processor.

The 4200-4400 number were for 3dMark runs at a higher resolution.

"No, the 8600M GT did 5561 at 1280x800 in 3DMark06. The 320M did 4700 at 1280x800. The gap is definitely closing, which is not surprising because it has been 3 years already. However at the moment the 320M is still unable to pass the mid-range card from 3 years old."

There is a overclocked result of the 8600M GT that is over 6000.

Are there other reasons I'm overlooking where the benchmarks I posted are misleading? These specific MBP benchmarks reveal the impact the RAM speed has. Other than the fact that we can't use my benchmark table for things like ULV versions of the HD 3000 due to their lower clock speed, is there something else they're obscuring?
 
We are looking at the same thread right? :)

https://forums.macrumors.com/threads/451456/

First responder posted a screenshot showing 4414 at 1280x800.

Second responder:

"I have the 2,5 Ghz with 2GB RAM and 7200 RPM HD.

3dmark06
4100 3DMarks
SM2.0 Score: 1623
HDR/SM3.0 Score 1517
CPU Score 2193

I ran it without messing with any options so the resolution was set to 1280x1024, and not the native 1440x900"

Third:

"I ran 3DMark06 Basic Edition on my 2.5 GHz with 512 VRAM and 2 GB of RAM running on Vista Ultimate. Everything set to default of course like 1280 x 854 and no AA. Got 4200"


Those are from the first page of people reporting that they used the default 1280x800 resolution. I'm not sure why we'd call these invalid. The scores get higher later in the thread but the 5500 score is definitely an outlier among those reporting regular clocking.
 

Attachments

  • 083b722d.png
    083b722d.png
    57.6 KB · Views: 677
We are looking at the same thread right? :)

https://forums.macrumors.com/threads/451456/

First responder posted a screenshot showing 4414 at 1280x800.

Second responder:

"I have the 2,5 Ghz with 2GB RAM and 7200 RPM HD.

3dmark06
4100 3DMarks
SM2.0 Score: 1623
HDR/SM3.0 Score 1517
CPU Score 2193

I ran it without messing with any options so the resolution was set to 1280x1024, and not the native 1440x900"

Third:

"I ran 3DMark06 Basic Edition on my 2.5 GHz with 512 VRAM and 2 GB of RAM running on Vista Ultimate. Everything set to default of course like 1280 x 854 and no AA. Got 4200"


Those are from the first page of people reporting that they used the default 1280x800 resolution. I'm not sure why we'd call these invalid. The scores get higher later in the thread but the 5500 score is definitely an outlier among those reporting regular clocking.

The screenshot clearly says 1280x1024, getting 4414

Another quick datapoint here:
http://www.barefeats.com/mbpp25.html

Where the 8600M GT clearly outperforms the 320M in some real world gaming benchmarks.
 
Last edited:
Mmmmhmm... And the other two are at a lower resolution with similar/lower scores. I thought you were saying that at 1280x800 they're up to 5000. If anything the screenshot is showing the best result of the three I quoted, a higher score AND a higher resolution.

I'm not sure what you're suggesting by pointing out that the screenshot is showing 1280x1024, except that it's not 1280x800. Is it that if that first guy had run it at 1280x800 he or she would have gained 600 points?
 
Sorry, I'll stop feeding this fire. The OP's question has certainly been answered.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.