Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

henrikrox

macrumors 65816
Original poster
Feb 3, 2010
1,219
2
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

http://www.techyalert.com/2011/02/25/macbook-pro-2010-vs-macbook-pro-2011/

Here you go.

Down on the page. I5 was Average 12 fps behind (alot) compared to a core 2 duo with a 320m.


Left for dead

2011 MBP with Intel HD 3000 (min/max/avg) = 38 / 90 / 63
2010 MBP with nVidia 320m (min/max/avg) = 53 / 92 / 75
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

It's a start
 
Geekbench

2011 MBP with 2.3GHz i5 Sandy B. = 5962

2010 MBP with 2.4GHz Core 2 Duo = 3388



Anyone actually seriously considering this a gaming computer? The CPU improvements are significant. The GPU is probably adequate for average uses. It probably is faster than the last gen MacBook with the 9400.
 
People need to stop complaining. I'd take a 10-20% loss in graphics over 200-300% increase in cpu power any day of the year.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Exactly my point. The CPU is so much better then a core 2 duo. But still it looses heavily in games.

And Its even worse when you turn the detail levels up.

I don't care about 3d mark etc. Here is actual gpu peformance
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

TMRaven said:
People need to stop complaining. I'd take a 10-20% loss in graphics over 200-300% increase in cpu power any day of the year.

So you are saying a CPU is 3 times as good. Think you need to work on your math.

Maybe 30-40%

And I'm not complaining. Just laying facts on the table
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Evil Spoonman said:
Exactly my point. The CPU is so much better then a core 2 duo. But still it looses heavily in games.

And Its even worse when you turn the detail levels up.

I don't care about 3d mark etc. Here is actual gpu peformance

You really have no idea what you're talking about, do you.


This is already being discussed in another topic: https://forums.macrumors.com/threads/1102927/

That topic talks about synthetic results. Who cares about a score In a program. It matters how your gpu peforms in games.

And I'm just posting benchmark. The numbers don't lie
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)



So you are saying a CPU is 3 times as good. Think you need to work on your math.

Maybe 30-40%

And I'm not complaining. Just laying facts on the table

Sorry, I confused the 3.1x number dealing with the amd graphics with the 2.2x number dealing with cpu on apple's website, but my point still remains. The intel integrated graphics are only marginally slower than the 320m while the sandybridge cpu is 2x faster than the dated core2duo cpu in the previous low end macbookpros. There really is no comparison there about which laptop is overall a better machine.

And I never said you were the one complaining, but if you take the word people to mean yourself all the time, that's not my problem.

And no, that notebookcheck link does give some actual realworld tests too, given that they gave some fps benchmarks from actual games.
 
Nvidia has Cuda, Intel and ATI not. So even if you don't play games, if you work with Adobe CS5 for example, a Nvidia Gpu is much better.

But Apple had no choice. So be happy you got the i5/i7 in the 13" MBP.

If you think the GPU is too weak, there is always the 15" models. It has higher resolution too.
 
That topic talks about synthetic results. Who cares about a score In a program. It matters how your gpu peforms in games.

And I'm just posting benchmark. The numbers don't lie

The numbers don't lie, but you are interpreting them totally incorrectly. Comparing a GPU/OS combination that has been out for ONE DAY to one that has been out for years with a model range and driver history to back it up well before that is really damn foolish. For all you know, next month and a system patch later the HD 3000 could be screaming 25% faster than the 320M ever was in every single game.

I talked about the hardware potential in the other thread. There are a lot of variables on the table for the HD 3000. What I said stands. We can't definitively call performance yet. We can make estimates and generalizations, but until the HD 3000's drivers have gained some maturity it is going to stay up in the air. No matter how convinced you are that this is the way things are, it doesn't make you right.


Nvidia has Cuda, Intel and ATI not. So even if you don't play games, if you work with Adobe CS5 for example, a Nvidia Gpu is much better.
I am still curious whether or not the HD 3000 supports OpenCL. Haven't been able to find any actual verified data yet.


Disclaimer: I'm typing this from an 8 core Nehalem Mac Pro with a Radeon 4870 in it. I have no interest in buying a 13" MBP.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Yes you are partially right. This is a reply to tmraven

Because the notebookcheck website benchmark isn't comparable. Because they are using a quad core high end found in the 15-17 mbp in those tests.

I would not say a quad core high end is compareable to the i5 found in the 13". Therefor the intel Igp is winning In the test on notebookcheck. It got so much help from the quad core.

Therefor I posted. Sorry just trying to post some proper benchmarks.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; nb-no) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Evil Spoonman said:
henrikrox said:
That topic talks about synthetic results. Who cares about a score In a program. It matters how your gpu peforms in games.

And I'm just posting benchmark. The numbers don't lie

The numbers don't lie, but you are interpreting them totally incorrectly. Comparing a GPU/OS combination that has been out for ONE DAY to one that has been out for years with a model range and driver history to back it up well before that is really damn foolish. For all you know, next month and a system patch later the HD 3000 could be screaming 25% faster than the 320M ever was in every single game.

I talked about the hardware potential in the other thread. There are a lot of variables on the table for the HD 3000. What I said stands. We can't definitively call performance yet. We can make estimates and generalizations, but until the HD 3000's drivers have gained some maturity it is going to stay up in the air. No matter how convinced you are that this is the way things are, it doesn't make you right.


mark28 said:
Nvidia has Cuda, Intel and ATI not. So even if you don't play games, if you work with Adobe CS5 for example, a Nvidia Gpu is much better.
I am still curious whether or not the HD 3000 supports OpenCL. Haven't been able to find any actual verified data yet.


Disclaimer: I'm typing this from an 8 core Nehalem Mac Pro with a Radeon 4870 in it. I have no interest in buying a 13" MBP.

Well dude I'm just trying to show how the games are peforming right now. I don't know what happens in 1-2-9 months. And whats happening with drivers.

What matters is performance right now. Intel has always been bad at drivers.

And right now the intel 3000hd has 12 fps less then a 320m running left for dead.

Also. You are misunderstanding the benchmark of the 3000hd, because it's running with a quad core and 3d mark
Also factors in CPU score. There for the intel 3000 hd is "winning"

If you actually check the links. Youll see the 3d mark with the intel 3000hd running with a dual core instead of a quad core
 
Well dude I'm just trying to show how the games are peforming right now. I don't know what happens in 1-2-9 months. And whats happening with drivers.

What matters is performance right now. Intel has always been bad at drivers.
I don't think that's true at all. People tend to own computers a fair bit longer than 1-2-9 months. I think they would like to know the full story. Right now things are showing a slight lead in some games for the 320M, and a slight lead in other games for the HD 3000. This with immature drivers and some bugs showing up. Also Intel isn't doing the drivers for the HD 3000 under OS X, Apple is doing them. You can buy a 13" MBP today with confidence that your GPU is going to be relatively similar in performance to the 320M.

Also. You are misunderstanding the benchmark of the 3000hd, because it's running with a quad core and 3d mark
Also factors in CPU score. There for the intel 3000 hd is "winning"

If you actually check the links. Youll see the 3d mark with the intel 3000hd running with a dual core instead of a quad core
So? Aren't we measuring machine performance here, not just raw GPU performance? It's not like we can take the 320M out of the last gen and transplant it into this one anyways. We are comparing the 2010 13" MBP to the 2011 13" MBP. CPU, IGP, memory bus bump, the whole package. If the 2010 MBP is performing 10 FPS better across the board, it doesn't matter all that much where it comes from does it?
 
I notice that the guy in your link said he didn't qualitatively sense any frame rate difference in actual play, described the performance as 'not far off' and ultimately said he's keeping the 2011.

Also, what you're getting is 2x performance on installation, 2x performance on encoding, etc (this impressed me and will be the core of my testing). I guess it's fair to say that the 2011 MBP is not a good choice as a primary gaming machine, but superb for working on. What a surprise!

I'd be interested to see what the scores are for gaming under Mac Steam btw... anyone able to link?
 
To add some curiosity to this question, I figured something out in another thread: https://forums.macrumors.com/threads/1103007/

Presuming I am correct about the chips (not verified).

Common
- 4 Threads
- 12 EU Intel HD 3000 Graphics
- 35W
- Dual Channel 1333MHz memory
- DMI 2.0
- 384MB Shared Memory for IGP

i5-2410M (base 13")
- CPU Base Clock: 2.3GHz
- CPU Turbo Clock (1C/2C): 2.9GHz/2.6GHz
- Cache: 3MB
- Graphics Standard Clock: 650MHz
- Graphics Max Turbo Clock: 1,100MHz

i7-2620M (best 13")
- CPU Base Clock: 2.7GHz
- CPU Turbo Clock (1C/2C): 3.4GHz/3.2GHz
- Cache: 4MB
- Graphics Standard Clock: 650MHz
- Graphics Max Turbo Clock: 1,300MHz



Based on these numbers. Due to the way Sandy Bridge works, the IGP has access to the same shared L3 as the CPU. It also can clock itself up to consume available TDP headroom if needed for graphics applications. This means that the graphics potential for the i7 13" is greater. Especially in applications that do not consume very much CPU. Games like Starcraft 2 which tend to use reasonable amounts of CPU will benefit less from this.

Best graphics performance for the 13" MBP can be had by matching SODIMMs in both slots for proper dual channel access, providing more cache for the whole chip, and purchasing the top-shelf CPU for maximum IGP turbo potential.


Edit: This appears to also be the case for the 15" models.

Common:
- 8 Threads
- 12 EU Intel HD 3000 Graphics
- 45W
- Dual Channel 1333MHz memory
- DMI 2.0
- 384MB Shared Memory for IGP

i7-2630QM (base 15")
- CPU Base Clock: 2.0GHz
- CPU Turbo Clock (1C/2C/4C): 2.9GHz/2.8GHz/2.6GHz
- Cache: 6MB
- Graphics Standard Clock: 650MHz
- Graphics Max Turbo Clock: 1,100MHz

i7-2720QM (best 15")
- CPU Base Clock: 2.2GHz
- CPU Turbo Clock (1C/2C/4C): 3.3GHz/3.2GHz/3.0GHz
- Cache: 6MB
- Graphics Standard Clock: 650MHz
- Graphics Max Turbo Clock: 1,300MHz

i7-2820QM (BTO 15")
- CPU Base Clock: 2.3GHz
- CPU Turbo Clock (1C/2C/4C): 3.4GHz/3.3GHz/3.1GHz
- Cache: 8MB
- Graphics Standard Clock: 650MHz
- Graphics Max Turbo Clock: 1,300MHz


In this light, the upgrade to the best model 15" makes a lot more sense. You end up getting a lot more clock when you take into account Turbo. It isn't just an extra 200MHz, it's an extra 400MHz even with all cores active. You get a decent bump in the IGP maximum speed as well. Not to mention the vastly better dGPU and other goodies.

The i7-2820QM BTO option looks less attractive. It is truly only 100MHz across the board, and a 2MB L3 cache increase.
 
Last edited:
Apple does not use Cuda but OpenCL which is vendor independent and is able to support Intel and AMD/ATI equivalent technologies.

Nvidia has Cuda, Intel and ATI not. So even if you don't play games, if you work with Adobe CS5 for example, a Nvidia Gpu is much better.

But Apple had no choice. So be happy you got the i5/i7 in the 13" MBP.

If you think the GPU is too weak, there is always the 15" models. It has higher resolution too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.