Search for cTDP up or cTDP down. Manufactures could set the TDP of the 17W U Series CPUs to 13W and also up them to 20+ something. That is exactly the same thing as what they call SDP now. Only difference is that they gave it a new name rather than custom TDP (up/down).
It does the same thing. They lower TDP which only works via clocks. It is still the same chip with the same name for the same money.
cTDP was a media spin on what should just be "TDP".
Essentially, starting with Sandy Bridge, which has Turbo Boost, TDP is no longer the right metric for power consumption or thermal anymore because a processor may kick Turbo in and increase its frequency until its internal thermal sensors detect a high value. So essentially, your 17W TDP is only applicable at the max non-Turbo frequency. For instance, on the MacBook Air, that's 1.7GHz for CPU and 1.1GHz for the GPU.
And yeah, so the Intel HD 4000 can perform better or worse depending on which CPU it's coupled with. If it's with a CPU that has more cooling headroom, it'll run faster, and vice versa.
In most MacBooks, it doesn't get that much headroom compared to some Windows laptops... This is especially the case with the MacBook Air. So HD 4000 may actually run much faster on a Windows laptop compared to on a MacBook. Here's a chart to demonstrate that:
BTW all chips are binned there are only 4 different actual DIEs that leave the plants. Everything else is binning and locking multipliers.
I meant "binning" as in these chips are "specifically cherry-picked" to reduce frequencies and run at slower clocks.
Sure GFLOPS does not tell much in actual performance but it does show how much difference there is with older chips. Those aren't little afterthought graphics anymore and vastly subpar as they used to be. It is not like one compares a processing monster to a poor attempt at a bit of GPU performance. These IGPs push quite some performance.
Like I stated above, it really depends on many factors. The problem is this: not all HD 4000 are made equal.
And when it comes down to it, aside from the chips you get in quad-core MacBooks (15" and above... only 15" for 2012), most of the other ones run at slower clock speeds and are under more constraints (thermal, Turbo clocks, etc...), so it's absurd to compare them to 330M GT. Hell, even the ones inside quad-core MacBooks are not guaranteed to beat 330M GT.
My current games I tried Civ V in OSX which doesn't work at any setting. I don't game much anymore because I gave up on anything newer like Bad Company which isn't really playable at even low settings. MW3 works without AA, no smoke edges, high textures. I tried some newer Total war and gave up on it.
Otherwise my gaming is revolt (which is 16bit), CSS and C&C Generals. Newer games just drive me crazy. I would like to play ARMA 2 or Assassins Creed and none would even yield the bare minimum to be playable. It is effectively an IGP by modern standards. All the games I can play work well on the HD 4000 too as far as I know and on the rest you can forget either one.
I don't have a single game in OSX now and aside from my MW3 settings in Windows I cannot remember anything.
It is no soft edges, no aa, high textures, native 1680x1050, no spec map, basically everything turn off so that I get native res. which I prefer over higher details and AA. Yields me 30-40 fps. Indoor sometimes above 40. Some scenes below 20.
C&C Generals runs at all high, CSS too. Total war the Imperial one (pre napoleon) had such poor texture performance it was full of artifacts. Even at low it was annoying. Water battles didn't much work. Eventually I just stopped it.
IMO a GPU at this performance level is worth no more than the HD 4000.
A 620M or 710M is where it starts to make a difference IMO.
I forgot to mention I usually run my 330M with an over clock of 600 Mhz rather than the usual 500. Makes the difference in Multiplayer where frames usually drop.
Effectively I run all games at low/med settings and forget new demanding games all together. Starcraft 2 might be the exception as there 25fps should be good enough and the 330M does a lot better at medium/high.
I don't have Civ V, but I have Modern Warfare 3. With the HD 4000 inside my MacBook Air 13", I'm limited to 1280 x 800 with all settings at low except textures. Framerate was between 30 to 40fps.
With Radeon HD 6490M, I could go 1440 x 900 with all settings maxed and no AA. Framerate was between 40 to 50fps.
With Radeon HD 6630M (Mac Mini), I could go to 1920 x 1080 without AA and soft edge smoke.
Also don't have C&C but I have StarCraft 2.
Medium settings have HD 4000 chug along at 20-30fps.
Radeon HD 6490M gets 40-60fps at 1440 x 900.
Radeon HD 6630M gets 50-60fps at 1600 x 900.
And just for kicks, I compared those with Crysis 2 as well:
HD 4000 1024 x 600 at lowest settings - 20-25fps
Radeon HD 6490M at lowest settings 1440 x 900 - 25-40fps
Radeon HD 6630M at lowest settings 1600 x 900 - 25-40fps
Colleague's 330M GT (Alienware M11x) at lowest settings 1366 x 768 - 32-50fps
And needless to say, my rMBP eat all of those for breakfast. Crysis 2 alone gets 35-60fps at 1920 x 1200.
If Apple exposed HD 4000 on the rMBP to Windows, I'd have gotten some results for you. But sadly, it's dGPU only for Windows.
My initial post was uneducated I grant you that. I extrapolated performance from some older memories and factors of performance. I compared HD 3 to 4000 and guess it would come out where the 330M would be given how this one fared against the older and slower competition. It is still not GPU of any use for actual gaming.
I think your MacBook or whatever system you are running may be limited by some other factor. The 330M GT from what I can remember isn't that bad. I mean... look squarely at the Crysis 2 results and tell me if the 330M GT is "equal" to HD 4000 or not. And I don't think Crysis 2 would lose to any modern game in terms of requirements.
Yes you are right about Open GL. OSX is terrible.
http://www.phoronix.com/scan.php?page=article&item=intel_sandy_threesome&num=1
I remember some steam game that anand tested in Open GL on both plattforms were especially the hd 3000 drivers on windows seemed to be poor. Maybe they improved a lot or it was bugs. That phoronix article is quite new.
OpenGL has been worse on OSX for a long while. Even now, not all OpenGL 3.0 features are available, and we already have OpenGL 4.0.
3D performance suffers quite noticeably under OSX as a result. I always have to resort to Windows for gaming or for AutoCAD/Maya even though I much prefer to work under OSX.
I don't know how the HD 3000 does, since I don't have any computer that I can test that with (the 2011 MBP 15" above doesn't expose HD 3000 under Windows either), but HD 4000 does get better OpenGL performance under Windows compared to OSX.
And why are they only rebrands. Because Nvidia and AMD both now they don't sell anymore. People either want some more GPU capability or a thin notebook and the market in between isn't really service. These entry level cards simply don't matter anymore. They aren't worth the trouble for most manufactures. There is a handful of notebooks with a 620M out there and that one is faster than my 330M. Intel and AMD killed this low performance market. Nvidia and AMD simply gave up on it. If they finally sell up to date entry hardware 18 months down the road it won't change the picture.
Today they need a certain performance level to make it worth it and that usually requires some 60% of the CPU it is paired with at least.
520M isn't a rebrand. It's just a very old GPU.
And now 620M is old as well. nVidia is moving on to Kepler, so all Fermi chips as of this point is old news.
If you're looking for a low-power GPU, try GeForce GT 640M LE.
http://www.notebookcheck.net/NVIDIA-GeForce-GT-640M-LE.72199.0.html
It has the same TDP as the 520M, but it has much better performance.
As for ATI's rebranding strategy, I have no idea. It may just be that they don't have a new GPU tech to bank on. That would also explain why Apple went nVidia this generation.