Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Edit:
Just to be clear, I believe apple will shoot themselves in the foot if they drop the dGPU just for the reason of software. The software isn't there yet code wise to handle different vendors.
Unless it's a strategic move to make everyone with rendering/grid computing needs buy a new Mac Pro...

Everyone with serious rendering/grid computing needs should buy a Mac Pro or a dedicated render farm.

They have to check things like Photoshop performance, stuff that is used by freelancers etc that don't have access to huge resources. I don't have much experience there, except that Keynote sometimes complains about a lack of vRAM when I'm on the iGPU.

One would hope that the consumer would be a given a choice.

Yeah, choice would be nice, but this is Apple. There have been a few cases in the past where the cheaper model would only come with integrated graphics, while the more expensive one offered a dGPU. But I wouldn't bet on it.

Don't forget that apparently Apple is getting an exclusive custom Iris Pro with more power. Ask yourselves: How do you get such a deal? By agreeing to order large quantities. How can you achieve that: By only using one type of CPUs in that machines.
 
Don't forget that apparently Apple is getting an exclusive custom Iris Pro with more power. Ask yourselves: How do you get such a deal? By agreeing to order large quantities. How can you achieve that: By only using one type of CPUs in that machines.
Who says something about such a deal.
Even if it was neither exclusive nor custom. The only thing Intel would change is clock speed or TDP headroom. Both things Intel offers to anybody. Every one of these chips sold allows custom TDP up settings. Absolutely nothing exclusive about it just how you'd like to use them.
Intel can only offer so much at a given TDP and ever since Ivy Bridge they ship CPUs with three different TDP settings that only need to be activated.
You can by just a single one of these chips and have fun with it.

grid computing needs
What the hell is that supposed to mean on a notebook? You are setting up a grid of $2000 notebooks?
The only time you'd do something related to grid computing on a notebook, you'd control some grid remotely for which a smartphone cpu is fine and you most likely won't need a gpu at all.
 
Who says something about such a deal.
Even if it was neither exclusive nor custom. The only thing Intel would change is clock speed or TDP headroom. Both things Intel offers to anybody. Every one of these chips sold allows custom TDP up settings. Absolutely nothing exclusive about it just how you'd like to use them.
Intel can only offer so much at a given TDP and ever since Ivy Bridge they ship CPUs with three different TDP settings that only need to be activated.
You can by just a single one of these chips and have fun with it.

http://semiaccurate.com/2013/07/25/apple-gets-a-special-haswell-for-the-macbook-pro/
https://forums.macrumors.com/threads/1614807/

"unique", "something that no one else will get", "ultra-high performance part"...

Yes it's just a rumor, and a vague one. Maybe those are just a special selection that is more stable under overclocking. But the "Apple only" part seems pretty clear.
 
Everyone with serious rendering/grid computing needs should buy a Mac Pro or a dedicated render farm.

Except you can't lug any of those along with you when you need to go on a business trip to another facility.

They have to check things like Photoshop performance, stuff that is used by freelancers etc that don't have access to huge resources. I don't have much experience there, except that Keynote sometimes complains about a lack of vRAM when I'm on the iGPU.

Photoshop will see a significant performance drop because it relies more on OpenGL than it does on OpenCL.

OpenCL is used for very specific features, but it's not a necessity for a large number of tools while OpenGL actually is a necessity for some things to work.

Also without OpenGL, manipulating the canvas on a high resolution project is slow as molasse...

It's easy to check. Just disable GPU acceleration in Photoshop and then try to do things.
 
Iris isn't going to save anything on battery life. You are already using the igp with the current rmbp for basically all tasks sans gaming. What iris may reduce is power use when gaming and thus heat.

Thus increasing battery life.

My dGPU fires up for flash, VLC and various other applications which really shouldn't warrant it, too.
 
Except you can't lug any of those along with you when you need to go on a business trip to another facility.


Photoshop will see a significant performance drop because it relies more on OpenGL than it does on OpenCL.

OpenCL is used for very specific features, but it's not a necessity for a large number of tools while OpenGL actually is a necessity for some things to work.

Also without OpenGL, manipulating the canvas on a high resolution project is slow as molasse...

It's easy to check. Just disable GPU acceleration in Photoshop and then try to do things.

Finally someone that knows what they're talking about.




Everyone with serious rendering/grid computing needs should buy a Mac Pro or a dedicated render farm.


Where did I state "serious rendering"?. See my comment below next quoted text.




What the hell is that supposed to mean on a notebook? You are setting up a grid of $2000 notebooks?
The only time you'd do something related to grid computing on a notebook, you'd control some grid remotely for which a smartphone cpu is fine and you most likely won't need a gpu at all.


I occasionally do some fullHD editing in fcpX or Adobe Premie at home, sometimes when i'm out in the field or on a trip etc...
But now, imagine being a small firm say, 5-10 employees, all with MBPr for being able to do some rendering on the go/at a clients. Nothing wrong with that - one still needs a dGPU as stated before in those applications.

But during the holidays or vacations, the laptops/notebooks are left at the office docked, locked and unused.
I have a project that needs to be done - why not hook up all of unused computers and use them?
Can't really afford any extra rendering farms because it's not that often used.
See where i'm going with this?
Likely or unlikely scenario, it is doable and people are creative.
 
I occasionally do some fullHD editing in fcpX or Adobe Premie at home, sometimes when i'm out in the field or on a trip etc...
But now, imagine being a small firm say, 5-10 employees, all with MBPr for being able to do some rendering on the go/at a clients. Nothing wrong with that - one still needs a dGPU as stated before in those applications.

But during the holidays or vacations, the laptops/notebooks are left at the office docked, locked and unused.
I have a project that needs to be done - why not hook up all of unused computers and use them?
Can't really afford any extra rendering farms because it's not that often used.
See where i'm going with this?
Likely or unlikely scenario, it is doable and people are creative.
Ah yes I see. Still seems if you just make a render grid of scrap material lying around currently unused do 20% performance difference really make a difference. You are probably going to have enough of a problem with heterogenous architectures all over the place. The Iris Pro is with the right code probably really good in theory. I think the whole scenario is so unlikely that it can be pretty much ignored or at least the difference between a dGPU and the Iris Pro.
The hardware is designed for what people need most of the times. Like smartphone cpu's are designed to be primarily good at the code they tend to run most often like Java Script.
Setting up such a grid requires so much know how, that alone will make it a niche problem and once you overcome that hurdle the difference in performance isn't worth talking about.

Seriously Amazon EC2 instances (or some specialized cloud render service) aren't that expensive and you can have a bunch on them at any time you need them for only as long as you do. There you get way better tool support and if you account for the time spent to set it up I think a cloud render grid on demand will probably win out against some notebooks lying around in your office.

thundersteele said:
http://semiaccurate.com/2013/07/25/a...e-macbook-pro/
https://forums.macrumors.com/threads/1614807/

"unique", "something that no one else will get", "ultra-high performance part"...

Yes it's just a rumor, and a vague one. Maybe those are just a special selection that is more stable under overclocking. But the "Apple only" part seems pretty clear.
Looks to me like the typical nonsense that always shows eventually. It is probably just Apple using a TDP 55W or something setting and the fan boys call it a special custom part.
I doubt Apple gets any special binned parts. They are already the biggest customer of those high end parts and there is only so many good chips Intel has. The 4950hq is the coolest and best at over clocking. Intel wouldn't even be able to supply anything even better at any sufficient quantity that could potentially satisfy Apple.
 
Thus increasing battery life.

My dGPU fires up for flash, VLC and various other applications which really shouldn't warrant it, too.

Seems more like a driver problem. None of that requires a dgpu.
 
Sigh...

Again, for the umpteenth time, Iris Pro inherently has hardware problems. You can't remedy that with a software "fix".

Its performance falls off a cliff on any benchmark when the resolution is increased past 1680 x 1050 because there is just not enough high-bandwidth memory for the iGPU to access.

I'm not sure why so much is expected of the chip when reality has always pointed on the opposite side.

Of course its a driver problem. Apple's GPU switching is utterly retarded.

However, CPU of 35-45 watts + 45 watt GPU will always consume more power than CPU+GPU in the 45 watts or less (in total).

Not when the 45W GPU is not running at all. That's where Mavericks comes in.

Under light load, it doesn't matter if you have a 35-45W CPU + 45W GPU vs 47W CPU+GPU. Both should consume about the same amount of power.

And you forgot the display. That thing still sucks up battery like crazy.
 
If the special Apple Iris Pro closely compares to the performance of the 650m, then they'll need to put at least a 765m as a dGPU or else it would just be a waste of space having a 750m, which doesn't offer a great performance boost over a 650m. For some reason I can't see Apple using a 765m so I feel as though they may ditch the dGPU entirely. As a result, everyone will freak out and then the next refresh of rMBP will feature a dGPU again albeit a middling one for $2000+
 
Of course its a driver problem. Apple's GPU switching is utterly retarded.

However, CPU of 35-45 watts + 45 watt GPU will always consume more power than CPU+GPU in the 45 watts or less (in total).

Again intel generally seems to overshoot the TDP mark in tests. If you load up a mobile i7 quad with furmark + prime and cool it adeuqetly it will use MUCH more power than 45 watts.

Its hard to find benchmarks for iris but what I did find.

http://www.computerbase.de/artikel/grafikkarten/2013/intel-iris-pro-5200-grafik-im-test/6/

Comparison between i7-4750 and i7-4700 + 750m (asus computer). Asus was also tested with the igp

(Note these are different systems and so power consumption cannot be compared directly. For example the iris pro system has a 14.1" 1080p screen while the asus has a 15.6" 1080p screen).

Idle

Iris : 18 watts
Asus (igp): 20 watts
Asus: (optimus): 20 watts

(not a significantly difference given the different platforms, idle seems roughly equal between the two given the screen difference). These values are Very high and seem to be with full brightness in maximum performance mode)

Prime 95

Iris: 81 watts
Asus: (igp) 79 watts
Asus: (optimus) : 79 watts

Crysis 3:

Asus (igp): 79 watts
Iris : 82 watts
Asus (optimus): 89 watts

(the crysis 3 test appears to be broken 750m should be way higher in terms of fps when you look at the review). In fact all the 750m tests seem broken, 750m at 967/1250 (1085/1250 boost) is about equal to a 650 (desktop) or 7750 which should smack the crap out of the a10-6800k which it does not.) Take these tests with a grain of salt but it does not appear that iris is really that much more power efficient

(The game performance index shows the 750m as 24% faster than the a10-6800k and iris at the same speed as the a10-6800k which clearly isn't the case looking at techreport and AT)
 
Not when the 45W GPU is not running at all. That's where Mavericks comes in.

Under light load, it doesn't matter if you have a 35-45W CPU + 45W GPU vs 47W CPU+GPU. Both should consume about the same amount of power.

And you forgot the display. That thing still sucks up battery like crazy.

I'm running mavericks here right now kiddo. The GPU switching is STILL brain damaged, though admittedly far less so than before.

However it STILL fires the GPU up for stuff that quite happily runs with minimal CPU/GPU load on an MBA which doesn't even HAVE a discrete GPU to fire up.

And as soon as the GPU becomes active, I lose at least 1.5-2 hours of battery life. Often for stuff the HD3000 in my machine could handle without breaking a sweat and HAS handled just fine when i've forced the GPU to not be enabled with gfxcardstatus.

However I don't like 3rd party drivers like that on my machine.
 
I'm running mavericks here right now kiddo. The GPU switching is STILL brain damaged, though admittedly far less so than before.

However it STILL fires the GPU up for stuff that quite happily runs with minimal CPU/GPU load on an MBA which doesn't even HAVE a discrete GPU to fire up.

And as soon as the GPU becomes active, I lose at least 1.5-2 hours of battery life. Often for stuff the HD3000 in my machine could handle without breaking a sweat and HAS handled just fine when i've forced the GPU to not be enabled with gfxcardstatus.

However I don't like 3rd party drivers like that on my machine.

I'm not sure what you're running that fires up the dGPU, but I'm only seeing the dGPU firing off in Photoshop or VMWare or AutoCAD. It's dead all the time elsewhere.
 
I'm not sure what you're running that fires up the dGPU, but I'm only seeing the dGPU firing off in Photoshop or VMWare or AutoCAD. It's dead all the time elsewhere.

VLC does it for one.
Neither VMware nor Photoshop should require the GPU (when on battery, unless i ask for it), both run just fine on HD3000 or even GMA950 in my Mac Mini.
 
VLC does it for one.
Neither VMware nor Photoshop should require the GPU (when on battery, unless i ask for it), both run just fine on HD3000 or even GMA950 in my Mac Mini.

VLC doesn't do it for me. It may be because I have a HD 4000 instead of the HD 3000.

And the dGPU needs to be on when connected to an external display, so if you plug in an external display while running VMWare, it may crash. I have had that happen to me before. That's why I'm not using gfxCardStatus.
 
VLC doesn't do it for me. It may be because I have a HD 4000 instead of the HD 3000.

And the dGPU needs to be on when connected to an external display, so if you plug in an external display while running VMWare, it may crash. I have had that happen to me before. That's why I'm not using gfxCardStatus.

Yeah same reason i got rid of it. That and it seemed to get confused as to what state it was in occasionally, i can't remember the exact circumstances.

I think it's the video chip shift that kills VMware. So long as you pick one state or the other vmware is fine, but obviously external display "needs" dGPU, so....

But again, HD3000 only machines can run external display just fine.


What apple need is a big tick box in system preferences, power - "don't run dGPU on battery". Is it too much to ask? I don't want the option to "Enable GPU switching" which basically just runs the dGPU all the time if i turn it off...
 
Yeah same reason i got rid of it. That and it seemed to get confused as to what state it was in occasionally, i can't remember the exact circumstances.

I think it's the video chip shift that kills VMware. So long as you pick one state or the other vmware is fine, but obviously external display "needs" dGPU, so....

But again, HD3000 only machines can run external display just fine.


What apple need is a big tick box in system preferences, power - "don't run dGPU on battery". Is it too much to ask? I don't want the option to "Enable GPU switching" which basically just runs the dGPU all the time if i turn it off...

It's a hardware limitation from what I can see. All external display ports are routed through the dGPU. If you plug in any display, the dGPU is forced either way, so they can't use just the iGPU.
 
Sigh...

Again, for the umpteenth time, Iris Pro inherently has hardware problems. You can't remedy that with a software "fix".

Its performance falls off a cliff on any benchmark when the resolution is increased past 1680 x 1050 because there is just not enough high-bandwidth memory for the iGPU to access.

I'm not sure why so much is expected of the chip when reality has always pointed on the opposite side.

I suppose Intel could be adding more memory to fix this problem, but as far as i understand it the memory used in the Iris Pro is one of the most expensive components, and increasing it would likely lead to a large spike in unit costs.
 
Id prefer the discreet graphics. If I wanted GREAT battery life Id go with the MBA. 7 hours for a rMBP is a long enough battery life (compared to a few years ago). I would prefer the power rather than the battery life
 
Sigh...

Again, for the umpteenth time, Iris Pro inherently has hardware problems. You can't remedy that with a software "fix".

Its performance falls off a cliff on any benchmark when the resolution is increased past 1680 x 1050 because there is just not enough high-bandwidth memory for the iGPU to access.
Notebookcheck got around to test the 4750hq. While that notebook test is useless for any other info other than benchmarks it does show something about the resolution problem you claim to exist.
There are only two real cases in Bioshock and BF3 where it is pretty much true that resolution becomes a defining issue. In either case with all the bandwidth in the world the Iris pro wouldn't yield playable frame rates at those settings. 9.5 fps vs a potential 12.5 fps ;) or 10 fps vs a potential 12 in BF3 (assuming the same performance degradation the 650M shows in the same settings). It doesn't look like a problem with any real world impact.
Most games show no problem with high res or even perform better.

low/med/high is all on 768p
ultra is 1080p
high is 2xMSAA
ultra is 4xMSAA
the rest is low/med/high/ultra settings as the usual quick setting in all the games that adjusts most detailed settings appropriately.
For the comparison the actual 650M isn't important compared to others as I only care to compare the drop in performance it gets on the respective settings vs the Iris Pro facing the same problem.
Code:
		650m	hd 5200	650M/5200-1
 Company of Heroes 2 (2013)
	low	25,9	25,9	-0,00%
	med.	20,7	21,5	-3,72%
	high	13,3	12,3	8,13%
	ultra	4,8	5,8	-17,24%
 GRID 2 (2013)				
	low	113,8 116,2 -2,07%
	med.	82,1	61,7	33,06%
	high	59,7	42,5	40,47%
	ultra	18,9	15,3	23,53%
 Metro: Last Light (2013)				
	low	42,72	47,1	-9,30%
	med.	37,22	34,9	6,65%
	high	20,72	17,8	16,40%
	ultra	12	9,6	25,00%
 BioShock Infinite (2013)				
	low	98	78,5	24,37%
	med.	56	40,1	39,48%
	high	48	35,2	36,34%
	ultra	17	9,8	78,08%
 SimCity (2013)				
	low	40,9	219,9 -81,40%
	med.	25	34,3	-27,11%
	high	20,6	24,4	-15,57%
	ultra	11,8	11	7,27%
 Tomb Raider (2013)				
	low	113,93	123,1 -7,45%
	med.	62,73	58,6	7,05%
	high	37,83	35,3	7,17%
	ultra	14,43	18	-19,83%
 Crysis 3(2013)				
	low	51,3	50,7	1,18%
	med.	32,6	31,4	3,82%
	high	21,9	20,9	4,78%
	ultra	8,4	7,7	9,09%
 Dead Space 3 (2013)				
	low	184,9 164,6 12,33%
	med.	89,6	73,2	22,40%
	high	69,3	55,3	25,32%
	ultra	41,7	32,7	27,52%
 Hitman: Absolution (2012)				
	low	45,8	44,6	2,69%
	med.	40,4	34,3	17,78%
	high	22,9	14,4	59,03%
	ultra	9,9	6,9	43,48%
 Call of Duty: Black Ops 2(2012)				
	low	91	92	-0,70%
	med.	71	67,7	4,51%
	high	44	28,4	55,82%
	ultra	27,82	18,3	52,02%
 Dishonored (2012)				
	low	93,92	89,9	4,47%
	med.	85,62	74,5	14,93%
	high	82	67,1	21,99%
	ultra	54	40,3	32,88%
 Fifa 13 (2012)				
	low	261,7	317	-17,44%
	med.	195,8	197,6 	-0,91%
	high	182,9	149,6	22,26%
	ultra	132,8	80,3   	65,38%
 Counter-Strike: GO (2012)				
	low	161,3	240,4	-32,90%
	med.	136,8	181,3	-24,54%
	high	111,9	88,4	26,58%
	ultra	72,3	51,9	39,31%
*Diablo III*(2012)				
	low	164,37	133,8	22,85%
	med.	105,7	86,2	22,62%
	high	86,9	75	15,87%
	ultra	58,19	46	26,50%
*Anno 2070*(2011)				
	low	141,491	134,7	5,04%
	med.	61	64,4	-5,88%
	high	39	39,2	-0,21%
	ultra	18,9714	18,1	4,81%
*The Elder Scrolls V: Skyrim(2011)				
	low	75,3	70	7,57%
	med.	56,83	36,6	55,27%
	high	41,96	21,4	96,07%
	ultra	24,15	12,1	99,59%
*Battlefield 3*(2011)				
	low	64,43	46,1	39,76%
	med.	42,84	33,8	26,75%
	high	33,65	26,5	26,98%
	ultra	15,25	10	52,50%

copy paste von [URL="http://www.notebookcheck.com/Welche-Spiele-laufen-auf-Notebook-Grafikkarten-fluessig.13827.0.html?sort=b_160_515&deskornote=0&or=0&search=&month=&benchmark_values=&gpubenchmarks=0&professional=0&archive=1&dx=0&multiplegpus=0&showClassDescription=0&itemselect_2986=2986&itemselect_4457=4457&condensed=0&showCount=0&showBars=0&gameselect%5B%5D=223&gameselect%5B%5D=220&gameselect%5B%5D=217&gameselect%5B%5D=214&gameselect%5B%5D=212&gameselect%5B%5D=210&gameselect%5B%5D=208&gameselect%5B%5D=204&gameselect%5B%5D=202&gameselect%5B%5D=193&gameselect%5B%5D=191&gameselect%5B%5D=188&gameselect%5B%5D=186&gameselect%5B%5D=176&gameselect%5B%5D=170&gameselect%5B%5D=168&gameselect%5B%5D=166&gameselect%5B%5D=162&gameselect%5B%5D=160&gameselect%5B%5D=142&gameselect%5B%5D=110&gameselect%5B%5D=112&gameselect%5B%5D=105&gameselect%5B%5D=90&gameselect%5B%5D=52&gameselect%5B%5D=49&gpu_fullname=1&codename=0&architecture=0&pixelshaders=0&vertexshaders=0&corespeed=0&shaderspeed=0&memoryspeed=0&memorybus=0&directx=0&technology=0&daysold="]notebookcheck.com[/URL]
Skyrim obviously has a problem but it is not really a resolution problem and compared to the HD 4600 the Iris Pro improves as much as it should so it does get fed with data. It is just not terribly good with that engine.
Fifa 13 is still at 80 fps and the picture looks the same at the lower res setting changes. It is probably just gpu/cpu turbo juggling.
In most situations resolution won't be an issue nor will bandwidth in general. Intel seem to see it that way as even mobile Broadwell won't come with DDR4 which could be ready if they wanted too.


@Cirus I wouldn't take any power measurements seriously until you see and quality products with the stuff.
Notebookcheck tested that 4750HQ and the package itself does stay with in the 47-55W range. The rest of the notebook isn't exactly top notch and probably the power supply too. Razer I think showed what Haswell can do for battery life if done right. Many others showed how easy one can do it wrong.
I know it is german but look under "Energieverwaltung" at the screenshots posted. http://www.notebookcheck.com/Test-Schenker-S413-Notebook-Clevo-W740SU.97789.0.html
The notebook itself adds like 20W along with the power supply.

The big benefit won't be load power anyway because 70W is still enough to drain the battery in an hour or close enough. It is in being able to always run at the right power savings level. No photoshop in the background that drains battery life. In split seconds in can ramp up for 3D rendering and then go down to iGPU levels immediately. With Apple's graphics swtiching anything in the background can keep the dGPU active and sucking power for no reason. Under low load dGPU's are just wasteful. Nobody is gaming on battery power anyway but many people might launch a few apps here or there that could for a burst need a fast GPU but not all the time.

@all VLC activates my dGPU too on the 2010 but it works fine and saves about 5W in playback (720p material) when running on the forced at launch Intel GPU. Hardware acc. is obviously working and it can handle the frames and load with ease. It is just Apple being retarded with their implementation.
Even if they fixed that in Mavericks and not turn to the dedicated quite as often, the problem remains that any app in the background that does need the dGPU keeps that GPU active even if it isn't even in focus or maybe just doing currently a task that doesn't need any GPU muscle.
 
Last edited:
Notebookcheck got around to test the 4750hq.

Mind posting the link to said Notebookcheck test? I couldn't find this no matter how hard I tried. The only reference was to an external review done by a German website.

And I know there are 2 German reviews of Iris Pro. The one done with 4750HQ had comparisons to GeForce GT 750M, but it's super dubious at best.

If they took those "results" from this chart:

http://www.hardwareluxx.de/images/stories/galleries/reviews/schenker_s413/spieleleistung.png

On this website:
http://www.hardwareluxx.de/index.php/artikel/hardware/notebooks/27246-schenker-s413-im-test.html

Then some of them just don't look "right". And said website has demonstrated favorable attitude toward Intel and less toward the laptop that runs 750M.

Even if those look "right", the comparison chart on Notebookcheck actually takes the results of the GT 650M w/ DDR3 results into account as well. The 650M w/ DDR3 is very definitely bandwidth starved.

I know because my rMBP with 650M w/ GDDR5 gets much better results for some of those.

For instance, Bioshock Infinite at all High settings and 1366 x 768 under Bootcamp, I get 60-75fps.

Battlefield 3, at all High settings and 1366 x 768, I get 55-62fps.

Diablo 3 at Ultra (as in everything maxed out) and 1920 x 1080, I get 70fps.

Call of Duty Black Ops with everything maxed out and at 1920 x 1080, I get 38-45fps.

Those numbers look way too low.

In fact, Skyrim with "ultra" at 1920 x 1080, I get 45fps. Almost doubled what they're showing.

In most situations resolution won't be an issue nor will bandwidth in general. Intel seem to see it that way as even mobile Broadwell won't come with DDR4 which could be ready if they wanted too.

Resolution is an issue... when you consider bandwidth and memory capacity.

If not, why would high-end graphics cards like the GTX 680 come with 1GB VRAM or more instead of 512MB or 256MB? Seeing as 128MB in Iris Pro is enough?

Honestly, I think you're just wishing that Iris Pro wouldn't be too bad at this point, but... I'm sorry to burst your bubbles: it is.
 
<snip>

@Cirus I wouldn't take any power measurements seriously until you see and quality products with the stuff.
Notebookcheck tested that 4750HQ and the package itself does stay with in the 47-55W range. The rest of the notebook isn't exactly top notch and probably the power supply too. Razer I think showed what Haswell can do for battery life if done right. Many others showed how easy one can do it wrong.
I know it is german but look under "Energieverwaltung" at the screenshots posted. http://www.notebookcheck.com/Test-Schenker-S413-Notebook-Clevo-W740SU.97789.0.html
The notebook itself adds like 20W along with the power supply.

The big benefit won't be load power anyway because 70W is still enough to drain the battery in an hour or close enough. It is in being able to always run at the right power savings level. No photoshop in the background that drains battery life. In split seconds in can ramp up for 3D rendering and then go down to iGPU levels immediately. With Apple's graphics swtiching anything in the background can keep the dGPU active and sucking power for no reason. Under low load dGPU's are just wasteful. Nobody is gaming on battery power anyway but many people might launch a few apps here or there that could for a burst need a fast GPU but not all the time.

That laptop seems okay but there are major problems in implementation. (If apple uses it then I expect it to be largely fixed).

Limited clocks on battery? (1.3 Ghz CPU and 200 mhz gpu).

We are comparing to the regular 650m. Apple's 900/1250 gpu is a little (~10-15% stronger).

The iris Pro Graphics may be the fastest processor graphics ever, but it is also the energiehungrigste: 73 watts in 3DMark 06 are measured by the performance only an average result. As required by the stronger one K56-3F with 4700MQ Core i7 and GeForce GT 750M only about 10 watts more. The maximum power consumption is 78.3 watts in turn with relatively restrained, but this is mainly due to the limited turbo boost (see stress test).
Its okay but definitely not using 47 watts of power (73-47=26 watts for rest of system which is rather high). For the i7 + 750m to only use 10 watt more iris really isn't good in terms of efficiency (considering that the non iris has higher clocks and the 750m is 40%+ on average better than iris).

Its using a little less power but not much less. 750m is much stronger in gaming
 
I posted the link but here again.
http://www.notebookcheck.com/Test-Schenker-S413-Notebook-Clevo-W740SU.97789.0.html
notebookcheck is a german site therefore the german page usually gets tests first. Later they are translated to the other languages. Select German in the top left to see the most recent tests. If you aren't interested in the in between blabla it should really matter that it is German.

That PCB of that notebook looks ridiculously huge.
Schenker's clevo barebones are great for the budget gamers out there but they aren't very refined.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.