Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Cost, reliability, and battery life are all good reasons to eliminate the discreet GPU. As the number of devices that fit on a single die increases, ultimate performance will become a fourth reason in two or three years. For a given power (heat dissipation) budget, integrated graphics already offers better performance than discrete graphics.

I think my argument is the construction is there, the heat dissipation is also engineered to withstand a discrete graphic card. it's not something that'll require radical re-engineering so why would they leave it out? I can only see them sell non-dedicated 15" as an alternative which they have done previously
 
This is like saying a truck is better for moving things than an 18-wheeler because it gets better gas mileage. The fact your new Macbook Pro with an integrated graphics card is running cooler and lasting longer on a battery won't matter much when it can't perform the same tasks nearly as well as your old MBP with a discrete graphics card.
That's a false analogy followed by a straw-man argument.

The MBP is supposed to be a portable workstation. Size, heat, and battery should be secondary concerns in relation to its main task of getting things done.
That's your supposition, which I do not share.

I think my argument is the construction is there, the heat dissipation is also engineered to withstand a discrete graphic card. it's not something that'll require radical re-engineering so why would they leave it out?
Argument based on a false implicit premise, specifically that the heat dissipation budget cannot be re-allocated from a dGPU to the CPU, which of course it can. I would much rather have a 60W CPU with iGPU (with all the processing power that comes with) than a 45W CPU + 45W dGPU.
 
That's a false analogy followed by a straw-man argument.


That's your supposition, which I do not share.


Argument based on a false implicit premise, specifically that the heat dissipation budget cannot be re-allocated from a dGPU to the CPU, which of course it can. I would much rather have a 60W CPU with iGPU (with all the processing power that comes with) than a 45W CPU + 45W dGPU.

... lol you're trying to hard bro. Do you even know how heat dissipation works? You don't just rebudget heat dissipation like that.
 
That's a false analogy followed by a straw-man argument.

No, it's pretty apt. You're sacrificing power and speed for better battery life and heat management. I could understand this being tempting for Apple, since it'll allow them to make an even thinner MBP, but it's not necessarily the best move for some of their end users.

And how is that a strawman argument? Anyone who relies heavily on GPGPU tasks on their Macbook Pro will find the Iris Pro the newest rev a downgrade from the previous generation.

That's your supposition, which I do not share.

And what exactly do you think the MBP is used for?

edit: Okay, I take what I said above back. It looks like the Iris Pro 5200 is better at OpenCL tasks than I initially thought.
 
Last edited:
It's just a benchmark, but...

Notebookcheck- GT 650M (GT 750M is the same chip, better clocks, but nothing major)

Image


And Iris Pro 5200

Image

Well that "benchmark" has a lot of real world 3d applications which show that Iris pro will in some cases be twice as fast as the nvidia part.
Both Solidworks and Pro/E (which I use both especially SW) seem to be twice as fast and will prove to be significant enhancments...
I don't understand why many posters here confuse 3d pro apps (like spaceclaim, solidworks, Maya, LW) with 3d graphics for games.
Also the 3d rendering that many proclaim will be better with a dgpu, is in fact a funtion of the CPU and not the GPU. There's only a handful of gpu renderers out there and none of them are robust enough to use for all tasks! 95% of all 3d rendering is done by CPU rendering engines.

There are dedicated laptops for gaming and MBP is def the wrong choice for one even though it's fairly competent....
Bottom line I would rather be able to fluidly view, create, and edit my 3d models in Spaceclaim and SW (thats how I make my money when Im on the go) rather than playing Crysis 3 at max res (thats how I waste my time when Im on the go)..
 
..but what about shared memory?...
the discrete has its own memory...maybe its just not so important but this for me
would be worst ...not better.
 
Last edited:
You have earned my respect.

..but what about shared memory?...
the discrete has its own memory...maybe its just stupid but this for me
would be worst ...not better.
Shared memory eliminates the need for copying data back and forth between the main memory and video memory, but make sure you have enough to avoid swapping. I won't buy another Mac with only 4GB -- even though it's probably enough for many users.
 
No, it's pretty apt. You're sacrificing power and speed for better battery life and heat management. I could understand this being tempting for Apple, since it'll allow them to make an even thinner MBP, but it's not necessarily the best move for some of their end users.

And how is that a strawman argument? Anyone who relies heavily on GPGPU tasks on their Macbook Pro will find the Iris Pro the newest rev a downgrade from the previous generation.



And what exactly do you think the MBP is used for?

edit: Okay, I take what I said above back. It looks like the Iris Pro 5200 is better at OpenCL tasks than I initially thought.

I see the Iris Pro..better for the new retina 13 (which IMO its a Macbook not a Pro)
or for the future of the Air in the 2014-2015 while
the MacBook Pro its a different beast and cannot become a QUAD AIR...
with a MacBook Pro price....
sorry.:mad::apple:
 
I see the Iris Pro..better for the new retina 13 (which IMO its a Macbook not a Pro)
or for the future of the Air in the 2014-2015 while
the MacBook Pro its a different beast and cannot become a QUAD AIR...
with a MacBook Pro price....
sorry.:mad::apple:


....summary cMBP or with the GT 650, will have a better performance at the expense of bigger Eneri consumption and heat up .....?
 
....summary cMBP or with the GT 650, will have a better performance at the expense of bigger Eneri consumption and heat up .....?

don't see your point..you prefer to have less power and less heat..
take the Air then..why to wait..

----------

Shared memory eliminates the need for copying data back and forth between the main memory and video memory, but make sure you have enough to avoid swapping. I won't buy another Mac with only 4GB -- even though it's probably enough for many users.

yes but the total body aluminum screwed and glue as hell to make the cheese inside so thinner that all the juice will fly over the space between the metal molecules...don't make this operation really easy..isn't it...?so you have to buy on order the max you can at Apple price..
i prefer to have a real gpu with its memory...lots of..not 128..200 MB..512...etc shared...with the system...
 
Is the shared memory really fast enough to put hundreds of MB of texture data through? This was always the strong side of dGPU.
 
I understand that fear but if this Iris Pro ends up being faster than the dedicated card Apple would have used anyway then does it really matter?

Intels Haswell chips are manufactured on a 22nm process. Broadwell is set to be produced on a 14nm process. NVIDIA and AMD are stuck right now at 28nm and they may well be still at 28nm when Broadwell launches. Intel is ahead of the game in power consumption and gate size.

1. That would be interesting if that was the case. I somehow don't see that happening but who knows.

2. You don't think nVidia will at least be on a 20nm process? We'll see. It's a race to be the best.
 
If you care about GPU performance, wait for Broadwell.

There's some principle I read once but can't recall about a type of paralysis about pulling the trigger on buying a tech item based on, "no, the next one will THE one I've been waiting for," and then having the same feeling kick in as soon as THAT one is released ("No, now I'm gonna wait for _____" [insert tech in development here, e.g., TB 2, 811ac, "cold fusion" powered, etc.]) - and so repeatedly being unable to buy.

Anybody know the name and how it was stated?

Anyway, I've been in that loop for the last three revs at least. So it's only the sheer obsolescence of my gear that's going to force my hand after Apple's fall products are released this year.

And I'm thinking the next crop is sounding pretty sweet. So I'll still have those feelings, but sure I'm going to have gear that's a buncha bumps up from what I'm limping along with.....
 
There's some principle I read once but can't recall about a type of paralysis about pulling the trigger on buying a tech item based on, "no, the next one will THE one I've been waiting for,"...

It's worst than it's ever been right now, because the changes being made aren't just a slightly faster version of X, or a slightly improved version of Y. They're huge upgrades that can make for an entirely different laptop experience. Haswell brought significant battery life upgrades to laptops, to the point an MBA can last 13 hours on a charge. Broadwell is supposed to bring the same improvements to integrated GPUs. Intel isn't making baby steps here. They're huge leaps in two very important areas.
 
It's worst than it's ever been right now, because the changes being made aren't just a slightly faster version of X, or a slightly improved version of Y. They're huge upgrades that can make for an entirely different laptop experience. Haswell brought significant battery life upgrades to laptops, to the point an MBA can last 13 hours on a charge. Broadwell is supposed to bring the same improvements to integrated GPUs. Intel isn't making baby steps here. They're huge leaps in two very important areas.

And, as you pointed out above, the benchmark results are showing that the latest (Iris Pro 5200) integrated GPUs now have decent performance. That is going to be good enough for most people, which may cause a death spiral for discrete graphics in notebooks. Sure, gamers and graphics professionals will always want more on the desktop, but, sadly to me, the real laptop (think late 2007 through unibody MBP) seems to be going bye-bye. We have already lost the 17" MBP. You have to love the new battery life trend, though.
 
Thunderbolt 1 is 10Gb/s
Thunderbolt 2 is 20Gb/s
PCIe x8 v2.0 like the Mac Pro you referenced is 32Gb/s
PCIe x16 v2.0 is 64Gb/s
PCIe x16 v3.0 is 128Gb/s

Keep in mind when Intel and Apple discuss Thunderbolt they are talking in Gigabits not GigaBytes like on the PCIe spec pages. To make it simpler I have converted all GigaByte speeds in to Gigabit (GB -> Gb).

Thunderbolt 2 doesn't even reach the same performance as PCIe 2.0 x8 - It is closer to x4 (16Gb/s) and many sites have shown that modern graphics chips when run at PCIe 2.0 x4 speeds greatly diminish in performance. And the problem is compounded when looking at GPGPU workloads like those created by OpenCL and CUDA as those technologies heavily exchange data with the CPU and system memory.

Here is one benchmark showing an AMD HD 5870. This card launched in September 2009. That makes it almost 4 years old. Now take a look at the performance benchmarks:

Image

As you can see by dropping down from PCIe x16 to x4 the average frames per second fell by 13%. Now keep in mind that may not seem like a lot but remember this testing was done with a 4 year old graphics card that is much slower than modern day processors that one may wish to connect over Thunderbolt in an external chassis.

In-fact my own testing with my GTX 780's has confirmed this hypothesis and not with x8 PCIe 2.0 but x16 PCIe 2.0. I actually saw a 300 point increase in the Unigine benchmark just by changing from PCIe 2.0 x16 to 3.0 x16. That resulted in a 7% performance increase in graphics performance.


Sure, has an impact, though I'll refer you to the front page :p
 
If Apple drops Dedicated Graphics (with its own dedicated RAM) they should also drop the word 'Pro' from the name and call it Macbook Casual Consumer

Right, because graphics performance is the sole indicator of whether or not a laptop can be used professionally. :rolleyes:

It has nothing to do with processing power, durability/reliability, efficiency, or any of that. Nothing.
 
Sure, has an impact, though I'll refer you to the front page :p

GTX 570. A card from 2010. At 1366x768.

To put this in perspective a modern card is 4-8x as bandwidth hungry as the one used in the front page article. Even the 650m in the Retina MacBook Pro is faster than a GTX 570.
 
mouse and touchscreen complements each other, it is a GOOD combination allowing the users to operate the tablets comfortably while holding it and placing it onto a flat surface.
No, they don't. It's like the Amphicar, not a good boat, not a good car and not even better than owning a car and a boat separately.
the surface does not restrict you to only a single input mode at one time.
Instead it provides two input methods, which interfere with each other and create a huge mental overhead. There are better tablets and there are better PCs. The surface is merely the best Tablet-PC so far. A category of products people do not buy since 2002.

tumblr_kyth3k1x7X1qzbioio1_400.jpg

It's (still) not! :cool:
microsoft surface is about wrong execution, not because of fusing incompatible technologies.
Microsoft is about bad design decisions, like carrying all the old desktop ballast over into the new mobile world. No wonder it doesn't work and Microsoft had to write of $900M in unsaleable Surfaces. It's impossible to build a great car, that also is a great boat. The requirements are much too different. If you want to build a great mobile computer, it can not also be a great desktop.
you do not get the word "complement" do you?
Do you want to teach me on set theory? Than look up the meaning of "intersection". Windows 8 even comes with two kinds of Internet Explorer one for touch and one for mouse. That's insane! Shortcuts and Gestures are different input methods, which do complement each other, because they don't go for the same click targets. Touch targets need to be bigger, point targets need to be smaller and closer together. The combination of both doesn't work well.
the HD5200 is NOT as good as GT650, you'd be idiot to buy a machine that has a lower performance than its predecessor. and 2012 to 2013 is ONE year, Apple has been doing it for FIVE years.
Can you hear that? That's the sound of me, not caring. I will happily trade off my 2010 MBP against a 2013 rMBP. In case you actually believe, performance is down for five years in a row. :D
 
No, they don't. It's like the Amphicar, not a good boat, not a good car and not even better than owning a car and a boat separately.

Instead it provides two input methods, which interfere with each other and create a huge mental overhead. There are better tablets and there are better PCs. The surface is merely the best Tablet-PC so far. A category of products people do not buy since 2002.

nobody tells you to use mouse and touch screen at the same time.
just switch them according to your needs, if the tablet is sitting on the table then use the mouse, if you're holding it then use your fingers.

is that REALLY a mental overhead? one has to have cognitive capacity below average to get overwhelmed by just choosing between whether to use mouse or touch input.


Image
It's (still) not! :cool:
Microsoft is about bad design decisions, like carrying all the old desktop ballast over into the new mobile world. No wonder it doesn't work and Microsoft had to write of $900M in unsaleable Surfaces. It's impossible to build a great car, that also is a great boat. The requirements are much too different. If you want to build a great mobile computer, it can not also be a great desktop.

People aren't buying it because the customers are not exactly well informed about the uses, it can be a great laptop and great tablet as well.

Microsoft tablet 2002 is a bad design, its huge, heavy and not ergonomic.
but the Surface is different, it runs on a machine good enough to be considered as a laptop and lightweight to be hold as a tablet.
do you even own a Surface to judge their design?

Can you hear that? That's the sound of me, not caring. I will happily trade off my 2010 MBP against a 2013 rMBP. In case you actually believe, performance is down for five years in a row. :D

Yeap, i can hear that ... the sound of a frog inside a well who sees that his needs always equals to everybody's needs ...
great job by guessing the minds of the rest of population ...
 
is that REALLY a mental overhead? one has to have cognitive capacity below average to get overwhelmed by just choosing between whether to use mouse or touch input.

Haven't we heard for decades that Apple users can't deal with the complexity of a two-button mouse?

How could Apple users possibly deal with choosing between a three-button mouse and touch input?

;)
 
1. That would be interesting if that was the case. I somehow don't see that happening but who knows.

2. You don't think nVidia will at least be on a 20nm process? We'll see. It's a race to be the best.

Ok,maybe..but lets stay on present..,i don't understand in order to accept the d.gpu cut i see comparison with the old 650 and with the future Broadwell..you won't have Broadwell with the fall MacBook pro..and at the same time its acceptable to have the same 650m speeds but for the higher price of the pro line...if still exist...
what about Nvida 7xx and Iris 5200?

----------

Right, because graphics performance is the sole indicator of whether or not a laptop can be used professionally. :rolleyes:

It has nothing to do with processing power, durability/reliability, efficiency, or any of that. Nothing.

Basically you want a 13-15 inch with just quad in it,but what about the price?
Would be good for you pay an overpriced Air Pro?
Sorry i don't buy it..for 2000 bucks i want a real gpu there,simply.
ps.
Edited.
 
Last edited:
The MBP is supposed to be a portable workstation. Size, heat, and battery should be secondary concerns in relation to its main task of getting things done. With an integrated GPU alone, it's only one step above being a souped up Air with a nicer screen.

It has never been this, and you've never been able to make decent use of the discrete GPU when on the move due to battery life issues, unelss your job consists of working 1-2 hours per day.

If what you want is a machine you can carry from desk to desk, and run on AC power (i.e., a portable workstation), the new Mac Pro fits that niche.
 
It has never been this, and you've never been able to make decent use of the discrete GPU when on the move due to battery life issues, unelss your job consists of working 1-2 hours per day.

If what you want is a machine you can carry from desk to desk, and run on AC power (i.e., a portable workstation), the new Mac Pro fits that niche.

You're missing the point,not every people can afford (or have space for )"a desk and lap togheter" but they still need to work everywhere as pro as possible,so the macbook was a good compromise for it without that good luck Apple because old classic/r ivy 650 will sky rocket !!!! without at least a quad core bto for the 13 and a discrete gpu bto for the 15 staying too expensive and too close to Air.
ps.
Are you really saying to transport desk by desk a DustBin pro?..pay attention not to grab a trash while you're on the rush then.
 
Last edited:
for 2000 bucks i want a real gpu there,period.

A discrete GPU is not any more "real" than an integrated GPU. Your desire for the GPU to be discrete doesn't make any more sense than any of the following:

"I won't pay 2000 bucks for a laptop unless it has a discrete memory controller, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete north bridge, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete L2 cache, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless it has a discrete floating point processor, not one integrated with the CPU."

"I won't pay 2000 bucks for a laptop unless every core has its own die, not all the cores integrated into one CPU."

Integration has always been the way forward since the beginning of integrated circuits, is the way forward today, and will continue to be the way forward long after discrete GPUs have joined 14 inch hard drives in the dustbin of history.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.