Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
]Also, and this is a big also here. The iMac's Iris Pro configuration has a TDP of 65W (because it is a desktop variant), while the 15" rMBP's would only be 47W (for mobile). So that is partly while the iMac's Iris Pro looks even stronger. The rMBP's would not score that high due to a lower TDP

Although with Broadwell, there's more power savings promised on top of Haswell, so the difference probably wouldn't be as great.
 
Although with Broadwell, there's more power savings promised on top of Haswell, so the difference probably wouldn't be as great.

Difference between what? TDP? If so, the iMac doesn't need low TDP as it's a desktop. That's why it's able to put in the desktop configuration with 65W. The rMBP Broadwell would have ~45-50W system with additional battery savings between 20-40% probably. And likely a iGPU performance increase of ~40% too. I don't understand your comment :confused:

Just because you have x amount of power savings doesn't mean that's going to negatively correlate with your performance gains.
 
You all are outdated. Im waiting for the one with 128GB DDR6 RAM, Nvidia 8GB 950M, 2TB PCIe 2.0 SSD that reads and writes at 3000MB/s, 3D retina display, wifi 802.11z that has 5miles range, 3 days non stop battery 6 months standby time. All while being able to play battlefield 4 in bootcamp at max resolution, 60FPS constant with everything on ultra settings. Nah, Ill wait for the one after that, I heard they will make wireless charging available... After like more than 100 years since Nikola Telsa discovered wireless power. Now that's innovation!

LOL.
 
You won't see DDR4 RAM support in consumer parts until at least Skylake. Apple won't even necessarily implement it as soon as it's supported, either.

This has been confirmed multiple times. Don't get your hopes up.
 
You all are outdated. Im waiting for the one with 128GB DDR6 RAM, Nvidia 8GB 950M, 2TB PCIe 2.0 SSD that reads and writes at 3000MB/s, 3D retina display, wifi 802.11z that has 5miles range, 3 days non stop battery 6 months standby time. All while being able to play battlefield 4 in bootcamp at max resolution, 60FPS constant with everything on ultra settings. Nah, Ill wait for the one after that, I heard they will make wireless charging available... After like more than 100 years since Nikola Telsa discovered wireless power. Now that's innovation!

LOL.

Stop. Many people don't care about Haswell, and Broadwell might be what they're after. For those who actually care about GPU performance. If you just browse the web on your MBP then by all means, continue to mock this thread. But if that's the case, you sir, are a noob.

LOL.
:D
 
Stop. Many people don't care about Haswell, and Broadwell might be what they're after. For those who actually care about GPU performance. If you just browse the web on your MBP then by all means, continue to mock this thread. But if that's the case, you sir, are a noob.

LOL.
:D

Even if Intel comes out with Roswell, a CPU + GPU invented from reverse engineering an alien spacecraft, someone is going to complain it's not good enough.

I remember when I build a new PC and stuck 128MB of RAM in that thing. Everyone said 32MB was more than enough.
 
Even if Intel comes out with Roswell, a CPU + GPU invented from reverse engineering an alien spacecraft, someone is going to complain it's not good enough.

I remember when I build a new PC and stuck 128MB of RAM in that thing. Everyone said 32MB was more than enough.

Cool story bro
 
An integrated GPU doesn't need to be better in every way for Nvidia to lose a customer. It only needs to be better overall (including price, reliability, power consumption, etc.).

But do you see what I did there? I made a statement that is easy to defend. I'll let you debate the nuances for 100s of posts.. ;)
 
But do you see what I did there? I made a statement that is easy to defend. I'll let you debate the nuances for 100s of posts.. ;)

I saw that you made an argument which is unassailable. I made an argument which is easy to defend. ;)

When Apple dropped the discrete GPU from the 13" MBP, integrated GPUs were not better in every way than discrete GPUs. They were only better overall.
 
I saw that you made an argument which is unassailable. I made an argument which is easy to defend. ;)

When Apple dropped the discrete GPU from the 13" MBP, integrated GPUs were not better in every way than discrete GPUs. They were only better overall.

Indeedy. It's more out of defiance that people argue against it imo. Apple dropped the dGPU for the 13" and I'm sure there was an avalance of complaints, but there was still a dGPU option in the 15". This time around, if they do, it should be a much more significant outcry. They just need to stop messing around and give us the 750M though :D

Either way, if they drop it in the 15" people will bitch and moan for 3-4 months and that'll be it. In 1-2 years people won't care anymore. That is assuming that Skylake beats other mid-level dGPUs at that point.

Mcarling what would you say the state of mid-level (45W-75W) dGPU's will be around time of broadwell/skylake. My best guess' assumption would say that at least at broadwell, dGPU's (NVIDIA's next gen) will be significantly faster, and that Skylake would maybe present a similar situation to what we have today with Iris Pro vs. 650M
 
Either way, if they drop it in the 15" people will bitch and moan for 3-4 months and that'll be it. In 1-2 years people won't care anymore. That is assuming that Skylake beats other mid-level dGPUs at that point.
Which is another reason to dispense with this no-dGPU nonsense in the upcoming Haswell refresh. It behooves Apple to keep the dGPU this go-round and closely monitor Intel's progress along these lines. Of course that's assuming Nvidia sits on it's ass throughout all this. In a nutshell, I along with a few laws of physics don't see at least an option of a dGPU going away anytime soon. If that does happen with Apple, I see them finally leaving the general laptop market and just going all MBA. (Steve's "MacBook of the future").

And come on folks... only 9900 more posts and we pass those Haswell wankers. Slackers! Let's go!
 
Which is another reason to dispense with this no-dGPU nonsense in the upcoming Haswell refresh. It behooves Apple to keep the dGPU this go-round and closely monitor Intel's progress along these lines. Of course that's assuming Nvidia sits on it's ass throughout all this. In a nutshell, I along with a few laws of physics don't see at least an option of a dGPU going away anytime soon. If that does happen with Apple, I see them finally leaving the general laptop market and just going all MBA. (Steve's "MacBook of the future").

And come on folks... only 9900 more posts and we pass those Haswell wankers. Slackers! Let's go!

That's exactly what I've been thinking. I think it'll be at least 3-4 years
 
Lol, I specifically ask for your opinion on something 2 paragraphs below, and that's the response I get?
I didn't see the question mark. I should read more carefully.

Mcarling what would you say the state of mid-level (45W-75W) dGPU's will be around time of broadwell/skylake. My best guess' assumption would say that at least at broadwell, dGPU's (NVIDIA's next gen) will be significantly faster, and that Skylake would maybe present a similar situation to what we have today with Iris Pro vs. 650M
If I understand the question, even a speculative answer would depend on what process is available to Nvidia at the time, adding a second layer of speculation. With Intel, we know that every second year the process shrinks by about the square root of two, allowing a doubling of the number of transistors. With Nvidia, such progress still occurs, but at a much less predictable rate. As I've written before, Intel's integrated graphics will force Nvidia out of the laptop GPU market before it forces them out of the desktop GPU market. When exactly either will happen is difficult to say. Apple's choice of GPU in the 15" Haswell MBP will give us a good indicator. Ask me again after that has been revealed.

And you also said that if the 15" lost its dGPU it would only affect half a percent of potential buyers:

https://forums.macrumors.com/threads/1646337/
No, that is absolutely false. If the 15" rMBP were to lose its discrete GPU, 100% of buyers would be affected (positively or negatively or both) though most wouldn't know that they had been affected. What I wrote in the post to which you linked was that only a fraction of 1% of potential buyers would refuse to buy the product if Apple were to drop the discrete GPU.
 
By the time Broadwell comes out, Nvidia will shock us all and release their won competing CPU with GPU on a die and take on INtel.
 
By the time Broadwell comes out, Nvidia will shock us all and release their won competing CPU with GPU on a die and take on INtel.
253507432_640.jpg


add that with crack and you get that post
 
Urgh, I don't know. Battery life is a big thing for me but I need a new machine soon.

Is it a new processor every release with Macs or every second release?
 
Is it a new processor every release with Macs or every second release?

Intel release a new generation of processors annually. Apple release a new generation of each Mac annually, based on and following the release of the new Intel processors. In addition to the annual updates, Apple also do a major redesign every three to five years (the last two for the MBP were the unibody redesign and the Retina MBP). There are also, in some years, minor updates such as the February 2013 update.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.