Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
if they gave it 1 gb of GDDR5 vram in addition to their 128 mb L4 cache, I have a feeling it would come a lot closer to 650m levels.
 
Showing your ignorance here buddy. The Iris 5200 has the same performance as a dGPU, you don't need a chip from nVidia any more. That's the whole point of Iris, it's not like the traditional idea of an iGPU.

Seriously for everyone laughing off integrated graphics and saying gamers will be annoyed etc, you really don't understand what Iris is all about. It can match laptop dGPUs. Hence why Codemasters (who make Grid 2) have been advertising it loads, it can play their games at full pelt.

No, the Iris Pro is far behind the GT 650. For example, go read Anandtech. I remember you from the forums, where this was already presented to you. I guess you are the ignorant one, buddy.
 
The benefits of dropping the dGPU will outweigh the cons and negative feedback from users who's expecting more from the GPUs.

Less overhead from maintaining two GPUs, less circuitry required on the motherboard, freeing up space for more battery, more battery life and so on.

I doubt the removal of dGPU would make it thinner but if they maintain the same performance from iGPU, it is possible they won't increase the battery volume and instead, thin out the battery, so that they can make the laptop thinner.

So far, folks are loving the size/efficiency focus that Apple has going for the past few years (MBA very successful due to its profile). So, hopefully, the thinner rMBP will work out as well.

So, Dell's top of the line laptops will have 11 hours battery life and discrete GPUs whilst MBPs will have 12 hours battery life and IGP. Which one is better? I'd say anything beyond 9 hours matters to very few people outside rural Africa.
 
It'a mistake to drop the dGPU. Iris Pro, special version or not, will never be as good... And Intel will never catch up with Nvidia. They have really nice Logan cards coming next. People who want real graphics performance, including decent gaming, will be left behind if they buy MacBook Pro's.

likely true. Also, apple can't play amd and nvidia off each other to get the lowest prices. Now, they're bound to intel forever. In a few gens, once they get Apple to rely on intel's GPUs, they can jack up the price to whatever they want.
 
Integrated graphics have evolved incredibly in recent years and are no longer a major drawback when it comes to purchasing an everyday notebook. And in the case of this extreme Iris version, it's safe to say that Apple can even drop the dedicated parts in their rMBP line-up for the sake of a more robust battery architecture, such as the one present in the current Air.

Very interested in what apple and intel can come up with in the future.
 
My advice: You're heading off to school - focus like a madman on the subject matter, and not on the machine.

University tuition is astronomically overpriced as it is (thanks to the fact it's subsidized and considered "mandatory"), so use the resources they have on campus.

I always found that those that focused on technology and having "the latest and greatest" - and I was in that group - didn't do nearly as well as those who focused on the core foundation of their field. (I wasn't too bad there either, but I definitely got distracted by always trying to apply the latest technology.)

For your means, any MacBook made in the last 2 years should do fine. If you're in engineering, get a big screen and/or an external monitor (which are dirt cheap). And study, study, study!

It's not made clear that I'm in Britain and my tuition is on a loan which I don't have to pay until I'm on £25k (I think) a year, and even then I pay it back incrementally.

Also, a laptop is *vital* for my subject matter. I'm doing computer science, you see. I need a good laptop which I can code on—especially for the fourth year, in which I will be doing my dissertation. I'm also doing a sandwich course in the third year, so I will be earning ~£25k in that year.

Adding to the fact that I've been saving up money (1) since birth, and (2) since the start of college (tertiary education, or the equivalent of America's high schools), and have accumulated a lot of money.

I have the funds to pay for this, as well as the bursaries and money I will be receiving elsewhere. The new Macbook Pros will (apparently) be marked up at the same price as the current Macbook Pros, so there's no pain in waiting.
 
Reminds me of 1st gen Mac Pros, and those things were astonishing machines when they were released.
 
likely true. Also, apple can't play amd and nvidia off each other to get the lowest prices. Now, they're bound to intel forever. In a few gens, once they get Apple to rely on intel's GPUs, they can jack up the price to whatever they want.

Except that prices are increasing all around, and Intel is making things harder (by design) to include a discrete GPU in a laptop and maintain the needed bandwidth for other connectivity. Apple isn't stupid, and they'll bargain where they need to. They are looking at the future themselves, and Intel would be stupid to assume that they'll have them against the wall for very long, if ever.
 
**** that. I could justify paying more for quality in the details, but a >$2000 computer without dedicated graphics is just obscene.
 
This is bad news actually. For the 15" flagship retina, we need a real GPU, and this looks like we're not going to get one.

It means that someone at Apple is so concerned with battery life that they are willing to completely sacrifice graphics performance. Yes, Iris is better than the old HD was, but that's like saying a Honda Accord is better than a VW beetle. Neither are a Ferrari.
 
This is bad news actually. For the 15" flagship retina, we need a real GPU, and this looks like we're not going to get one.

It means that someone at Apple is so concerned with battery life that they are willing to completely sacrifice graphics performance. Yes, Iris is better than the old HD was, but that's like saying a Honda Accord is better than a VW beetle. Neither are a Ferrari.

Neither are the mid range graphics in the current crop..

MBP's have meh graphics and now if this is true they will have meh graphics and better battery life.
 
If they phase out the non-retina models, I'm very curious to see where the retina price point will end up.

Many people need more than 128gb of storage, and I think it would be very painful for most to have to pay $1700 for a 256gb retina (and only 13" !!!) vs. $1200 for a non-retina.

I'd be pretty thrilled if I could get a 13" retina, 256gb ssd msata version for around $1200-$1300 !!! That said, I don't expect this to happen. MAYBE we'll see a $100 price drop for retina, but more than likely it will stay the same.

Did you notice how the cMBP has stayed around? They will not phase it out at current rmbp price points.

The future is OpenCL, no matter how hard Nvidia fights it.

It has nothing to do with fighting it. They've been ahead in terms of development devoted to their own gpus regardless of the frameworks involved.
 
Will be interesting to see how Apple will spin this, if they present these new MacBook Pros at an event. They'll likely handpick a few artificial benchmarks where their Iris Pro barely matches a GT 650 DDR3 version. Then claim that integrated graphics is as good as dedicated. It's going to be a blatant lie, since there's so much more to the picture. I can already see Phil Schiller doing it, with his smile...
 
It's not made clear that I'm in Britain and my tuition is on a loan which I don't have to pay until I'm on £25k (I think) a year, and even then I pay it back incrementally.

Also, a laptop is *vital* for my subject matter. I'm doing computer science, you see. I need a good laptop which I can code on—especially for the fourth year, in which I will be doing my dissertation. I'm also doing a sandwich course in the third year, so I will be earning ~£25k in that year.

OK, waiting is good and healthy. I'm just saying that in priority, saving your pennies and focus on subject is way more important than "latest technology". And never EVER get seduced by attractive loan terms - life has a funny way of making those terms mighty ugly when it comes time to pay the piper, especially how the west's economy is going.

Given your field - not too far from mine of "chip design", which has me doing advanced programming all the time - I'd focus on having a bigger hard drive and more display space than the latest processor. When I do things local, I have to use virtual Linux machines a lot. In this regards, having throw-away (or school) servers running Linux, and VNC-ing into them makes a LOT more sense.

Keep things simple, keep things cheap, and focus on the core problem at hand.... Just some honest advice from someone who's often learned the hard way!
 
How about people wait until Apple announces these things? How accurate is SemiAccurate anyway? Are they known for getting scoops in the past?
 
Worth selling my current rmbp 2.6ghz model and upgrading?

Ah, no. Don't quite get the folks who start to feel inadequate when the next bit of shiny kit comes along.

Buy your girlfriend/wife/mother something special if you've got money burning in your pocket.



Re the prospect of dropping the dGPU altogether, put me in the sceptical boat. There just doesn't seem to be a big enough advantage here if battery life is the main factor.

Might we see a return of the 17" in a rMBP form as the flagship dGPU model..? That would be a nice surprise IMHO.
 
Perhaps for some extra $$$, Apple may add (custom configuration) an extra graphic cards option... AMD Radeon HD 8970M perhaps....hopefully....please.:eek:
 
How about people wait until Apple announces these things? How accurate is SemiAccurate anyway? Are they known for getting scoops in the past?

They are very accurate. So much in-fact it costs $1,000 just to get access to their content for a year and they are still in business. They seem to have a lot of industry sources.
 
I wish. It's almost sad for me to say this but I'm just looking for a typical incremental update here. I don't curerntly own a MBP, but I'd like to see haswell and typical bump in dGPU. Integrated, 'almost similar level of performance or not' is just a shot to the gut. No matter how smartly others try to persuade you, an iGPU is never going to be the same as a dGPU. Plain and simple. Iris won't ever beat 660M in real life performance. Unless somehow, "as much GPU power as possible" means it's well over 40% faster than the chip should be capable of.

Edit: Yes I'm a gamer. No, I don't game religiously. I would be satisfied with a current gen laptop dGPU
 
No, the Iris Pro is far behind the GT 650. For example, go read Anandtech. I remember you from the forums, where this was already presented to you. I guess you are the ignorant one, buddy.

Sorry what? This is the first time I've commented on iGPU vs dGPU stuff, and I've certainly never been presented with Anandtech stuff. Guess you're the ignorant one, just assumed I was someone else.

However, I'd like to take the time to remind you that this particular Iris GPU has not yet been benchmarked, and as such it's performance relative to a 650M is currently unknown. Next argument plz.
 
OK, waiting is good and healthy. I'm just saying that in priority, saving your pennies and focus on subject is way more important than "latest technology". And never EVER get seduced by attractive loan terms - life has a funny way of making those terms mighty ugly when it comes time to pay the piper, especially how the west's economy is going.

Given your field - not too far from mine of "chip design", which has me doing advanced programming all the time - I'd focus on having a bigger hard drive and more display space than the latest processor. When I do things local, I have to use virtual Linux machines a lot. In this regards, having throw-away (or school) servers running Linux, and VNC-ing into them makes a LOT more sense.

Keep things simple, keep things cheap, and focus on the core problem at hand.... Just some honest advice from someone who's often learned the hard way!

Trust me, the loan system is fine. The government in Britain provides our for every student so that we are able to afford higher education. It's been going for a very long time (as far as I know).

I am also incredibly good with money considering I'm 18. Squirreling money away is a part of my nature.

As far hard drive space, I'm absolutely ay-ok in that regard. My Macbook Pro I have at the moment isn't quite up to scratch, so I'm anticipating a newer model (for moderate gaming as well).

Given that it might be possible that the new GPU may not have dedicated memory, it raises the question: should I wait or get the most recent model available?
 
Tuition fees were introduced in the UK in '98 by Blair at 1k per year. Tripled by him in 2004 then again by Cameron in 2010. So not wonderfully long, the system still has yet to prove it works well tbh.
 
One processor to rule them all..

It began with the forging of the Great Processors. Three were given to the Workstations, immortal, wisest and fairest of all computers. Seven to the Desktops, great gamers and craftsmen of the office documents. And nine, nine processors were gifted to the race of Laptops, who above all else desire power.
For within these processors was bound the strength and the will to govern each system. But they were all of them deceived, for another processor was made. Deep in the land of Intel, in the Fires of Mount St. Helens, the Dark Lord Otellini forged a master processor in secret, and into this processor he poured his cache, his malware and his will to dominate all systems. One processor to rule them all....
 
Tuition fees were introduced in the UK in '98 by Blair at 1k per year. Tripled by him in 2004 then again by Cameron in 2010. So not wonderfully long, the system still has yet to prove it works well tbh.

I think I'm okay with the debt though. Investing in my future and whatnot.
 
Also, a laptop is *vital* for my subject matter. I'm doing computer science, you see. I need a good laptop which I can code on—especially for the fourth year, in which I will be doing my dissertation....

I've done a CS degree in England, and trust me, the coding you do at that level (barring an optional module or two that you might or might not take, e. g. 3d, or game programming) will not require anything more powerful than an Atom... Better to save your money for something else...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.