Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Keep in mind that the 2012 rMBP has no problems running 1920x1200@2x (9.2 megapickles) via the integrated HD4000 graphics, which is ridiculously underpowered compared to the baseline R9 M290X.

2012 rMBP *has had* no problems running 1920x1200@2x (9.2 megapickles) via the integrated HD4000 graphics.
Now, thanks to Yosemite, it has.
 
Trouble is, in anything that's a 3D game or even playing a full-frame 4-5K video in full screen, how the hell is a mobile GPU going to perform when even some higher-end desktop cards struggle at those resolutions?

You guys keep repeating these old arguments even though the retina MBP has been around for several years, haven't you learned anything about the HiDPI resolutions from it?You are not going to play a 3D game on 5K resolution with those machines! You will play using the 2560x1440, just as with the old iMac. Expecting anything more is silly, unless we are talking about really old games. The retina display is all about resolution independence and improved quality of fonts/user interface, not about playing games at ultra high resolutions!

As to second part - playing video is not an issue. My rMBP with an iGPU can decode and play 4K video without hiccups. Besides, most video you will ever play is Full HD, which will be upscaled to the 5K resolution. And you concerns about 'full-frame in full-screen' is baseless. The base m290x has a memory bandwidth of 154GB/s. A full 5K frame takes 62 MB of memory (on practice, this is less, because of the color compression). This means that the GPU can copy that frame around hundreds of time per second, without breaking any sweat. So playing a 5K video at 60fps is a simple enough task, as you need 'only' around 4GB/s to do that. You can even easily stream it from the system memory.
 
You guys keep repeating these old arguments even though the retina MBP has been around for several years, haven't you learned anything about the HiDPI resolutions from it?You are not going to play a 3D game on 5K resolution with those machines! You will play using the 2560x1440, just as with the old iMac. Expecting anything more is silly, unless we are talking about really old games. The retina display is all about resolution independence and improved quality of fonts/user interface, not about playing games at ultra high resolutions!

As to second part - playing video is not an issue. My rMBP with an iGPU can decode and play 4K video without hiccups. Besides, most video you will ever play is Full HD, which will be upscaled to the 5K resolution. And you concerns about 'full-frame in full-screen' is baseless. The base m290x has a memory bandwidth of 154GB/s. A full 5K frame takes 62 MB of memory (on practice, this is less, because of the color compression). This means that the GPU can copy that frame around hundreds of time per second, without breaking any sweat. So playing a 5K video at 60fps is a simple enough task, as you need 'only' around 4GB/s to do that. You can even easily stream it from the system memory.

You're basing all those sweet presumptions on a spec sheet, we're merely saying nobody has seen real usage from that new machine yet - just press lackeys reposting press release material.

If you'll recall, the first rMBP with a Geforce 650M wasn't without issues.

It's entirely possible this machine is awesome and has no performance problems whatsoever, let's at least wait until some people have actually used it?

Or order away and see for yourself, you can tell us later on.
 
You guys keep repeating these old arguments even though the retina MBP has been around for several years, haven't you learned anything about the HiDPI resolutions from it?You are not going to play a 3D game on 5K resolution with those machines! You will play using the 2560x1440, just as with the old iMac. Expecting anything more is silly, unless we are talking about really old games. The retina display is all about resolution independence and improved quality of fonts/user interface, not about playing games at ultra high resolutions!

As to second part - playing video is not an issue. My rMBP with an iGPU can decode and play 4K video without hiccups. Besides, most video you will ever play is Full HD, which will be upscaled to the 5K resolution. And you concerns about 'full-frame in full-screen' is baseless. The base m290x has a memory bandwidth of 154GB/s. A full 5K frame takes 62 MB of memory (on practice, this is less, because of the color compression). This means that the GPU can copy that frame around hundreds of time per second, without breaking any sweat. So playing a 5K video at 60fps is a simple enough task, as you need 'only' around 4GB/s to do that. You can even easily stream it from the system memory.

Thank you sir. It's ridiculous people think any company let alone Apple would allow the release of a product where the user experience is anywhere near what is claimed in this thread. An incredible machine that is going to be beautiful for 98% of tasks. Get something else if you want 3D 4k gaming.
 
Trouble is, in anything that's a 3D game or even playing a full-frame 4-5K video in full screen, how the hell is a mobile GPU going to perform when even some higher-end desktop cards struggle at those resolutions?
The R9 M295X has the same chip as the desktop R9 285, which is an upper mid-range desktop GPU. It's not a high-end part and you certainly are paying a disproportionate amount of money for it, but it's certainly not slow.

I also don't see video playback being problematic. Even my 2011 iMac can do 4K video decoding just fine and the the actual displaying is not much of a problem for any modern GPU, including scaling.

In historical context: Display resolutions have stagnated between 2004 (when the first 2560x1200 resolution screens became available) and recent years. Now we finally have 4K and even 5K (which is 3.6x the number of pixels). In the same time, GPU performance has increased by roughly two orders of magnitude.

Yes, it won't run Call of Duty 17 at native res, but pretty much no machine can do that. Even nVidia's flagship GTX980 is struggling to hit just 30fps at 4K resolution in many modern games. Should Apple have postponed the iMac by several years just because of that?


2012 rMBP *has had* no problems running 1920x1200@2x (9.2 megapickles) via the integrated HD4000 graphics.
Now, thanks to Yosemite, it has.
I didn't notice it yet, but I have only looked at the betas occasionally.

The problem with iOS7 and Yosemite is the blur, which is very costly in terms of memory bandwidth. iGPUs generally have the problem that they only have as much memory bandwidth as the system has (25.6 GB/s in case of DDR3-1600 and a dual-channel controller, which is the most common configuration in recent years).

In comparison, dedicated GPUs usually have more than 100 GB/s of memory bandwidth at their disposal.

In case of the rMBP: The Intel Iris Pro tries to circument that problem by providing additional eDRAM (128MB) directly on the chip. No help for the 2012 rMBP, though.

let's at least wait until some people have actually used it?
Good advice for any new product.
 
You're basing all those sweet presumptions on a spec sheet, we're merely saying nobody has seen real usage from that new machine yet - just press lackeys reposting press release material.

If you'll recall, the first rMBP with a Geforce 650M wasn't without issues.

It's entirely possible this machine is awesome and has no performance problems whatsoever, let's at least wait until some people have actually used it?

Or order away and see for yourself, you can tell us later on.

I have the first rMBP with Geforce 650M. I haven't had any issues? I play a variety of games now. I mean I'm not discounting that maybe other users had issues with whatever they were doing but I haven't in the 2+ years I've had it.
 
This isn´t a Rev. A machine, it´s using the exact same design as the previous mode. It has a new display panel inside. Apple are churning out a ton of these at launch, and so far the reviews are awesome.

Whats with all the negativity people? You hurt you bought the Mac Pro? Or have you invested heavily in a 4K Sharp display? Or is it simply out of reach money-wise? No matter what it is, stop the lame complaining, it´s annoying.

This is a $2500USD 5K IGZO display with a state of the art computer embedded into it. It´s running super fast without any lag. For what you are getting, I´d say it´s an amazing steal! And that´s coming from someone with a nMP/64GB/D700 with a 4K display currently. I´m getting this as well, in short it´s the best display/computer any photography could dream about.

So cut back on the unfounded complaining about lag,rev a,price,gpu.. And for once be happy for Apple that has managed to pull through and deliver something truly unique and innovative.


Some people are into the techie part of it, they have certain expectations on what they would like in a machine before they pony up $2500+ for a computer. Apple failed to meet some expectations.
Its also annoying when people have the thought process that Apple walks on water, so now we are both annoyed.

I have not posted any negative in this thread. I also I am not satisfied with the release, so I took that money and purchased a different model.

People all handle it differently, but everyone has the right to complain or praise about it.

Rob
 
Just saw in Apple store the new imac 5k with 295x. I tested with the apple fellers around 30 min they even installed diablo 3, i don't play that game but they have late 2013 with 780M in stocks

So basic line is this : final cut pro, web surfing, mail, videos(4k), looks stunning and work seamless. (This is 295m AMD driving 2880p and not HD4000 driving 1800p) to make a comparison. So everything works /feels/see gorgeous/stunning/mind blowing. The brightens is at max level above the last generation (i have late 2013 27" imac at home). I tried D3 on 1440p all ultra on both ..in the 295x i have fixed 60 fps everywhere, on the 780M in town or whatever is called when the game begins i had 58-59 fps and in battles drop at 39-40 fps. They have a Gpu test app (i don't remember how its called) 780M scored 77 fps and the 295x scored 92 fps.

After 10 min lets say of playing with final cut pro 4k ed, or diablo 3 the heat is about the same like my imac. So this is a very good thing that it doesn't heat up from those pixels

In other words, Georgio's criticism below was uninformed and stupid? Nice to know.

Well there goes my money; I'll stick with my existing iMac thanks.

No one is going to buy this iMac except the uninformed and stupid.

Apple going with this GPU with this display is the biggest mistake they've ever made.
 
If you'll recall, the first rMBP with a Geforce 650M wasn't without issues.

I have the first rMBP, and it runs Yosemite on its 3360x2100 frame buffer without any performance problems (yes, individual applications can exhibit lag — which is the fault of bad programming within those applications). So, based on my experience, my knowledge of technology and the specs of those GPUs, I am fairly sure that they will workouts fine. Now, don't get me wrong, I am not trying to sell my opinion as the only correct one. Still, I was replying to claims like 'm290x can never work with a 5K screen', which as far I am concerned, are simply ridiculous from technical standpoint.

Or order away and see for yourself, you can tell us later on.

I would, but I don't need a desktop. My work is done on a laptop and for heavy lifting I have an SGI supercomputer :p
 
The R9 M295X has the same chip as the desktop R9 285, which is an upper mid-range desktop GPU. It's not a high-end part and you certainly are paying a disproportionate amount of money for it, but it's certainly not slow.

I don't think anyone knows the absolute spec's of this chip yet, do they? I have been scouring the net since yesterday and everywhere I come across is all rumours and in same cases conflicting information. I think we might have to wait for an in depth review to be sure. Do you have a link to anything more than a rumour? I can't find anything official from AMD.

I believe this is the first case of this chip in the wild, it looks like Apple have bought up all the stock.

Thanks.
 
Only 2GB VRAM...

I'm a little shocked that they would put tome and money into developing some proprietary chip to assist with screen and pixel fidelity and still cheapen out on something as simple as VRAM. 4GB should have been the minimum, no matter the GPU, and 6GB should have been the BTO.

This thing costs $2,500, pushes 5k, and only has 2GB VRAM. How disappointing.
 
If it's engineering well, which I assume it was, it will have internal thermal management to prevent this. There would be lawsuits if they made a chip that was destined to "melt".

And Gav, Mr. Engineer, heat transfer is not so much thermodynamics, it is formally considered as time-dependent energy transport. Sure the component might have the potential to get hot, but whether it actually does depends on how well Apple designed the heat sink around the unit. I'm waiting for real-life demonstrations before claiming that they poorly engineered their computer.

I look forward to the tear down with interest and the technicians guide for the thermal system, if they are using the same internals as is likely I do have cause for concern. Anyone who had to have their AMD GPU swapped out with the iMac 2011 for free were lucky but that was a daughter card and easy to do. But the thousands of users who are being left to foot the bill for a replacement GPU in the 15/17 2011 are testament to the fact that Apple engineering are not perfect in any way shape or form regarding thermal problems over a longer term period than a year. I imagine they used the same airflow modelling simulations when they stuck a red hot Sandy Bridge and AMD 6XXX in that existing cooling system which was originally designed for silicon less than half its combined die sizes and that has totally proved to be inadequate in the real life long term scenario.

Enjoy your new toys everyone I'm looking forward to having a go on one this weekend, but make sure you get that AppleCare for it with the purchase.
 
Last edited:
If it's engineering well, which I assume it was, it will have internal thermal management to prevent this. There would be lawsuits if they made a chip that was destined to "melt".

And Gav, Mr. Engineer, heat transfer is not so much thermodynamics, it is formally considered as time-dependent energy transport. Sure the component might have the potential to get hot, but whether it actually does depends on how well Apple designed the heat sink around the unit. I'm waiting for real-life demonstrations before claiming that they poorly engineered their computer.

I certainly hope they've learned after the iPad 3. No?
 
I look forward to the tear down with interest and the technicians guide for the thermal system, if they are using the same internals as is likely I do have cause for concern. Anyone who had to have their GPU swapped out with the iMac 2011 for free but that was a daughter card, and the thousands of users who are being left to foot the bill for a replacement GPU in the 15/17 2011 are testament to the fact that Apple engineering are not perfect in any way shape or form regarding thermal problems over a longer term period than a year. I imagine they used the same airflow modelling simulations when they stuck a red hot Sandy Bridge and AMD 6XXX in that existing cooling system which has proved to be inadaquate in the real life long term scenario.

Enjoy your new toys everyone I'm looking forward to having a go on one this weekend, but make sure you get that AppleCare for it with the purchase.

The 2012 iMac's (new 5mm thick design) had new thermal designs relative to the 2011's (as do the rMBP's) so I don't see how you could possibly compare them.
 
Honestly I don't know what you're rumbling on about, mate.

I trust that the Apple engineer's know what they are doing. I have heard of the old adage that says to get Rev. "B" of products but meh - :shrugs:

As another poster so aptly put it

After taking every single Intel Mac to bits I wish my trust was as blind and loyal as yours when it comes to Apple's engineering. But in pretty much in one very small place the school of hard knocks has made me feel very far away from that way I'm afraid. Between the dies of the CPU and GPU and the plates which bind it to the heat-sink they are nowhere near perfect enough.

----------

The 2012 iMac's (new 5mm thick design) had new thermal designs relative to the 2011's (as do the rMBP's) so I don't see how you could possibly compare them.

I know- I've had four or five 2012's apart this year. I am comparing that chassis and this new 2014 and only using the 2011 as an example of when they were far from perfect.
 
Last edited:
I don't think anyone knows the absolute spec's of this chip yet, do they? I have been scouring the net since yesterday and everywhere I come across is all rumours and in same cases conflicting information. I think we might have to wait for an in depth review to be sure. Do you have a link to anything more than a rumour? I can't find anything official from AMD.
That's true, there is no confirmation yet. What I've read was AMD Tonga @ 850MHz. The Tonga chip is quite new (released in September), so that could explain, why there are no other M295Xs around yet.

I guess we'll have confirmation soon.
 
In other words, Georgio's criticism below was uninformed and stupid? Nice to know.

One persons 30 minutes playing around isn't what I would call benchmark.
Let's not make this personal as anyone less experienced than I might take that as a direct confrontation?

Bottom line is that this machine could have been epic with the logical compnents and now for the sake of trying to maximise profits it's just another refresh with a better screen.
 
One persons 30 minutes playing around isn't what I would call benchmark.
Let's not make this personal as anyone less experienced than I might take that as a direct confrontation?.

didn't you call anyone who bought one uniformed and stupid?
 
should of went with the 256 ssd.....much better performance double the flash storage...same price....

I suppose the guy was stupid with storage as well, not just with the Retina display... Unless just maybe he has a few hundred gigabyte of data which don't fit on that 256 GB SSD drive...

----------

Unless I'm missing something, can the eye even really see 5k?

As posted in many other threads, text on a 5K display is a lot easier to read than on a 2560 x 1440 display, even if the eye can't see it.

The 5k iMac is basically a dell monitor with a crappy computer, or a 2009 1440p iMac core 2 duo.

You mean a Dell monitor selling for the same price as the iMac, with a £1000 Dell computer thrown in for free, except it comes with a real OS instead of Windows 8, with a Fusion drive instead of a plain old hard drive, a nice design instead of crappy black plastic and all the rest of it.
 
Last edited:
This thread title really annoys me, and so I'd wish it would die.

The 5K iMac is clearly NOT going to lag like crazy. When has ANY Apple device lagged like crazy on release?

Right.

As has been mentioned earlier, even the crappy GPUs in the rMBP 13" can handle that res just fine for everyday things.
 
I agree. When i saw the price tag during the keynote I had to pick my lower jaw off the floor!

$2,499 was what the Macintosh cost in 1984. 512 x 342 pixels. The iMac has 84 times more, and they are all in color instead of black and white! 128KB Ram, the iMac has 65,536 times more. 400KB drive. The iMac has 2.5 million times more. The Macintosh did about 1,000 floating point operations per second, the iMac does 3.5 TFlops, which is 3.5 billion times more.
 
This thread title really annoys me, and so I'd wish it would die.

The 5K iMac is clearly NOT going to lag like crazy. When has ANY Apple device lagged like crazy on release?

Right.

As has been mentioned earlier, even the crappy GPUs in the rMBP 13" can handle that res just fine for everyday things.

I agree its the wrong title but the sentiments are pretty sound.

It probably won't lag like crazy, but I can cite the 2012 mbp retina on launch and the iPad 3 as certainly being problematical as far as scrolling is concerned. The 2012 retina in case being the first of its generation of bleeding edge technology just like this retina iMac is. There were also many issues with the quality of the display on the retina mbp's also.

It will certainly get hot with opencl acceleration on the AMD using FCPX that I'm pretty sure of. I'm wondering just how hot when it gets past a year going onto 3 years!
 
For every one who is doubting the GPU...
Nvidia 980M 3,189 GFLOPS
AMD R9 M295X 3,5 GFLOPS

I Think it will do just fine.

http://www.techpowerup.com/gpudb/2622/geforce-gtx-980m.html

I wont be as radical saying that will lag but i would have bought one yesterday if only the 980m was there, i had the card ready :)

Now i will still buy one if the m295x proves to be close to that in performance wise.
I really dont need a 5k even dough i would love to have one (It looks amazing) but will not trade performance over looks.

Yesterday i knew exactly what i wanted to buy, and today completely lost just because of this GPU that we dont know the real deal yet. (Really hopping the best for all the ones that already purchased and also that i would love to get one in a few days)

By the way i dont play any games at all. All i do is AE, Premiere, FCPX, Motion 5 and thats it. Oh!! mail, and iphoto :)

Lets hope for the best. Really anxious to see its performance
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.