Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
do macs last a few years only? i want to buy an imac but i see people replacing their computers every 3 years or so .. i can't do that .. i want something that will last 7-8 years ... im using a dell xps 8300 at the moment ... core i7 2600 .. running very well

Anything will last 8 years if you're willing to deal with old technology. If you're thinking that you can continue to upgrade that Dell for 8 years, dream on.
 
A 4k display is only approximately 8.29 megapixels, and that's the entire screen. A 1080p monitor is 2 megapixels. You if you've been taking 5 megapixel pictures and above, it should look fantastic on a 4k monitor, and far better than on your 1080p one.

The rumored 5k resolution, on the other hand, would be an astounding 14.75 megapixels. So yeah... But then again, no one really uses every pixel on the display to look at a picture.

always something... my first digital pics are from 1999 in 1,3 megapixels... (I think I have some from an apple camera too, in 320 x 240...:eek:)
 
Great to read this kind of rumor.

Now for those like me who are not aware of the AMD 'disaster' as it seems to create a lot of frustration, can you please summarize it? thanks

What would be a better discrete video card for this rumored 5K panel?

Will :apple: release something like Metal for Macs???
 
Great to read this kind of rumor.

Now for those like me who are not aware of the AMD 'disaster' as it seems to create a lot of frustration, can you please summarize it? thanks

What would be a better discrete video card for this rumored 5K panel?

Will :apple: release something like Metal for Macs???

Currently, adobe power users (such as myself) benefit from the CUDA technology exclusive to Nvidia cards for certain features of their programs. Although fairly specialized, and possibly seldom used, CUDA takes what would be about 10-30 seconds per frame rendering (using only the CPU) down to milliseconds per frame rendering. When dealing with a 60 second video at 30 frames per second, this is an enormous difference. One that makes it essentially impossible to perform the task without CUDA. CUDA is a very promising and seemingly efficient means of utilizing the GPU for process intensive tasks (though it generates a lot more heat than most anything else, which is not a strong point with the iMac design). In any case, no matter how much you use an aspect of software that can take advantage of CUDA, the fact is, if you don't have Nvidia, you can NEVER use it. And that is a major factor for me at least, in deciding on a future machine.
 
Currently, adobe power users (such as myself) benefit from the CUDA technology exclusive to Nvidia cards for certain features of their programs. Although fairly specialized, and possibly seldom used, CUDA takes what would be about 10-30 seconds per frame rendering (using only the CPU) down to milliseconds per frame rendering. When dealing with a 60 second video at 30 frames per second, this is an enormous difference. One that makes it essentially impossible to perform the task without CUDA. CUDA is a very promising and seemingly efficient means of utilizing the GPU for process intensive tasks (though it generates a lot more heat than most anything else, which is not a strong point with the iMac design). In any case, no matter how much you use an aspect of software that can take advantage of CUDA, the fact is, if you don't have Nvidia, you can NEVER use it. And that is a major factor for me at least, in deciding on a future machine.

Thanks for the details provided about NVIDIA and CUDA.

Some people claim that with Thunderbolt it was possible to adapt external video cards, would this be a potential approach?
Even it defeats the purpose of an all-in-one it might serve the purpose for future upgrades if feasible.
 
AMD? Oh boy... It's time for Apple to find a better solution.

These comments really irk me. Read up on contemporary GPU options and performance, please, before you comment based on out-of-date information.

"Hurrr, AMD had an inferior offering in a certain space a year ago, they must still be terrible"
 
Is this good for gamers?

As a Mac gamer I don't think this is good news for me. The 27" iMac's display is shap enough as it is now. Having 4 times as many pixels to push around will hurt the performance. And I prefer Nvidia to AMD Radeon.
 
I have my 15" rMBP set up with a 24" 1920x1200 monitor alongside. A 27" iMac would take up less space than those combined.

I ran my old 15" unibody in closed-lid mode with a cool vertical stand out of the way, and had the 24" Apple Cinema Display. I used BT mouse/keyboard, and that was perfect for me. For me, 27" is too large. I just don't need it, and as I said, my workspace is crowded as it is, and I've used a 27" before, and I found myself moving my head rather than my eyes across the display.
 
Would there be AAAAAAAAAANY chance of upgrading the displays in my 2012 or 2013 27" iMacs to one of these 5k displays? :confused:

Yes you can .. It's called "sell the old iMac and buy the new one."

----------

These comments really irk me. Read up on contemporary GPU options and performance, please, before you comment based on out-of-date information.

"Hurrr, AMD had an inferior offering in a certain space a year ago, they must still be terrible"

I don't really worried about AMD performance. Radeon graphics are doing just fine. But I would worry about long term performance and durability. Radeon in Mac has been well known to be troublesome after 1 or 2 years of use. Total dead, weird colors rendering and artifacts because of graphic chip failure. That's what I'm worrying about .

Note that I had a bad experience with Radeon 6970M in 2011 iMac. Decent but overheat so often and it had to be replaced. So I've been there and it wasn't good.
 
Double the resolution with double-sized assets. Everything is legible but sharper. That's the point of these screens.

So I hope that means sharper graphics and text at arms length viewing with my current applications.

I wish Apple would also include a better D(igital) A(udio) C(onverter) for the audio out port.
 
As a Mac gamer I don't think this is good news for me. The 27" iMac's display is shap enough as it is now. Having 4 times as many pixels to push around will hurt the performance. And I prefer Nvidia to AMD Radeon.

Just run games at 2560 x 1440, native resolution won't be possible for demanding games at all. I'm more worried about a consistently smooth UI (Mission Control animations etc). This new 27" iMac will have an insane resolution to push, keeping animations smooth will be difficult.
 
That's about 14.7 million pixels...

Wow! I guess it's not unprecedented, some people are already pushing 14+ millions pixels by hooking up four 2560x1440 displays. But it requires a pretty high end video card to do anything serious with that many pixels.
 
A 5K retina iMac is a dream come true for those of use who own 4K video cameras like the Panasonic GH4, FZ1000 and LX100 - the latter two models cost only $899.00 so 4K is becoming the new affordable standard and conventional HD (1080) obsolete.

Apple will sell a ton of 5K iMacs for Christmas if they actually do launch one in late October.
 
Better than Intel graphics.

There are few things I hate more in life than Intel graphics. Their drivers are crap, the performance mediocre and the unit cost definitely reflects the quality.

The drivers are less noticeable in OS X but I will never buy a system that has them where I'm stuck with them; Mac or otherwise.
 
Last edited:
As a Mac gamer I don't think this is good news for me. The 27" iMac's display is shap enough as it is now. Having 4 times as many pixels to push around will hurt the performance. And I prefer Nvidia to AMD Radeon.


When you play games, you can just run the game at 2560x1600. Or 1080p if you prefer. 1600p is still vastly higher that what most PC gamers play..let a lone consoles. As mac gamers this is great news, we get a killer fast GPU, to play games. And when we are not playing games we have a gorgeous 5k display to view all other content.

Also I am not convinced the iMac will have AMD graphics, but it wouldnt be the end of the world if it did.
Just because the 2011 MacBook pro had issues with Radeon GPUS doesn't mean macs will always have issues with AMD\ATI GPUs. I owned a 2011 mac book pro. And had no issues with it. And I played pc games like crysis 2 and mass effect 3 all the time.

People should not be concerned about apple finding a notebook GPU to push 5k. There are plenty of Nvidia and AMD mobile GPUS that would be able to push 5k,if they were just tweaked slightly by Apple or Nvidia/Ati.


Overall, this sounds like an awesome machine. Too bad I am not in the market for an iMac right now
 
I think Apples motivations in the past have been thermal power related. It seems to be the overwhelming qualifier for choice in the MBPs. That is best performance at a thermal point.

Anything is possible. I just have to dismiss the idiots whining about AMD GPUs like they know what they are talking about. Some may prefer NVidia due to proprietary software but even that is ill advised these days.

This point seems to have come up several times in this thread and a lot of the people who've expressed concern about the move to AMD graphics chips over Nvidia (myself included) haven't sufficiently elaborated on why they are so opposed, so let me give it a quick shot.

1. Performance: nvidia's new 900 series graphics cards (970 and 980), are for many applications the most powerful graphics chips on the market, and leaked benchmarks show the mobility versions(both 970m and 980m) decimating all previous mobile graphics cards, with the 980m approaching the performance of a desktop 970. To put it another way the 980M may end up being as much as twice as fast as the 680m(x)which was introduced in the 2012 IMacs. AMD's current mobile flagship is roughly the same speed as that 680m. In other words AMD is about two years behind in terms of mobile performance, so unless they completely overhaul their architecture, performance may not be all that much better than the 780m in the current iMacs

2. Resolution: if this rumor is true Apple is going to skip 4K in the iMacs and go straight to 5K. This is a huge increase in the amount of pixels that the display needs to push (vs 1440p in the current iMacs), and thus it would be ideal to have a correspondingly huge improvement in GPU performance to go along with it. Going back to performance unless AMD is able to dramatically increase the speed of their offerings, their options just don't make sense to power a 5K display when Nvidia has far more powerful options available.

3. Power/Thermals: As you mentioned in your about post one of apples primary concerns is thermal constraints. It is because of the thermal(and power) constraints that Apple has been forced to use mobility class graphics chips in the iMac. Nvidia's Maxwell architecture which provides far superior performance per watt over their previous mobile(and desktop) graphics chips as well as anything AMD offers. Again unless AMD is able to dramatically increase their performance in this area, we be looking at a retina iMac that runs hotter, louder and with a potentially higher failure rate over time.

Those are the primary reasons I'm concerned about the shift to AMD if it's true.

That said, if AMD and Apple have an ace up their sleeve that can allow them to equal or exceed the performance of the 980M then I have no problem with Apple shipping to AMD :)
 
Great 5K....

How soon can we actually take advantage of it ? 4K is only just coming on new displays more and more, and i have not seen any movie in 4k yet. Not upscaled, but "True 4K" or "True 5k" since this are the real deal. Limited from those that come with the display, and home movies. but that's about it

Nothing spectacular but the resolution.
 
Dual GPU's, and yes it's possible. They've been in notebooks for a year already. Saw some dual 750's in a notebook August 2013.
 
Thanks for the details provided about NVIDIA and CUDA.

Some people claim that with Thunderbolt it was possible to adapt external video cards, would this be a potential approach?
Even it defeats the purpose of an all-in-one it might serve the purpose for future upgrades if feasible.

You'd take a major hit to the performance. Thunderbolt 2 has a bandwidth of 20Gbps (that's gigabits, or 1/8th of a gigabyte). PCI-e 3.0 has a bandwidth of 32 GB/s (or 256 Gbps). You can do the math on that, you get about 1/12 of the bandwidth. Although not all of the PCI-e bandwidth is used anyway, typically a little more than half. Some websites suggest you can pull it off with about a 30% performance loss, although the added equipment is going to add at least $300 to the cost of whatever graphics card you buy. Possibly worth it if you're a wealthy individual.

----------

Great 5K....

How soon can we actually take advantage of it ? 4K is only just coming on new displays more and more, and i have not seen any movie in 4k yet. Not upscaled, but "True 4K" or "True 5k" since this are the real deal. Limited from those that come with the display, and home movies. but that's about it

Nothing spectacular but the resolution.

In case no one has ever bothered to tell you, computers generate their own content. Like the text that you're reading right now. It would be rendered in 5k, and thus be far more clear, if your monitor supported that resolution.
 
Why would they limit the screen size to 27"? Please please please offer a decent sized screen, 30" or over, to make a top-end machine.
 
Heck. I better hurry and buy a current 27" iMac!

Heck. I better hurry and buy a current 27" iMac with the 4GB NVIDA card before Apple goes with AMD. My 2009 iMac with 256mb of vRAM can't handle the new games.

NVIDIA GeForce GTX 780M 4GB GDDR5
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.