Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
5120x2880, may seem great, if all u intent to do is admire your wallpapers all day, or use Mac apps, but outside of this, most users won't be experiencing native 5k since no 5k content is even available.

By "content" I guess you mean movies? Photographers will be able to use the resolution immediately. Even some point and shoot cameras have close to this resolution. Current high-end DSLRs exceed this substantially. (Nikon D800/D800E: 7,360 x 4,912).

Are users really going to zoom in all day on their photos to say "this is gorgeous" ? or would everyone rather just get their work done ?

Photographers will use it to view and edit their pictures in native, or at least closer to native, resolution.

Yes, fonts are nice, but its nothing to get loose sleep over.

Nice looking fonts are nice, but, it is much more useful than that. It is more real estate. More lines of text. Bigger, better spreadsheet views. 5120x2880 will make it possible to more view and compare two documents side by side.
 
Can you
a) use this iMac in the HiDPI mode, meaning same screen real estate as in a 2560x1440 display but sharper everything
b) use it with a 5120x2880 resolution meaning huge screen real estate but tiny UI elements? Could you actually even use the menus at this resolution?
 
Assuming Apple does offer a 5K display, I'd gather it'd be around a Thunderbolt 3.0 announcement and release alongside a new Mac Pro (?).

Well, I've just been googling around and can't find any evidence that Thunderbolt 3 will actually support DisplayPort 1.3. The TB 3 rumours pre-date the announcement of DP 1.3. Nor are there any details of DP 1.3 capable GPUs - however, at least TB 3 has the bandwidth to actually shift enough data to drive a 5k display...

Also, I still suspect that the Thunderbolt Display's most important market is as a MacBook peripheral, and that Mac Pro customers are more likely to buy third-party "professional" displays.

I'm not sure that anybody has ruled out the possibility of running the forthcoming Dell 5k monitor - which supposedly uses two DisplayPort cables - off a Mac Pro, given that (a) nobody seems to have any details from Dell yet and (b) nobody has found any details of how the display in the 5k iMac is connected (it might be using the same approach).

Could Apple release a 4K display in the interim? May not be business savvy though (?). :(

Problem is, under OS X, UHD is simply not the optimal resolution for a 27" display. Hence the 5k iMac.
 
Can you
a) use this iMac in the HiDPI mode, meaning same screen real estate as in a 2560x1440 display but sharper everything

I assume this is the default.

Bear in mind, though, that "real estate" is a piece of string. At least with Retina-compliant apps, any app with a "zoom" function will let you pack as much content into a window as your eyesight will cope with - its only the menus, icons, window furniture and dialogs that will be the "same size" as the existing 27" - and the existing 27" already has oodles of real estate.

b) use it with a 5120x2880 resolution meaning huge screen real estate but tiny UI elements? Could you actually even use the menus at this resolution?

There were hacks to do this on the rMBP so I'm sure they'll appear for the riMac. Probably unusable, though - icons and menus are not huge on the regular 27" display, and they'd be half the size in this mode.

Also, with other retina Macs & 4k displays, Apple has offered intermediate "scaled" modes that give you more "real estate" at the expense of some extra GPU load. So, on a 4k display, you can choose "2560x1440" which (I think) renders at 5120x2880 (pixel-doubled) internally then downsamples to the native 2840x2160 - so its far, far better quality than you're used to seeing running a non-retina display in a non-native resolution.

We'll have to wait for reviews to see whether the i5k offers this (it would involve rendering at even more than 5k internally, so it might not be possible).
 
IMHO, Maxwell kills these AMD chips while being much more power efficient to boot. 2011 MBP with AMD GPU's are dying left and right, while my 2012 rMBP with Nvidia GPU (yes, they learned their lesson from 2008) is still running strong despite heavy gaming.

Well, it's great you have an opinion.

The takeaway from your opinion is that you're a dedicated Nvidia ... uhh "enthusiast" and that you like to make unsubstantiated claims in order to veil your opinion as some sort of fact.

No Maxwell mobile chip can run games properly on the 5K retina iMac. It's literally pointless to have included them *even* if it was an option for Apple, which is isn't.

The R9 series destroys Maxwell in OpenCL, that's a fact. And an important one to Apple.

I also find it amusing that you're gaming on a laptop. :D:cool::p
 
AMD GPU's and CPU's use a ridiculous amount of powerc compared to its intel and nvidia counter parts, which is fine for desktops, but impractical for laptops. I personally think it was a mistake to go with AMD as R9 290x for desktops has trouble with 4k displays, so I'm guessing the mobile variant wont be any better. The better alternative would've been the 970 and 980 as they use less power and perform better then both of those AMD gpu's respectively.

No, that's actually just you talking out of your nether regions. To begin somewhere, let's just dismiss Intell "GPUs" from this discussion. They're incredibly inadequate and not even in the same league as AMD.

Also you seem to have gotten things backwards when it comes to the R9 290X and 4K displays, because that's actually where the R9 290X is the strongest, beating any Nvidia card that was available at the time of release.

There was and perhaps still is a problem with all Nvidia cards and 4K.

The "less power" thing is so dumb that you may want to reconsider your words. Less heat dissipation could matter, less efficiency could matter, but not less power.

It's like the idiots (not you of course) who claim AMD CPUs use "so much power" that when calculated it's a few extra cents a month in electicity compared to an Intel CPU for instance, if allowed to run 24/7.

Also the 980m and 970m were not an option for Apple to use in the retina iMac. So presenting that as a viable "option" doesn't make sense.
 
Well, it's great you have an opinion.

The takeaway from your opinion is that you're a dedicated Nvidia ... uhh "enthusiast" and that you like to make unsubstantiated claims in order to veil your opinion as some sort of fact.

No Maxwell mobile chip can run games properly on the 5K retina iMac. It's literally pointless to have included them *even* if it was an option for Apple, which is isn't.


I also find it amusing that you're gaming on a laptop. :D:cool::p

What I find amusing is you probably work for AMD. I do game while on the road and I have a dedicated 780ti machine at home that no stock single GPU AMD card can beat even after almost a year after release. Last time I checked, AMD is laying off a significant number of employees to boot so I wonder who's more successful currently? Yes while the 980M can't run 5K natively it's still almost 50% faster than Tonga.

The R9 series destroys Maxwell in OpenCL, that's a fact. And an important one to Apple.

BS
Rubbish
The 290X winning 3/9 tests does not equal 'destroy'
Tests here
And Anandtech is usually more of a Team Red website in the past...

It's not that the extra power that AMD uses is so much the issue. It's the amount of extra heat it produces in such tiny cramped space that the iMac has.
 
What I find amusing is you probably work for AMD.

Because anyone who likes AMD products must work for AMD? That is a silly statement.

I do game while on the road and I have a dedicated 780ti machine at home that no stock single GPU AMD card can beat even after almost a year after release. Last time I checked, AMD is laying off a significant number of employees to boot so I wonder who's more successful currently?

Can you post a chip-for-chip comparison?

I'm not familiar with the mobile chips, but, I am somewhat familiar with the high-end desktop chips from 2-1 years ago, when I did purchase comparisons, and, overall, AMD 7970 beat Nvidia's then current, and then high-end follow-on.

I assume that there is some reason that many FPS gamers (I'm not a gamer) love Nvidia. I'm just not sure what the reason is. It can't be raw performance, because, for quite a while, AMD was ahead of Nvidia overall. So, I assume that it is something else. I've been told that Nvidia drivers were designed to make that frame rate smoother and steadier (like a movie), while AMD drivers looked choppier (until recently). Is that it, or, is it something else? In any case, why do Nvidia partisans hate AMD? Seems very counterproductive to me. Any consumer market benefits when there are least two competing companies. You the consumer benefit from Nvidia and AMD competing-- you should hope that AMD stays in that market, even if you buy Nvidia.
 
By "content" I guess you mean movies?


All say "by "content" you mean movies . but then most of the post is about personal content (pictures/photos took by photograph's own camera's (which could be close to 4K, still jot native by any mean)

I'm talking here specifically not "near to anything" but actually native 5K.

Which bring me to my next question, what are Apple thinking by releasing a great product (and while acceptable for even lower quality images), there is nothing, and not ever will be 5K movies you can get, where talking like a few years away, since there is no 4k movies yet.....

So really, i look at this iMac and say, "this is limited use"

By virtue, u could say the same about "Retina" display and/or iOS too, in fact anything that does not have Retina app specifically wide spread u can call it ""limited use"

However, when mean "limited" use here, i MEAN limited use, because no one can watch a 5k movie at all for a few years time.

Therefore, i could say, this 5K iMac is much more limited in 5k usage than say an iPhone 5s/6/6 Plus where u have a Retina display but at least you have allot of "Retina" apps... compared to this 5K thin iMac, how many 5k apps have you got ? Apart from maybe Apple's own...


And how many developers will recode them to take advantage of 5K ? we haven't seen this happen yet.

Although great quality, i would like to see 5k apps make an appearance for me to change my mind.
 
Do u think Apple will bring on a retina Cinema Display to go along with Mac Pro. Really really thought they would update the one they have.
 
All say "by "content" you mean movies . but then most of the post is about personal content (pictures/photos took by photograph's own camera's (which could be close to 4K, still jot native by any mean)

I'm talking here specifically not "near to anything" but actually native 5K.

I apologize, but, I'm not quite following you.

The resolution of two well-known premium, but, not top-of-the-line "professional" DSLRs that are widely available:

Canon EOS 5D Mark III 5760 × 3840
Nikon D800/D800E 7360 × 4912

Now, if you want to go to medium format cameras, which, in the digital world, are now all professional and very expensive, take the Hasselblad H5D-50 has a 6132 x 8176 sensor, for example, and the H5D-60 is 6708 x 8956. But, you could buy a luxury car for the price of these cameras. The Canon and Nikon, while not cheap, cost less than a good used car, and, lots of amateur photographers have these or similar cameras.

Oh, and, in case you are wondering-- you can see the difference. Anyone with decent eyesight can tell the difference. Lots of people who say you can't tell the difference are just not visually-oriented people. They are the same people who looked at HD content on their VHS tape deck and said that there is no difference. ;)
 
Last edited:
Understood.



But the Mac Pro is the flagship Mac and if an iMac is able to run a 5K display so should a Mac Pro.


It doesn't work that way.

Apple was able to 'hack' a display port cable into supporting 5k resolution precisely because of the imac's all-in-one nature. Since the imac's screen doesn't have to serve as an external display, Apple is more or less free to ignore existing industry standards, which is exactly what they have done here.

Companies like Dell can't do this because the technology simply isn't there (display port 1.3 will only be out next year), and they must use display port 1.2 cables. The Mac Pro technically can support a 5k display; the caveat is that you have to plug in 2 thunderbolt cables to run the display.
 
It doesn't work that way.

Apple was able to 'hack' a display port cable into supporting 5k resolution precisely because of the imac's all-in-one nature. Since the imac's screen doesn't have to serve as an external display, Apple is more or less free to ignore existing industry standards, which is exactly what they have done here.

Companies like Dell can't do this because the technology simply isn't there (display port 1.3 will only be out next year), and they must use display port 1.2 cables. The Mac Pro technically can support a 5k display; the caveat is that you have to plug in 2 thunderbolt cables to run the display.

I totally understand this.

Remember the original Apple Cinema Display used the ADC connector on both the Mac and display. In the same way Apple could push the technology through to enable 5K on the Mac Pro but I guess they're ok with waiting for someone else to do it for them.

Apple made the Mac 'Pro' be less 'Pro' than the 5K iMac with regard to display capability.
 
I totally understand this.



Remember the original Apple Cinema Display used the ADC connector on both the Mac and display. In the same way Apple could push the technology through to enable 5K on the Mac Pro but I guess they're ok with waiting for someone else to do it for them.



Apple made the Mac 'Pro' be less 'Pro' than the 5K iMac with regard to display capability.


Not by any fault or malice. Apple too is constrained by the technologies available. I am not familiar with the ADC connector you talk about. Care to elaborate?
 
Not by any fault or malice. Apple too is constrained by the technologies available. I am not familiar with the ADC connector you talk about. Care to elaborate?

That's my point. Apple is letting external technologies dictate features of their product lineup, when in the past they developed their own. But in reality it has been to Apple's benefit to use industry standards for connection (DVI, USB etc). The challenge is that clearly a 5K display is possible, just waiting on the 'industry' to standardize the connection.

Here is a link to the ADC connector: http://en.wikipedia.org/wiki/Apple_Display_Connector
 
So, 5K became to a "non-pro" machine.. ? Whats wrong with this picture..??

Great display now it just didn't really cut it before. At least that's what uses will see last iMac models as anyway.

Apple should have invested the money in an external display with 5K instead, where pro users will more likely be using as for the most past, and integrate Thunderbolt 3 port into Mac Pro update...

Pro users will see more of 5K..... here... I would have liked iMac as a Retina display more than 5K.... Its too early for that yet.

I reckon Apple stuffed up big time here.

They like like to stick better stuff in everything don't they without making any decision of "Does this actually make sense, and how many users will really experience"
 
So, as a 2013 Mac Pro owner, we now need:

  • Apple Thunderbolt 2, 4K display
  • Option to upgrade TB2 on our Mac Pro's (in store or user fit) with DP 1.3
  • If above, then external 5K Apple display

Otherwise we would feel left out somewhat. In fact I already feel somewhat disappointed when Apple showed their device range lineup in the event video which didn't have the Mac Pro in it, it started with the iPhone and ended with an iMac?? What does that mean? is the 2013 Mac Pro a one off?
 
What I find amusing is you probably work for AMD. I do game while on the road and I have a dedicated 780ti machine at home that no stock single GPU AMD card can beat even after almost a year after release. Last time I checked, AMD is laying off a significant number of employees to boot so I wonder who's more successful currently? Yes while the 980M can't run 5K natively it's still almost 50% faster than Tonga.



BS
Rubbish
The 290X winning 3/9 tests does not equal 'destroy'
Tests here
And Anandtech is usually more of a Team Red website in the past...

It's not that the extra power that AMD uses is so much the issue. It's the amount of extra heat it produces in such tiny cramped space that the iMac has.

What I find amusing is that you're so self-indoctrinated that you'd rather fantasize about some guy on the internet being paid by or working for some company instead of you being... wrong.

Surely, that makes sense!

Disclosure: I do not work for AMD, nor have I ever.

If you find vindication in the layoffs of people, some 700 people, then you truly a sick sick fanboi beyond any reasonable debate. :confused:

You'll find that despite your best efforts at fabulating imagined benefits of one chipmaker to another (and frequently confusing mobile chips to desktop chips in order to make your logic work) there is an AMD chip in the new retina iMac.

That is the reality. The only reality that matters. :cool::apple:

And the benchmarks are so incredibly inconsistent with the 980 that it's downright suspicious. For instance the R9 290X is consistently a high-scoring chip for OpenCL. The 980? Sometimes.

Sometimes it is, sometimes curiously not. And this is the best Nvidia has to offer.
 
Buy AppleCare. When it breaks, take it in for repairs.
TB has defined that the future will be every feature in its own box and power supply. According to this, it's much nicer, that when your display breaks, you don't have to take your computer to repairs, rather just keep using it with another screen.
Why isn't DP 1.3 available yet? The standard was approved a year ago. New designs should be using DP 1.3.
Nope, dp1.3 was finalized one month ago.
There is no STANDARD in the industry that is officially out yet that supports 5K displays.
Try dp1.3. Maybe it's standard enough even for you?
 
Nope, dp1.3 was finalized one month ago.
You are correct that it was adopted a month ago, although I think most of the specs were out in September of 2013. And, they say don't expect products until 2015. I don't understand why there hasn't been more urgency on getting DP 1.3 to market. It is silly that graphics chips, and monitors, have the raw capability, but, the standard interconnect is lagging. And HDMI is lagging even more.
 
I could then go out and buy a 32" 4K display for $500 and only have $1300 in it for what amounts to the exact same computer (I realize it'd be 4K display vs 5K). But fully upgradeable and future proofed.

You should also realise that $500 buys you a crappy TN 4K display, not a wide-viewing-angle IPS that can be calibrated. Won't matter for gaming, but for anything serious you'd want to avoid a TN.
 
You are correct that it was adopted a month ago, although I think most of the specs were out in September of 2013. And, they say don't expect products until 2015. I don't understand why there hasn't been more urgency on getting DP 1.3 to market. It is silly that graphics chips, and monitors, have the raw capability, but, the standard interconnect is lagging. And HDMI is lagging even more.
I'd guess that the reasons are pretty much the same that always come up here in MR, when talking about macs vs. other computers. Why Apple adopted usb3 as last and thunderbolt as first, etc.
Different players in industry have different needs. Some needs to be first just for pr-reasons, others to gain market share, some can wait the optimal econimic and technical balance.
Display makers usually wants to be first and that usually needs specific graphic cards and now that Apple has adopted no-upgrading-at-all, they will be late in adopting new standards in their accessories, since they need to have lots of main products (like macs) to benefit those standards.

One thing which is changing product development is the trend to control product's lifespan. Macs are designed to be replaced in 3 years and that's why it's not reasonable to make them capable to adopt some new standard, which will become mainstream two years in the future.
Other thing is that manufacturers try to control how their products are used. Samsung wants you to buy another screen for computer and because of that, doesn't put that dp in their bigger screens.
 
- Is there any advantage to having both pairs match each other? Is there anything wrong with 2x4GB next to 2x8GB for a total of 24GB?

No, I believe that shouldn't cause any issues. But if you do plan to max out later on, you will get stuck with pretty useless 4GB chips (surprisingly hard to resell).
There is no cost involved, though, so giving them away would be fine - it's just an issue of whether or not to contribute to landfill.

Cost of 16GB Apple factory upgrade: $200
Total RAM: 16GB

Cost of 16GB 3rd party upgrade: $200
Total RAM: 24GB

The 24GB will be "enough" for a bit longer than the 16GB will so the opportunity cost of spending the remaining $200 (or maybe even less in the future) to do the final upgrade to 32GB will be lower for the second option as well since you can wait more time before you will feel the need to do it. Having useless 4GB chips when all is said and done involves no cost and is a fairly minor nuisance compared to the advantages. Also, you probably want to keep those chips for warranty repairs until your machine is out of warranty at least (and maybe longer if you plan to continue to use Apple service).

On the other hand, if your goal is to have 32GB very soon, then the best option is to do the factory upgrade to 16GB (to save the landfill) and then add 16GB 3rd party. It's still a $400 total expenditure (half to Apple, half to 3rd party), but why bother with wasting chips.

The other issue to consider is warranty service. If you ever have to send the machine in then you pretty much have to remove all 3rd party RAM. The least hassle of having to juggle static-sensitive chips in that case would be:

8GB:
- stock RAM
- total cost $0
- no swapping issues for repairs
- wasted chips only if you later go up to 32GB

16GB (option A):
- 8GB stock + 8GB aftermarket
- total cost $100 (cheapest day 0 cost for 16GB)
- have to remove second 8GB for repairs
- wasted chips (and money) for any upgrade beyond 16GB

16GB (option B):
- 16GB factory upgrade
- total cost $200 (same as next 24GB option)
- no swapping issues for repairs
- no wasted chips for future upgrades

24GB:
- 8GB stock + 16GB aftermarket
- total cost $200 (same as previous 16GB option)
- have to remove 16GB chips for repairs
- minor waste if you ever upgrade to 32GB

32GB:
- 16GB factory upgrade + 16GB aftermarket
- total cost $400 (cheaper than factory 32GB upgrade)
- have to remove second 16GB chips for repairs
- no wasted chips

I don't think I'd recommend anyone do the 16GB option B above since it is more expensive than option A if today's dollars matter to you and/or you think you will never need more than 16GB, and it leaves you with less memory than the 24GB option with little advantage other than not having to pull some chips for the off-chance of a repair. The only time it would make sense is if you are actually using it as an ordering configuration stepping stone to get to 32GB immediately or very soon as in the last option. If you are going to "hold off" on getting to 32GB for any appreciably amount of time, then the 24GB option offers you more time to "hold off" with only minor waste. And any time you can add to the "hold off" period just makes the eventual upgrade even cheaper (whether by price drops or by opportunity cost of holding on to the money).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.