Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacSumo

macrumors regular
Original poster
Nov 26, 2013
129
0
But it's 2880x1800 natively, it gets scaled down to look nice and have an appropriate size.

The problem is the hardware can't handle doing that. Simple tasks like web browsing cause stutters and lags. Trackpad becomes unresponsive. And the retina gimmick blurs UI assets on various non-retina applications.

A 2880x1800 is most useful when the hardware can handle it, is energy efficient, and the display is large enough to display the text legibly at native resolutions.

Otherwise, Apple is playing tricks to sample visual data across resolutions so it can be packed to 1920x1200. The end result is incompatibility across many software.

The introduction of "Retina" , on average, has decreased quality than increased it. If all you do is use safari (or an equivalent browser), look at pictures, and type using iWork pages, then you're not the type of user who will care.
 

Macshroomer

macrumors 65816
Dec 6, 2009
1,301
730
I've been here 12 years, as you can see from the join date on my avatar, and I don't think it's crazy to call it a "gimmick," although it is a bit of an exaggeration. It's actually absurd that it took retina resolutions for us to have a 1920x1200 resolution, and given the graphics overhead, I would be very tempted by a MBP that offered that resolution natively. I don't mind a little pixelation in my fonts, and I do mind the lag and headaches (e.g., tiny images, windows and the like that don't appear right in virtual machines without tweaking, etc.).

I'm pretty in touch with reality. I just have a view that's a bit more in line with the guy you were dogging.

I kind of agree with this, I am half way in between, I like the retina display, it is impressive but I question if I really need it.

For what it is worth worth, I have had no issues getting pro work done on MacBook Pros since before they were called that. I made thousands of dollars of income from a Ti-400 if that gives you any idea.

The origin of this post is silly, if you are asking this question now, not in 1995, you are being provocative and nothing more.

----------

The problem is the hardware can't handle doing that. Simple tasks like web browsing cause stutters and lags. Trackpad becomes unresponsive. And the retina gimmick blurs UI assets on various non-retina applications.

I'm getting NONE of this on my i7/16/1TB 13". And I do mean none.
 
Last edited:

Quu

macrumors 68040
Apr 2, 2007
3,420
6,792
The problem is the hardware can't handle doing that. Simple tasks like web browsing cause stutters and lags. Trackpad becomes unresponsive. And the retina gimmick blurs UI assets on various non-retina applications.

A 2880x1800 is most useful when the hardware can handle it, is energy efficient, and the display is large enough to display the text legibly at native resolutions.

Otherwise, Apple is playing tricks to sample visual data across resolutions so it can be packed to 1920x1200. The end result is incompatibility across many software.

The introduction of "Retina" , on average, has decreased quality than increased it. If all you do is use safari (or an equivalent browser), look at pictures, and type using iWork pages, then you're not the type of user who will care.

It's true and it's sad. And this is from the perspective of someone that bought the absolute best highest spec 15" Retina available. My only criticism of the system is with the screen honestly.

I will say this, at native "best for retina" mode it looks fantastic. But that working area is just too small to get meaningful work done.
 

sinc26

macrumors newbie
Nov 25, 2013
16
0
The problem is the hardware can't handle doing that. Simple tasks like web browsing cause stutters and lags. Trackpad becomes unresponsive. And the retina gimmick blurs UI assets on various non-retina applications.
From the many owners who've commented here and dozens of online reviews, it seems you're overstating how much the machine slows down with scaling.
People think you're trolling because you call retina a "gimmick." It's absurd to think high DPI displays are "gimmicky" by any sense of the word. "Oh, some applications don't yet support it!" does not make high DPI suddenly useless.
A 2880x1800 is most useful when the hardware can handle it, is energy efficient,
The hardware apparently can handle it and is energy efficient, given the MBP's decent battery life.
and the display is large enough to display the text legibly at native resolutions.
You mean a non scaled 2880x1800 resolution on a >25" display? I'm not sure how that's even vaguely related to laptop discussion.
Otherwise, Apple is playing tricks to sample visual data across resolutions so it can be packed to 1920x1200. The end result is incompatibility across many software.
Apple wrote their own scaling algorithms to best ensure compatibility with applications not yet made for retina. I really don't know what you're trying to say here. Would you prefer non-scaled, tiny UI elements until every developer updated their applications to support high DPI? This is a major issue Windows 8.1 and Linux DE's are having.
The introduction of "Retina" , on average, has decreased quality than increased it. If all you do is use safari (or an equivalent browser), look at pictures, and type using iWork pages, then you're not the type of user who will care.
Oh please, anyone who reads text will appreciate the significantly better text rendering. As far as I'm aware, the only people who shouldn't use the retina display for work are those who do photo work, since they need the wide-gamut for printers.
 

PDFierro

macrumors 68040
Sep 8, 2009
3,932
111
Oh please, anyone who reads text will appreciate the significantly better text rendering. As far as I'm aware, the only people who shouldn't use the retina display for work are those who do photo work, since they need the wide-gamut for printers.

Bingo. I'm a writer and as such work with lots of text. No way has this decreased quality for me. It has made everything so much better.
 

Supercell

macrumors member
Feb 19, 2011
48
0
Are the heat and screen "issues" any problem when doing development on the new rMBP 15"? Mainly iOS and Android development. Having multiple projects open at once? Multiple browser tabs etc.

I'm considering buying the new rMBP 15" soon.
 

Wishbrah

macrumors regular
Oct 20, 2013
235
8
What's the point of his post? His argument quickly falls into: not all of a professional printer's work can be accomplished with only one laptop.

Definitely more of a rant than a legitimate comparison.
 
Last edited:

MacSumo

macrumors regular
Original poster
Nov 26, 2013
129
0
Are the heat and screen "issues" any problem when doing development on the new rMBP 15"? Mainly iOS and Android development. Having multiple projects open at once? Multiple browser tabs etc.

I'm considering buying the new rMBP 15" soon.

The screen might not bother you. You may not even notice if you haven't ever studied the technology behind monitors.

As for development, if you utilize your CPU resources beyond, say 60%, your CPU temps will definitely reach 80C.

With general use, it will hover around 70C. Even at this temp, it is not comfortable to be placed on your lap, and it should never be placed there if it is "burning" at 90C+, which is very close to its Tjunction of 100C.

Anything above 70C, and the keyboard becomes too warm and uncomfortable. 100C WILL be hot to the touch if you place your fingers below the vent, and the number keys will also be very, very warm. It is an uncomfortable experience, and one shouldn't leave their hands on the top portion of the body. It can become a hazard.

----------

From the many owners who've commented here and dozens of online reviews, it seems you're overstating how much the machine slows down with scaling.
People think you're trolling because you call retina a "gimmick." It's absurd to think high DPI displays are "gimmicky" by any sense of the word. "Oh, some applications don't yet support it!" does not make high DPI suddenly useless.

The hardware apparently can handle it and is energy efficient, given the MBP's decent battery life.

You mean a non scaled 2880x1800 resolution on a >25" display? I'm not sure how that's even vaguely related to laptop discussion.

Apple wrote their own scaling algorithms to best ensure compatibility with applications not yet made for retina. I really don't know what you're trying to say here. Would you prefer non-scaled, tiny UI elements until every developer updated their applications to support high DPI? This is a major issue Windows 8.1 and Linux DE's are having.

Oh please, anyone who reads text will appreciate the significantly better text rendering. As far as I'm aware, the only people who shouldn't use the retina display for work are those who do photo work, since they need the wide-gamut for printers.

I've personally tested many rMBPs (late 2013), and they suffer from stutter issues, especially when browsing. I cannot make my statement simpler than that.

It's even on the front page of arstechnica, right now: http://arstechnica.com/apple/2013/1...h-out-our-biggest-gripes-with-os-x-mavericks/

So please stop saying that I'm lying or exaggerating.
 

Meister

Suspended
Oct 10, 2013
5,456
4,310

Ryan1524

macrumors 68020
Apr 9, 2003
2,093
1,421
Canada GTA
I have not experienced this stutter.

The only time scrolling is a bit laggy is when I'm using a Logitech wireless mouse, which supposedly have a bad driver.

When scrolling with the trackpad, it's smooth as butter.
 

sinc26

macrumors newbie
Nov 25, 2013
16
0
I've personally tested many rMBPs (late 2013), and they suffer from stutter issues, especially when browsing. I cannot make my statement simpler than that.

It's even on the front page of arstechnica, right now: http://arstechnica.com/apple/2013/1...h-out-our-biggest-gripes-with-os-x-mavericks/

So please stop saying that I'm lying or exaggerating.
Your arstechnica link is not about rMBP-specific stuttering. It's about a stuttering issue in Mavericks and developers not yet implementing Maverick's responsive scrolling feature, which reportedly works very well. This is a software issue and should be improved upon.

Earlier versions of Safari/Webkit did have framerate issues with scrolling, but these framerates have been dramatically increased via software updates (e.g. http://anandtech.com/show/6495/late...erformance-on-macbook-pro-with-retina-display) and by Haswell's GPU improvements. Many people report that scrolling performance is fine on the Haswell rMBP, which is why I get the impression you're exaggerating for the sake of argument.
 

Wishbrah

macrumors regular
Oct 20, 2013
235
8
1) The screen might not bother you. You may not even notice if you haven't ever studied the technology behind monitors.

2) As for development, if you utilize your CPU resources beyond, say 60%, your CPU temps will definitely reach 80C.

3) With general use, it will hover around 70C. Even at this temp, it is not comfortable to be placed on your lap, and it should never be placed there if it is "burning" at 90C+, which is very close to its Tjunction of 100C.

4) Anything above 70C, and the keyboard becomes too warm and uncomfortable. 100C WILL be hot to the touch if you place your fingers below the vent, and the number keys will also be very, very warm. It is an uncomfortable experience, and one shouldn't leave their hands on the top portion of the body. It can become a hazard.

----------



5) I've personally tested many rMBPs (late 2013), and they suffer from stutter issues, especially when browsing. I cannot make my statement simpler than that.

6) It's even on the front page of arstechnica, right now: http://arstechnica.com/apple/2013/1...h-out-our-biggest-gripes-with-os-x-mavericks/

7) So please stop saying that I'm lying or exaggerating.

contribootin' to roll bread

1) Let me translate what sumo is saying: "Only if you have studied the technology behind the monitor, then you can actually see the monitor. If you haven't, then you're ignorant." Lol?

2) Beyond, "say 60%" and "will definitely reach 80%"...please give a source? If not, then the hot air you're blowing out is certainly hotter than the Mac's.

3) What's general use? Where do you come up with 70C? Your own, one-of-a-kind experience? Also, comfortable is relative, and if the cpu is at 90C the case is not at 90C obviously.

"Very close to its Tjunction of 100C" Define "very close"? I guess you're implying that 10 degrees away from it's tjunctionmax is supposed to degrade the cores? If the cpu is WITHIN it's thermal specifications (which, at 90C, it is), then why wouldn't it get a 10 year lifespan? If 90C was bad on the CPU, then it's tjunctionmax, which is calculated by Intel, would be 90C. Please link a study showing temperature vs long term reliability.

4) Your opinion, cool story.

5) How many is "many"? 3? 5?

6) Not relevant.

7) Misinformed?
 

john123

macrumors 68030
Jul 20, 2001
2,581
1,535
As for development, if you utilize your CPU resources beyond, say 60%, your CPU temps will definitely reach 80C.

With general use, it will hover around 70C. Even at this temp, it is not comfortable to be placed on your lap, and it should never be placed there if it is "burning" at 90C+, which is very close to its Tjunction of 100C.

So please stop saying that I'm lying or exaggerating.

I wanted to see if you were, in fact, lying or exaggerating. So, I trigged about a dozen instances of yes > /dev/null to max out every core on my 2012 15" Retina MacBook Pro. (For anyone who wants to replicate this at home, eight processes should do it—two for each core, due to HyperThreading, but there's no harm in kicking off a couple extras.)

I let it run for a while too. On average, the fans spun up between 4800rpm and 5300rpm. The CPU generally stayed around 46 degrees Celsius, +/- 2 degrees. That's a 34 degree difference from what you claimed. In other words, you alleged that the CPU runs 78% hotter than it actually does.

At "general use," the temperature came down to 39-40 degrees Celsius.

I'm not going to weigh in on whether you're lying, even though you brought it up. I'll just say that the evidence suggests you are either woefully misinformed or in possession of a defective computer.

----------

7) Misinformed?

This one word really seems to sum it all up, doesn't it?
 

leman

macrumors Core
Oct 14, 2008
19,183
19,030
Heck, to the last point, rmBP 15" is 1920x1200. That is no way "Retina" the way most people think of high resolution.

You have no idea what are you talking about. Please, go read up OS X developer documentation and how graphics work (its all free on Apple's website) and then come back. Furthermore:

The problem is the hardware can't handle doing that. Simple tasks like web browsing cause stutters and lags. Trackpad becomes unresponsive. And the retina gimmick blurs UI assets on various non-retina applications.

You are either consciously lying or are just blatantly misinformed. I have been a user of a 2012 rMBP for over a year now. What you say is definitely wrong. The trackpad issues were a driver bug and has been fixed, AFAIK. The retina is supported by virtually every properly coded Cocoa application. Legacy apps (which do no use Cocoa) might the problematic and Cocoa apps that do custom rendering would need an update. Again, if the application is implemented following the resolution independence guidelines (which have been available since Tiger and more or less finalised by 10.7) - it will display correctly. As to 'hardware can't handle that' - drawing to a desktop basically involves copying textured rectangles around. A last-get Intel IGP can play Crysis with OK-sh settings, with acceptable performance. Yet you claim that this this hardware - which can to push millions of textured triangles with advanced per-fragment computations per second - has problems with several hundred textured rectangles per second? Ridiculous.

Is rendering a HiDPI desktop more taxing for the hardware? Undoubtedly. Does a non-HiDPI mode have better performance? Certainly. Should applications devs take more care to optimise their applications in a HiDPI mode? Absolutely. That is all.

A 2880x1800 is most useful when the hardware can handle it, is energy efficient, and the display is large enough to display the text legibly at native resolutions.

Again, you are blatantly ignoring what retina screens is all about - true supersampling. It is the only long term solution to the dreaded pixel aliasing. People have been using hacks like subpixel calculation to combat it, high-resolution screen provide the real, simple way to aluminate it altogether. Make a 30" 2880x1800 screen, and you will have jagged lines everywhere. Apple are building the path to the new generation of the display technology - where resolution stops being a core spec and turns into a less relevant technical detail.

With general use, it will hover around 70C. Even at this temp, it is not comfortable to be placed on your lap, and it should never be placed there if it is "burning" at 90C+, which is very close to its Tjunction of 100C. Anything above 70C, and the keyboard becomes too warm and uncomfortable. 100C WILL be hot to the touch if you place your fingers below the vent, and the number keys will also be very, very warm. It is an uncomfortable experience, and one shouldn't leave their hands on the top portion of the body. It can become a hazard.

I am pushing my machine on 100% CPU (every core) every day for multiple hours, and i have yet to experience any failure or instability. I also play games while having the laptop on my lap. Yes, it gets warm. If you get burns from that level of warmth, however, then you should get your skin checked out by a professional - you might suffer from some sort of rare supersensitivity. I admit - occasionally the laptop can get to around 40-45 (with heavy GPU usage on a hot summer day), which can give you a second degree burn - after several hours of uninterrupted exposure.

I've personally tested many rMBPs (late 2013), and they suffer from stutter issues, especially when browsing. I cannot make my statement simpler than that.

Oh, yes, the irrefutable argument. Well, I have also tested many rMBPs and they are performing satisfactory. See what I did there?


Now, this bit is really funny. You never even READ the article. They complain about Mavericks's new scrolling implementation bugging out in third party applications. There is no mention of retina display. They are not even using a rMBP, they are on a 2012 iMac. You should go into politics, honestly.

So please stop saying that I'm lying or exaggerating.

Your posts are combination of wishful thinking, incorrect technical claims, false quotes (like the arstechnica article above), boasting about having tested an unrealistically large party of laptops - please do tell us me whether you are lying or exaggerating.
 
Last edited:

joe-h2o

macrumors 6502a
Jun 24, 2012
997
445
It used to be a "pro" when

- we had the choice to upgrade them ourselves (HD, ram)
- when we had line in port (this was very usefull for music producers)
- matte screen (perfect for photographers, photoshop people)

Now you get macbook AIR i7 for cheaper than the pro and it outperforms a retina pro... so whats the point of buying a heavier, less battery life rmbp?

waste of money, considering that apple is getting rid of retinas due to old stocks, before they move to new display IGZO. Not to mention retina is very bad for the eyes.


Haha. A line in port being useful for professional music producers. You're funny!

(For those not getting that, professional music producers will be using external sound cards).

Retina display bad for the eyes? News just in! High PPI displays are bad for you for... reasons.

Good troll, bro. 4/10, would read again.
 

FrozenDarkness

macrumors 68000
Mar 21, 2009
1,727
968
i can't speak to tinting as none of my macbook pros have it and image retention has always been a problem.

A true photographer or anybody that cares about color accuracy, for example, would own a $1k-$2k super accurate display. That in itself is already 1/2 the cost of a macbook pro. There are no laptops in production today that is color accurate enough for a real photographer to rely on. None.
 

joe-h2o

macrumors 6502a
Jun 24, 2012
997
445
i can't speak to tinting as none of my macbook pros have it and image retention has always been a problem.

A true photographer or anybody that cares about color accuracy, for example, would own a $1k-$2k super accurate display. That in itself is already 1/2 the cost of a macbook pro. There are no laptops in production today that is color accurate enough for a real photographer to rely on. None.

To further this point, OP, show us what laptop you would recommend that has the necessary colour accuracy (you earlier confused colour gamut with colour accuracy, but I'll be generous and assume you know what you're talking about) for a professional photographer to use.

Hard mode: do it for the same price as the current Macbook Pro.

Spoiler: there aren't any.
 

MacSumo

macrumors regular
Original poster
Nov 26, 2013
129
0
I wanted to see if you were, in fact, lying or exaggerating. So, I trigged about a dozen instances of yes > /dev/null to max out every core on my 2012 15" Retina MacBook Pro. (For anyone who wants to replicate this at home, eight processes should do it—two for each core, due to HyperThreading, but there's no harm in kicking off a couple extras.)

I let it run for a while too. On average, the fans spun up between 4800rpm and 5300rpm. The CPU generally stayed around 46 degrees Celsius, +/- 2 degrees. That's a 34 degree difference from what you claimed. In other words, you alleged that the CPU runs 78% hotter than it actually does.

At "general use," the temperature came down to 39-40 degrees Celsius.

I'm not going to weigh in on whether you're lying, even though you brought it up. I'll just say that the evidence suggests you are either woefully misinformed or in possession of a defective computer.

----------



This one word really seems to sum it all up, doesn't it?

I said you must run After Effects CC, and render a complex project.

If you don't want to do that, at least run a benchmark like Geekbench.
 

john123

macrumors 68030
Jul 20, 2001
2,581
1,535
I said you must run After Effects CC, and render a complex project.

If you don't want to do that, at least run a benchmark like Geekbench.

i guess you don't know what yes > /dev/null does

FrozenDarkness hit the nail on the head. What I described hits the CPU as hard or harder than anything you can throw at it—around 99% user utilization, with the remaining 1% just keeping critical system processes going.

I stand by my previous claim: you either have no idea what you're talking about, or you're deliberately making stuff up.

EDIT: It occurs to me that the OP maybe doesn’t understand the difference between CPU and GPU, or perhaps how they are related—although it still doesn’t make his claim accurate.

Using my 650M GPU by playing a game, the heatsink temperature rises to about 62 degrees C. Note that the game itself is only using about 15-20% of my total CPU power. The high temperature is an artifact of the spillover from the GPU.

Add on some yes > /dev/null processes, and those fans will spike to the max 6000 rpm, but that actually causes the CPU temperature to drop to around 57-58 degrees C. (The GPU is around 65 degrees, and the GPU diode is around 73 degrees).

So, pretty much no matter what, I can’t come anywhere close to the CPU temperature allegations he makes.

EDIT #2: Maybe he Googled temperature stuff and found threads pertaining to older models? Older models did heat up a lot more than the Retinas. Or maybe I should just stop trying to find a rational explanation for irrational nonsense.
 
Last edited:

Acronyc

macrumors 6502a
Jan 24, 2011
905
392
Thank you! Finally, someone that chimes in with experience, too.

I've tested tons of macbook pro retinas, and they suffer from the problem you described. The unibody of the laptop is either uncomfortably cold when not in use, or uncomfortably hot when in actual use, like you're doing.

I bet your temps are near 70c when working in photoshop and illustrator. You are very lucky you don't have to do any video editing work, because your temps will sky rocket.

Anyone reading this post should test it. Go download a trial version of After Effects CC. Then, create a new composition and add a complex fractal that generates frames for up to 10 minutes. Now hit render and watch your macbook pro BURN. I will pay anyone here $20 via paypal if you can render a project file I give on a macbook pro while not having your cpu temps go above 90c. You have to take a video of it and include your username while videotaping.

The terms of the contest are:

1) You must completely render the project file I give you in After Effects CC. You're only allowed to load it, add it to the render queue, and hit render
2) You must have temperature guage pro running while recording. I must see your temps in real time. If any one of your temps exceed 90C for more than 30 seconds, you are disqualified. I will be looking at the temperatures of all of your cores, along with the average core temp.
3) Any editing of the recorded video will lead to disqualification
4) If any of your temps reach or exceed 100C, you are disqualified

Wow, you sure have a lot of time on your hands.

If you don't like Apple's computers so much, why don't you start a computer company and research, design, test, and build notebook computers yourself? That way you could get just what you want.
 

MacSumo

macrumors regular
Original poster
Nov 26, 2013
129
0
FrozenDarkness hit the nail on the head. What I described hits the CPU as hard or harder than anything you can throw at it—around 99% user utilization, with the remaining 1% just keeping critical system processes going.

I stand by my previous claim: you either have no idea what you're talking about, or you're deliberately making stuff up.

EDIT: It occurs to me that the OP maybe doesn’t understand the difference between CPU and GPU, or perhaps how they are related—although it still doesn’t make his claim accurate.

Using my 650M GPU by playing a game, the heatsink temperature rises to about 62 degrees C. Note that the game itself is only using about 15-20% of my total CPU power. The high temperature is an artifact of the spillover from the GPU.

Add on some yes > /dev/null processes, and those fans will spike to the max 6000 rpm, but that actually causes the CPU temperature to drop to around 57-58 degrees C. (The GPU is around 65 degrees, and the GPU diode is around 73 degrees).

So, pretty much no matter what, I can’t come anywhere close to the CPU temperature allegations he makes.

EDIT #2: Maybe he Googled temperature stuff and found threads pertaining to older models? Older models did heat up a lot more than the Retinas. Or maybe I should just stop trying to find a rational explanation for irrational nonsense.

I obviously know the difference between a CPU and a GPU.

And I know what the "yes" command does, I use a variety of linux distros, after all.

You can run 8 terminals with "yes", and then geekbench, separately. Check the temperature of your system. You will see geekbench increases the temp more than what was suggested using the yes command.

A more real world benchmark is to use after effects. Then, you will see the difference.

I'm purposefully simplifying things so non-technical people can understand.
 

FrozenDarkness

macrumors 68000
Mar 21, 2009
1,727
968
I obviously know the difference between a CPU and a GPU.

And I know what the "yes" command does, I use a variety of linux distros, after all.

You can run 8 terminals with "yes", and then geekbench, separately. Check the temperature of your system. You will see geekbench increases the temp more than what was suggested using the yes command.

A more real world benchmark is to use after effects. Then, you will see the difference.

I'm purposefully simplifying things so non-technical people can understand.

Lol. Real cute.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.