Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The fact is no laptop should be getting hotter than 90C. Can the CPU handle it? Yes. Should the CPU have to handle it? No.

No. The fact is that MOST laptops (especially workstation ones) would get to 90C under the most horrible load that they can handle.

And MOST would idle at around 50-60C.

You can't apply desktop logic to laptops here.

If you don't want to take my words for it, please feel free to look things up. Laptops are very prone to get dangerously close to junction temperature.

100 degrees is problematic and will increase wear on the system. Heat is not good for electronics period. 100 degrees is not normal and not good.

Any amount of heat is bad on the system. Period. 100C will just wear it out faster, but it's wearing either way. Depending on the material, though, the wearing can go on for years or even decades before the processor is no longer operating within specifications. At which point, you'd already be looking at a new laptop.

And people seem to be operating under the mistaken belief that if its not shutting off due to heat then its fine and working as intended. The fact is that pretty much no modern laptop shuts off from heat anymore. It just downclocks. If the CPU is running below base frequency under sustained heavy loads then there is a problem; the CPU is not hitting its intended frequencies and thus is not working as intended. Running at base frequency is okay but not ideal. A perfectly working CPU will run at max boost.

And here's another misconception.

Turbo-Boost should only kick in when the thermal headroom allows for it to do so.

In most cases, if the processor is operating within its intended specifications (as in... a 2.3GHz quad-core Haswell running at 2.3GHz with all 4 cores stressed), that's still within specs.

And I have yet to experience said thermal throttling with a rMBP. The only thing I have seen is a bug with the EFI system forcing the rMBP to go into throttled state indefinitely.

I had a dell that would down clock to 1.2 ghz when playing Skyrim (from base 2.0 ghz and boost 2.6 ghz). Did the dell shut off? No. Was it working properly? No. Was there a problem with it? Yes.

That's a Dell, though. Have you monitored a rMBP to do the same thing?

Is no one concerned that Apple is selling you an expensive CPU upgrade and then potentially cripples that CPU by not supplying the thermal headroom? Or the appropriate power envelope (85 W for the system isn't enough)?

It's not that 85W isn't enough. The power supply seems to "throttle" itself when its temperature reaches a critical point because it's trying to draw too much current. I suspect that in certain environments (with high ambient temperature), this means that it's not reaching the 85W power that it's supposed to supply. But that's all good because it should protect itself against explosion due to excess heat. Imagine what would happen if your power supply goes bang on you?

In a nutshell, the rMBP does have the potential of drawing more power than the power supply can provide, but you would have to stress it far beyond the point of simply playing Skyrim for it to do so.

And as an aside, I have had access to 4 rMBPs by now. Disregarding the fact that I regularly run computational workloads and algorithms that easily chew through 16GB of RAM that it offers for breakfast while stressing all cores to their max, I also have a Bootcamp partition in which I... overclocked the GPU to maximize the framerate I'd get from the machine.

Result? Ever seen Skyrim run at 2880 x 1800 at 30fps?

If I can do that, then why do I have to bother with temperature or frequencies?

And if you really want to know how the rMBP does with a heavy workload, why don't I run PCSX2 (PS2 emulator) with one of the most intensive games and see how the rMBP fares?
 
Define "professional"

You will encounter a plethora of different classifications of what one deems professional.

If it gets the job done in a timely manner and is beneficial to your own workflow, then by all means use the machine that best matches that criteria. Playing the min/max game on specifications and/or theoretical performance is a lesson in futility.

A professional is someone who does hours long video renderings in after effects under Ubuntu on a 15" laptop that is balanced on their lap.

Really, what the original thread starter needs is a desktop with an external display.
 
Any amount of heat is bad on the system. Period. 100C will just wear it out faster, but it's wearing either way. Depending on the material, though, the wearing can go on for years or even decades before the processor is no longer operating within specifications. At which point, you'd already be looking at a new laptop.

So this is the heart of the argument.

Any amount of heat is bad for electronics. So, the question is then. How much does Macbook Pro's thermal headroom of 95c hurt the electronics inside.

That's what we can't gauge right now. We have no measurements on it, and there are few technical details on it. Zenbooks, Razer Blade and others get just as warm.

Are newer processors more resistant to heat? Are the the success of Apple just attributing to more problems than before, simply because more macbooks out in the wild will yield more problems?
 
I'll bite on this.

3. I occasionally do crazy stuff with the stats end of it and run models that take hours to days to run. Again this was fastest machine I could get at the time. Also - this software is not optimised to multicore - so single core speed matters to me.

Because of number 3 - I would like to upgrade (though I cna't afford to right now) - the 2013 model gets about a 20% better benchmark score - to me that equates to 20% less time waiting on models to run. I haven't looked exhaustively, but from what I can see the 2013rMBP is again the fastest laptop machine on the market.

If you know otherwise - do please enlighten me

Finally got this software running multi-core. Happy now to stick with my 2011 machine for a while longer since I can put more of its power to use. Happy with that!
 
I tried ubuntu and another linux distribution.
Didnt like it at all.
Way too complicated imo.
I am not a computer scientist. I just want things to work.

If Ubuntu is too complicated for you then I just don't know what to say. My 64 year old Handwerksmeister Father-in-law has been on Ubuntu for 2 years now.

I've been here 12 years, as you can see from the join date on my avatar, and I don't think it's crazy to call it a "gimmick," although it is a bit of an exaggeration. It's actually absurd that it took retina resolutions for us to have a 1920x1200 resolution, and given the graphics overhead, I would be very tempted by a MBP that offered that resolution natively. I don't mind a little pixelation in my fonts, and I do mind the lag and headaches (e.g., tiny images, windows and the like that don't appear right in virtual machines without tweaking, etc.).

I'm pretty in touch with reality. I just have a view that's a bit more in line with the guy you were dogging.

Retina/HiDPI is in essence a gimmick on portable and the scaling is pretty bad.

serious question: which notebook in this price category (~2000-2500$) DO utilize full adobeRGB display? I think HP Dreamcolor Displays and maybe some Precisions?

Yes, the Dell and HP portable workstations are full RGB and priced comparably to the rMBP.

I wanted to see if you were, in fact, lying or exaggerating. So, I trigged about a dozen instances of yes > /dev/null to max out every core on my 2012 15" Retina MacBook Pro. (For anyone who wants to replicate this at home, eight processes should do it—two for each core, due to HyperThreading, but there's no harm in kicking off a couple extras.)

I let it run for a while too. On average, the fans spun up between 4800rpm and 5300rpm. The CPU generally stayed around 46 degrees Celsius, +/- 2 degrees. That's a 34 degree difference from what you claimed. In other words, you alleged that the CPU runs 78% hotter than it actually does.

At "general use," the temperature came down to 39-40 degrees Celsius.

I'm not going to weigh in on whether you're lying, even though you brought it up. I'll just say that the evidence suggests you are either woefully misinformed or in possession of a defective computer.

----------



This one word really seems to sum it all up, doesn't it?

You Maxed out the CPU but you didn't add the a max GPU which rendering would do. The CPU and GPU share a heat pipe so you in essence tested the cooling system at half power.

quoting myself. Anyone got an answer?!

See above..

That Thinkpad with 8Gb RAM and K2000 (which is about on par with Iris Pro btw) is $2500 outside of current BF rebate. A rMBP with the same CPU is $200 cheaper. Its also over 600g lighter. Why would I choose the Thinkpad again?



I'm confused now. Apple offers you the fastest ports and fastest storage currently available on a laptop. I would call a drive which surpasses capabilities of a SATA3 interface 'anemic'. If having fast ports and fast storage is your definition of professional, then the current MBPs must the the most professional laptops ever ;)

Quadro's spec wise are anemic the driver makes up for the spec, plus nVidia supports it's quadros longer than everything else. I'm on the absolute latest nVidia driver on my almost 6 year old mobile Quadro. There are many comparisons out there between workstation cards and the consumer counterparts if you are doing something that benefits from quadro/firepro there is no substitute even if the substitute is "more powerful".

=======

I type this from a 17" Matte MacBook Pro hooked to a 30" ACD but I carry a latitude running Arch because if I no ****** need to connect to something my Arch box will not let me down. My opinion of professional is connect to anything any time and make it work OS X doesn't do that.
 
You Maxed out the CPU but you didn't add the a max GPU which rendering would do. The CPU and GPU share a heat pipe so you in essence tested the cooling system at half power.

You didn't continue to read the thread, it seems. I ramped up the GPU later, although probably not to its maximum. Things still didn't get uncomfortably warm. We probably could test both at 100%, but I don't expect we're going to get dramatically different results.
 
If Ubuntu is too complicated for you then I just don't know what to say. My 64 year old Handwerksmeister Father-in-law has been on Ubuntu for 2 years now.
Ive been using computers for different purposes over the last 25 years.
Programmed my first video game at the age of thirteen and sold it to my playpals.

I tried ubuntu via refit on a mac mini and the drivers didnt all work.
If something doesnt work in linux u r basically on your own. I see no advantage in that.
I could get it to work eventually but i dont have time for this!

I tried some older linux system on an acer some years back. Same problem.
Tried to hook up a wifi adapter and it came with windows and mac driver.
It also gave instructions for linux how to write the driver yourself! ... In japanese!

I dont know y u discriminate against your father in law just because hes 64.
U also dont seem to hold carpenters in high regards.
Of course if ubuntu is already up and running i can use it too. But y bother?

The older i get the more i just want my computers to work easily and quickly.
mac os does this!
 
Since you didn't read the previous discussion either, I'll recap:
• Different programs report different values for the temperature(s), and it's not entirely clear which are "correct" and which are "incorrect."
• iStat Pro and others reported values at 100% CPU utilization around 46 degrees Celsius. However, my suspicion is that some of those lower values are around the heat sink.
• The utility used by the OP shows values closer to 100 degrees Celsius. I poked around, and that's the on-core reported value.
• Here's the kicker with everything you said: having an on-core temperature of 100 degrees is NOT the same as having a "laptop" temperature of 100 degrees. If the internal temperature is in the 40s or 50s (which it is, even with the CPU and GPU together), then we don't have a problem. You're talking about 100 degrees as if it's some system-wide temperature. It isn't. Not even close. And again, I can only get 100 degree on-core readings while pushing the CPU to 100% utilization across all cores and with HyperThreading (i.e., 8 processes).
• We don't know if, at maximum load, the CPU is running below base frequency. It's a good bet that Turbo Boost isn't kicking in. Again, this is something that someone running Boot Camp would have to test.
• The last point is a theoretical/hypothetical one. Let's say the laptop were magically doing a better job of cooling (more on that in a minute). In that hypothetical world, would the CPU still be hitting 100 degrees? (It's distinctly possible.) Would it be Turbo Boosting, and if so, how much? There's no way for us to answer these questions. Ostensibly, if there's a fantastic cooling PC laptop out there, we could run an experiment comparing the same 100% utilization on the Mac and the PC and see...but again, that's something someone else would have to do.

TL/DR: I see no problem whatsoever. Could cooling be improved with some Arctic Silver and non glopping it on as if more is better? Absolutely. Does that poor conductivity probably shrink the life of some Apple laptops? I'm sure. Is it really a "problem" per se? There's no compelling evidence whatsoever to suggest that it is. There's just a bunch of speculation and supposition.

First any program reporting 46 degrees under load is clearly wrong. I mean just apply some common sense. Even on a desktop with water cooling 46 degrees under max load is pretty good. No and I never said (or at least it was not my intention to imply) that its the entire laptop that's heating up to 100 degrees. But even a CPU that is operating at 100 degrees is bad news.

Anandtech checked in the review of the 2012 rmbp 15" and saw that turbo was not performing optimally on a short cinebench run.

47675.png


That score is about right for a 3630qm at 3.2 ghz turbo. Not a chip with turbo to 3.4 ghz on all cores.

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-15-Retina-2-3-GHz-Mid-2012.78959.0.html

Notebookcheck which did a pretty good and possibly less biased review (HL 2 is not a stress/throttling test game) found that the cinebench score was 5.52 on the 2.3 ghz model, indentical to the 2011 model.

In the newer CineBench R11.5 the Intel Core i7-3615QM scored 5.52 points. The predecessor reached 5.49 points - an almost identical score. The newer processor doesn't manage to pull ahead in this test. ......With the two top processors, Apple manages to cover all needs. We are not sure if the additional money for the Intel Core i7-3820QM over the i7-3720QM is money well spent. The difference in performance to the models of the previous Sandy Bridge generation is pretty minor.

As for throttling

This test wouldn't be complete if we didn't address the subject of throttling. Prime 95 and Furmark can extract the maximum performance out of the hardware. When both programs are run simultaneously, the core frequency drops (caused by Prime 95) to 1.2 Ghz. Running Prime 95 alone, we measured temperatures between 92 and 104 degrees Celsius (197.6 and 219.2 degrees Fahrenheit), and the word "Throttling" flashes repeatedly in HWiNFO. Furmark on its own cycles through all available Turbo Boost steps. After about one hour, the temperature settled at a fairly consistent 85 degrees Celsius (185 degrees Fahrenheit) with the fan system running. We never measured above 100 degrees Celsius (212 degrees Fahrenheit) with Furmark. This limitation might be due to the power adapter: in the above scenario, the system would require 86.6 watts - but the power adapter can only supply a maximum of 85 watts.


No. The fact is that MOST laptops (especially workstation ones) would get to 90C under the most horrible load that they can handle.

And MOST would idle at around 50-60C.

You can't apply desktop logic to laptops here.

If you don't want to take my words for it, please feel free to look things up. Laptops are very prone to get dangerously close to junction temperature.

Any amount of heat is bad on the system. Period. 100C will just wear it out faster, but it's wearing either way. Depending on the material, though, the wearing can go on for years or even decades before the processor is no longer operating within specifications. At which point, you'd already be looking at a new laptop.

And here's another misconception.

Turbo-Boost should only kick in when the thermal headroom allows for it to do so.

In most cases, if the processor is operating within its intended specifications (as in... a 2.3GHz quad-core Haswell running at 2.3GHz with all 4 cores stressed), that's still within specs.

And I have yet to experience said thermal throttling with a rMBP. The only thing I have seen is a bug with the EFI system forcing the rMBP to go into throttled state indefinitely.

That's a Dell, though. Have you monitored a rMBP to do the same thing?

It's not that 85W isn't enough. The power supply seems to "throttle" itself when its temperature reaches a critical point because it's trying to draw too much current. I suspect that in certain environments (with high ambient temperature), this means that it's not reaching the 85W power that it's supposed to supply. But that's all good because it should protect itself against explosion due to excess heat. Imagine what would happen if your power supply goes bang on you?

In a nutshell, the rMBP does have the potential of drawing more power than the power supply can provide, but you would have to stress it far beyond the point of simply playing Skyrim for it to do so.

And as an aside, I have had access to 4 rMBPs by now. Disregarding the fact that I regularly run computational workloads and algorithms that easily chew through 16GB of RAM that it offers for breakfast while stressing all cores to their max, I also have a Bootcamp partition in which I... overclocked the GPU to maximize the framerate I'd get from the machine.

Result? Ever seen Skyrim run at 2880 x 1800 at 30fps?

If I can do that, then why do I have to bother with temperature or frequencies?

And if you really want to know how the rMBP does with a heavy workload, why don't I run PCSX2 (PS2 emulator) with one of the most intensive games and see how the rMBP fares?

Notebookcheck runs every notebook they test under a sustained (several hours) Prime + Furmark load. Many do not break 90 degrees (the big gaming notebooks or AMD notebooks), or if they are breaking 90 degrees its because they are configured to run at max turbo. But the fact remains that furmark + prime is unrealistic. A render is not and the OP was specifically stating that he was reaching 95+ degrees on render projects using consumer software.
I agree that running at base clock is not throttling. However it is far from ideal and shame on apple for selling CPU upgrades that are never felt in the real world (and any manufacturer who does this). There are multiple threads about a decreasing battery under load on rmbp 15"; simply put the power supply is not large enough. If the power supply is throttling because its in danger of exploding then there is something wrong with the power supply. The power supply should be able to take the load I apply on it with the computer that is sold with it. Apple needs to make the power supply a little larger and a little more powerful. As it is its a poor match with the rmbp 15".
 
First any program reporting 46 degrees under load is clearly wrong. I mean just apply some common sense.
Again, you seem not to be reading, so I'll put it in bold what I said: my suspicion is that some of those lower values are around the heat sink. Yay reading!

it is far from ideal and shame on apple for selling CPU upgrades that are never felt in the real world
This remains a supposition on your part. The rest of what you wrote is not corroborating evidence. Benchmarks are not an accurate reflection on what a processor is doing. Again, Intel provides PC-native utilities that do provide this information.

----------

But notice Anand's temperature readings. Doing his HL2 torture stress test;http://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/12


2012 rMBP
Max CPU Temp 63C
Max GPU Temp 72C



that doesn't sound like the temps most people here are getting.

I think this gets back to what I was trying to say a few pages ago, what what the OP and cirus and others seem to be ignoring: the numbers are wildly different depending upon what you measure. If you use Apple's sensors on the heat sink, that's consistent with Anand's values. That's different from the thermal readings on the cores of the CPU themselves. Architecturally, I don't know where those readings come from. Intel's literature implies they come from Intel, but if someone actually knows the answer there, I would be quite curious.
 
Again, you seem not to be reading, so I'll put it in bold what I said: my suspicion is that some of those lower values are around the heat sink. Yay reading!

Yet you posted

Different programs report different values for the temperature(s), and it's not entirely clear which are "correct" and which are "incorrect."

Its entirely clear and obvious that the CPU temperature is not 46 degrees. The heatsink temperature is not the CPU temperature. Neither is the temperature around the motherboard. With all the complaints about heat and such, do you think for one minute that the CPU temperature (on the cores because that is what controls whether there is throttling or no) is only 46 degrees.

This remains a supposition on your part. The rest of what you wrote is not corroborating evidence. Benchmarks are not an accurate reflection on what a processor is doing. Again, Intel provides PC-native utilities that do provide this information.

If a benchmark is getting lower results than it should then its not difficult or illogical to extrapolate that lower results will be had real world. Cinebench is pushing it as a 'benchmark' are there is perfectly usable software than runs the same basic code (Maxon Cinema 4D).

I think this gets back to what I was trying to say a few pages ago, what what the OP and cirus and others seem to be ignoring: the numbers are wildly different depending upon what you measure. If you use Apple's sensors on the heat sink, that's consistent with Anand's values. That's different from the thermal readings on the cores of the CPU themselves. Architecturally, I don't know where those readings come from. Intel's literature implies they come from Intel, but if someone actually knows the answer there, I would be quite curious.

Numbers are different yes. The heatsink temp can be 0 degrees and the CPU core can be 100 but do you think for one minute the heatsink value is going to be the factor affecting throttling?

And Anand's test does not make sense in that if you compare numbers the CPU is running about 100-200 mhz faster on the multicore test than the single thread test. And whatever utility anand used to test temperatures is wrong. The 2011 mbp runs about the same temperature under load but throttles during the HL2 demo. And look at the fps ove time graphs. The 2011 and the 2012 runs at similar fps at the beginning of the test despite the 2012 model being significantly faster on the GPU (and a little faster on the CPU). Something seems weird there.
 
But notice Anand's temperature readings. Doing his HL2 torture stress test;http://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/12


2012 rMBP
Max CPU Temp 63C
Max GPU Temp 72C



that doesn't sound like the temps most people here are getting.
What's the point? Half Life 2 is an 8 years old pc game. It's considered an ancient by video game standards. 2012 rmbp creating so much heat producing graphics in this game doesn't really prove anyone in favor of rmbp.
 
What's the point? Half Life 2 is an 8 years old pc game. It's considered an ancient by video game standards. 2012 rmbp creating so much heat producing graphics in this game doesn't really prove anyone in favor of rmbp.

your right. I read Anands reviews of MBP 15 2011(early) and 2011(late) reviews and it was the same.. used HL2 to bench.

maybe someone should tweet him about perhaps using some other torture tests for this haswell model in his upcoming review
 
Its entirely clear and obvious that the CPU temperature is not 46 degrees. The heatsink temperature is not the CPU temperature. Neither is the temperature around the motherboard. With all the complaints about heat and such, do you think for one minute that the CPU temperature (on the cores because that is what controls whether there is throttling or no) is only 46 degrees.
I think you have things backward, or are reading what I'm saying backward, which is why you're arguing with me.

Of course the temperature on the CPU itself is not 46 degrees. The whole point of the OP's (baseless) claim is about what is "too hot" for the CPU and what isn't. With the Tjunction value being 100 degrees C, then if (big if) the readings of 100 degrees are accurate, there is not a thermal problem on the CPU itself.

Similarly if (big if) the heatsink values ranging from 45-60 degrees are correct, then we still don't have a problem, because heat is dissipating as it should.

If a benchmark is getting lower results than it should then its not difficult or illogical to extrapolate that lower results will be had real world. Cinebench is pushing it as a 'benchmark' are there is perfectly usable software than runs the same basic code (Maxon Cinema 4D).
Since you demonstrated respect for Anand, I'll point out two things. First, he didn't allege that the benchmarks were lower than expected. Second, what he did say about them directly contradicts what you think is a logical extrapolation. I'll quote him directly from the very same page with the benchmarks you linked:

The improved thermal characteristics may allow mobile Ivy Bridge to operate in turbo modes for longer than Sandy Bridge, however I don’t have any data to actually support that claim. That doesn’t mean it can’t happen, it’s just complex to test and model.

snip Something seems weird there.
Yeah there's a lot of stuff that seems weird, which is why I'm a bit frustrated that you and others keep jumping to conclusions as if you're sure you know what's going on. For about the tenth time, at this point it's nothing more than idle speculation. There are simply too many variables at play. It's great to have a hypothesis, but proving a hypothesis requires a level of rigor not seen whatsoever in this thread. Three cheers for the scientific method.
 
Professional writer: yes, the macbook pro is suitable as long as the keyboard does not heat up and burn the writer's fingers.

Fair point, but any laptop that gets that hot wouldn't make it past testing from pretty much any company. Therefore, this point is moot.

Professional digital artist who does print work: no, the macbook pro has a bad display

This is true of any laptop out there. If you need your colors to be that incredibly accurate, then you need a professional external monitor that you can calibrate with a device; no laptop screen will cut it. Therefore, this point is moot.

Professional digital artist who does video work: no, the macbook pro over heats when rendering.

This, in my opinion, is the most viable point. However, has anyone actually determined that it is indeed overheating? I understand that the temperatures are nearing the thermal limits of the components in question, but are they actually going beyond those limits? Do we know that the computer is scaling back the CPU and/or GPU to stay within thermal limits? If, for example, you added some fans or something else to significantly lower the temperature inside the casing, would you find that those complex AE projects would render that much faster?

I realize I'm replying to the first page, and that these points may have been hashed out already, but I wanted to submit my thoughts.
 
You didn't continue to read the thread, it seems. I ramped up the GPU later, although probably not to its maximum. Things still didn't get uncomfortably warm. We probably could test both at 100%, but I don't expect we're going to get dramatically different results.

I did and it was not at maximum..

I'm not going to argue whether they get too hot because if they do they shut down..

----------

Ive been using computers for different purposes over the last 25 years.
Programmed my first video game at the age of thirteen and sold it to my playpals.

I tried ubuntu via refit on a mac mini and the drivers didnt all work.
If something doesnt work in linux u r basically on your own. I see no advantage in that.
I could get it to work eventually but i dont have time for this!

I tried some older linux system on an acer some years back. Same problem.
Tried to hook up a wifi adapter and it came with windows and mac driver.
It also gave instructions for linux how to write the driver yourself! ... In japanese!

I dont know y u discriminate against your father in law just because hes 64.
U also dont seem to hold carpenters in high regards.
Of course if ubuntu is already up and running i can use it too. But y bother?

The older i get the more i just want my computers to work easily and quickly.
mac os does this!

I hold him in higher regard than you, if you think Ubuntu is to complicated.
 
Fair point, but any laptop that gets that hot wouldn't make it past testing from pretty much any company. Therefore, this point is moot.



This is true of any laptop out there. If you need your colors to be that incredibly accurate, then you need a professional external monitor that you can calibrate with a device; no laptop screen will cut it. Therefore, this point is moot.



This, in my opinion, is the most viable point. However, has anyone actually determined that it is indeed overheating? I understand that the temperatures are nearing the thermal limits of the components in question, but are they actually going beyond those limits? Do we know that the computer is scaling back the CPU and/or GPU to stay within thermal limits? If, for example, you added some fans or something else to significantly lower the temperature inside the casing, would you find that those complex AE projects would render that much faster?

I realize I'm replying to the first page, and that these points may have been hashed out already, but I wanted to submit my thoughts.

I don't have any rMBP with me to test right now (returned them), but I can create a sample project file for you to test if you have one available.

What I do remember are the fans shutting down after ~10 or ~20min of rendering. That behavior varied as the screen would turn black (but still logged in and AE running). They would run back up again a few minutes later.

Someone should watch the clock speeds in real time on a test render to see what's happening and post their results. The CPU has difficulty being kept below 100C, so to avoid shutting down, it probably does throttle performance so it can stay at the 93C - 100C average.

I've only had the rMBP shutdown once on me while rendering, and that's when I started a test benchmark to see if it could handle it. 1/10 it shutdown. The rest, it just kept the CPU at a sustained near Tjunction temperature. The keyboard was INCREDIBLY hot during the render process. I don't understand anyone who would want to be using their laptop to type anything while rendering. For me, personally, it basically becomes useless since it has to be treated as a desktop when rendering.

It would've been nice to render 30s to 5min comps without worrying about how badly the rMBP is being tortured. My aim is to see how well the next gen rMBP holds up. I want to move some of my professional work to a mobile environment but I won't do it if the hardware can't keep up a decent amount of the workload.

As far as next year's possible IGZO display, I hope it's 10bit with good uniformity.

ASUS' IGZO 4K display isn't so bad in terms of uniformity. Of course it's not perfect, but I'll accept something that reaches that.
 
Render farms aren't always feasible, so when that's the case, I rely on my own custom built machine (fairly modest build: 32gb ram, i7 quad core haswell). My normal cpu temps range from 26C to 33C. I never peak above 55-60C when under full load due to the cooling system I have. I'm already outgrowing this build, and am eying for a better build once 16gb ddr4 ram sticks become more affordable. Then I'll consider 16gbx8.

As for the rMBP, being my first mac, I was under the impression that it would be suitable for some of the work I do. Unfortunately, I can't even do light digital editing work that relies on color accuracy since the display is so bad. And if I try to do visual Fx and rendering, the system becomes too hot (sustained 95C-100C is not acceptable).

I completely understand that I'm wrong. I should never have thought that Apple was capable of making a decent portable computer that met professional standards, in terms of color accuracy and use of cpu intensive applications.

Apple is for the masses. They appease to the average folk who don't understand how computers work. Why else do they offer one to one services explaining how to do basic tasks? Or why can't the apple "geniuses" answer my technical questions when in store? They are simply sales driven, and that's understandable. But to call their laptops "Pro" is false advertising.

That's primarily why I don't try to go into technical details on this forum as it'll just be a waste of time.

Thanks for your input.

rMBP does not equal pro.

I think you might struggle to find a portable machine that meets your needs then. I have been through a Dell Precision (lasted a week, display was horrible), a Lenovo W530 for a year and then back to a Macbook pro. Each of these machines is the top "professional" machine made by that manufacturer.

Regarding heat, processors etc I found them all to be much the same on mains power. When running off battery the Lenovo would throttle down to a ridiculous level that made it all but unusable.

The work I do doesn't require colour accuracy so I can't comment on that, I do know that my MBP display is much nicer to "look at" than either of the other two but that is purely subjective :)

The Dell and Lenovo used to become far hotter than my MBP running my workloads. I can still work with my MBP on my knees but not the other two, they get too hot.

The Lenovo was a pain, driver-wise. Whether I used official Lenovo drivers for the NVidia K2000M quadro graphics or latest NVidia ones it would frequently cause Explorer crashes and for a period of about 6 months would blue-screen simply plugging a projector into the VGA port until a driver update eventually came and sorted it out. "Professional" ? more like a bad joke.

Finally the new PCI-E storage on the MBP makes it extremely quick. Even taking into account that I only have 16GB vs the 32GB that was available on my Lenovo.

I am a professional, I run a business. I need a tool that does the job quickly with minimum of fuss. The MBP fits that bill. As an added bonus, after running it for 18 months I'll be able to sell it at a decent price and get the new model. The total cost of ownership of the machine being far lower than the other two.

It's a no-brainer to me.
 
I had a dell that would down clock to 1.2 ghz when playing Skyrim (from base 2.0 ghz and boost 2.6 ghz). Did the dell shut off? No. Was it working properly? No. Was there a problem with it? Yes.

Is no one concerned that Apple is selling you an expensive CPU upgrade and then potentially cripples that CPU by not supplying the thermal headroom? Or the appropriate power envelope (85 W for the system isn't enough)?

Anyway......we are discussing here if a Macbook Pro is Pro or Not....what do games have to do with Pro usage??????
 
I think you have things backward, or are reading what I'm saying backward, which is why you're arguing with me.

Of course the temperature on the CPU itself is not 46 degrees. The whole point of the OP's (baseless) claim is about what is "too hot" for the CPU and what isn't. With the Tjunction value being 100 degrees C, then if (big if) the readings of 100 degrees are accurate, there is not a thermal problem on the CPU itself.

Similarly if (big if) the heatsink values ranging from 45-60 degrees are correct, then we still don't have a problem, because heat is dissipating as it should.

You never want to run anything close to its breaking point. If you go anywhere else (other forums, etc) and say your (new) computer is running at 100 degrees during a video encode everyone will tell you that that is pretty high and that you should do something about it. Heatsink value means zilch because the heatsink temperature won't determine throttling or system shutoff. 100 degrees is simply too hot.

(Alternatively Notebookcheck measured the temperatures under windows and saw 95+ degree temperatures. Heatsink is not the on die temperature sensor). Also relatively low heatsink values compare to relatively high CPU temperatures indicates a poorer cooler mechanism. I'm not sure whats normal but you do want the delta between heatsink and air to be as large as possible and the delta between heatsink and heatsource to be as small as possible.

Since you demonstrated respect for Anand, I'll point out two things. First, he didn't allege that the benchmarks were lower than expected. Second, what he did say about them directly contradicts what you think is a logical extrapolation. I'll quote him directly from the very same page with the benchmarks you linked:

Yeah there's a lot of stuff that seems weird, which is why I'm a bit frustrated that you and others keep jumping to conclusions as if you're sure you know what's going on. For about the tenth time, at this point it's nothing more than idle speculation. There are simply too many variables at play. It's great to have a hypothesis, but proving a hypothesis requires a level of rigor not seen whatsoever in this thread. Three cheers for the scientific method.

There are multiple threads about high temperatures on this thread. Its a well known problem. Perhaps I cannot directly test it but I can certainty assimilate known information and make an informed statement.

----------

Anyway......we are discussing here if a Macbook Pro is Pro or Not....what do games have to do with Pro usage??????

Because no one games on a computer labeled "macbook pro"?

Or maybe you are an art designer and you make models/animation/AI or programming for a game (or video/photo/render) and you need to test it out?

Or perhaps its indicative of a GPU and CPU load which being a 'pro' machine it should be able to handle?

(and before someone jumps on the fact that its a dell its just used for comparison to illustrate that throttling below base clock is bad even though its not shutting off and as people on this forum would have you believe nothing is wrong with it).
 
But we also need to look at this from another perspective.


Other performance thin and light notebooks like Asus Zenbook 51 and Razer Blade reaches the same temperatures. No chassis this thin can control the cooling. It's simply impossible.



Now, I my previous laptop had a 640m LE and a 2,9 C2D ghz processor and it never got above 80c degrees in either cpu or gpu.

But your talking a lower mid tier DDR3 graphics chip and 4 threads (2 cores). its half the power, so even in a small envelope its much easier to control with passive cooling.


maybe we just need to accept as far as heat goes, that quads aren't ready for passive cooling?


hyper-threading and turboboost does nothing to cool it right?
 
Hi,

Can you stick to the topic? There are clear issues outlined.

No. There aren't. Nothing but innuendo and internet gossip. Meanwhile plenty of professionals are quite happy with their rMBP's.

His first proof is a forum post. Hardly proof and laughable.

Then he quotes an Apple discussion post saying there is burn in which was an early issue with the first rMBP's only. Same with yellow timing. Simply idiotic. Even if you do have a problem Apple will give you a new one. Try that with..any other laptop manufacturer.

He claims Apple uses stock ssd's which is widely known to not be true and even more foolishly says they have no control over manufacturing.

Then he claims the cooling is defective to the point it affects cpu life. Right.

He is also wrong about maximum usable resolution. I'm only surprised that someone who has $3000 to spend is this ignorant. His claims are silly and easily dismissed.

----------
 
I did and it was not at maximum..

I'm not going to argue whether they get too hot because if they do they shut down..

Then I don't see what point you have. That there might be a small bit of throttling going on? A claim that has no evidence to back it whatsoever, other than benchmarks where there is always variance? Yeah, uh no.

----------

You never want to run anything close to its breaking point. If you go anywhere else (other forums, etc) and say your (new) computer is running at 100 degrees during a video encode everyone will tell you that that is pretty high and that you should do something about it. Heatsink value means zilch because the heatsink temperature won't determine throttling or system shutoff. 100 degrees is simply too hot.
Running at Tjunction for a while probably isn't really a problem. Running there consistent probably is. What's the threshold? I'm sure there have been studies to estimate Bayesian probabilities of failure, but I haven't read them, and I get the feeling you haven't either. But running hotter in general isn't a good thing. That's just life, and physics, and is the reason why people hitting CPUs super hard very consistently are better remote connecting to a server rather than using their laptop. Again, I'm not sure what point you have.

There are multiple threads about high temperatures on this thread. Its a well known problem. Perhaps I cannot directly test it but I can certainty assimilate known information and make an informed statement.
And I've participated in them and agreed with some. However, your idea of an "informed statement" is based on assumption and conjecture, not facts, and you have blurred the line, and that's my problem.

A fact is that the thermal paste application and materials done by Apple are mediocre at best and probably sub-par. A conjecture, albeit a reasonable one, is that this increases the probability of failure. Is that a "problem"? It all depends on what that probability is.

This is logic and statistics 101. I can't state it any more clearly and simply.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.