Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

geromi912

Cancelled
Original poster
Mar 19, 2018
209
248
https://www.notebookcheck.net/Apple-MacBook-Pro-15-2018-2-6-GHz-560X-Laptop-Review.317358.0.html

TL;DR
>Only able to maintain 6 core turbo of 4Ghz for about a second as CPU hits 100C in 1 second
>There are clock dips down to 1.8Ghz on first run, fluctuates between 1.8Ghz and 3.5Ghz, averages at 2.7Ghz. Cannot maintain the steady base clock of 2.6Ghz.
>XPS with a LOWER CPU (8750H) on average 27% ahead on cinebench multi, still beats the MBP on its 50th consecutive run vs the best MBP run
>During the cinebench loop, only 13% ahead of 2017 MBP with 7700HQ
And this is all with the dGPU completely inactive.
 
TL;DR
>Only able to maintain 6 core turbo of 4Ghz for about a second as CPU hits 100C in 1 second
>There are clock dips down to 1.8Ghz on first run, fluctuates between 1.8Ghz and 3.5Ghz, averages at 2.7Ghz. Cannot maintain the steady base clock of 2.6Ghz.
>XPS with a LOWER CPU (8750H) on average 27% ahead on cinebench multi, still beats the MBP on its 50th consecutive run vs the best MBP run
>During the cinebench loop, only 13% ahead of 2017 MBP with 7700HQ
And this is all with the dGPU completely inactive.

Notebookcheck does the best reviews! They are one of the very few reviewers who actually properly look at the machines.

Few comments:
- They say that the average frequency over 42 minutes was 2.7 Ghz — that is more or less the base frequency and what is expected when running such CPU with all cores loaded. The 4.0Ghz is the max turbo boost for single active core, which is a very different situation. In fact, if it can maintain the average of 2.6 Ghz, then it is operating strictly within the specifications (as this is the point at which the TDP of the CPU is expected to be reached).
- They run Cinebench under Bootcamp. Is there any reason for it?
- It is not true that the dGPU is inactive. First of all, under Bootcamp the dGPU is always active (since its the graphic card that drives the display). Second, Cinebench creates an OpenGL context during tests.
- The XPS leadership here is strange (as I mentioned other threads), since MBP had it beaten in 2016 and 2017. As far as I know, neither XPS nor the MBP had any updates to their cooling solution. I think its a bit strange, that Dell throttles more than the MBP using 2016 and 2017 CPUs, but throttles less using a hotter 2018 CPU...
 
Notebookcheck does the best reviews! They are one of the very few reviewers who actually properly look at the machines.

Few comments:
- They say that the average frequency over 42 minutes was 2.7 Ghz — that is more or less the base frequency and what is expected when running such CPU with all cores loaded. The 4.0Ghz is the max turbo boost for single active core, which is a very different situation. In fact, if it can maintain the average of 2.6 Ghz, then it is operating strictly within the specifications (as this is the point at which the TDP of the CPU is expected to be reached).
- They run Cinebench under Bootcamp. Is there any reason for it?
- It is not true that the dGPU is inactive. First of all, under Bootcamp the dGPU is always active (since its the graphic card that drives the display). Second, Cinebench creates an OpenGL context during tests.
- The XPS leadership here is strange (as I mentioned other threads), since MBP had it beaten in 2016 and 2017. As far as I know, neither XPS nor the MBP had any updates to their cooling solution. I think its a bit strange, that Dell throttles more than the MBP using 2016 and 2017 CPUs, but throttles less using a hotter 2018 CPU...

Dell must have quietly changed something
 
Notebookcheck does the best reviews! They are one of the very few reviewers who actually properly look at the machines.

Few comments:
- They say that the average frequency over 42 minutes was 2.7 Ghz — that is more or less the base frequency and what is expected when running such CPU with all cores loaded. The 4.0Ghz is the max turbo boost for single active core, which is a very different situation. In fact, if it can maintain the average of 2.6 Ghz, then it is operating strictly within the specifications (as this is the point at which the TDP of the CPU is expected to be reached).
- They run Cinebench under Bootcamp. Is there any reason for it?
- It is not true that the dGPU is inactive. First of all, under Bootcamp the dGPU is always active (since its the graphic card that drives the display). Second, Cinebench creates an OpenGL context during tests.
- The XPS leadership here is strange (as I mentioned other threads), since MBP had it beaten in 2016 and 2017. As far as I know, neither XPS nor the MBP had any updates to their cooling solution. I think its a bit strange, that Dell throttles more than the MBP using 2016 and 2017 CPUs, but throttles less using a hotter 2018 CPU...

That's really not true. Both the i7 in my gaming notebook at the i5 in my 2016 13" MacBook Pro can run with a full all-core load at their respective max all-core turbo speeds indefinitely.

My 7700HQ (2.8Ghz base) machine routinely runs benchmarks and even the Prime95 torture test or OCCT with the turbo pegged 100% steadily at 3.4 - 3.5Ghz depending on whether the test is more CPU or GPU intensive. With the turbo pegged that way and the fans under automatic control running nowhere near max the temp never spikes above 70c. This CPU in this chassis has lots of thermal headroom and the machine runs cool and quiet even under extreme load.

The 2.9Ghz i5 in my Mac will run at a steady, unwavering 3.1Ghz no matter the CPU load. With the fans under automatic control they ramp up to 4500-5000RPM and the temps hover in the low 90s. This CPU in this chassis is just about at its limits, but has just enough room to stretch its legs.

On both machines my CPU frequency line in Intel Power Gadget is perfectly flat and continuous as long as the CPU is under full load. That's the way it's supposed to work.

A CPU that is bouncing its clock speed all over the place because it's continuously running into its thermal safety limits and having to down clock is just not for me, though I'm sure some people wouldn't mind it. When I look at all these Power Gadget graphs that look like crazy zig-zags I see a CPU that wants to work but just can't because it's in a chassis that simply isn't capable of handling it.
 
That's really not true. Both the i7 in my gaming notebook at the i5 in my 2016 13" MacBook Pro can run with a full all-core load at their respective max all-core turbo speeds indefinitely.

My 7700HQ (2.8Ghz base) machine routinely runs benchmarks and even the Prime95 torture test or OCCT with the turbo pegged 100% steadily at 3.4 - 3.5Ghz depending on whether the test is more CPU or GPU intensive. With the turbo pegged that way and the fans under automatic control running nowhere near max the temp never spikes above 70c. This CPU in this chassis has lots of thermal headroom and the machine runs cool and quiet even under extreme load.

The 2.9Ghz i5 in my Mac will run at a steady, unwavering 3.1Ghz no matter the CPU load. With the fans under automatic control they ramp up to 4500-5000RPM and the temps hover in the low 90s. This CPU in this chassis is just about at its limits, but has just enough room to stretch its legs.

On both machines my CPU frequency line in Intel Power Gadget is perfectly flat and continuous as long as the CPU is under full load. That's the way it's supposed to work.

A CPU that is bouncing its clock speed all over the place because it's continuously running into its thermal safety limits and having to down clock is just not for me, though I'm sure some people wouldn't mind it. When I look at all these Power Gadget graphs that look like crazy zig-zags I see a CPU that wants to work but just can't because it's in a chassis that simply isn't capable of handling it.
This is about the new MBP with cannonlake CPU. Are you using internet explorer?
 
This is about the new MBP with cannonlake CPU. Are you using internet explorer?

My response was a general one to the false notion that a CPU isn't expected to run at steady boost clocks under full multi-core load. With proper cooling in a proper chassis, it should be able to do exactly that, and for extended periods of time. I have two machines that can do exactly that.

A CPU should absolutely not be bouncing continuously between boiling hot, hitting its thermal safety cutoff, and falling back to frequencies well below stock to cool off, in some sort of crazy frequency zig-zag.
 
Notebookcheck does the best reviews! They are one of the very few reviewers who actually properly look at the machines.

Few comments:
- They say that the average frequency over 42 minutes was 2.7 Ghz — that is more or less the base frequency and what is expected when running such CPU with all cores loaded. The 4.0Ghz is the max turbo boost for single active core, which is a very different situation. In fact, if it can maintain the average of 2.6 Ghz, then it is operating strictly within the specifications (as this is the point at which the TDP of the CPU is expected to be reached).
- They run Cinebench under Bootcamp. Is there any reason for it?
- It is not true that the dGPU is inactive. First of all, under Bootcamp the dGPU is always active (since its the graphic card that drives the display). Second, Cinebench creates an OpenGL context during tests.
- The XPS leadership here is strange (as I mentioned other threads), since MBP had it beaten in 2016 and 2017. As far as I know, neither XPS nor the MBP had any updates to their cooling solution. I think its a bit strange, that Dell throttles more than the MBP using 2016 and 2017 CPUs, but throttles less using a hotter 2018 CPU...
-Yeah it's running at above the base clock ON AVERAGE. But there are dips to way below it. This could cause severe stuttering
-By inactive I really mean not under load, so it should only consume below 10w.
-I have a 9560 XPS and it's catastrophically undercooled so I'm surprised the new one can cool the 8750H and 1050Ti too. There is no way they didn't enhance the cooling.
[doublepost=1532101441][/doublepost]
My response was a general one to the false notion that a CPU isn't expected to run at steady boost clocks under full multi-core load. With proper cooling in a proper chassis, it should be able to do exactly that, and for extended periods of time. I have two machines that can do exactly that.

A CPU should absolutely not be bouncing continuously between boiling hot, hitting its thermal safety cutoff, and falling back to frequencies well below stock to cool off, in some sort of crazy frequency zig-zag.
It shouldn't when properly cooled. But the cooling on the MBP is way too weak.
 
My 7700HQ (2.8Ghz base) machine routinely runs benchmarks and even the Prime95 torture test or OCCT with the turbo pegged 100% steadily at 3.4 - 3.5Ghz depending on whether the test is more CPU or GPU intensive. With the turbo pegged that way and the fans under automatic control running nowhere near max the temp never spikes above 70c. This CPU in this chassis has lots of thermal headroom and the machine runs cool and quiet even under extreme load.

I have no doubts about it. Still, its a gaming laptop — its designed with a hot GPU in mind and will of course give the CPU a lot of headroom.

On both machines my CPU frequency line in Intel Power Gadget is perfectly flat and continuous as long as the CPU is under full load. That's the way it's supposed to work.

A CPU that is bouncing its clock speed all over the place because it's continuously running into its thermal safety limits and having to down clock is just not for me, though I'm sure some people wouldn't mind it. When I look at all these Power Gadget graphs that look like crazy zig-zags I see a CPU that wants to work but just can't because it's in a chassis that simply isn't capable of handling it.

Yeah, that is also something find strange. I've never seen the frequency change so quickly, usually there is a certain granularity to it. Maybe Coffee Lake is much more agile in changing its frequency, making many more small adjustments? Another indication for this is that you can see similar frequency bounces on another Coffee Lake machine they tested (its a quad core though) — https://www.notebookcheck.net/Dell-XPS-15-2018-9570-8300H-GTX-1050-97Wh-Laptop-Review.308420.0.html
 
I have no doubts about it. Still, its a gaming laptop — its designed with a hot GPU in mind and will of course give the CPU a lot of headroom.

Yeah, that is also something find strange. I've never seen the frequency change so quickly, usually there is a certain granularity to it. Maybe Coffee Lake is much more agile in changing its frequency, making many more small adjustments? Another indication for this is that you can see similar frequency bounces on another Coffee Lake machine they tested (its a quad core though) — https://www.notebookcheck.net/Dell-XPS-15-2018-9570-8300H-GTX-1050-97Wh-Laptop-Review.308420.0.html

The newer CPUs no longer use SpeedStep, they have switched to something called SpeedShift, which is supposed to be a lot more agile and responsive than the old method, which is probably why the graphs of the throttling look so freakishly jagged. Essentially, if I recall correctly, it offloads the decision-making to the CPU itself, which cuts out a lot of latencies.

Just looked it up:

Compared to Speed Step / P-state transitions, Intel's new Speed Shift terminology, changes the game by having the operating system relinquish some or all control of the P-States, and handing that control off to the processor. This has a couple of noticable benefits. First, it is much faster for the processor to control the ramp up and down in frequency, compared to OS control. Second, the processor has much finer control over its states, allowing it to choose the most optimum performance level for a given task, and therefore using less energy as a result. Specific jumps in frequency are reduced to around 1ms with Speed Shift's CPU control from 20-30 ms on OS control, and going from an efficient power state to maximum performance can be done in around 35 ms, compared to around 100 ms with the legacy implementation. As seen in the images below, neither technology can jump from low to high instantly, because to maintain data coherency through frequency/voltage changes there is an element of gradient as data is realigned.
 
With all these benchmark tests that people are obsessing about, how many people have actually just, you know, USED the machine normally to see how it is? This is pixel peeping, but with processors. As long as you aren't editing an Avengers movie or something, most people won't know the difference if you turbo-ing for 2 seconds or two days.

High end users doing heavy stuff will want to know this stuff, but the other 95 percent of people, who are browsing and checking their email, won't know the difference, For me personally, its the 32GB RAM and 4TB storage that make this the machine to get. But others may need all that processor horsepower at full steam, and so they will rightfully care about all this. But I'll bet most of you who are going crazy over this don't need to.
 
With all these benchmark tests that people are obsessing about, how many people have actually just, you know, USED the machine normally to see how it is? This is pixel peeping, but with processors. As long as you aren't editing an Avengers movie or something, most people won't know the difference if you turbo-ing for 2 seconds or two days.

High end users doing heavy stuff will want to know this stuff, but the other 95 percent of people, who are browsing and checking their email, won't know the difference, For me personally, its the 32GB RAM and 4TB storage that make this the machine to get. But others may need all that processor horsepower at full steam, and so they will rightfully care about all this. But I'll bet most of you who are going crazy over this don't need to.
You sound exactly like those guys who used to say that nobody needs 32gb ram up until a week ago...
 
With all these benchmark tests that people are obsessing about, how many people have actually just, you know, USED the machine normally to see how it is? This is pixel peeping, but with processors. As long as you aren't editing an Avengers movie or something, most people won't know the difference if you turbo-ing for 2 seconds or two days.

High end users doing heavy stuff will want to know this stuff, but the other 95 percent of people, who are browsing and checking their email, won't know the difference, For me personally, its the 32GB RAM and 4TB storage that make this the machine to get. But others may need all that processor horsepower at full steam, and so they will rightfully care about all this. But I'll bet most of you who are going crazy over this don't need to.

First of all, Apple is charging people $300 for an "upgraded" i9 CPU that just plain doesn't work. It's not only not faster than the thing it's meant to "upgrade," it's often slower. There was a story on 9To5Mac yesterday and the guy used Xcode to disable 2 of the CPU's 6 cores and his video render went faster. That's how badly these CPUs are unsuited to these chassis designs.

Apple is ripping people off.

That's just wrong, as a matter of principle and basic decency.

I'm having a hard time accepting how this could have happened. How the heck did no one at Apple who mattered simply ask, "Shouldn't this be working better than this?"
 
You sound exactly like those guys who used to say that nobody needs 32gb ram up until a week ago...

I still don't NEED 32 - but I got it anyways, just in case. You can't use RAM that isn't there, if you need it. But in the end, it is about need, isn't it? Do you NEED to go 4.8 GHZ turbo? Probably not, most don't. Should it work as Apple advertises? Yes. But if it doesn't, don't buy it and quit whining. And there is a thin line between noting an issue and being concerned about it - and what we have here about the processors: obsession. But I guess this is just a normal day in the tech world.
 
The newer CPUs no longer use SpeedStep, they have switched to something called SpeedShift, which is supposed to be a lot more agile and responsive than the old method, which is probably why the graphs of the throttling look so freakishly jagged. Essentially, if I recall correctly, it offloads the decision-making to the CPU itself, which cuts out a lot of latencies.

Wasn’t SpeedShift introduced with Skylake? I never saw these kind of bounces there. I also tried to make it bounce by running brief packets of intense computations with short breaks in between - but all it did was to confuse the system and it didn’t even turbo properly. The packets and pauses had to be rather long to have any noticeable effect on frequency.
 
First of all, Apple is charging people $300 for an "upgraded" i9 CPU that just plain doesn't work. It's not only not faster than the thing it's meant to "upgrade," it's often slower. There was a story on 9To5Mac yesterday and the guy used Xcode to disable 2 of the CPU's 6 cores and his video render went faster. That's how badly these CPUs are unsuited to these chassis designs.

Apple is ripping people off.

That's just wrong, as a matter of principle and basic decency.

I'm having a hard time accepting how this could have happened. How the heck did no one at Apple who mattered simply ask, "Shouldn't this be working better than this?"

Well, there are two issues. Raw speed and cores. Apps that use multiple cores will run faster at a lower clock speed than fewer cores at a faster clock speed. So, for many apps, 6 slower clocked cores will be faster that 4 cores at a higher speed. So it is not really a "ripoff", though the clock speeds Apple advertises should be accurately portrayed. Which is why I asked, what about ACTUAL apps, not benchmark testers. From the tests a few have done on the internet, it seems apps do tend to be faster with the 6 cores, regardless of clock speeds
 
Last edited:
I know that on the PC side, some OEM's weren't enabling support for it in UEFI/BIOS by default. Some (laptops, mostly, I think) weren't even exposing it as an option, and I recall reading about people using Throttlestop to undervolt and enable it. That's what I did with my 7700HQ. There was no option in the BIOS, so I just flipped the switch in Throttlestop just in case.

Anyway, maybe Apple still had macOS handling the power/frequency transitions until recently?
[doublepost=1532103494][/doublepost]
Well, theer are two issues. Raw speed and cores. Apps that use multiple cores will run faster at a lower clock speed than fewer cores at a faster clock speed. So, for many apps, 6 slower clocked cores will be faster that 4 cores at a higher speed. So it is not really a "ripoff", though the clock speeds Apple advertises should be accurately portrayed. Which is why I asked, what about ACTUAL apps, not benchmark testers. From the tests a few have done on the internet, it seems apps do tend to be faster with the 6 cores, regardless of clock speeds

If Apple is selling someone a "2.9Ghz" i9 CPU then that CPU's minimum operating speed under load should be 2.9Ghz.

That's the minimum. That's why it's called a base clock. In a properly designed machine with sane cooling it should also be able to turbo, even with all cores under heavy load. I have two notebooks that can do it. One of them is a Mac.

If they're selling people a 2.9Ghz i9 that can't run all day and all night at 2.9Ghz then they're ripping people off, the same way Ford would be ripping people off if it sold someone a truck with an "upgraded" drive train with a 7500lb towing capacity that overheated and shut down if you actually tried to tow 7500lbs.
 
  • Like
Reactions: geromi912
With all these benchmark tests that people are obsessing about, how many people have actually just, you know, USED the machine normally to see how it is? This is pixel peeping, but with processors. As long as you aren't editing an Avengers movie or something, most people won't know the difference if you turbo-ing for 2 seconds or two days.

High end users doing heavy stuff will want to know this stuff, but the other 95 percent of people, who are browsing and checking their email, won't know the difference, For me personally, its the 32GB RAM and 4TB storage that make this the machine to get. But others may need all that processor horsepower at full steam, and so they will rightfully care about all this. But I'll bet most of you who are going crazy over this don't need to.

What I will say is, if we go back to the issue with the Nvidia GTX 970 3.5GB VRAM usable of 4GB - many people will not be affected by it, maybe even the majority - and some will. However, if someone advertised 4GB and you paid for 4GB, you should get just that.

If you upgraded to the 32GB model and found out later that only 28GB is usable - would you just be like "Well it doesn't affect my usage" and carry on? Because that is the wrong attitude when you pay for something in my opinoin.

It isn't just about "don't buy it then", this is valid criticism and you shouldn't just silently accept companies to advertise and sell products which don't work as expected. Companies only get away with it if you stay silent on the issue. Do you think we would have got a keyboard repair programme, the free case due to antennagate, the strengthening of the aluminum vs bend gate etc if we all followed your view of "don't whine and just don't buy"?

Nope...
 
If you upgraded to the 32GB model and found out later that only 28GB is usable - would you just be like "Well it doesn't affect my usage" and carry on? Because that is the wrong attitude when you pay for something in my opinoin.

What then, in your opinion, you think you are paying for? Do you expect that CPU to maintain max boost indefinitely? It’s not really how it works.

I kind of miss the old good times, when you knew that a cpu sold as X ghz would run X ghz and that’s it. Now you buy CPUs that can run somewhere between a and b, or maybe below that - all as part of their normal operation.
 
  • Like
Reactions: flyinmac
What I will say is, if we go back to the issue with the Nvidia GTX 970 3.5GB VRAM usable of 4GB - many people will not be affected by it, maybe even the majority - and some will. However, if someone advertised 4GB and you paid for 4GB, you should get just that.

If you upgraded to the 32GB model and found out later that only 28GB is usable - would you just be like "Well it doesn't affect my usage" and carry on? Because that is the wrong attitude when you pay for something in my opinoin.

It isn't just about "don't buy it then", this is valid criticism and you shouldn't just silently accept companies to advertise and sell products which don't work as expected. Companies only get away with it if you stay silent on the issue. Do you think we would have got a keyboard repair programme, the free case due to antennagate, the strengthening of the aluminum vs bend gate etc if we all followed your view of "don't whine and just don't buy"? Nope.

We all do that now with storage. We know that a 256GB iPhone doesn't really have 256GB of storage available. I said a few times that Apple should advertise accurately. But this isn't the first time a product spec has been over-advertised.
 
What then, in your opinion, you think you are paying for? Do you expect that CPU to maintain max boost indefinitely? It’s not really how it works.

I kind of miss the old good times, when you knew that a cpu sold as X ghz would run X ghz and that’s it. Now you buy CPUs that can run somewhere between a and b, or maybe below that - all as part of their normal operation.

Indefinately? No, maybe more than a couple of seconds - but also perhaps a stable base clock speed too. I mean I am not asking for the world - I am sure even Apple aren't happy how these chips are performing vs last years in terms of the chip capabilities. Again, I know this may be partly Intel's fault, my post was more to highlight that shushing criticism isn't the way to go. It won't help consumers.
 
I see a lot of you are learning about thermal throttling for the first time with this release of 2018 mbps. Probably only because it's in the media so much. Nothing new to see here.
 
We all do that now with storage. We know that a 256GB iPhone doesn't really have 256GB of storage available. I said a few times that Apple should advertise accurately. But this isn't the first time a product spec has been over-advertised.

I think it is a valid criticism, I remember a huge deal being made about some windows or android product where ~40%+ of the space was used up by years ago the OS itself.

Saying there, there is usually some fine print advising that some of the storage will be used by the OS. I doubt there will be any for over the top throttling of the CPU's.
 
What I will say is, if we go back to the issue with the Nvidia GTX 970 3.5GB VRAM usable of 4GB - many people will not be affected by it, maybe even the majority - and some will. However, if someone advertised 4GB and you paid for 4GB, you should get just that.

If you upgraded to the 32GB model and found out later that only 28GB is usable - would you just be like "Well it doesn't affect my usage" and carry on? Because that is the wrong attitude when you pay for something in my opinoin.

It isn't just about "don't buy it then", this is valid criticism and you shouldn't just silently accept companies to advertise and sell products which don't work as expected. Companies only get away with it if you stay silent on the issue. Do you think we would have got a keyboard repair programme, the free case due to antennagate, the strengthening of the aluminum vs bend gate etc if we all followed your view of "don't whine and just don't buy"?

Nope...

I am such an Apple fanboy. Well, not in the cheerleading sense, but in the sense that I have had so many iPhones, iPads, MacBooks, PowerBooks, even ... I have a desktop and a gaming notebook that run Windows, but for anything that isn't gaming, I reach for an Apple device. That's just where I'm comfortable, it's what I like.

I'm an Apple guy. I was really excited about this generation of MacBook Pros because I'd been planning to replace my 2016 13" with a 15" in the hopes of dual booting it with Windows + eGPU to enable me to get rid of my desktop machine, which is still super capable (4790K + GTX 1080) but takes up a lot of space and is getting on in years.

So, this throttling thing bums me out, because for my use case it's a real killer. But I'm still an Apple guy, and I have confidence that at some point they'll sort this crap out and right the ship.

But all these people who reflexively defend them, who mock, dismiss, or berate folks who are disappointed when things like this happen, I'll never understand it.

It is what it is. Apple has seriously, thoroughly screwed up with these devices. That's just reality, and I really don't get the point in denying it.
 
  • Like
Reactions: AuricBlue and Ma2k5
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.