Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
You I am afraid have not very large knowledge about thermal dynamics and physics in silicon.

45W of TDP means this: 45W of power consumed is transformed into 45W of heat. 45W CPU from Nehalem Generation will consume exactly the same amount of power, and generate EXACTLY the same amount of HEAT as 45W CPU from Skylake Generation. It has nothing to do with how much years the CPU has. Efficiency currently is achieved by adding more powerformance in THE SAME thermal envelope as before. It does not make CPUs use less power, unless we are talking about light load, or idle states.

And again. 45W CPU from whatever past generation is not producing more heat than 45W CPU from current generation. And this goes for every CPU generation there is.

That's an absurdly simplistic way of trying to explain away something without giving it much thought.

True, 45W is 45W and if processors were running at their highest clock speed constantly then yes, heat generation would be roughly the same. Fortunately, that's not all there is to the story.

Higher efficiency architecture and multi-core design both contribute to newer models achieving the same performance at a lower power rating. This directly contributes to lower heat generation for newer processors with the same TDP as older ones.
 
  • Like
Reactions: Meldalinn
That's an absurdly simplistic way of trying to explain away something without giving it much thought.

True, 45W is 45W and if processors were running at their highest clock speed constantly then yes, heat generation would be roughly the same. Fortunately, that's not all there is to the story.

Higher efficiency architecture and multi-core design both contribute to newer models achieving the same performance at a lower power rating. This directly contributes to lower heat generation for newer processors with the same TDP as older ones.
No. 45W of heat is 45W of heat regardless of generation of CPU. This is physics. Efficiency brings higher performance in the same thermal envelope, or lower thermal envelope.

But 45W of heat is still 45W of heat. For years currently Intel offers the same 45W TDP CPUs, and they are not cooler, but more efficient than previous generations. Efficient - higher performance in the same thermal envelope. It affects every single CPU generation that sport the same, or similar power envelope.
 
No. 45W of heat is 45W of heat regardless of generation of CPU. This is physics. Efficiency brings higher performance in the same thermal envelope, or lower thermal envelope.

But 45W of heat is still 45W of heat. For years currently Intel offers the same 45W TDP CPUs, and they are not cooler, but more efficient than previous generations. Efficient - higher performance in the same thermal envelope. It affects every single CPU generation that sport the same, or similar power envelope.

1) You can't have a Watt of heat. You can have a Watt-Hour of heat, but not a Watt.

2) 45W Processors don't run at a power draw of 45 Watts at idle, or when running at a lower clock speed.

I won't reply again if you don't understand this.
 
  • Like
Reactions: archi_130w1
1) You can't have a Watt of heat. You can have a Watt-Hour of heat, but not a Watt.

2) 45W Processors don't run at a power draw of 45 Watts at idle, or when running at a lower clock speed.

I won't reply again if you don't understand this.
Well, Im guessing you know better, even if you did not understood my post from the beginning...

Why do people have such big problems with reading comprehension? Its the details that make difference...

It is quite funny. You made me "wrong" even if what you have written, I have written in the first post. Who does not understand something?
 
1) You can't have a Watt of heat. You can have a Watt-Hour of heat, but not a Watt.

We all know that the watt is a unit of rate (joules per second) but yes you can have a device which generates 45W of heat energy, and all 45W CPUs will by definition dissipate heat energy at the same rate (at 100% utilization).
 
Let me fix this for all of you...

A 45W CPU AT FULL LOAD generates the same amount of excess heat as it did 10 years ago (we'll ignore the fact here for a sec that the ratings are never 100% true and a 45W CPU can also exceed 45W temporarily...).

Since CPUs have become more powerful though over the years what required a CPU at 100% back in the day now may only require... say... 2 Cores at 25%. Thus lowering the required power for the given task. So in essence you BOTH are correct.

Technically 45W are 45W. Whether the full amount of power is required though... that is a totally different story.
 
  • Like
Reactions: nicovh and 2457248
lol, they are hotter because they are older and therefor have older tech, which is made with 28nm, and the CPU and GPU are less efficient and more energy goes to heat, the heat/cumpute ratio is worse for every generation backwards in CPUs and GPUs, it has little to do with the aging of the Mac itself other than some dust in the heatsinks, and the aging of the thermalpaste

You I am afraid have not very large knowledge about thermal dynamics and physics in silicon.

45W of TDP means this: 45W of power consumed is transformed into 45W of heat. 45W CPU from Nehalem Generation will consume exactly the same amount of power, and generate EXACTLY the same amount of HEAT as 45W CPU from Skylake Generation. It has nothing to do with how much years the CPU has. Efficiency currently is achieved by adding more powerformance in THE SAME thermal envelope as before. It does not make CPUs use less power, unless we are talking about light load, or idle states.

And again. 45W CPU from whatever past generation is not producing more heat than 45W CPU from current generation. And this goes for every CPU generation there is.

Let me fix this for all of you...

A 45W CPU AT FULL LOAD generates the same amount of excess heat as it did 10 years ago (we'll ignore the fact here for a sec that the ratings are never 100% true and a 45W CPU can also exceed 45W temporarily...).

Since CPUs have become more powerful though over the years what required a CPU at 100% back in the day now may only require... say... 2 Cores at 25%. Thus lowering the required power for the given task. So in essence you BOTH are correct.

Technically 45W are 45W. Whether the full amount of power is required though... that is a totally different story.

Just so we're clear what this argument was all about. "Newbie" Meldalinn was correct.
 
And now we all love each other again and focus our anger at Apple or Intel instead of each other :) They are the true guilty people - they didn't deliver what we hoped and prayed for regardless if USB-C, 45W-hour or Kabylake in Q2 2016!
 
  • Like
Reactions: nicovh and WRONG
had it been the power draw, they wouldn't have called it Thermal Design Power.

it is self-explanatory, it's the thermal power produced by design, e.g. at max worst case load, the one for which the dissipation and power supply units have to be designed.

new generations consume less at idle than older ones.
the performance per watt is also higher in newer chips.
that also means that a given TASK needs less ENERGY to be completed.
the strategy on how to accomplish this, can vary (eg. bursts at high clock vs. continued prolonged load at base clock).

Peace.
 
had it been the power draw, they wouldn't have called it Thermal Design Power.

it is self-explanatory, it's the thermal power produced by design, e.g. at max worst case load, the one for which the dissipation and power supply units have to be designed.

new generations consume less at idle than older ones.
the performance per watt is also higher in newer chips.
that also means that a given TASK needs less ENERGY to be completed.
the strategy on how to accomplish this, can vary (eg. bursts at high clock vs. continued prolonged load at base clock).

Peace.

I heard your mic drop all the way over here...
 
  • Like
Reactions: 2457248
I do not have Delete key in my book

Try CTRL+Alt+Canc


FTFYtoo

918476615.jpg
 
Razor Core finally shipping, Consumer photos:
https://imgur.com/a/jpSyd

Unboxing video:

So very unbelievably jealous right now. I'm stuck on 8 GB of RAM and my elderly 330m can't even play Dota II. Meanwhile these razer d00ds are highfiving themselves with their hotrod eGPU playin crysis multiscreen 4k at 240 FPS on max settings.
Jealous of what? This razor core have insane price of 500$ And you still have to buy dgpu(another 300$ or more?) For 500$ dollars I can buy desktop computer whithout dgpu. Just buy playstation :)
 
Integrated graphics share the same total TDP as the CPU. So a 45w chip can either have full power to CPU (e.g. 40w CPU/5w iGPU) or full power to the iGPU (e.g. 25w CPU/20w iGPU). It will actually go over its 45w TDP for a time, but as soon as it hits 100c it will start throttling and the power will pull back closer to it's max TDP of around 45w.

For normal work loads (no CAD/gaming/3d graphics), this isn't a problem because the workload is generally intermittent so the the CPU/iGPU will hit its turbo speeds and you'll have a smooth, fast, computer.

However, as soon as you run a game that requires constant high power input, the power has to be shared between your CPU and iGPU. Your 3.5ghz (turbo boosted) CPU will under-clock itself to keep the heat within limits and will be running much lower than it's specified frequency, around 1.5Ghz - As well as having the GPU limited as it's operating under the same power and heat envelope - this is why iGPUs still suck for games.

Benchmarks will usually look okay as they are only run for a short time. But in real workloads, things run for much longer and the heat eventually slows it down.

Have a look the intel power gadget to see what I'm talking about in action: https://software.intel.com/en-us/articles/intel-power-gadget-20 - If you want to game on it, the external TB3 enclosure from Razer is a good option.

Thanks, I understand everything you are saying, but that didn't answer my question. I was asking why iris 580 is performing slower than the last gen iGPU , iris 6200 pro. Because that would be subject to the same throttling and heat issues.

Also. You are wrong. iGPUs do not suck for gaming. On mobile devices, iGPUs are ideal for gaming.

First of all, find me a 20w dGPU that performs anywhere near the iris 580.

So gaming is a spectrum, from light/casual gaming, to medium/multiplayer gaming, to high to enthusiast end gaming.

For most of that spectrum, like say 70-80% +. An iGPU is ideal, because it can play the games just as well, but play them cooler, quieter, and consume less battery. Only the real high end games, benefit from a dGPU. And each generation that goes by, iGPUs become idea for a larger part of this spectrum.


Oh and I have been trying to get a razor core since February. But it still just says, notify me on their site.

No. 45W of heat is 45W of heat regardless of generation of CPU. This is physics. Efficiency brings higher performance in the same thermal envelope, or lower thermal envelope.

But 45W of heat is still 45W of heat. For years currently Intel offers the same 45W TDP CPUs, and they are not cooler, but more efficient than previous generations. Efficient - higher performance in the same thermal envelope. It affects every single CPU generation that sport the same, or similar power envelope.

IINM Razer stated the Core currently only works with their Blade laptops. No third party support yet for the time being.

Wrong they work with any tb 3 laptop.
 
So I think the speed of the SSD's Apple is using right now corresponds with the Samsung SM951. As Samsung have announced the SM961 with even better speeds and availability 2H-2016, do you think Apple might include these in the next MacBook Pro 15"?

http://www.anandtech.com/show/10168/samsung-shows-off-sm961-and-pm961-ssds-oem-drives-get-a-boost

Faster is better right :D

Also, with 1TB SSDs now available, this opens the option for 2TB upgrade - doesn't it? I'm just hoping the 15" comes standard with 512GB, otherwise I'll need to upgrade.... even then I might upgrade for the hell of it ;)
So excited. I've never used a computer with an SSD, so the new MBP will probably blow my socks off.
[doublepost=1464284696][/doublepost]
Jealous of what? This razor core have insane price of 500$ And you still have to buy dgpu(another 300$ or more?) For 500$ dollars I can buy desktop computer whithout dgpu. Just buy playstation :)
I would still buy it, since I plan on doing alot of gaming on it.
Hopefully apple comes out with their own solution for this.
 
  • Like
Reactions: kanyehameha
Thanks, I understand everything you are saying, but that didn't answer my question. I was asking why iris 580 is performing slower than the last gen iGPU , iris 6200 pro. Because that would be subject to the same throttling and heat issues.

Also. You are wrong. iGPUs do not suck for gaming. On mobile devices, iGPUs are ideal for gaming.

First of all, find me a 20w dGPU that performs anywhere near the iris 580.

So gaming is a spectrum, from light/casual gaming, to medium/multiplayer gaming, to high to enthusiast end gaming.

For most of that spectrum, like say 70-80% +.
Fisrt you write that iris 580 is slowet than iris 6200 in real life situation (!) then you write that performance is ok for 80% of games. Nice ********s :D If igpu is ok for you for 80% of games that means that you are not a gamer and probablly play causally in fifa or lol. For gamers 90% of times igpu is pice of garbage.
 
look for Karabiner, you can customize shortcuts or keys combination
Love Karabiner, it rescued me from the stupid @ symbol being in the wrong place (from the PoV of a Windows convert). But... When I remote into my windows work pc, it switches it back again!
 
Jealous of what? This razor core have insane price of 500$ And you still have to buy dgpu(another 300$ or more?) For 500$ dollars I can buy desktop computer whithout dgpu. Just buy playstation :)

Insane price of $500? Is there a cheaper option out there ?

Fisrt you write that iris 580 is slowet than iris 6200 in real life situation (!) then you write that performance is ok for 80% of games. Nice ********s :D If igpu is ok for you for 80% of games that means that you are not a gamer and probablly play causally in fifa or lol. For gamers 90% of times igpu is pice of garbage.

Yes that's exactly what I am saying, 6200 pro and 580 are both perfect for 80% of pc gaming. That is what I am saying.

And no I am not a casual gamer, I will be buying a gtx 1080 asap. And I commonly build water cooling gaming systems with x2 x3 sli, with the latest desktop GPUs available. And build the most vigorous tests for them possible
 
Fisrt you write that iris 580 is slowet than iris 6200 in real life situation (!) then you write that performance is ok for 80% of games. Nice ********s :D If igpu is ok for you for 80% of games that means that you are not a gamer and probablly play causally in fifa or lol. For gamers 90% of times igpu is pice of garbage.

You're wrong. Look at the most popular PC games of 2015:

http://www.statista.com/statistics/251222/most-played-pc-games/

An Iris Pro 580 could play nearly all of those just fine. Only the "hardcore" games require a dGPU, and the 580 seems to be performing close to last generation's dGPUs. I've come to terms with and almost hope Apple ditches the dGPU (cause of many problems for the MBP over the years) and supports an eGPU for the hardcore gamers out there. I'd buy one of those, too.
 
Fisrt you write that iris 580 is slowet than iris 6200 in real life situation (!) then you write that performance is ok for 80% of games. Nice ********s :D If igpu is ok for you for 80% of games that means that you are not a gamer and probablly play causally in fifa or lol. For gamers 90% of times igpu is pice of garbage.
whats a definition for a gamer ?? i think a person who play a game no matter what game almost everyday right?
so there are a lot of people...millions or more that playes diablo cs lol wow bla bla etc so not demanding games so hd580 will be more than ok for those games
Gamer is not related to demanding games necessary
 
whats a definition for a gamer ?? i think a person who play a game no matter what game almost everyday right?
so there are a lot of people...millions or more that playes diablo cs lol wow bla bla etc so not demanding games so hd580 will be more than ok for those games
Gamer is not related to demanding games necessary
In my definition its a person that buy new aaa title often and want to have best possible graphic and performance. Yes, LOL player can play it 15 hours a day and iris 580 will be very good for this game (also WoW and other online games with graphic from 2002).
[doublepost=1464288736][/doublepost]
You're wrong. Look at the most popular PC games of 2015:

http://www.statista.com/statistics/251222/most-played-pc-games/

An Iris Pro 580 could play nearly all of those just fine. Only the "hardcore" games require a dGPU, and the 580 seems to be performing close to last generation's dGPUs. I've come to terms with and almost hope Apple ditches the dGPU (cause of many problems for the MBP over the years) and supports an eGPU for the hardcore gamers out there. I'd buy one of those, too.
Yeah, I saw the site with games recomended settings for iris 580 and that was just sad.
[doublepost=1464288874][/doublepost]
Insane price of $500? Is there a cheaper option out there ?
Its hard to say that 500$ box is an option. You can boy whole pc without dgpu for this price.
 
In my definition its a person that buy new aaa title often and want to have best possible graphic and performance. Yes, LOL player can play it 15 hours a day and iris 580 will be very good for this game (also WoW and other online games with graphic from 2002).
This is the worst definition of gamer.
 
  • Like
Reactions: Mattsasa
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.