Separate names with a comma.
Discussion in 'Mac Pro' started by macpro00, Feb 9, 2016.
x5675 processor is comparable to which i5/i7 processor?
None it is the previous generation design, the i series have improved instruction set, seriously lower power requirements, one is server chip other consumer. Raw performance one versus the other do "geekbench x5675" search vs say i7 4790 if my memory serves me well they should be close to the same in terms of work getting done.
I agree, none. The i5 doesn't have 6 cores. The i7 does, but has very different clock, memory speed, TDP, etc. So there is no equivalent.
Officially back with Westmere the consumer grade equivalent of Xeon's were i7 970/980X/990X but there was no exact equivalent to the X5675 (3.06GHz/95W). But I'm guessing that is not your question, nowadays CPU's do not really compare as stated above and it depends on your usage (multi-threaded/single-threaded).
Over the last 4-5 generations every i7 generation had about 5-10% better performance per GHz than the previous one. So very very roughly and with a big pinch of salt you could say a 3.06GHz hexacore of 5 generations back compares to a current generation i7 hexacore at 2.0 GHz (if one exists)
--- Post Merged, Feb 11, 2016 ---
95W = power consumption? Is it more efficient in terms of not eating electric bill vs a processor rated at 130W?
--- Post Merged, Feb 11, 2016 ---
That's the TDP, nothing really related to the power consumption of the CPU to finish a task.
e.g. A CPU has TDP 20W, so it's max power consumption is 20W (there is basically no OC in OSX, so we ignore that here), you need a cooling system that can handle 20W of heat.
On the other hand, if another CPU has 120W TDP, is that mean the 120W CPU has lower power efficient? NO
The 20W CPU can idle at 20W. Another much more advanced 120W CPU can idle at 10W. So the 120W CPU has better energy efficient during idle.
If the more advanced 120W CPU is 12x faster than the 20W CPU. And a job that the 20W CPU need 12min to finish, the 120W CPU only need 1min. Eventually, the 120W CPU has 200% better efficiency than the 20W CPU for this particular task (and this is just considering the CPU's power consumption, in real life, the other equipment also consume power, so the faster CPU may have more benefit in overall energy efficiency).
I mostly agree with your post regarding difference in energy efficiency except for "A CPU has TDP 20W, so it's max power consumption is 20W".
This can be explained by looking at the term TDP and what it stands for; Thermal Design Power which in case of Intel means the maximum amount of heat the cooler should be able to dissipate to keep the CPU within Tjunction (maximum temperature the CPU is still able to work at full speed). So TDP doesn't directly translate into CPU's power consumption although of course it is related.
Also, a 20W TDP CPU shouldn't idle at 20W, that would already be very close to the amount of energy the cooler should be able to dissipate, thus near 100% load instead of idle. Considering that most, if not all, the energy a CPU uses transfers into heat.
To "dumb" this down a bit... Can TDP be related to efficiency? In other words, 6cylinder 3.2 vs 6cylinder 3.5 car?
No it would be a maximum heat produced by the CPU type of thing on your efficiency idea that would be the VID or voltage ID of the chip. All chips have one set which is the default voltage of the chip for the speed it is rated at. Since power consumed is a square of the voltage applied x amps used lower is most definitely better in terms of efficient use of the power.
No, exactly the opposite. In no way is TDP a grade for efficiency on its own.
You could divide for example geekbench score through TDP to get some kind of weird score per thermal designed watt saying perhaps a bit about efficiency but is most likely not very precise anyway.
TDP only really says something about how much heat a CPU can produce (and should get rid off).
I totally agree, I tried to simplify the case and made it extreme for easier understanding. I should clearly state the assumption. e.g. The modern CPU should not idle at it's max TDP, but a very old poorly designed CPU may do that.
Anyway, AFAIK, most of the energy will transfer to heat, that's why the actual max power consumption and the TDP is so closely related.
Thanks for "dumbing" this down for us!
Since my install of the X5675, my cMP is running a bit zippier -- much more noticeable when using Photoshop. The temperature appears to be stable but curious to see what others are experiencing with their idle temps.
My X5675 / 6 core is running @ 44 degrees C (average CPU).
I've got a single-processor 2009 Mac Pro (not the one in my signature) with X5680; it runs around 38° at idle.