Building a PC

skaertus

macrumors 68040
Original poster
Feb 23, 2009
3,222
374
Brazil
Yes, I am trying to build a desktop PC now, and going nuts with all the options. Do not get me wrong: I have a MacBook Pro, and I think it is great. I could buy an iMac, but I am not going to, for a few reasons. First, Apple has not upgraded the iMac in a while. Second, Apple is making all its devices more expensive, and this trend is one additional reason not to lock myself in an ecosystem. Third, macOS is great and fine, but its version of Microsoft Office is sub-par, and I see no compelling reason for using it instead of Windows, especially on a desktop. Plus, Time Magazine seems to be right: if Apple thinks the iPad is the future of computers, then why buy a Mac (http://time.com/5439441/apple-macbook-air-great-buy-ipad/)?

However, one thing that attracts me on a Mac is the heat management. I have a MacBook Pro, and it runs cool. All my other Macs all run cool, or at least they did not heat as much as PC laptops that I had.

Now, I want a PC that creates as less heat as possible, and is still a powerful one. I currently have a mini PC with a Core i5-7500T and an integrated video card. It is OK, but performance is lacking, especially for driving a 4K monitor. The thing is, this mini PC that I have has no heating issues - it runs cool. I still want my computer to run cool, especially since I live in Brazil, which is a hot country. But I want gains in performance.

I do not have any desktop Macs, but I guess Apple manages to keep its desktops reasonably cool compared to a similar PC. The iMac has desktop-class processors, and a dedicated video card. The Mac Mini now has full-blown 65W TDP Intel processors.

I wonder how I could manage to build a desktop PC with a 65W processor, and a dedicated video card, which will run without increasing temperature of the room. I guess cooling makes little or no difference for this. Perhaps a better casing? I was first thinking of a Core i7-8700 with an Nvidia GTX 1070. However, I guess that may heat a lot, and I could go as low as having an AMD Ryzen 5 2400G with its integrated Vega 11 card, if it is required to keep temperatures low.

Suggestions?
 

HomeLate

macrumors member
Apr 29, 2015
30
20
It depends on what you want to do with that PC...

Not exactly the same processor, but I have a homeserver based on an AMD Ryzen 2200G and it runs pretty cool during cpu/gpi intensive tasks. I didn't use the AMD heatsink/fan combo but an Arctic Freezer 33 heatsink/fan combo. It reaches 40 degrees during full load.

My main PC is an AMD Ryzen 1800X with a Dark Rock Pro 4 cooler and it stays nice and cool during cpu intensive tasks. BUT, a big but.. my Nvidia based Titan can heat up to 90 degrees during games/rendering. It's based on Nvidia's reference design. You'll have to do some research to find the most quiet & cool graphics card.
 

skaertus

macrumors 68040
Original poster
Feb 23, 2009
3,222
374
Brazil
It depends on what you want to do with that PC...

Not exactly the same processor, but I have a homeserver based on an AMD Ryzen 2200G and it runs pretty cool during cpu/gpi intensive tasks. I didn't use the AMD heatsink/fan combo but an Arctic Freezer 33 heatsink/fan combo. It reaches 40 degrees during full load.

My main PC is an AMD Ryzen 1800X with a Dark Rock Pro 4 cooler and it stays nice and cool during cpu intensive tasks. BUT, a big but.. my Nvidia based Titan can heat up to 90 degrees during games/rendering. It's based on Nvidia's reference design. You'll have to do some research to find the most quiet & cool graphics card.
Thanks, this is helpful.

If your Ryzen 2200G does not heat up, I guess a Ryzen 2400G would also be kept cool. I suppose the cooler does not make any difference in heating of the room, just heating of the internals, am I right?

As for your Ryzen 1800X, does it heat up the room, even though it is kept cool?

The graphic card I suppose generates more heat. I was under the impression that AMD video cards would run hotter, as they have a higher TDP. But then Apple uses AMD video cards on its iMacs. Is it because they run cooler than Nvidia perhaps?
 

Mikael H

macrumors 6502a
Sep 3, 2014
644
277
You will not get around physics: You’re converting electric energy into computational work and heat. At some point (which depends a bit on what components you have in your computer), you’ll reach a point where the inefficiency takes off: the higher performance you want, the less additional computational power (and therefore more heat) you’ll get from each additional Watt of power you put into the system.
With large heat exchangers and/or fans you can spread the generated heat more evenly across your room, but there’s no way around the fact that the heat will have to go somewhere.

If you want to do some homework, it should be possible to find tables of benchmarks vs energy input for different CPUs and GPUs. To find the most performance per watt, graph these tables out, and find the ”knee”. That’s your sweet spot. Or base a build around TDP values for an approximation of the maximum sustained heating your room will experience. And switch your light bulbs for LEDs to compensate. ;-)
Or build a heat duct to pull as much as possible of the computer’s heat out of your room, á la an AC unit.
 

skaertus

macrumors 68040
Original poster
Feb 23, 2009
3,222
374
Brazil
You will not get around physics: You’re converting electric energy into computational work and heat. At some point (which depends a bit on what components you have in your computer), you’ll reach a point where the inefficiency takes off: the higher performance you want, the less additional computational power (and therefore more heat) you’ll get from each additional Watt of power you put into the system.
With large heat exchangers and/or fans you can spread the generated heat more evenly across your room, but there’s no way around the fact that the heat will have to go somewhere.

If you want to do some homework, it should be possible to find tables of benchmarks vs energy input for different CPUs and GPUs. To find the most performance per watt, graph these tables out, and find the ”knee”. That’s your sweet spot. Or base a build around TDP values for an approximation of the maximum sustained heating your room will experience. And switch your light bulbs for LEDs to compensate. ;-)
Or build a heat duct to pull as much as possible of the computer’s heat out of your room, á la an AC unit.
Well, thanks. I suppose there is no other way then.
 

velocityg4

macrumors 601
Dec 19, 2004
4,604
1,186
Georgia
Well, thanks. I suppose there is no other way then.
The only way you'll avoid heating up a room much is by dumping the heat outside the room. If you are really determined you could use liquid cooling. Place waterblocks on your CPU and GPU. Then run the cooling lines to a radiator outside your house. You'd dump most of the heat generated outside. Not counting the minor heat generated by other components. There are some risks. Having long lines outside the case increases risk of line damage. If it's substantially colder outdoors than indoors. You run the risk of water condensing on the lines in your room and case. Although you can insulate them.

Another option is undervolting and underclocking. Typically you can reduce energy usage of a CPU and GPU by undervolting them. How much depends on your luck in the parts lottery. As there are minor differences between each one manufactured which can lead to fairly substantial different VCore settings at a given clock rate. It's a bit tedious but you can make decent deductions. The tedium comes from reducing voltage then stress testing. Going as low as you can before it becomes unstable. You can supplement this by going for models with the best performance per watt. Generally this would be an Intel CPU and nVidia GPU. You'd also want to disable turbo functions.
 

HomeLate

macrumors member
Apr 29, 2015
30
20
The home server heats up to 55 degrees when running 3D Mark but a Ryze
Thanks, this is helpful.

If your Ryzen 2200G does not heat up, I guess a Ryzen 2400G would also be kept cool. I suppose the cooler does not make any difference in heating of the room, just heating of the internals, am I right?

As for your Ryzen 1800X, does it heat up the room, even though it is kept cool?

The graphic card I suppose generates more heat. I was under the impression that AMD video cards would run hotter, as they have a higher TDP. But then Apple uses AMD video cards on its iMacs. Is it because they run cooler than Nvidia perhaps?
Both have the same TDP so heat won't be that big of an issue. The 2200G reaches 55 degrees while running 3D Mark and it does not heat up the room at all, even when transcoding Plex movies.

The Ryzen 1800X is a different animal. It has double the TDP. Together with the Titan X, which can reach up to 90 degrees, it heat things up pretty easily. But like I wrote before, it is only under full load when gaming or batch processing images in Photolab.

Like others already wrote, you could place it in another room with sufficient ventilation to dissipate the heat. Use remote desktop to work on it from your Mac Mini. You could also stream games when Steam is installed on both your PC and Mac Mini. I do this all the time in my bedroom on my SFP while the Ryzen 1800X is doing all the work.

AMD GPU's tend to run hotter than Nvidia. You'll have to do some research on which GPU to get. Avoid overclocked ones because they run hotter than gpu's running stock speeds.