So if I understand this correctly. CPU usage in OSx will appear much higher when compared with Windows, but only at first glance. Correct me if I am wrong but, from what Rbarris (Valve dev) was saying, you (going from Windows to Osx scale) take whatever Windows outputs as the CPU usage, and multiply that by the number of cores. So, 5% usage with a dual core would show as 10% in Osx. so 5 x 2. if you see 20% on a quad core in Windows, then the same machine on Osx would be 20 x 4 = 80% in Osx. Osx is would seem shows usage by core, allowing for upto 100% per. IE if you have 4 cores, then your max CPU usage you could achieve total would be 400%. In Windows they only use a 100% scale. Here is what Rbarris said My i5 is an arrandale with 2 cores, however it hyper-threads to 4 so I have a 400% potential. So in this case, we see Activity monitor taking up about 1% of my CPU, which means it is taking 1% of one of my 4 cores. On Windows, that would show up as 1% / 4 = .25% So if you ever thought man, OSx needs a lot more CPU to do the same things... well there you go.