Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...
2. If, on the other hand, one uses [even light use] the VMs application then the temperature increase is temperature is large....for example, light use of Excel [small spreadsheet, not calculation intensive] and IE raised the temperature from 50 degrees to 73 degrees].
...

IE is almost certainly the important part of this observation. It's easy to assume that it doesn't take much power to display web pages. But the problem is that many web pages have ads, and many ads are in Adobe Flash. A poorly-programmed Flash ad might do very little but still take a huge amount of CPU power.

I'm guessing this is what you're seeing. What you could do is switch from IE to Chrome. Chrome has a nice Task Manager which will tell you which web pages and plugins are using the most CPU power and memory. You can use this to figure out what's taking up all your CPU power. If it does turn out to be Flash, there are plugins available for all browsers that will allow you to selectively run/block Flash.
 
...
Running a virtual machine does take a lot of resources - you're running two operating systems simultaneously. ...

And here's the crux of the matter. It's a commonly held misconception that running an operating system takes a lot of resources (CPU power, etc.). After all, the operating system is always running and it's the interface that your software is using constantly to access your hardware.

But in reality, operating systems take up very little CPU time. This is easily verified by simply opening up Activity Monitor (or Task Manager or whatever) and then checking how much CPU time is being used by the "System." Typically it's maybe only a couple percent. Multiply a couple percent by two and you still only have a few percent. So your explanation of why VMs (hypothetically) take a lot of CPU time doesn't hold water.

This is empirically shown by the OP's own observations that when his VM is booted but he's not using any software in the guest OS, the increase in CPU temperature is nominal. Thus, logically, any increase in CPU temperature is coming from using software in the VM. That is to say, your claim that running a VM requires a lot of CPU power, regardless of what you're doing with the VM, is wrong.

And since the software running in the VM is causing the temperature increase, that means it can be investigated and possibly reduced.

Instead of arguing with me about this, it might do you some good to try to investigate why your VMs are using so much of your CPU time. Because I have been using VMs for the past ~7 years and I can tell you that if my VMs are using a lot of CPU time it's because they're actually doing something and not simply because they are VMs.
 
I mentioned it earlier in the thread that despite motrek's claims to the contrary, even light tasks on a virtual machine on a MacBook Air will have a significant impact on the temperatures of your CPU.
It's not the VM itself that impacts workload and temps; it's the apps/processes running within the VM that do. If you have no processes running within it, a VM alone has very little impact on resources. It's the same with your native OS X installation. Very little is consumed by the OS itself. It's the apps and processes the user has running that put higher demands on resources.
There isn't a problem, and there's no need for concern - unless the temperatures are getting beyond 90C - which it looks like they aren't.
You can even go beyond 90C without problems, as the TJmax is 105C for the CPU and (if I recall correctly), 100C for the GPU.
 
A short note to thank everyone for their input and assistance on this as I know believe -- as a result of everyone's input [and a little help from iStat Pro] have a good understanding of what is taking place with the benefit that everything is as it should be!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.