Actually you are more wrong than he is.
As others have already pointed out more eloquently, this is entirely wrong, and I wonder what your motivation is in posting it in the first place. It's complete nonsense.
Baloney! Take a look at those printed circuit boards, do you see an excess of chips, or a bunch of old technology? A designer can have a significant impact on system power usage simply by making decisions based on what to include and what to exclude.
Apple is achieving the lower power design through underclocking the graphics chips and binning.
It appears that the only thing under clocked is the D700 or top end GPUs. We don't even know if this is for power usage or reliability. These top end GPUs are pushed really hard and as a result have higher failure rates than Apple might find acceptable. We can't really say that the under clocking is only due to the power issue.
There is nothing more fancy going on than that. The real challenge in any system build is effectively venting the energy that will be released as heat.
Sure that is a challenge but reducing heat in the first place effectively reduces that challenge to something manageable.
What Apple *is* very good at is creating creative (note I say "creative," which is different than "effective," although they can and do often overlap) heat dissipation systems, and the Mac Pro is no exception.
I would agree with that, they seem to lead the industry in that respect. However removing heat is not all there is to heat management. I would say reducing heat production in the first place is a smart bit of engineering. You have to remember the CPU is only part of the heat equation on a motherboard. Simply addressing losses in the CPU chip power supply can have a big impact.
Also nonsense. No such complete system upgrade should be required.
I can not agree with you at all on this one. With today's hardware it is absolutely silly to consider CPU or GPU upgrades as something viable.
You're also making the false assumption that everyone's needs are bound by system-wide limitations, distributed evenly. The bulk of the Mac Pro's computational power is tied up in its graphics cards through OpenCLit is not a particularly relevant computer if your needs are CPU intensive.
I can't agree with this either, many apps are computationally intensive and still bound to the CPU. That might be due to the code base not translating to OpenCL easily or it might be laziness on the part of the developer, but the fact remains the Mac Pro is still a machine where the CPU is important.
More importantly though is that in a year or two XEON will likely be on a different socket supporting a new RAM interface that will require a new motherboard anyways. GPUs are just as likely to go through major overhauls also. In the end you have to upgrade so much of a system to realize the benefit of new tech that it isn't worth it to consider GPU or CPU upgrades for the vast majority of users.
Think about it for a bit, the GPUs in these machines represent about 2/3rds of the cost of the whole machine. It depends upon the machine of course, but you will have a hard time convincing me of the value in putting that much money into an upgrade of an existing machine. This is especially the case if it has been around for more than two years.
The ability to upgrade the GPUs, then, is paramount to the useful lifespan of the machine, since GPU revisions and developments occur at a rapid pace.
Again baloney! This was certainly the case in the past, but we aren't seeing the massive bumps in GPU performance anymore every year. At best the GPU manufactures come out with new architectures every two years and even then the performance jumps are no where near as dramatic as in the past. So if it takes 2+ years to see a new GPU what has happened to the rest of the technology in the machine? By the time 3 years has passed I see no value at all in an upgrade to the GPUs that doesn't also address the CPU and at that point you might as well buy that new machine.
Even if all your apps are tightly GPU bound and you know that for a fact, I just don't see a huge value in a GPU upgrade these days. We just aren't seeing massive growth in GPU performance anymore.