Originally posted by abdul
Does anyone know how much of difference a 100mHz speed bump actually affects the speed of the computer in real terms?
when it was upgraded from 700 to 800 Apple said it was 30% increase in performace (i think). so does that mean another 30%?
When Apple upgraded the 700Mhz iBook to the 800Mhz iBook, it was not just a mere speed bump. If I remember correctly, when the 700Mhz Ibook was the top iBook, they all had a mobility radeon graphics card, rather than the Radeon 7500. Moreover, I think they only had 16Mb of VRam, which I believe sucks with QE. So given that, they could probably claim up to 30% performance increase in selected tasks.
This time around, all that changes is the CPU speed. Given that it still subject to a 100Mhz bus, I very much doubt the performance of the 900Mhz part is much greater than that of the 800Mhz. May be, on CPU intensive tasks, such as itunes encoding and such, you'll get 10% increase, at most. Not sure if it will be even that.
This is just guesswork though, I have no benchmarks to support my claims. Anyone care to shed some light in this direction?
Cheers
Latino