I haven't seen any mention of cache. On chips like the P4 and I beleive the 970 as well, 1/3 to 1/2 of the die is cache, easily. Therefore external bandwidth is lower.
What appears to happen here is Sony said: Hey, we want to fit more processing power in, ditch the cache, lets add a massively unnessecary amount of bandwidth for ram. Problem is, once they lost the cache, they are dependent on that bandwidth actually being there. If anyone in electronics is here, let us know what the proper signal length is for the speeds they are talking. I beleive it was 6Gbytes per second to ram....that is insane, the memory chips will have to be within centimeters of the Cell chip..... the question becomes can Sony make it work at the speeds they are needing.
Also, the ATI and NVidia chips are massively parrallel and have a ton of floating point power (and are programmable using shaders!). Remember that if Sony doesn't use a dedicated graphics chip, they are pushing all of that onto this chip. The chip isn't dedicated to 3D, and an existing PowerMac G5 cannot do 3D as well as current generation GPUs. So the 10x the power of a 970 isn't very accurate in terms of graphics performance, we don't yet know what this means for real-time graphics. For the people claiming this will seal the coffin on the XBox, don't count on it. We are atleast 1 generation away on ATI and Nvidia gaphics processors.... you have no idea what Microsoft will be using. As far as graphics power goes, they could have nearly identical power when all is said and done. While Sony may have done this because they can reprogram the chip to do things other than GPU, on a graphics intensive game you will rarely do that. On the other hand, the XBox2 will have a next-generation GPU, with probably 6 or 8 pipelines. And at the same time they will have a dual-core PPC-based core for all other functions such as AI, physics, sound, etc. I wouldn't count them out. I would bet Sony went with the Cell to reduce costs, assuming they could eliminate the GPU. However the grave mistake is they are now VERY dependent on Rambus for XDR memory and the two busses they licensed. I bet in the end, it will cost them more to produce even though they saved themselves from the GPU, and the power will be about equal between the two products.