So you're comparing a high end expensive CPU from 2006 to a console CPU from 2006? You're comparing an 8800GT to a 7800 console GPU.
And they're better. People have been claiming they're worse, when they're not. It's changing the argument to start talking about price points or whatever.
That's why your PC is faster, it's got a GeForce 8800, which was a far faster GPU when it came out than the 7800.
And also the CPU's a lot faster too. But yeah, the PS3 uses basically a 7900 class GPU.
Most developers say that both consoles are fairly equal overall, with pros and cons, basically the 360 could look a bit better, but the PS3 game could have better physics.
Well now this is really getting off topic, but it's pretty clear both from actual game and developer comments that the PS3 can look better, but is harder to code for (and if care isn't taken it can look worse).
First, I want to point out theres a lot of misinformation in this thread.
Look at another game, UT3. Built from the ground up for the PS3 and PC. Yet the PS3 version runs at less than half the speed, lower resolution, and lower detail settings compared to even low-end PCs at the time of the games release. My aluminum MacBook can run it at higher details with more onscreen action and at double the frame-rate, albeit at a slightly lower resolution than 720p.
This is really the same situation we've always seen-at best consoles are roughly equivalent to PC hardware when they launch, and then quickly fall behind. I actually think the Xbox 1 was more advanced for when it launched than the Xbox 360 or Playstation 3 were. It took several months after the launch of the Xbox 1 before there was a significantly more powerful GPU available on PCs, and the 733mhz Celron was pretty solid for when it launched. The 360/PS3 designs are actually weirder in comparison, but more powerful GPUs and CPUs were already available when they launched. (In the case of the 360, the Geforce 7800/7900 series was already out, in the case of the PS3, the 8800 series was already out.)
Quite honestly, the PS3's "Cell" is nothing short of a joke. The main core, the PPC core, is so painfully slow that it has no business even being around modern games. And the "SPEs", the co-processors, are somehow even worse. They're good at very linear and predictive math, the type Photoshop would use.. but they've proven to fail massively when it comes to demanding physics or AI. Just look at GTA4 as an example. Or even Half-Life 2.
And it's the same situation with the Xbox 360. Those CPUs are really ancient designs. They're good at very limited things.
The GPU in the PS3 is a massive failure as well. It's basically the equivalent of 2 GeForce FX 5200s taped together and forced to run in SLI mode. With the exception of Oblivion, which had an extra year in development and all new textures, PS3 games run at lower resolutions, lower detail settings, and have significantly lower resolution textures compared to Xbox360 versions of the same game.
Okay, this is where your argument falls apart. As that other guy mentioned, it's basically a 7900GTX, and while it's not a unified architecture, it's higher end than the 360's GPU. And Oblivion's far from the only game that looks or runs better on the PS3. Off hand I can think of Bioshock, Mercenaries 2, Burnout Paridise, and Grand Theft Auto 4 (there's some debate with that one, but I'm going by IGN's review). Plus if you look at PS3 exclusives, and even first generation PS3 games versus first generation 360 games, it's pretty clear it does have more potential, although I think practically speaking the difference isn't that gigantic, plus of course it is harder to code for so you get lots of situations (especially at first) where the PS3 version of something would look worse. (I suspect a lot of developers initially were just dumping 360 code deigned to run on three CPUs onto the PS3's single main CPU-which would pretty much be exactly in line with the performance difference we saw in some early multiplatform games).
The designs of both CPUs are stupid, but the PS3/360 still have tons of great, and great looking games, just by virtue of how much serious development gets done on them, and that they can get target their hardware specifically.
The parts are dirt cheap these days. A 160GB notebook drive is what? $50 now?
Seagate's highest end notebook drive (a 7200RPM 320GB drive) is only $90 much of the time!

Yeah, I mean I'm so tired of Apple pushing looks over function :-/
Why not? First AppleTV used Pentium D (i think) clocked around 1GHz only and it can do 720p... if I compare this with 1.6GHz dual core whatever, i believe it can do 1080p. Plus using Nvidia 9400 graphics, next appleTV would have OpenCL activated in its OSX.
I can't remember for sure exactly what it uses, but it's not a Pentium D. I think it's basically a Pentium M or single core Core 1 type thing, that's massively stripped down. I'm not really clear on how Atom stacks up to really stripped down modern CPUs running a lot slower though. I mean I guess the rule of thumb is that at 1.6Ghz it's supposed to be kind of like a Pentium 3 at 1Ghz, supposedly.