No. First, Objective-C programs typically don't rely as much on the cpu's endianness as C programs do - and it's bad programming style anyway to rely on endiannness.
I have some image manipulation written using Quartz that disagrees with you. Try to manipulate RGBA values inside an 32 bit integer without conforming to the platform's endianness and come back and tell me the result.
That's just how it is. It's not bad programming style, it depends on the requirements. If you're required to go to a lower level (I translate images to a series of pixel values using a CGBitmapContext so I can then manipulate the pixel data (to superimpose images partially) before displaying it on screen), you most certainly have to pay attention to endianness.
The beauty of ARM though is that it's endian-curious. It can be either big or little endian, based on what the chip designer makes it. Though for some odd reason, Apple picked the wrong endianness for iOS based devices so we're running opposite of Intel's.
You for the LOLZ
A5 is dual core and runs at 884MHz.
Maximum cores for this ARM architecture (Cortex A9) is 4.
"Normal" clock is 1GHz.
So, a normally clocked four-core A5 would be MUCH closer to the C2D version.
Plus, ARM is so far ahead of Intel in efficiency, that even a significant overclock would change nothing - it would still be FAR less consumption.
Now, how about a dual-processor (2x4 cores) design and some overclock?
You assume desktop and real-world tasks can be made parellele with 100% efficiency. That is unfortunately not the case. You'll never be able to reproduce synthetic benchmarks results with real world software as instructions are highly dependant on one and the other and while some form of out of order execution is possible, you cannot throw instructions on the cores in a round-robin fashion and hope something good comes out.
And a quad-core to catch up to a dual core ? Please. The ARM stuff might be highly efficient, but it's no where near the Intel mobile CPUs. It's good for handheld and embedded devices, why force yourself to stuff it in something like a laptop when you can simply go for Intel's mobile stuff which is already pretty efficient and much much better at brute instructions per clock.
Those people, who only do Web, E-mail, Facebook.
How are they clueless ? That's what they need out of a computer. Who are you to judge their needs ? I know many people who need nothing besides those things, especially now that Web encompasses a rich multitude of web applications (like iWork.com, Google Docs, etc..).
The same guy you just called clueless for using a computer only for Web access probably thinks you're as clueless for not needing a full set of allen wrenches, in all imperial and metric sizes. Not everyone has the same use for the tools we use (computers are tools) and you can't judge someone based on their needs in a particular field.