I thought I was getting great signal strength most of the time - 3 or 4 bars most anywhere I go. After reading several articles the past couple of days, I have a lot less respect for the network. Apparently, ANY time you have less than 5 bars you already are below a 5% signal.
Anyone who refers to the signal by percentage alone is proving that they have no clue what they are talking about. And it's not totally their fault that they have no clue... but that still doesn't mean what they're saying (or griping about) is actually true.
3G cell phone signals
are not linear. Neither is the dB scale being used to represent signal strength.
decibels are logarithmic, a totally different scale from plain straight percentages, and for that reason you can't express signal strength by decibels in terms of simple percentages either.
Add on top of this, signal strength by itself means nothing in a WCDMA signaling scheme. WCDMA isn't a plain old radio signal: it's an interleaved grouping of hundreds of channels into a single, wideband stream.
Think of WCDMA as like a room that people have to go into if they want to have a conversation with each other. For the sake of simplicity, let's assume it's a small room with good acoustics, so no matter what happens you can hear well: the signal is strong. But, that still doesn't determine how well you understand the person you're talking to. If you're the only two people in the room, then everything is great. If there's say, five or six pairs of people in the room, then everyone can hear each other's conversations, and you might have to talk a little louder with the person you want to speak with so you can understand each other. And if there's LOTS of people in the room talking, then it's really noisy, and you'll have to strain to hear the person you're trying to pay attention to.
Bringing this back to your iPhone, a "5-bar" signal strength means the signal is quite "strong." But it can be chock full of useful information relating only to your call or the web page you're downloading, or it can have little bits of the information you want mingled in with tons of other phone calls and data sessions... information you don't want or need, and just appears to your phone as noise and interference.
But the reverse is also true (though maybe not as often): "1-bar" could mean a weak signal from a far away tower or a bridged antenna, but with very little noise. So even though your phone has to "shout" (increase power) to hear and be heard, the call could still hold.
That, in a nutshell, is 3G, and it's very different from the way 2G GSM, EDGE, and good old fashioned analog radio works. It sounds chaotic, and lots of times it is. But cell carriers can pack more calls and data into the same amount of bandwidth this way, even if it means the electronics are way more complex and all this chaos has to be dealt with.
The component in 3G that actually has more meaning is
ec/io, a measurement of the ratio of pilot energy (the useful part of the signal... the "conversation" you want to hear) to total received energy (useful energy plus useless noise, static and interference - everyone talking all at once in that room). In simplistic terms, you can think of it as a signal-to-noise ratio. And no, the bar graph on your phone doesn't reflect this.
And this is why sometimes, people can have great sounding calls in a "1-bar" area, while others can frequently drop calls in a "5-bar" area.
Bottom line: when you're on a CDMA or 3G/UMTS/HSPA cell phone, the "bars" in your signal strength indicator are a very crude and not-totally accurate method of determining your chances of keeping or dropping calls.
The "bars" are a holdover from the analog/GSM days when cell phone signals and coverage could be more easily expressed by signal strengths and percentages. Modern cell signals aren't so simple: they're expressed better using ratios and mathematical equations. Most end users find that stuff cryptic, scary and not very useful, so the bars, inaccurate as they may be now, were kept.
Now, it's probable (in fact very likely) that Apple engineers know this, and chose for that reason to have a highly skewed bar graph. When the ec/io is good, the bar graph is probably programmed to be very optimistic about signal strength. When there's a lot more noise than useful signal, then it's probably programmed to be more pessimistic. And that could well mean it doesn't reflect true signal strength at all... but again, true signal strength alone is meaningless in 3G.
And for what it's worth, this is also why it's VERY possible to use software to fix an issue with the iPhone 4 antenna. If you can "teach" the 3G chipset to treat the antenna differently and use different methods to screen out noise and filter in a good pilot signal, then it's perfectly possible to fix the problems people are having with just a firmware update, AND the fix would be more than just a placebo. Just be warned: it
MIGHT mean that those who hold their iPhones with the "death grip" will start to see reduced battery life if and when a fix comes out. It all depends on whether the embedded 3G processor has to be told to use more power to digitally process and "clean up" the signal when the antennas are bridged.