Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

B Gallagher

macrumors regular
Original poster
Oct 19, 2008
105
0
New Zealand
This just occurred to me while watching the video of this morning's event - why are bars not standardised between cellphone companies? First there's the seemingly big discrepancy as to how many dB X number of bars actually equates to. Additionally, why does an iPhone show a total of 5 bars, another show 4, another show 3?

Any ideas??
 
Good question ..

Not only that, but if Apple has the power to just change the values of the bars on the iPhone then that means all AT&T phones could be different. They could be receiving the same reception strength but show different bars. Weird.
 
My thoughts:

Different phones have different levels of sensitivity -- i.e. some can work with weaker signals than others. This is especially true over time; a newly produced phone likely can make clear calls with signals that a five year old phone can't even detect. (I may be exaggerating).

How do you standardize five bars to measure signal strength in this instance? If you said five years ago that difference between zero and one bar is where that phone loses signal, is it fair to use the same scale for the modern phone that could work with a much weaker signal?

Better to adjust the scale to show the signal level across what the particular phone can handle. Seems like that's what the v4.0.1 update is adjusting for.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.