Why are "bars" not standardised?

Discussion in 'iPhone' started by B Gallagher, Jul 16, 2010.

  1. B Gallagher macrumors regular

    Joined:
    Oct 19, 2008
    Location:
    New Zealand
    #1
    This just occurred to me while watching the video of this morning's event - why are bars not standardised between cellphone companies? First there's the seemingly big discrepancy as to how many dB X number of bars actually equates to. Additionally, why does an iPhone show a total of 5 bars, another show 4, another show 3?

    Any ideas??
     
  2. Gav2k macrumors G3

    Gav2k

    Joined:
    Jul 24, 2009
  3. B Gallagher thread starter macrumors regular

    Joined:
    Oct 19, 2008
    Location:
    New Zealand
    #3
    Not at all. I'm just curious. :)
     
  4. Khryz macrumors 6502a

    Joined:
    Jan 7, 2007
    #4
    Good question ..

    Not only that, but if Apple has the power to just change the values of the bars on the iPhone then that means all AT&T phones could be different. They could be receiving the same reception strength but show different bars. Weird.
     
  5. deeddawg macrumors 604

    Joined:
    Jun 14, 2010
    Location:
    US
    #5
    My thoughts:

    Different phones have different levels of sensitivity -- i.e. some can work with weaker signals than others. This is especially true over time; a newly produced phone likely can make clear calls with signals that a five year old phone can't even detect. (I may be exaggerating).

    How do you standardize five bars to measure signal strength in this instance? If you said five years ago that difference between zero and one bar is where that phone loses signal, is it fair to use the same scale for the modern phone that could work with a much weaker signal?

    Better to adjust the scale to show the signal level across what the particular phone can handle. Seems like that's what the v4.0.1 update is adjusting for.
     

Share This Page