Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The Genius Bar diagnosed my iPhone 4 as having dropped 11% of calls. They have software that shows it very clearly. I do drive through a trouble area San Jose to Carmel.
 
1-800-MY-IPHONE

I encourage anyone that experiences reception drop or hasn't seen more than three bars since 4.0.1 to use the above number.

Their data stating 0.55% of users call into AppleCare may be accurate, but it is not precise.
 
Are you kidding? You're defending the Jobs presentation,

Please don't put words in my mouth. I didn't defend "the Jobs presentation." I didn't even say the data was correct. The OP, however, is using that data to create a distorted message. If anything, it's the OP who's using that data as a baseline, and distorting it to his own advantage. If you have an issue with the accuracy of the data, take it up with him, not me.
 
I didn't read every single post, but it seems like a lot of people are just arguing over how Jobs' fact was presented or stated. I found it to be pretty simple. If I make 100 calls on a 3GS, I may have x amount of dropped calls. X is arbitrary considering we're speaking on relative terms, not absolute. Now his iP4 statement basically said that if you made 100 calls on the iP4, you drop x+(<1) calls. I don't know why percentages were ever brought up, except to restate the "out of 100" statement. Am I wrong? This isn't a matter of odd wordings to me at all. Appears pretty straightforward.
 
When Jobs said the iPhone 4 drops less than 1 more call per 100 than the 3GS, I assume this meant when the antenna isn't blocked (i.e. death grip), right? If so then I imagine the amount of calls dropped per 100 could increase.
 
I wonder if anyone while compiling a lot of flawed data took into account the amount of calls made to drop calls deliberately to try and reproduce the issue.
 
Please don't put words in my mouth. I didn't defend "the Jobs presentation." I didn't even say the data was correct. The OP, however, is using that data to create a distorted message. If anything, it's the OP who's using that data as a baseline, and distorting it to his own advantage. If you have an issue with the accuracy of the data, take it up with him, not me.
Okay, you may not be defending the presentation, but you've obviously seen it. That presentation was almost entirely about how all smartphones, including the iPhone 3GS, suffer attenuation. After the OP's latest post stating the same thing, you asked for a citation, despite the fact that this is now well-established common knowledge.

This argument will obviously never be settled between you and the OP, particularly now that is has gotten to this level (ie. "cite every word including common knowledge, will not respond to actual argument"). For me, this thread is an interesting examination on the manipulation of statistics to use seemingly small numbers (1 more per hundred) versus potentially much bigger, more appropriate numbers (1.5X to 2X or 50% more to 100% more).

Even those in this thread who agree with you say 'sure, Steve could have said 100% more, but why would he?' Obviously he wouldn't, and that's what the OP and I are saying (I think). While "1 more per hundred" and "2x" and "100% more" are all technically, mathematically correct ways of describing this phenomena, the latter two (multipliers and percent more) are more tangible, more commonly used ways of expressing differences between two sets of data. Jobs opted for the one that sounded best. So the point of this thread is to point out additional, synonymous statistics that put it in a different light.

The above paragraph is fact, except the italicized portion. The italicized portion is the only portion that we can actually argue about, and it appears that it's devolved far past the actual argument. I say, and I think the OP would agree, that saying a phone drops 2x calls or 100% more calls than another phone has more meaning, and is a simpler explanation, than dropping 1 more call per 100 total calls. If you honestly think that "1 more bad unit per 100 total number units" is better and simpler than "2x the bad units" or "100% more bad units," then I guess its your decision to think that way, and there's nothing left to argue since it's all opinion.
 
You gotta take all these statistics with a grain of salt anyway...there are far too many externalities to even suggest statistical significance, such as (and I'm probably repeating some of the points but oh well):

- AT&T's figures are not just for the 3GS

- iPhone 4 has only been out for 3.5 weeks. The statistics are going to swing more wildly for the iPhone 4 compared to the 3GS.

- You are assuming that people are trying to call regularly, but there's a chance that a statistically significant amount of people are deliberately death-gripping the phone just to see if calls really get dropped. But we will never know that.

- Coverage of the phones. You are just basing some of your information on AT&T data, what about the networks all along the rest of the world? Oh wait, the iPhone 4 isn't widely available.

One additional point: we are talking about an absolute increase in dropped calls using percentages, NOT THE INCREASE IN DROP RATE. In other words, if the 3GS dropped 1 out of 100 calls, and the iPhone 4 dropped 1% more, then the iPhone 4 dropped 1.01 out of 100 calls.
 
Another interesting point is that Steve Jobs gave an excuse for why he couldn't just tell us how many calls the iPhone 4 drops. He said he could only give us the difference between the 2 phones, and not the absolute numbers, because AT&T doesn't allow them to disclose statistics on dropped calls.

Well, that turns out to be a blatant lie, as you can see, AT&T freely discloses the amount of dropped calls on their network in their financial statements.

Not true. Steve said that AT&T did not want to reveal actual numbers of drop calls, e.g. n drops over x period of time. AT&T is by no means freely discussing actual numbers with the percentages from your screen shot. Not to mention, that data is for all calls made on their network, and not any specific brand or genre of phone.
 
One additional point: we are talking about an absolute increase in dropped calls using percentages, NOT THE INCREASE IN DROP RATE. In other words, if the 3GS dropped 1 out of 100 calls, and the iPhone 4 dropped 1% more, then the iPhone 4 dropped 1.01 out of 100 calls.
This is exactly the ambiguity that jobs was gunning for to obsfucate the truth of the matter. I'm more inclined to believe the OP's reading that it is <1/100 calls more (and far closer to 1 than below 0.5/100) than the 3GS.

apple1dropcalliphone4.jpg



If it were only 1% more calls than the 3GS you can bet your bottom dollar this is what he would have presented. The slide would read 1%.
 
This is exactly the ambiguity that jobs was gunning for to obsfucate the truth of the matter. I'm more inclined to believe the OP's reading that it is <1/100 calls more (and far closer to 1 than below 0.5/100) than the 3GS.

apple1dropcalliphone4.jpg



If it were only 1% more calls than the 3GS you can bet your bottom dollar this is what he would have presented. The slide would read 1%.

Actually my interpretation would've spelled really bad news for Apple if the drop rate was higher (e.g. 33 out of 100 calls). It seems the formula is X / 100 for the 3GS vs. (X+Y) / 100, where Y < 1 (who cares whether Y is closer to 1 or 0.01)...at the end the practical difference isn't tangible, even if the percentages have the potential to be very alarming - which is the same POV as the OP's.

But yeah, if you think of it that way, the higher the percentages, the better for us because the high figure means the absolute values have to be really low. :O
 
lol, this is the lamest thread I've ever seen. MacRumors is getting very lame lately, but I didn't expect it would fall to the saddest point like this.

First of all, I'm not a native English speaker. So, please excuse my poor English. But I hope you get my point.

The OP's point is that if the iPhone 3Gs has 1% dropped calls and the iPhone 4 has 2% dropped calls, it's gonna be "droping as many as 100% more calls", right?

So, let's assume that the iPhone 3Gs has 0.0001% dropped calls, which means that the iPhone 4 has 1.0001% dropped calls according to Jobs. Does that mean the iPhone 4 drops as many as (1.0001-0.0001) / 0.0001 * 100 = 1000000% more calls? lol.

And.... if the iPhone 3GS has 0 dropped calls and the iPhone 4 has just 0.000000001% dropped calls, does this mean the iPhone 4 drops as many as INFINITY more calls than the 3GS? Because, you know, (0.0000001 - 0) / 0 * 100 = infinity! OMGZ, infinity dropped calls! lol.

In fact, I can even create any percentage I want using your trick. Let's say the iPhone 3GS drops x% calls and the iPhone 4 drops x+1 % call. So, it drops more (x+1)/x * 100 = p% calls.

(x+1)/x*100 = p
(x+1)/x = p/100
1 + 1/x = p/100
1/x = p/100 - 1
x = 1/(p/100 - 1)

So, if I want p = 5000000 because I want to be more sensationalizing, I just assume that the 3GS drops x = 1/(5000000/100 - 1) ~ 0.00002% call, and there I get 1.00002/0.00002 * 100 ~ 5,000,000% more dropped calls!!!

(BTW, I hope you will not create another thread, "iPhone 4 may be dropping as many as 5,000,000% more calls than iPhone 3GS")

With that formula, I can get any percentage I want very easily. Can you see why your argument doesn't make sense?

It's because... 1% more dropped call is, you know, 1% more dropped call. There is no point to try to distort it into an exponential difference because phones don't drop more calls exponentially with each generation. It depends on the design of the phone.

If the iPhone 5 drops 0 calls out of 1,000,000,000 calls and the iPhone 6 drops 1 call out of 1,000,000,000 calls, are you gonna say the iPhone 6 drops infinity more calls? Or are you gonna say it drops 0.0000001% more call?

OK, that's it. lol.
 
Actually my interpretation would've spelled really bad news for Apple if the drop rate was higher (e.g. 33 out of 100 calls).
That's clearly not the reality.

It seems the formula is X / 100 for the 3GS vs. (X+Y) / 100, where Y < 1 (who cares whether Y is closer to 1 or 0.01)..
People wanting to cut through the jobs bs. As the op points out if it were under 0.5 jobs would have said this (unless he doesn't have the data).

at the end the practical difference isn't tangible, even if the percentages have the potential to be very alarming
Perhaps for you. But that is an entirely subjective conclusion. For some people who use their phones for important calls 1/100 more drops might be entirely unacceptable and a reason to stay with the 3GS or another phone altogether.
 
I'll post what I posted in another thread:

Let's say you dropped 1 call per 100 (1% dropped calls). If the number increased to 2 calls per 100 (2% dropped calls), then sure you could say that your dropped calls have increased 100%, but the numbers are so small that a percent increase of dropped calls is insignificant. No sane person is going to bicker about an extra dropped call for every 100 calls he/she makes, because they won't even notice. (And anyhow, it was stated to be LESS than 1 more per 100 dropped)

I used 1 call dropped per 100 to show the extreme case (because an increase of 1 would result in 100% more dropped calls). But using percentage increases on such small numbers is misleading, because they are very vulnerable to change.

And here's Jimmy Fallon's example of the iPhone 4's better performance in weak coverage areas:
http://www.engadget.com/2010/06/25/t...ng-less-calls/
 
I made this comparison earlier: if I have 1 penny in the bank, or 1 percent of one dollar, and then I add another penny to my savings, or TWO percent of one dollar, then I've increased my life savings by 100 percent. That sounds pretty significant, right?

But guess what: I'm still broke. The actual, real-world number of how much money is actually in the bank is too small to make a difference, and my overall net worth would still be miniscule.

The same is true with the dropped calls situation. If we're measuring dropped calls, and the phone drops one call out of 100, then the ONLY possible increase is at least 100 percent. You can't drop half a call, or one quarter of a call, one percent of a call.


This is a nice try at spreading FUD, though.

The above is basically the best explanation for anyone regarding this whole issue.

THE FACTS:

-AT&T is reporting ABOUT 1% of calls are dropped on their network, or in other words, 1 out of 100

-Jobs reported that the iP4 drops calls less than one additional time out of one hundred when compared to the 3GS on average

WHAT WE DON'T KNOW:

-the percentage of dropped calls on the 3GS on AT&T's network

I think with all of the above, including the quoted, it's pretty much an end to this thread except for personal issues over semantics between users.

If we assume the 3GS is on par with the rest of AT&T's reported dropped calls, IOW drops 1 call out of every 100 on average, then according to Jobs, the iP4 drops between 1.1-1.9 calls out of 100 on average. And like it's said in the quote, you can't drop HALF a call, so in reality, you can just say that since it drops more calls but not quite a full one on average, that it drops 2 calls out of 100.

Sure, you can say it drops twice as many calls. Sure, you can say it drops 100% more calls. But considering it still only drops 2% of your calls, saying it the way Jobs did more fairly represents the reality of it all. That it's not something to worry about.

If I told you that your chances of winning the lottery have increased 100%, most people would jump for joy. But if your chances to begin with were 1 in 1,000,000, and now it's 2 out of 1,000,000, or 1 in 500,000.. well that doesn't sound too exciting, now does it? But on the surface it sounds very exciting.

Same idea can be applied here, except people would be dismissing the iP4 as a crappy phone that drops calls a lot rather than looking at it relatively.
 
lol, this is the lamest thread I've ever seen. MacRumors is getting very lame lately, but I didn't expect it would fall to the saddest point like this.

First of all, I'm not a native English speaker. So, please excuse my poor English. But I hope you get my point.

The OP's point is that if the iPhone 3Gs has 1% dropped calls and the iPhone 4 has 2% dropped calls, it's gonna be "droping as many as 100% more calls", right?

So, let's assume that the iPhone 3Gs has 0.0001% dropped calls, which means that the iPhone 4 has 1.0001% dropped calls according to Jobs. Does that mean the iPhone 4 drops as many as (1.0001-0.0001) / 0.0001 * 100 = 1000000% more calls? lol.

And.... if the iPhone 3GS has 0 dropped calls and the iPhone 4 has just 0.000000001% dropped calls, does this mean the iPhone 4 drops as many as INFINITY more calls than the 3GS? Because, you know, (0.0000001 - 0) / 0 * 100 = infinity! OMGZ, infinity dropped calls! lol.

In fact, I can even create any percentage I want using your trick. Let's say the iPhone 3GS drops x% calls and the iPhone 4 drops x+1 % call. So, it drops more (x+1)/x * 100 = p% calls.

(x+1)/x*100 = p
(x+1)/x = p/100
1 + 1/x = p/100
1/x = p/100 - 1
x = 1/(p/100 - 1)

So, if I want p = 5000000 because I want to be more sensationalizing, I just assume that the 3GS drops x = 1/(5000000/100 - 1) ~ 0.00002% call, and there I get 1.00002/0.00002 * 100 ~ 5,000,000% more dropped calls!!!

(BTW, I hope you will not create another thread, "iPhone 4 may be dropping as many as 5,000,000% more calls than iPhone 3GS")

With that formula, I can get any percentage I want very easily. Can you see why your argument doesn't make sense?

It's because... 1% more dropped call is, you know, 1% more dropped call. There is no point to try to distort it into an exponential difference because phones don't drop more calls exponentially with each generation. It depends on the design of the phone.

If the iPhone 5 drops 0 calls out of 1,000,000,000 calls and the iPhone 6 drops 1 call out of 1,000,000,000 calls, are you gonna say the iPhone 6 drops infinity more calls? Or are you gonna say it drops 0.0000001% more call?

OK, that's it. lol.

You don't get it. From an engineering standpoint, if the previous generation of phones drop 0 calls out of 1,000,000,000, but your new generation drops 1 out of 1,000,000,000, then you have done something really really bad. Think about it for a moment: you've introduced something bad that was never even there before. These numbers are big for a reason: they truly are very, very bad.
 
When Jobs said the iPhone 4 drops less than 1 more call per 100 than the 3GS, I assume this meant when the antenna isn't blocked (i.e. death grip), right? If so then I imagine the amount of calls dropped per 100 could increase.

You completely miss the point. These statistics are for ALL iP4 calls. It's the statistic for ALL calls that shows it is not a widespread problem. If I sell you a car that doesn't get good AM reception in a tunnel, as most cars, and I ask all car owners how often they lose their AM reception I may get a small percentage or number. However, if I ask only car owners who drive through a tunnel on a daily basis then I will get a very large number and percentage. You already know some people are affected - that's not disputed. We don't care that 100% of the affected people are affected. We know this! What we didn't know until Friday was what percentage or how many total users OUT OF THE WHOLE are affected by the issue. THAT is what the statistics show.
 
lol, this is the lamest thread I've ever seen. MacRumors is getting very lame lately, but I didn't expect it would fall to the saddest point like this.

First of all, I'm not a native English speaker. So, please excuse my poor English. But I hope you get my point.

The OP's point is that if the iPhone 3Gs has 1% dropped calls and the iPhone 4 has 2% dropped calls, it's gonna be "droping as many as 100% more calls", right?

So, let's assume that the iPhone 3Gs has 0.0001% dropped calls, which means that the iPhone 4 has 1.0001% dropped calls according to Jobs. Does that mean the iPhone 4 drops as many as (1.0001-0.0001) / 0.0001 * 100 = 1000000% more calls? lol.

And.... if the iPhone 3GS has 0 dropped calls and the iPhone 4 has just 0.000000001% dropped calls, does this mean the iPhone 4 drops as many as INFINITY more calls than the 3GS? Because, you know, (0.0000001 - 0) / 0 * 100 = infinity! OMGZ, infinity dropped calls! lol.

In fact, I can even create any percentage I want using your trick. Let's say the iPhone 3GS drops x% calls and the iPhone 4 drops x+1 % call. So, it drops more (x+1)/x * 100 = p% calls.

(x+1)/x*100 = p
(x+1)/x = p/100
1 + 1/x = p/100
1/x = p/100 - 1
x = 1/(p/100 - 1)

So, if I want p = 5000000 because I want to be more sensationalizing, I just assume that the 3GS drops x = 1/(5000000/100 - 1) ~ 0.00002% call, and there I get 1.00002/0.00002 * 100 ~ 5,000,000% more dropped calls!!!

(BTW, I hope you will not create another thread, "iPhone 4 may be dropping as many as 5,000,000% more calls than iPhone 3GS")

With that formula, I can get any percentage I want very easily. Can you see why your argument doesn't make sense?

It's because... 1% more dropped call is, you know, 1% more dropped call. There is no point to try to distort it into an exponential difference because phones don't drop more calls exponentially with each generation. It depends on the design of the phone.

If the iPhone 5 drops 0 calls out of 1,000,000,000 calls and the iPhone 6 drops 1 call out of 1,000,000,000 calls, are you gonna say the iPhone 6 drops infinity more calls? Or are you gonna say it drops 0.0000001% more call?

OK, that's it. lol.


Read the thread. I already explained that the 4 can be upto infinitely worse than the GS. This does not mean it will drop infinite calls. This would mean it would drop upto 1% of the calls.

On the other hand, the 4 could be almost equal to the GS and drop 100% of the calls.
 
You don't get it. From an engineering standpoint, if the previous generation of phones drop 0 calls out of 1,000,000,000, but your new generation drops 1 out of 1,000,000,000, then you have done something really really bad. Think about it for a moment: you've introduced something bad that was never even there before. These numbers are big for a reason: they truly are very, very bad.

Not really. Your statistics refer to a number that is one one billionth. That is not an engineering failure, that is a rounding error. That is a butterfly beating its wings in Shanghai causing a hurricane in Panama. One billionth of anything is called a "nano". From an engineering standpoint, you are probably not rounding your statistics to 9 decimal points.

Get a grip. :D
 
I'll post what I posted in another thread:

Let's say you dropped 1 call per 100 (1% dropped calls). If the number increased to 2 calls per 100 (2% dropped calls), then sure you could say that your dropped calls have increased 100%, but the numbers are so small that a percent increase of dropped calls is insignificant. No sane person is going to bicker about an extra dropped call for every 100 calls he/she makes, because they won't even notice. (And anyhow, it was stated to be LESS than 1 more per 100 dropped)

I used 1 call dropped per 100 to show the extreme case (because an increase of 1 would result in 100% more dropped calls). But using percentage increases on such small numbers is misleading, because they are very vulnerable to change.

And here's Jimmy Fallon's example of the iPhone 4's better performance in weak coverage areas:
http://www.engadget.com/2010/06/25/t...ng-less-calls/

Look, it's really simple statistics. The average dropped call rate for all phones on AT&T is around 1%. Since the minimum is 0%, the standard deviation is pretty small even if the distribution is right skewed. That means the iPhone 4 at an average 2% drop rate is statistically significantly worse than the overall sample, which means something different is wrong with it.

That's all I'm trying to show. It doesn't matter whether you will notice the difference or not. All that matters is that from an engineering standpoint, the numbers are significant enough to prove that something else is going on with the iPhone 4, which throws Steve's claim that what's afflicting the iPhone 4 is the same thing that afflicts other smartphones out the window.

Now, there are confounding variables, such as the location of early adopters of the iPhone 4 that someone in this thread mentioned, and Jobs' pet theory about case usage for the iPhone 4, and the (ridiculous, in my opinion) theory that more dropped calls are caused by people on YouTube purposely making it happen. I think this difference is large enough that nothing short of design flaw can explain it, but this is just my interpretation and other people may disagree. I guess I can try to show why the third thing mentioned above is ridiculous. There have been 3 million iPhones sold. Let's say that everyone who bought one has made 10 calls on it - a pretty conservation estimate - that's 30 million calls made. To bump the number of dropped calls by an extra 1 in 100, that's 300,000 extra dropped calls. There's no way that people on YouTube could make that many extra drop calls happen on purpose.
 
You don't get it. From an engineering standpoint, if the previous generation of phones drop 0 calls out of 1,000,000,000, but your new generation drops 1 out of 1,000,000,000, then you have done something really really bad. Think about it for a moment: you've introduced something bad that was never even there before. These numbers are big for a reason: they truly are very, very bad.

Yeah, very bad indeed. So you agree that it drops infinity more calls then? lol
 
Not really. Your statistics refer to a number that is one one billionth. That is not an engineering failure, that is a rounding error. That is a butterfly beating its wings in Shanghai causing a hurricane in Panama. One billionth of anything is called a "nano". From an engineering standpoint, you are probably not rounding your statistics to 9 decimal points.

Get a grip. :D

That's not how math works. You can't say something is a rounding error just because it is small, you have to find out how many significant digits you actually have based on the precision of your measures, etc. If you can say with confidence that your dropped calls was 0 in 1 billion previously, and with confidence that your dropped calls are 1 in 1 billion now, then that increase is significant, and is not a rounding error. In any case, that was just a hypothetical. The numbers we are actually working with are in the 0-2% range. With a sample size of millions if not billions of calls, these numbers are pretty significant, and cannot be attributed to rounding.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.