Are you kidding? You're defending the Jobs presentation,
Okay, you may not be defending the presentation, but you've obviously seen it. That presentation was almost entirely about how all smartphones, including the iPhone 3GS, suffer attenuation. After the OP's latest post stating the same thing, you asked for a citation, despite the fact that this is now well-established common knowledge.Please don't put words in my mouth. I didn't defend "the Jobs presentation." I didn't even say the data was correct. The OP, however, is using that data to create a distorted message. If anything, it's the OP who's using that data as a baseline, and distorting it to his own advantage. If you have an issue with the accuracy of the data, take it up with him, not me.
Another interesting point is that Steve Jobs gave an excuse for why he couldn't just tell us how many calls the iPhone 4 drops. He said he could only give us the difference between the 2 phones, and not the absolute numbers, because AT&T doesn't allow them to disclose statistics on dropped calls.
Well, that turns out to be a blatant lie, as you can see, AT&T freely discloses the amount of dropped calls on their network in their financial statements.
This is exactly the ambiguity that jobs was gunning for to obsfucate the truth of the matter. I'm more inclined to believe the OP's reading that it is <1/100 calls more (and far closer to 1 than below 0.5/100) than the 3GS.One additional point: we are talking about an absolute increase in dropped calls using percentages, NOT THE INCREASE IN DROP RATE. In other words, if the 3GS dropped 1 out of 100 calls, and the iPhone 4 dropped 1% more, then the iPhone 4 dropped 1.01 out of 100 calls.
This is exactly the ambiguity that jobs was gunning for to obsfucate the truth of the matter. I'm more inclined to believe the OP's reading that it is <1/100 calls more (and far closer to 1 than below 0.5/100) than the 3GS.
![]()
If it were only 1% more calls than the 3GS you can bet your bottom dollar this is what he would have presented. The slide would read 1%.
That's clearly not the reality.Actually my interpretation would've spelled really bad news for Apple if the drop rate was higher (e.g. 33 out of 100 calls).
People wanting to cut through the jobs bs. As the op points out if it were under 0.5 jobs would have said this (unless he doesn't have the data).It seems the formula is X / 100 for the 3GS vs. (X+Y) / 100, where Y < 1 (who cares whether Y is closer to 1 or 0.01)..
Perhaps for you. But that is an entirely subjective conclusion. For some people who use their phones for important calls 1/100 more drops might be entirely unacceptable and a reason to stay with the 3GS or another phone altogether.at the end the practical difference isn't tangible, even if the percentages have the potential to be very alarming
I made this comparison earlier: if I have 1 penny in the bank, or 1 percent of one dollar, and then I add another penny to my savings, or TWO percent of one dollar, then I've increased my life savings by 100 percent. That sounds pretty significant, right?
But guess what: I'm still broke. The actual, real-world number of how much money is actually in the bank is too small to make a difference, and my overall net worth would still be miniscule.
The same is true with the dropped calls situation. If we're measuring dropped calls, and the phone drops one call out of 100, then the ONLY possible increase is at least 100 percent. You can't drop half a call, or one quarter of a call, one percent of a call.
This is a nice try at spreading FUD, though.
lol, this is the lamest thread I've ever seen. MacRumors is getting very lame lately, but I didn't expect it would fall to the saddest point like this.
First of all, I'm not a native English speaker. So, please excuse my poor English. But I hope you get my point.
The OP's point is that if the iPhone 3Gs has 1% dropped calls and the iPhone 4 has 2% dropped calls, it's gonna be "droping as many as 100% more calls", right?
So, let's assume that the iPhone 3Gs has 0.0001% dropped calls, which means that the iPhone 4 has 1.0001% dropped calls according to Jobs. Does that mean the iPhone 4 drops as many as (1.0001-0.0001) / 0.0001 * 100 = 1000000% more calls? lol.
And.... if the iPhone 3GS has 0 dropped calls and the iPhone 4 has just 0.000000001% dropped calls, does this mean the iPhone 4 drops as many as INFINITY more calls than the 3GS? Because, you know, (0.0000001 - 0) / 0 * 100 = infinity! OMGZ, infinity dropped calls! lol.
In fact, I can even create any percentage I want using your trick. Let's say the iPhone 3GS drops x% calls and the iPhone 4 drops x+1 % call. So, it drops more (x+1)/x * 100 = p% calls.
(x+1)/x*100 = p
(x+1)/x = p/100
1 + 1/x = p/100
1/x = p/100 - 1
x = 1/(p/100 - 1)
So, if I want p = 5000000 because I want to be more sensationalizing, I just assume that the 3GS drops x = 1/(5000000/100 - 1) ~ 0.00002% call, and there I get 1.00002/0.00002 * 100 ~ 5,000,000% more dropped calls!!!
(BTW, I hope you will not create another thread, "iPhone 4 may be dropping as many as 5,000,000% more calls than iPhone 3GS")
With that formula, I can get any percentage I want very easily. Can you see why your argument doesn't make sense?
It's because... 1% more dropped call is, you know, 1% more dropped call. There is no point to try to distort it into an exponential difference because phones don't drop more calls exponentially with each generation. It depends on the design of the phone.
If the iPhone 5 drops 0 calls out of 1,000,000,000 calls and the iPhone 6 drops 1 call out of 1,000,000,000 calls, are you gonna say the iPhone 6 drops infinity more calls? Or are you gonna say it drops 0.0000001% more call?
OK, that's it. lol.
When Jobs said the iPhone 4 drops less than 1 more call per 100 than the 3GS, I assume this meant when the antenna isn't blocked (i.e. death grip), right? If so then I imagine the amount of calls dropped per 100 could increase.
lol, this is the lamest thread I've ever seen. MacRumors is getting very lame lately, but I didn't expect it would fall to the saddest point like this.
First of all, I'm not a native English speaker. So, please excuse my poor English. But I hope you get my point.
The OP's point is that if the iPhone 3Gs has 1% dropped calls and the iPhone 4 has 2% dropped calls, it's gonna be "droping as many as 100% more calls", right?
So, let's assume that the iPhone 3Gs has 0.0001% dropped calls, which means that the iPhone 4 has 1.0001% dropped calls according to Jobs. Does that mean the iPhone 4 drops as many as (1.0001-0.0001) / 0.0001 * 100 = 1000000% more calls? lol.
And.... if the iPhone 3GS has 0 dropped calls and the iPhone 4 has just 0.000000001% dropped calls, does this mean the iPhone 4 drops as many as INFINITY more calls than the 3GS? Because, you know, (0.0000001 - 0) / 0 * 100 = infinity! OMGZ, infinity dropped calls! lol.
In fact, I can even create any percentage I want using your trick. Let's say the iPhone 3GS drops x% calls and the iPhone 4 drops x+1 % call. So, it drops more (x+1)/x * 100 = p% calls.
(x+1)/x*100 = p
(x+1)/x = p/100
1 + 1/x = p/100
1/x = p/100 - 1
x = 1/(p/100 - 1)
So, if I want p = 5000000 because I want to be more sensationalizing, I just assume that the 3GS drops x = 1/(5000000/100 - 1) ~ 0.00002% call, and there I get 1.00002/0.00002 * 100 ~ 5,000,000% more dropped calls!!!
(BTW, I hope you will not create another thread, "iPhone 4 may be dropping as many as 5,000,000% more calls than iPhone 3GS")
With that formula, I can get any percentage I want very easily. Can you see why your argument doesn't make sense?
It's because... 1% more dropped call is, you know, 1% more dropped call. There is no point to try to distort it into an exponential difference because phones don't drop more calls exponentially with each generation. It depends on the design of the phone.
If the iPhone 5 drops 0 calls out of 1,000,000,000 calls and the iPhone 6 drops 1 call out of 1,000,000,000 calls, are you gonna say the iPhone 6 drops infinity more calls? Or are you gonna say it drops 0.0000001% more call?
OK, that's it. lol.
You don't get it. From an engineering standpoint, if the previous generation of phones drop 0 calls out of 1,000,000,000, but your new generation drops 1 out of 1,000,000,000, then you have done something really really bad. Think about it for a moment: you've introduced something bad that was never even there before. These numbers are big for a reason: they truly are very, very bad.
I'll post what I posted in another thread:
Let's say you dropped 1 call per 100 (1% dropped calls). If the number increased to 2 calls per 100 (2% dropped calls), then sure you could say that your dropped calls have increased 100%, but the numbers are so small that a percent increase of dropped calls is insignificant. No sane person is going to bicker about an extra dropped call for every 100 calls he/she makes, because they won't even notice. (And anyhow, it was stated to be LESS than 1 more per 100 dropped)
I used 1 call dropped per 100 to show the extreme case (because an increase of 1 would result in 100% more dropped calls). But using percentage increases on such small numbers is misleading, because they are very vulnerable to change.
And here's Jimmy Fallon's example of the iPhone 4's better performance in weak coverage areas:
http://www.engadget.com/2010/06/25/t...ng-less-calls/
You don't get it. From an engineering standpoint, if the previous generation of phones drop 0 calls out of 1,000,000,000, but your new generation drops 1 out of 1,000,000,000, then you have done something really really bad. Think about it for a moment: you've introduced something bad that was never even there before. These numbers are big for a reason: they truly are very, very bad.
Not really. Your statistics refer to a number that is one one billionth. That is not an engineering failure, that is a rounding error. That is a butterfly beating its wings in Shanghai causing a hurricane in Panama. One billionth of anything is called a "nano". From an engineering standpoint, you are probably not rounding your statistics to 9 decimal points.
Get a grip.![]()