I would say the iPhone synchronizes with one of Apple's servers. I know the Mac does. The WiFi iPad and the iPod touch can't rely on a carrier signal, so they must sync with the same server. Thus it makes sense to say the iPhone synchronizes with that server, too.The iPhone's clock, like pretty much every other phone on the market, gets updated from the carrier's signal.
According to the Astrophysicist Neil Degrass Tyson the iPhone time is more accurate then Android. But this was back in March, and could have been fixed in the android software since then.I would say the iPhone synchronizes with one of Apple's servers. I know the Mac does. The WiFi iPad and the iPod touch can't rely on a carrier signal, so they must sync with the same server. Thus it makes sense to say the iPhone synchronizes with that server, too.
"The iPhone has the correct time unlike the Android-based phones...The bulk of the Androids get their time from GPS satellites. The timekeeping system for GPS satellites was defined up to 1982. And since 1982, 15 leap-seconds have been added to civil time. And those leap-seconds are not included in the Android timekeeping because they're getting their time directly from GPS, whereas the iPhone compensates for this, puts those 15 seconds back in, and has therefore the correct time. As a result, most Android phones are exactly 15 seconds too fast."
The problem with his argument is that GPS time, while not accounting for leap seconds implicitly, DOES contain a field that makes clear the difference between GPS time and UTC time. Any GPS-calibrated clock can then account for the difference and be correct to UTC time, and pretty much all of them do. Including, I'm sure, Android phones.
Actually most cell towers (including all CDMA towers) are GPS synchronized, because it's cheaper, more reliable and easier than maintaining a direct connection to NIST.That's because most cell phones get their time from their cell tower, which in turn gets its time from the continental atomic clock.
Which in turn get their time from an atomic clock or their (the satellite's) own internal atomic clock.Actually most cell towers (including all CDMA towers) are GPS synchronized, because it's cheaper, more reliable and easier than maintaining a direct connection to NIST.
Starting with iOS 4, Apple has included a ntup daemon to update the device's time via an internet time server. Likely time.apple.com or the local Apple time server.So let me get this straight. The iPhone synchronizes with a Stratum server, which figures out the time by looking at the timestamp from a signal sent by a GPS satellite. It then has to account for the time it took for the signal to reach the computer.
Either that, or it calculates the time based on data provided by the US Naval Observatory, who makes their data easily accessible by the public.
I'm sorry, but what you say is very inaccurate and confusing.Atomic clocks are currently the most accurate clock that is able to be cheaply produced. While still expensive, the next most accurate clock is well over five times as much. GPS time can be off up to 15-30 seconds due to latency between the satellites and the receiving device.
I'm talking about the big machines that have bits of caesium, rubidium, or some other type of atomic element at their core. Not those radio clocks. Whenever I say they can be produced cheaply, I'm referencing the cost of producing single-ion clocks. Those are extremely expensive to build and maintain.I'm sorry, but what you say is very inaccurate and confusing.
1. If you mean "radio controlled clocks" instead of "atomic clocks" (as real atomic clocks are quite expensive) yes, they can be cheaply produced and are fairly accurate, but they usually only pick up the WWVB once per night if they are lucky to catch it. Otherwise they are just regular quartz watches and they can lose or gain time after they picked up the signal. Some models claim to calibrate the quartz oscillator over time, and based on receiving the time signal frequently. Radio controlled cheap clocks or watches don't account for distance to the antenna, so there is some loss of precision when you are far from the antenna in Colorado.
2. A regular PC synchronizing to NTP can be fairly accurate, order of milliseconds.
3. GPS time is not off by 15 seconds or more. There is an offset between GPS time and UTC. Signal delay from satellite to receiver is subsecond. That's what you mean. That offset is sent with the signal and is taken into account to calculate UTC or any other timezone time. GPS is highly accurate. Without doubt much better for high precision timing than WWVB signal. European Galileo system is promising even more accurate timing as far as I know. Reasons for GPS to be so convenient are the following. A receiver can calculate its position on earth fairly accurately. Once the position is determined one can very accurately calculate the delay of the signal from the satellite to the receiver and improve this calculation when multiple satellites are on view. Accuracy is easily in microseconds. GPS is probably the cheapest most efficient way of constructing a stratum 1 NTP server.
Furthermore, iPhone currently uses NTP to synchronize its time. However, I doubt it is configured for checking frequently. Perhaps for once per day, or once at boot time. Not sure. There are some nifty apps that can give you NTP time and some of them compare with the internal clock to give you the offset. One of them is called Time (by Emerald Sequoia). Another is called Atomic clock and it imitates a Gorgy clock. Offset is usually under 1 second. This is anyway much better than in the past, when the iPhone was synchronizing to ATT cellular network (sometimes 4 or 5 seconds offset). CDMA networks also have very accurate timing. However, I don't think the iPhone derives its time from CDMA in Verizon network.
Do you have any proof of what you are saying? Except for what you said about the single-ion (I don't think there is any production ready) being more expensive than a cesium or rubidium atomic clock I don't agree with anything else you said. GPS satellites have atomic clocks inside (Cesium and/or Rubidium). How can they be off by up to 15 seconds? I've never heard of such a thing. You might mean 15 nanoseconds, that I might believe. GPS satellites get corrections from control centers, etc. and yes, there can be some minor discrepancies, but as I said, in the order of nanoseconds, not secondsI'm talking about the big machines that have bits of caesium, rubidium, or some other type of atomic element at their core. Not those radio clocks. Whenever I say they can be produced cheaply, I'm referencing the cost of producing single-ion clocks. Those are extremely expensive to build and maintain.
Depending on the atmospheric conditions, solar anomolies, and the accuracy of the satellites' internal clocks, GPS time can be off 15 seconds or more even when taking the offset in to consideration. When the source clock's time is too far off, the offset doesn't correct it enough. In some cases, a single GPS satellite's clock can fluctuate as much as 5 seconds from what it should be reading. That is why no single satellite is used for time, but is instead averaged out between the various times received. Accuracy once averaged and localized can be very tight, but it can still be off if the conditions are right.
The iPhone's NTP daemon is set to run every one hour. The process is extremely lightweight. Probably why Apple has it run every hour.