Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why? Hardly surprising! Explain? Backdoors would also exist in Android and Windows Phones. Any of the US based smartphone OSs would have "US intelligence compatible" backdoors. Similarly, other countries would have their version. Don't be naive!

I guess I was naive. I took that whole 'Apple are at the forefront of privacy' thing hook, line, and sinker. I feel a bit stupid. It's just disappointing is all.
 
I guess I was naive. I took that whole 'Apple are at the forefront of privacy' thing hook, line, and sinker. I feel a bit stupid. It's just disappointing is all.

I don't think it's Apple's fault either. When the intelligence service comes knocking, there's not a lot of room to move. Further, all these companies need businesses from govt agencies. All too messy.
 
Knew it. If anybody really thinks Apple actually cares about user privacy, they're a fool.
 
"OS vulnerable to snooping by forensic tools. "

How is this not good ?

This is basically saying that when a stolen phone is found or other criminal activity, the courts and experts have no way to get the data off.

That's like, saying hard drive recovery companies know they recover your hard drive, but they just choose not to.

The thing is not getting rid of it, its a matter of tightening it up, and Apple has always done that..

But they obviously can't convince the real experts here.

Obviously an article that clearly has no security knowledge at all. If the NSA wanted to they can by-pass anything, even the most robust tools can.....

It's not about turning to the public and calling it a "backdoor" because THEY can do it, its about making it secure so most people can't..

The article says "services clearly shouldn't be in IOS" however it failed to name any...
 
Last edited:
i am guessing a big part of these 'back doors' were put there to support the anything anywhere anytime nature of iCloud and the various iWork apps that support it... everything is viewable and editable anywhere now and this handoff feature is just another extension of that. it is only convenient that the same features that offer the user great convenience also introduce the potential for abuse in the wrong hands.

i have no doubt all tech companies are implicit in this monitoring. google doesn't even need your device to get deep into everything about you and eventually apple will catch up. at least a good percentage of apple's revenue comes from hardware sales and so controlling your information is not as important to them as google but in the end the more they do know about you the better their ability to provide you with smarter services that convenience you.

there really is no avoiding the utopia that we are all driving ourselves collectively into. you lust after these amazing ly smart devices that know and do everything for you then you get what you are looking for and it should be no surprise that the tap to your data is only a court order away.
 
  • Like
Reactions: powers74
Just a personal question out of curiosity. Whenever there is some malware, bug, something-not-so-pretty (e. g. Schmidt & The Bilderbergers) about Android or Google discovered, do you go onto Android-themed forums and post "<if article was about Apple/iOS> thread burns"?

No because it's a known quantity in the Android world and people try and solve problems. People who use Android (on forums) generally understand the pros and cons of using the OS. People on this forum (generally) look for any excuse to hammer Google (ads, privacy, etc.) while hiding behind the thought that Apple would never do what Google is being accused of.
 
Sometimes you have to examine the details of the statement. They specifically said they never worked with the NSA. They did not say they never worked with any security agency in the US or Globally.
Well, in their new statement they did:

"As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services. "

I sure hope this doesn't turn out to be a baldfaced lie. The rest of the statement is, unfortunately, not very convincing though:

"We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers, and Apple for troubleshooting technical issues."

Why would "IT departments" or Apple need to be able to access a packet sniffer on the device? It might be useful for developers for debugging purposes, but then the capture service should only be installed on the debug image, not on all devices. Similarly, many of the personal data accessible through the file_relay service are not useful for "IT departments" or Apple.

"A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data."

That is technically true, but rather easy to bypass. There are some easy things Apple could do to tighten this up:

- Allow the user to manage and remove pairings from the device. It only takes a few seconds for an attacker to create a pairing, and there is currently no way for the user to even detect this
- Add an option to allow the user to block any further pairings
- Disable access to the usbmux service over Wifi if the user has disabled Wifi synicng in iTunes

And, of course, as suggested by Zdziarski, respect the user's encrypted backup setting in all services, and remove unnecessary services from the device.
 
  • Like
Reactions: powers74
But if you look at the full stack, they can't do it without your passcode/Touch ID.

Ah the excruciating pain of breaking a 4 digit passcode...no way any agency may brute force it's way past that layer of security. /S

But let's be serious...you are suggesting that this backdoor (or any backdoor) is not a serious security risk for any user?:eek:
 
I don't understand why people get so worked up about this sort of thing.

Those backdoors are there for your protection. They are put there for the exclusive use of the governments who we democratically elected. i.e.: the good guys.

We should all stop being so suspicious, and learn to fully trust the NSA and GCHQ. These guys are serious, trained professionals - not spotty nerds who are out to steal credit card numbers or pictures of your girlfriend!

As long as these backdoors are secure (and surely they are!), then we have nothing to fear.

I don't NEED their protection.
 
There is a lot of misinformation about this one going around (not really a big surprise there). But here are the basics:

- there is a number of services that are a part of a normal iOS installation that could be used for forensic purposes.

- they are either turned off by default or accessible via USB only.

If you read the paper, Apple provides the tools to lock down iOS devices completely. You can set them up so they cannot pair with any other computers. This prevents pretty much everything that can be exploited with these back doors.

Nobody can just turn on these services remotely on your iPad or iPhone. They need physical access to your device. He mentioned a few possible methods of getting access to the device in the talk - a USB like dongle that law enforcement connects up to a target device during a traffic stop or some other interaction.

Once these services have been turned on, it's game over though. Which is why pretty much every security researcher will tell you that unless you have physical security, you don't have any security.

As for the packet sniffer in particular - pretty sure just about every Linux distro comes with one as a part of its repository. Maybe not installed by default, but certainly installable with a command or two. The thing is, packet sniffers are legitimate IT tools. And if someone is having issues on the corporate wifi, having the ability to turn on pcap on their phone for some testing would be very useful.

Can bad guys use this stuff too? Sure they can. But they need physical access to your phone just like everyone else.

The whole idea that Apple has built a remotely exploitable back door into their system while telling the world that they have not seems awfully ridiculous. They are selling a product billed as highly secure and something like this would completely undermine everything they are working towards.

As someone earlier said, when someone can demonstrate a remote exploit of iOS, I'll start to worry. But if the first step involves getting physical access to the device, I'm just not going to be that concerned.
 
There is a lot of misinformation about this one going around (not really a big surprise there).
You seem to eager to add more.
But here are the basics:

- there is a number of services that are a part of a normal iOS installation that could be used for forensic purposes.

- they are either turned off by default or accessible via USB only.
No. The usbmux demon is also accessible over Wifi (this was done to enable iTunes syncing over Wifi). And the services can be started by the attacker simply by requesting them via commands sent to the demon.
If you read the paper, Apple provides the tools to lock down iOS devices completely. You can set them up so they cannot pair with any other computers. This prevents pretty much everything that can be exploited with these back doors.
This tool (Apple Configurator) is meant for enterprise administrators and is not exactly easy to use. It is also not available for Windows users. And disabling pairing means that you cannot do iTunes syncing, app file transfers, accessing photos etc. from any other computer than the management machine anymore.
Nobody can just turn on these services remotely on your iPad or iPhone. They need physical access to your device.
Nope.
He mentioned a few possible methods of getting access to the device in the talk - a USB like dongle that law enforcement connects up to a target device during a traffic stop or some other interaction.
Yup. Takes just a few seconds. And can also be done by non-law enforcment persons. Attackers could also steal the pairing keys from your computer using the same methods used to steal passwords etc. (malware, social engineering, etc.). Or they could mount a man-in-the-middle attack on a public Wifi network (perhaps with SSID "attwifi", so that closeby iPhones automatically log on). There are many ways.

The bottomline is that the pairing process is not very secure. So it should not expose sensitive information, at least not without additional protection (e.g. by requiring the user's backup encryption password).
As for the packet sniffer in particular - pretty sure just about every Linux distro comes with one as a part of its repository. Maybe not installed by default, but certainly installable with a command or two. The thing is, packet sniffers are legitimate IT tools. And if someone is having issues on the corporate wifi, having the ability to turn on pcap on their phone for some testing would be very useful.
This is ridiculous. Nobody debugs network issues using a phone.
As someone earlier said, when someone can demonstrate a remote exploit of iOS, I'll start to worry.
There is already a proof-of-concept out there.
 
You seem to eager to add more.
No. The usbmux demon is also accessible over Wifi (this was done to enable iTunes syncing over Wifi). And the services can be started by the attacker simply by requesting them via commands sent to the demon.

To use the usbmux daemon, the device still needs to be paired. Which means either it's been paired in the past, or your keys have been stolen -- which implies you have bigger issues to deal with.

This tool (Apple Configurator) is meant for enterprise administrators and is not exactly easy to use. It is also not available for Windows users. And disabling pairing means that you cannot do iTunes syncing, app file transfers, accessing photos etc. from any other computer than the management machine anymore.

It's also freely downloadable to anyone via the Mac App Store -- I'm not sure what more Apple has to do to fulfill making this tool available.


They need either physical access to the device or they've already compromised your pairing keys from your PC/Mac.

Yup. Takes just a few seconds. And can also be done by non-law enforcment persons. Attackers could also steal the pairing keys from your computer using the same methods used to steal passwords etc. (malware, social engineering, etc.). Or they could mount a man-in-the-middle attack on a public Wifi network (perhaps with SSID "attwifi", so that closeby iPhones automatically log on). There are many ways.

The bottomline is that the pairing process is not very secure. So it should not expose sensitive information, at least not without additional protection (e.g. by requiring the user's backup encryption password).

The pairing process by it's very nature is about giving trust to another machine. If that machine is not secure, your iPhone is not going to be secure either. And if the user of the device will simply say yes to the prompt that says "Do you want to trust this computer?", they aren't going to be very secure either.

This is ridiculous. Nobody debugs network issues using a phone.

I wasn't trying to imply that someone would use their phone to debug all sorts of network issues. But having a tool like this on the device could come in handy when trying to debug an issue between the device and the network.

There is already a proof-of-concept out there.

Link please. Again, if it requires physical access to the device or compromising another machine, then I'm not particularly worries. If you can somehow show that someone could gain access to my iPhone or iPad remotely without that, I'm interested in hearing more.
 
Statistically, very few of us live such colorful/interesting lives whereas monitoring of it would be beneficial to an outside source. Yet, with that stated, sure- this information provided in this blog is disturbing.

Most of the tin foil hat wearing crowd (FYI, love the Weird Al "Foil" video) are like that. However, there is privacy no matter how mundane or exciting your life is.

I'm not that concerned about Big Brother but more Big Second Cousin. I'm referring to industrial espionage and amateur hacking. Government agencies have long and exhaustive protocols to wiretap according to a set objective.

My biggest concern is the small time privateer finding any security hole to exploit. IMO a lot of "government harassment" in hacking and other compromising is typically traced to a small time, local privateer with too much talent and too much time on their hands.
 
For example, there is a packet capture service (pcapd) that allows an attacker to remotely monitor the phone's complete network traffic, and a service that provides access to the same (and more) data accessible by the backup service while bypassing that service's encryption feature. Apple should provide an explanation and/or remove these services.

Almost any UNIX I know of has a pcap-implementation in the form of tcpdump. Placed there, no doubt, by the NSA to spy on UNIX-users.

And no worries, pcap and wireshark (a GUI-variant of tcpdump) for windows works as well, and installs for many windows-users without even asking for a password. Placed there by monkey boy Ballmer after the CIA threatened to publish that one photograph.

'bypassing encryption' is a fallacy, a fake argument, a FUD-spreading lie. You have privileged access to the host, including the 'secret' keys used to encrypt/decrypt data. At that point there is no privacy any more.

I've read the paper. Zdziarski points out that you need physical access, PIN and passwords to access these data. Most of the press coverage seems to forget to mention this tiny detail. (With physical access, PIN and password I can disable backup encryption and gain the same data from a backup, or I can just open the phone and *read* the messages, addresses,notes, emails etc.. *GASP*.)
 
  • Like
Reactions: powers74
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.