Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,120
38,874



Israel-based software developer Cellebrite, known for breaking into mobile devices like the iPhone to obtain sensitive data, has announced that it can now unlock any iOS device running up to iOS 12.3, which was released only a month ago.

iphonexipadpro.jpg

The firm revealed the capability in a tweet posted late Friday advertising UFED Premium, the latest version of its Universal Forensic Extraction Device.

On its UFED web page, Cellebrite describes the tool's ability to glean forensic data from any iOS device dating back to iOS 7, as well as from Android devices made by Samsung, Huawei, LG, and Xiaomi.

The Israel firm describes UFED Premium as "the only on-premise solution for law enforcement agencies to unlock and extract crucial mobile phone evidence from all iOS and high-end Android devices."

If the claims are accurate, Cellebrite's tool will enable authorities to potentially crack the vast majority of smartphones currently available on the market. As Wired notes, no other law enforcement contractor has made such broad claims about a single product, at least not publicly.

Apple continually introduces improvements to the security of its operating systems in order to keep ahead of companies like Cellebrite that are always searching for flaws and vulnerabilities to exploit in order to access the data on locked iOS devices.

For example, in October 2018 Apple's successfully thwarted the "GrayKey" iPhone passcode hack, sold by Atlanta-based company Grayshift, which had also been in use by U.S. law enforcement.

Cellebrite first garnered significant attention in 2016, when it was believed the company was enlisted to help the FBI break into the iPhone 5c of San Bernardino shooter Syed Farook after Apple refused to provide the FBI with tools to unlock the device.

The FBI did not use Cellebrite's services for that particular case, but several United States government agencies do regularly work with Cellebrite to unlock iOS devices.

According to Wired's sources, Grayshift has developed tools to unlock at least some versions of iOS 12. If true, the firm is still keeping its cards close to its chest, but probably not for much longer.

Even as Apple works to increase the security of its iOS devices, Cellebrite's brazen announcement suggests the cat-and-mouse game of exploiting vulnerabilities in mobile device software will only become more competitive, as rival companies attempt to grab a bigger share of the market.

Note: Due to the political nature of the discussion regarding this topic, the discussion thread is located in our Politics, Religion, Social Issues forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Article Link: Data Extraction Company Cellebrite Touts New Software for Cracking iPhones and iPads Running up to iOS 12.3
 
Cellebrite first garnered significant attention in 2016, when it was believed the company was enlisted to help the FBI break into the iPhone 5c of San Bernardino shooter Syed Farook after Apple refused to provide the FBI with tools to unlock the device.
If i recall correctly, Apple did not "refused to provide the FBI with tools to unlock the device" they simply said they had no way to do what they were asked.....
 
It’s funny because Apple Stores use Cellebrite machines to transfer data from device to device. I don’t think they use them much anymore, but stores definitely have them.
 
  • Like
Reactions: amartinez1660
Yes, but neither 12.3.1 nor 12.3.2 fixed and security issues.

Yeah, and Cellebrite claims they can "Bypass or determine locks and perform a full file system extraction on any iOS device" (emphasis mine). Maybe 12.3.x is safe, maybe Cellebrite just does not bother stating the patch number, maybe they just had not tested it at the time they prepared the text... I would not feel too safe about this.
 
It’s only if the user uses week passwords like just digit code, if they use a complex passcode then it’s game over but how many users pick complex passwords they tend to stick with 4 digit code as it’s good enough for most users criminals will use complex codes and further added encryption on top of current encryption making any attempt by the FBI futile but criminals are dumb not that smart so why they is still a market for this until they smarting up

Not long the police had to take away an iPad and iPhone to see if it had anything useful to find a missing person, they claim experts could get any deleted stuff back if needed, but I managed the iPad for him keeping it organised and removed unwanted junk i’m A developer so when I delete anything it’s encrypted then trashed and any entry in the file system removed as well they soon came back with then as they could not retrieve anything totally blank
 
Last edited:
It’s only if the user uses week passwords like just digit code, if they use a complex passcode then it’s game over but how many users pick complex passwords they tend to stick with 4 digit code as it’s good enough for most users criminals will use complex codes and further added encryption on top of current encryption making any attempt by the FBI futile but criminals are dumb not that smart so why they is still a market for this until they smarting up

May not be limited to criminals. With my corporate sign-on, part of the initial setup was converting the existing 4 digit passcode to 8 digits so it applies to all iPhone/iPad use - not just the corporate applications. That is just to access the phone as the corporate alpha/numeric password signin is later required for that content and corporate email. Also, a very easy security feature I hadn't though of until was mentioned by a 'genius' at the Apple Store...the numbers don't all have to be different. Obviously you don't want something like 8 of the same, but if you double or triple up a couple times, it exponentially expands the pool of probabilities that any hacking must deal with.
 
  • Like
Reactions: vidjahgamz
Given that marketing is basically legally brainwashing people, I can boldly say Cellebrite might as well just want to get some media attention and gain upper hand in marketing against other competitors. Whether they have the up to date tool or not could be another debate. I just assume government database has an up to date real-time Data cache of all my devices connected to the internet, period.
 
As long as companies continue to find ways to break into devices, there is no need for legislation requiring manufacturers to provide law enforcement with back doors.
 
As long as companies continue to find ways to break into devices, there is no need for legislation requiring manufacturers to provide law enforcement with back doors.

I would argue even if there weren't private sector solutions, there is still no need to require back doors in smartphones and other computer hardware - because (beyond user privacy) there is no such thing as a secure back door, thinking back to the exploits that were leaked (from the NSA etc.) and the access would be abused by governments. JMHO...
 
As long as companies continue to find ways to break into devices, there is no need for legislation requiring manufacturers to provide law enforcement with back doors.

As long as Apple continues to have bugs that allow people to break in, how can Apple claim the high ground with regard to privacy. At the end of the day, it does not matter whether its a bug (in Apple's case) or a feature (in Google's case) it still puts our devices at risk.
 
  • Like
Reactions: AlexGraphicD
As long as Apple continues to have bugs that allow people to break in, how can Apple claim the high ground with regard to privacy. At the end of the day, it does not matter whether its a bug (in Apple's case) or a feature (in Google's case) it still puts our devices at risk.

C'mon. Even the "high ground" is not invulnerable. I think Apple can say that iOS is the most secure system. People can interpret that as they will. It does matter if it's a feature or a bug. And, of course Apple will change things as soon as they can find Cellebrite's method. Locking out the Lightning data lines was a great start. Google has less of an imperative.
 
As long as Apple continues to have bugs that allow people to break in, how can Apple claim the high ground with regard to privacy. At the end of the day, it does not matter whether its a bug (in Apple's case) or a feature (in Google's case) it still puts our devices at risk.

Because they're security issues, nothing to do with their privacy policy.
 
If you don’t care about jailbreaking, the lastest iOS 12.4 beta is looking pretty good right now.
 
C'mon. Even the "high ground" is not invulnerable. I think Apple can say that iOS is the most secure system. People can interpret that as they will. It does matter if it's a feature or a bug. And, of course Apple will change things as soon as they can find Cellebrite's method. Locking out the Lightning data lines was a great start. Google has less of an imperative.

The problem is that with a closed system like Apple, we really don't know if it was a bug or a feature. The FBI really backed off Apple and that is not like the government, unless they have negotiated something else in private.

This dance could go on forever, "Ops here is another bug (we won't tell what exactly it was, but we fixed it)". Then wash and repeat for the next Apple NSA feature that gets exposed.

Apple seems to have better security, but after all Apple is also the best marketing company and prefers marketing over real technology, so we really don't know do we?
 
Because they're security issues, nothing to do with their privacy policy.

Really? If anyone, NSA, FBI, Russia, etc. can break the security, then there is NO privacy, regardless of policy. Policy only has meaning if it is enforceable.

Let's say you live in an apartment, make a privacy policy that your living room is private, then try leaving your front door open and see how much privacy you actually get.
 
Now let’s please not see a bunch of silly posts about Apple not caring or being good at privacy/security.

Edit: Oops. Too late. Natch.

If i recall correctly, Apple did not "refused to provide the FBI with tools to unlock the device" they simply said they had no way to do what they were asked.....
You’re only partially correct. They also framed it as a privacy and civil liberties issue. What they couldn’t do was access the San Bernardino iPhone, but what they wouldn’t do was build the FBI a custom iOS version with a backdoor.
 
Last edited:
  • Like
Reactions: YaBe
Also, a very easy security feature I hadn't though of until was mentioned by a 'genius' at the Apple Store...the numbers don't all have to be different. Obviously you don't want something like 8 of the same, but if you double or triple up a couple times, it exponentially expands the pool of probabilities that any hacking must deal with.

That's not quite true. An 8 digit numeric passcode has a fixed number of possibilities (100,000,000), whether or not numbers are duplicated.

Now, what is true behaviorally is some are very unlikely to be chosen, and others more likely; it's like the lottery where fewer people would pick 123456 since they think that will never come up but it's just as likely as any other 6 digit number.

It's all about what people think and how that influences behavior; especially to make passcodes easy to remember. For example, a passcode all of the same digit may be more likely than a random set since such a passcode is easy to remember; or some combination of the 4 corners for a 4 digit code. In my own anecdotal experience, I worked for a company that made you change your pw every 3 months and not reuse one of last 6. They didn't however, prevent you from resetting the password 7 times in a row; thus one really never needed a new password.
[doublepost=1560776611][/doublepost]
As long as Apple continues to have bugs that allow people to break in, how can Apple claim the high ground with regard to privacy. At the end of the day, it does not matter whether its a bug (in Apple's case) or a feature (in Google's case) it still puts our devices at risk.

The difference is in how companies respond. Apple appears to attempt to block exploits after they are discovered; other companies not so much so.

Edit: Corrected typo that said 100K, not 10^8 possibilities.
 
Last edited:
I didn't realize the iPhone was full of fake technology.

Not fake, but not what Apple claims either.

For example, they claim to be very privacy minded. For example, your data in iCloud is encrypted. This encryption has keys that only Apple has, but that you as the owner cannot control. So this encryption is for marketing only. At the end of the day, Apple, NSA, FBI, criminals, and any other government agency that wants to forge up a court order to look at anything they want.

The technology exists, and has for years, to allow users full control of the encryption on iCould. But Apple has not implemented it. Why? Because it is an additional cost and Apple's marketing is working right now without it. Most people believe Apple's marketing and would never question the company.

The result is that we get fantastic claims and mediocre delivery. Now I agree that Apple's delivery is better than the alternatives, but that does not mean that we, as users, should not be demanding more. We need to force Apple to live up to its marketing claims.
 
  • Like
Reactions: stylinexpat
As long as Apple continues to have bugs that allow people to break in, how can Apple claim the high ground with regard to privacy. At the end of the day, it does not matter whether its a bug (in Apple's case) or a feature (in Google's case) it still puts our devices at risk.

There's no such thing as "bug free software", and there's no such thing as "100% secure" software. Security is, and always will be a game of cat and mouse and there's no way around it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.