Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And in this analogy you can't just change the locks since you have no idea what is specifically wrong with the lock. That's why this is not very good. No one with good intentions would come to this zerodium and give them an exploit. If a find something i report it to the the company that made it and i make my device and all devices more secure.

Anyone who will deal with this "company" are immoral ***** in only for the money.

Immoral, yes, but we are talking about well over a million dollars here. Money talks. How many people can honestly claim that they won't be honestly tempted if they ever found themselves in the exact same scenario?

That said - what's stopping the seller from double dipping? E.g.: first selling the exploit to the security company (or even multiple companies), then selling that same exploit to Apple to have it patched once I have received the money from the former?
 
You know, if Apple played their cards right, they could contract someone to work on their behalf and get $1.5Million of of Zerodium's money and directly benefit Apple. In fact, a truly conniving company could create a hidden 'vulnerability', sell it to Zerodium, fix the code right away and sink the potentially sabotaging company.

Pure genius. Love it.
 
  • Like
Reactions: JohnArtist
Immoral, yes, but we are talking about well over a million dollars here. Money talks. How many people can honestly claim that they won't be honestly tempted if they ever found themselves in the exact same scenario?

That said - what's stopping the seller from double dipping? E.g.: first selling the exploit to the security company (or even multiple companies), then selling that same exploit to Apple to have it patched once I have received the money from the former?

I wouldn't. I have standards.
 
  • Like
Reactions: millerlite
security through obscurity... what obscurity? iOS has a BILLION devices in use.
This is one of the most ignorant comments I've seen recently.

"In security engineering, security through obscurity (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system." (Google search definition)

It has nothing to do with the number of devices in use.

Android (at least AOSP) cannot rely on security through obscurity. It's out in the open. Lots and lots of people look at the code and discover bugs, for free, which lowers any possible financial incentive. The iOS codebase is secret, and thus much more expensive to investigate.

History has shown time and time again that the most secure software systems are (1) open source - many eyes looking at it (2) embracing full disclosure, basically a policy of transparency and honesty and (3) committed to timely fixes.

Apple's software is (1) closed source, (2) fighting disclosure, with a policy of obstructionism and deception and (3) very lax about fixing issues.

For this, Apple is nothing but a mere pretender in the security game. Most of their security efforts are spent in marketing, which is enough to fool its security illiterate, naive users.
 
My thoughts exactly. This is a national security issue, not some business deal. We have the Patriot Act, but no legal requirement to report potential security vulnerabilities to the companies that make hardware and software?

Anybody who takes this 'bounty' should be held legally liable, along Zerodium, for any damages caused by a customer exploiting a bug...

Ironically I imagine one of Zerodium's loyal customers will be the NSA. So I doubt that will happen.
 
The iOS codebase is secret, and thus much more expensive to investigate.

...

Apple's software is (1) closed source, (2) fighting disclosure, with a policy of obstructionism and deception and (3) very lax about fixing issues.

That is not completely accurate. Most of the base OS is open-source. As well as the browser engine. The essential closed source component is the UI framework.
 
This is one of the most ignorant comments I've seen recently.

"In security engineering, security through obscurity (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system." (Google search definition)

It has nothing to do with the number of devices in use.

Android (at least AOSP) cannot rely on security through obscurity. It's out in the open. Lots and lots of people look at the code and discover bugs, for free, which lowers any possible financial incentive. The iOS codebase is secret, and thus much more expensive to investigate.

History has shown time and time again that the most secure software systems are (1) open source - many eyes looking at it (2) embracing full disclosure, basically a policy of transparency and honesty and (3) committed to timely fixes.

Apple's software is (1) closed source, (2) fighting disclosure, with a policy of obstructionism and deception and (3) very lax about fixing issues.

For this, Apple is nothing but a mere pretender in the security game. Most of their security efforts are spent in marketing, which is enough to fool its security illiterate, naive users.

Whatever floats your boat then, if not even the very words of the guy behind that exploit company can't convince you.
 
If i sold a vulnerability of you home alrm system to someone would you like that?

How about your internet connected smart meter for your electricity?

Most of the internet of things items that helped take down Brian Krebs security website last week were just internet connected cameras. We're so setting ourselves up for a crisis.

That said this market exists for these exploits (legal or not - with the U.S. govt being the biggest buyer by far, year over year of them) and its probably better its somewhat in the open so we know about it - as opposed to just pushing it under the covers.
 
Why should not this be legal ?
Here is why:
"Rather than report the vulnerabilities to Apple, Zerodium said that it would sell the exploit to its customers, which include major technology, finance, and defense corporations, as well as government agencies."
In other words, you have someone out there paying huge $ to hackers to exploit systems to, in turn, sell those secrets to governments or other entities. This puts all of us at risk as security exploits are no longer being reported to be patched, there just being exploited. It's against the common good.
 
as someone who just went through an iPhone hack, I never want that to happen again. I have Zero tolerance for companies like Zerodium and anyone who works with them.

What Apple needs to do is up their "reward" for finding exploits well beyond Zerodium bounty so their many customers are safe. They've got the money and it would be well spent (unlike some hefty purchases made in the past couple of years). If Apple want to prove they are big on privacy, this is one way they could do it.
 
My thoughts exactly. This is a national security issue, not some business deal. We have the Patriot Act, but no legal requirement to report potential security vulnerabilities to the companies that make hardware and software?
You seem to be overlooking who Zerodium's customer are: "...its customers, which include major technology, finance, and defense corporations, as well as government agencies." I hate to be the one to tell you this but the government is doing the exact same thing Zerodium is doing. Also Apple's potential issues do not rise to the level of national security.

Anybody who takes this 'bounty' should be held legally liable, along Zerodium, for any damages caused by a customer exploiting a bug...
How are you going to know: 1. Who collected the bounty 2. Who or how many entities Zerodium sold the vuln 3. Who to attribute an exploit to since Zerodium is not the only player in that field.

You know, if Apple played their cards right, they could contract someone to work on their behalf and get $1.5Million of of Zerodium's money and directly benefit Apple. In fact, a truly conniving company could create a hidden 'vulnerability', sell it to Zerodium, fix the code right away and sink the potentially sabotaging company.
Somebody has been watching Mission Impossible reruns.:D:p Apple doesn't need to create any "hidden" vulnerabilities. They have real vulnerabilities readily available and it would be exceedingly stupid to leave a hole in security. For your plan to work Apple would have to know when Zerodium planned to test and sell the vulnerability and when the customer was going to use it. {back to MI:Bad Plan Protocol} Using info obtained while testing the "trap", Zerodium stumbles upon an access point that allows unfettered system acces. Oops. Tom Cruise gets killed. Scientology goes apeshot and decides to board their secret spaceship and head to Mars. Once off planet they release a super virus that kills all mankind.
Is this what you want Chris? Is it? IS IT!!
 
Last edited:
  • Like
Reactions: melendezest
Android is open source, so it's easier to look for security flaws.

In the meantime, Apple's source code includes things such as the goto fail bug, which speaks volumes about its security and value, and volumes about Apple's code review and sensitive code change practices.

So why are there unmatched vulnerabilities for YEARS in Open Source Software (Heartbleed)? Bugs happen no matter how many eyes are on it. Heartbleed is just one example - there are many, many more.
 
  • Like
Reactions: Tycho24
Android exploits cost less because there are already plenty of attack vectors and far fewer people are running the latest and greatest Android version. This pales in comparison to the iOS 10 install base which currently does not have an attack vector that we know of and Apple closes those gaps in security very fast with the ability to roll out security updates.

Also, there is no reason you couldn't sell the exploding to both Apple and Zerodium. Apple payout STARTS at $200,000 and goes up to I think $500,000.
 
My thoughts exactly. This is a national security issue, not some business deal. We have the Patriot Act, but no legal requirement to report potential security vulnerabilities to the companies that make hardware and software?

Anybody who takes this 'bounty' should be held legally liable, along Zerodium, for any damages caused by a customer exploiting a bug...

Especially when people in finance, health, and government use iOS.

We heavily used iOS at work in healthcare. If there was a leaked exploit I'm pretty sure nothing would happen given the state of our government, specifically the FBI.
[doublepost=1475239230][/doublepost]
Android exploits cost less because there are already plenty of attack vectors and far fewer people are running the latest and greatest Android version. This pales in comparison to the iOS 10 install base which currently does not have an attack vector that we know of and Apple closes those gaps in security very fast with the ability to roll out security updates.

Also, there is no reason you couldn't sell the exploding to both Apple and Zerodium. Apple payout STARTS at $200,000 and goes up to I think $500,000.

Finding an attack vector on android will probably get 1% of its user base given how stupidly fragmented it is.

Attacking iOS 9 or 10 you can get 35 to 45% easily or more if both.
 
For all we know, iOS could have many more security flaws that Android has, but they're more expensive to find due to its closed-source aspect.

Most of the important parts of iOS are open-source. Not that it means anything when looking for vulnerabilities - do you think there are less exploits for I.E. than for WebKit-based browsers because MS keeps the source closed?
 
Why should not this be legal ?
There are many things that society/legislature doesn't want to happen and makes illegal. In the realm of software, cracking DRM has been made illegal (not very successfully though). Many types of weapons are illegal, though which types varies from country to country (in the UK even knifes above a certain size are illegal). In other security-related areas, owning a TSA master key without permission is illegal, in many countries police scanners are sort-of illegal as are ISMI catchers.

Obviously, outlawing information (what computer code actually is) is much harder than disallowing hardware and even that is often difficult. And it might in some instances run foul of free speech rights (again very country-dependent but with 'information', outlawing it in one country while it's legal in other countries often is rather pointless).
[doublepost=1475241169][/doublepost]
When did "not user friendly" equate to "illegal"?
Things get really out off whack when jokes are taken at face value: "This is so user unfriendly, it should be illegal".
 
This is one of the most ignorant comments I've seen recently.

"In security engineering, security through obscurity (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system." (Google search definition)

It has nothing to do with the number of devices in use.

Android (at least AOSP) cannot rely on security through obscurity. It's out in the open. Lots and lots of people look at the code and discover bugs, for free, which lowers any possible financial incentive. The iOS codebase is secret, and thus much more expensive to investigate.

History has shown time and time again that the most secure software systems are (1) open source - many eyes looking at it (2) embracing full disclosure, basically a policy of transparency and honesty and (3) committed to timely fixes.

Apple's software is (1) closed source, (2) fighting disclosure, with a policy of obstructionism and deception and (3) very lax about fixing issues.

For this, Apple is nothing but a mere pretender in the security game. Most of their security efforts are spent in marketing, which is enough to fool its security illiterate, naive users.

You've already been presented with this quote, but here it is again:

"That means that iOS 10 chain exploits are either 7.5 x harder than Android or the demand for iOS exploits is 7.5 x higher. The reality is a mix of both"

You're talking in theory, but this is reality. Regardless of what open source ideology *should* do for Android, it is obviously harder to find exploits in iOS than it is for Android, and the patches for exploits that are found in iOS make their way to far more active devices far more quickly than they do on Android. Google has taken steps to close the gap, but the people forking out millions of dollars for these exploits obviously find iOS exploits to be at least a little bit harder to find than Android exploits.
 
security through obscurity... what obscurity? iOS has a BILLION devices in use.

Zerodium's CEO words can't be more clear, let's requote them: "That means that iOS 10 chain exploits are either 7.5 x harder than Android or the demand for iOS exploits is 7.5 x higher. The reality is a mix of both"

Here's the full description:
https://en.wikipedia.org/wiki/Security_through_obscurity

Microsoft used to highly rely on "security through obscurity" so not to fix vulnerabilities. There are more installations of windows than iOS devices.
 
Just going to leave these here for future reference.

iPads to Replace Paper Reference Manuals in All AA Cockpits

20010911NY456.jpg


Who are zerodium's customers? Are they vetted? How? What kinds of things would disqualify an extremely rich customer from purchasing an exploit?

This is why world leaders are increasingly talking about cyber security. These kinds of OS exploits and bugs can actually become weaponisable technologies. In some sense a company like Zerodium is a kind of PMC. PMCs are largely unregulated mercenaries who will defend abhorrent dictators in the third world with modern US weaponry for enough cash. A cyber version is much scarier, because they an easily target civilian populations anywhere in the world at any time.
 
Last edited:
This is one of the most ignorant comments I've seen recently.

"In security engineering, security through obscurity (or security by obscurity) is the reliance on the secrecy of the design or implementation as the main method of providing security for a system or component of a system." (Google search definition)

It has nothing to do with the number of devices in use.
You know full well that one reason for the lower vulnerability of the Mac, the presumption that this is due to the much smaller install-base of Macs vs PCs, has long been described as 'security through obscurity'. Dictionary definitions do not matter if they don't reflect the actual usage of a term.
 
  • Like
Reactions: djcerla
as someone who just went through an iPhone hack, I never want that to happen again. I have Zero tolerance for companies like Zerodium and anyone who works with them.

What Apple needs to do is up their "reward" for finding exploits well beyond Zerodium bounty so their many customers are safe. They've got the money and it would be well spent (unlike some hefty purchases made in the past couple of years). If Apple want to prove they are big on privacy, this is one way they could do it.

Alternatively, Apple could tell the public that if you manage to get $1.5 million from this company for an exploit, they will take you to court until your money is gone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.