Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,545
39,400


Apple is today launching a new Apple Security Research Device Program that's designed to provide security researchers with special iPhones that are dedicated to security research with unique code execution and containment policies.

applesecuritydevice.jpg

Apple last year said it would be providing security researchers with access to "special" iPhones that would make it easier for them to find security vulnerabilities and weaknesses to make iOS devices more secure, which appears to be the program that's rolling out now.

The iPhones that Apple is providing to security researchers are less locked down than consumer devices and will make it easier to find serious security vulnerabilities.

Apple says the Security Research Device (SRD) offers shell access and can run any tools or entitlements, but other than that, it behaves similarly to a standard iPhone. SRDs are provided to security researchers on a 12-month renewable basis and remain Apple property. Bugs discovered with the SRD must be "promptly" reported to Apple or a relevant third-party.
If you use the SRD to find, test, validate, verify, or confirm a vulnerability, you must promptly report it to Apple and, if the bug is in third-party code, to the appropriate third party. If you didn't use the SRD for any aspect of your work with a vulnerability, Apple strongly encourages (and rewards, through the Apple Security Bounty) that you report the vulnerability, but you are not required to do so.

If you report a vulnerability affecting Apple products, Apple will provide you with a publication date (usually the date on which Apple releases the update to resolve the issue). Apple will work in good faith to resolve each vulnerability as soon as practical. Until the publication date, you cannot discuss the vulnerability with others.
Apple is accepting applications for the Security Research Device Program. Requirements include being in the Apple Developer Program, and having a track record finding security issues on Apple platforms.

Those that participate in the program will have access to extensive documentation and a dedicated forum with Apple engineers, with Apple telling TechCrunch that it wants the program to be a collaboration.

The Security Research Device Program will run alongside the bug bounty program, and hackers can file bug reports with Apple and receive payouts of up to $1 million, with bonuses possible for the worst vulnerabilities.

Article Link: Apple Launches Security Research Device Program to Give Bug Hunters Deeper OS Access to Find Vulnerabilities
 
  • Like
Reactions: 0924487
We don't have time to QA because business wants new features...

...okay kidding aside that's good news....unless they want to open source all of iOS eventually...then security might get compromised.
 
How is this different than the crash logs we already have in iOS?
There's a huge difference. Right now there's no way to inspect the file system to see if there was a successful breach, and crash logs only contain a stack trace and memory snapshot of application. With this kit you have full access to the device that normally would be protected. This lets you probe more sensitive areas such as Secure Enclave.

It also lets you do more detailed API testing and fuzzing as root on the iPhone, similar to what Google Project Zero's Ian Beer does.
 
ahhh now it makes sense. the real reason why they took an issue with correlium’s tool.
 
The big issue is, that Apple controls everything in this programme. Apple could decide not to fix an issue and nobody would know because only Apple decides when to release the information. That is btw the reason why Google's Project Zero won't join this programme, it is against their 90 days publication policy.
 
The big issue is, that Apple controls everything in this programme. Apple could decide not to fix an issue and nobody would know because only Apple decides when to release the information. That is btw the reason why Google's Project Zero won't join this programme, it is against their 90 days publication policy.
It's their product, and it's their hardware, why wouldn't you give all the control and timeline to Apple?

If you have ever worked with government contractors, you know that's basically the deal everywhere in the industry.

If you work for universities and research institutions, everything you discover and make is the IP of the institution or the grant giver with some attribution to you. You don't get to decide when those findings are declassified and can be released to the public.
 
  • Like
  • Disagree
Reactions: NMBob and tehabe
It's their product, and it's their hardware, why wouldn't you give all the control and timeline to Apple?

The thing is, why should I care? I care more about users and people who might depend on the security of those devices. Therefore I don't Apple to control how security issues are being handled or published.
 
This is good,

...that is, up until an Apple employee decides to take advantage of the situation.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.