Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,214
38,993



Hackers have had an "easy way" to get certain malware past signature checks in third-party security tools since Apple's OS X Leopard operating system in 2007, according to a detailed new report today by Ars Technica. Researchers discovered that hackers could essentially trick the security tools -- designed to sniff out suspiciously signed software -- into thinking the malware was officially signed by Apple while they in fact hid malicious software.

macos_code_signing_bypass.jpg

The researchers said that the signature bypassing method is so "easy" and "trivial" that pretty much any hacker who discovered it could pass off malicious code as an app that appeared to be signed by Apple. These digital signatures are core security functions that let users know the app in question was signed with the private key of a trusted party, like Apple does with its first-party apps.

Joshua Pitts, senior penetration testing engineer for security firm Okta, said he discovered the technique in February and informed Apple and the third-party developers about it soon after. Okta today also published information about the bypass, including a detailed disclosure timeline that began on February 22 with a report submitted to Apple and continues to today's public disclosure.

Ars Technica broke down how the method was used and which third-party tools are affected:
The technique worked using a binary format, alternatively known as a Fat or Universal file, that contained several files that were written for different CPUs used in Macs over the years, such as i386, x86_64, or PPC. Only the first so-called Mach-O file in the bundle had to be signed by Apple. At least eight third-party tools would show other non-signed executable code included in the same bundle as being signed by Apple, too.

Affected third-party tools included VirusTotal, Google Santa, Facebook OSQuery, the Little Snitch Firewall, Yelp, OSXCollector, Carbon Black's db Response, and several tools from Objective-See. Many companies and individuals rely on some of the tools to help implement whitelisting processes that permit only approved applications to be installed on a computer, while forbidding all others.
Developer Patrick Wardle spoke on the topic, explaining that the bypass was due to ambiguous documentation and comments provided by Apple regarding the use of publicly available programming interfaces that make digital signature checks function: "To be clear, this is not a vulnerability or bug in Apple's code... basically just unclear/confusing documentation that led to people using their API incorrectly." It's also not an issue exclusive to Apple and macOS third-party security tools, as Wardle pointed out: "If a hacker wants to bypass your tool and targets it directly, they will win."

For its part, Apple was said to have stated on March 20 that it did not see the bypass as a security issue that needed to be directly addressed. On March 29, the company updated its documentation to be more clear on the matter, stating that "third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result."

Article Link: Third-Party macOS Security Tools Vulnerable to Malware Code-Signing Bypasses for Years
 
These companies are prioritizing speed for security. We can assume they'll now implement proper checks, but it will come at the cost of speed.

I'm sure most won't bother to read this article and blame Apple, but the real blame here is with developers including Little Snitch, xFence, and Facebook's OSquery. They're the ones that failed to properly check these signatures.
 
I'm sure most won't bother to read this article and blame Apple, but the real blame here is with developers including Little Snitch, xFence, and Facebook's OSquery. They're the ones that failed to properly check these signatures.

It's Apple's fault. When 8 separate developers use the API in the wrong way, there's an issue with the API and instructions.
 
So... I'm confused. Doesn't macOS itself check whether an app is properly signed or not before it's allowed to run? Is that feature working properly (and does it actually exist?) If so, then this really doesn't matter.

If not... then perhaps there's actually a need for virus checkers in macOS that I wasn't aware of?
 
It's Apple's fault. When 8 separate developers use the API in the wrong way, there's an issue with the API and instructions.

No, it's really not. It's the developers responsibility to use the proper security procedures in their app. Is it the states fault that people fail to follow speed limit signs?
 
  • Like
Reactions: ILikeAllOS and zzu
Yeah, since the built-in signature checking is valid, this is much less of an issue.

Reading the original article on Okta, it does really hammer home how overly complex these systems are. And complexity is IMO the biggest enemy of security. It seems to me it's very much "easy to fail" rather than "easy to succeed".
 
  • Like
Reactions: MacsRuleOthersDrool
So... I'm confused. Doesn't macOS itself check whether an app is properly signed or not before it's allowed to run? Is that feature working properly (and does it actually exist?) If so, then this really doesn't matter.

If not... then perhaps there's actually a need for virus checkers in macOS that I wasn't aware of?

That is the inherent problem with X509 and Gatekeeper. It defaults to trusting anything in it's trust store. With Gatekeeper you just pay for a developer account and get your signing key, or steal someone elses and your code will run on any Mac by default. Determining code authenticity is a solved problem with TOFU tools like PGP/GPG. There is some really critical software out there that does not have GPG signatures, https://mostvulnerable.com.
 
  • Like
Reactions: zzu
I'm sure most won't bother to read this article and blame Apple, but the real blame here is with developers including Little Snitch, xFence, and Facebook's OSquery. They're the ones that failed to properly check these signatures.
I'm not sure I'd blame the devs here. The problem is the documentation. Once again reminding us that tech writers are an underappreciated bunch.
 
I'm not sure I'd blame the devs here. The problem is the documentation. Once again reminding us that tech writers are an underappreciated bunch.

The current Apple documentation insists on the need to vet all certificates. But that slows things down, which is why some developers have chosen not to do so.

Is it the states fault if people don't follow speed limit signs?
 
  • Like
Reactions: ILikeAllOS and zzu
Can someone explain the issue at hand please, I don't really understand the problem here.
I have Little snitch, is Little snitch easily hacked or what?
Bit confusing article.:confused:
 
Can someone explain the issue at hand please, I don't really understand the problem here.
I have Little snitch, is Little snitch easily hacked or what?
Bit confusing article.:confused:
Not easily hacked - see this discussion
https://forums.obdev.at/viewtopic.php?f=1&t=11372

It seems that LS 4.1 addressed the problem but we will need to wait for the developers to respond to this speculation
 
  • Like
Reactions: justperry
"stating that "third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result."
 
  • Like
Reactions: zzu
AFAIK, Apple's only error was lack of clarity in the documentation, which I gather they fixed. But when that happens and there's a known problem as a result affecting multiple apps, I hope they make a reasonable effort to actively inform registered developers.
 



Hackers have had an "easy way" to get certain malware past signature checks in third-party security tools since Apple's OS X Leopard operating system in 2007, according to a detailed new report today by Ars Technica. Researchers discovered that hackers could essentially trick the security tools -- designed to sniff out suspiciously signed software -- into thinking the malware was officially signed by Apple while they in fact hid malicious software.

macos_code_signing_bypass.jpg

The researchers said that the signature bypassing method is so "easy" and "trivial" that pretty much any hacker who discovered it could pass off malicious code as an app that appeared to be signed by Apple. These digital signatures are core security functions that let users know the app in question was signed with the private key of a trusted party, like Apple does with its first-party apps.

Joshua Pitts, senior penetration testing engineer for security firm Okta, said he discovered the technique in February and informed Apple and the third-party developers about it soon after. Okta today also published information about the bypass, including a detailed disclosure timeline that began on February 22 with a report submitted to Apple and continues to today's public disclosure.

Ars Technica broke down how the method was used and which third-party tools are affected:
Developer Patrick Wardle spoke on the topic, explaining that the bypass was due to ambiguous documentation and comments provided by Apple regarding the use of publicly available programming interfaces that make digital signature checks function: "To be clear, this is not a vulnerability or bug in Apple's code... basically just unclear/confusing documentation that led to people using their API incorrectly." It's also not an issue exclusive to Apple and macOS third-party security tools, as Wardle pointed out: "If a hacker wants to bypass your tool and targets it directly, they will win."

For its part, Apple was said to have stated on March 20 that it did not see the bypass as a security issue that needed to be directly addressed. On March 29, the company updated its documentation to be more clear on the matter, stating that "third-party developers will need to do additional work to verify that all of the identities in a universal binary are the same if they want to present a meaningful result."

Article Link: Third-Party macOS Security Tools Vulnerable to Malware Code-Signing Bypasses for Years
[doublepost=1528893263][/doublepost]Yes, noticed just yesterday that the main problem to removing malware from a Mac was the Mac security ... preventing the removal as long as SIP is activated.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.