This is exactly the response I would give too if I got caught.
Good. That is what we needed.a new preference for users to opt out of these security protections
Of course you could create a hash, but in this case, Apple doesn't — they transmit the hash of the developer's cert signature.1. There is no application hash. Of course there is. You can create a hash of any file you like. They are obviously sending some kind of hash to verify the app.
I'm telling you, there are plenty of IP addresses where the distance is a lot more than 2 miles.2. Going from an IP address to "Computer, ISP" etc is quite a stretch. Not at all. I ran a free online ip geolocation tool on my IP address. The specified coordinates are about 2 miles from my actual location. That's the city/state nailed.
That'll block any certificate revocation checks, which is unsafe for web browsing.As far as I'm concerned, this is now blocked at network level here.
Well, they kind of addressed this in a lateral way: they said they will be stripping any identifying information from the requests, which I assume includes your IP.The larger issue here in my opinion is that Apple is bypassing firewalls and vpn apps and exposing your public ip. If you go to the trouble of using a vpn to hide your traffic apple shouldn’t be bypassing those measures and broadcasting unencrypted packets.
Although this particular traffic is relatively harmless, the very idea that they thought that was a good design decision is disturbing.
How would that even work? Co-ordinated legislation would have to be passed by governments around the planet, and even if those laws were somehow passed, those countries would then have to agree that the thresholds have been met in order to trigger enforcement.
Bad in a way in that if a user pays for an app, the developer doesn’t follow gridlines, the user is left with an unusable app.
It doesn’t phone home on every app opening. That was incorrect information. Everyone needs to read this before posting to make sure you have correct information about what’s going on. https://blog.jacopo.io/en/post/apple-ocsp/What they should do is more like old AntiVirus where Signatures ( or list ) are updated very frequently instead of having macOS phoning home on every App opening.
Apple had to respond. It wasn't smart, it was defensive. I am neither distraught (hyperbole, no?) nor an Apple apologist. I'm concerned, which is a reasonable reaction.Very smart of Apple to issue a formal statement. Hopefully, this will allay the over distraught minds of some users here, who made this out to be more than it really is.
Of course you could create a hash, but in this case, Apple doesn't — they transmit the hash of the developer's cert signature.
I'm telling you, there are plenty of IP addresses where the distance is a lot more than 2 miles.
That'll block any certificate revocation checks, which is unsafe for web browsing.
The term "conspiracy theorist" is used to demean and belittle someone's point of view. It seems to me there are reasonable concerns. And, being concerned with privacy and hacking are not mutually exclusive.Whoa. Lots of conspiracy theorist today. Instead of being concerned about a company protecting your privacy be concerned about the people trying to hack your computer....
My biggest gripe in these sorts of affairs is that neither MS nor Apple, for all their PR bluster (especially Apple) about privacy, provides the tools to push the "trust" model farther down to the user. Granted, MOST users wouldn't know what to do, how to run their own services that respond to these requests. However, I could. And most competent I.T. departments have the talent. I'm all for the measures that Apple has implemented… but ONLY in the sense that they should have also open sourced the code and engineered it such that the "user" (or org, or I.T. dept) has the power to also supersede the signing. That way, once I had my system "locked down", I could say "OK, macOS, now I want you to use my private key to resign the binaries; stop talking to Apple's servers and now talk to mine." If the system was properly engineered, this wouldn't be a problem, because the entire PKI infrastructure was fundamentally built around the concept of "web of trust". It is Apple that has taken a decentralized system and centralized it; bad engineering, contrary to the fundamental basis of the design intent.At the end of the day, I trust absolutely no one. But using the Internet is already a compromise of one's privacy so you sort of just have to accept that. I think it's best if people have a balanced perspective and recognize that data is being collected by these companies and if you value privacy you need to do a lot of reading to understand what settings to turn on or off to minimize what data is transmitted.
we can be concerned about both. shouldn't a company focused on privacy (as marketing tactic) try to protect your computer from hackers in the first place?Whoa. Lots of conspiracy theorist today. Instead of being concerned about a company protecting your privacy be concerned about the people trying to hack your computer....
There may not be ONE, but it isn't like there are a lot. Between Five-Eyes, Russia, and China, you've gotten a whole lotta folks covered… India, Pakistan, Iran, and various other totalitarian states (NorK) sweep up a whole lot more. Sure, not "one" government, but…The government?
Anyone who thinks there’s a single planetary government (at this point in history) is delusional.
There is ZERO reason why Apple or Microsoft need to be sole collectors of all of this user data, merely because they wrote the operating systems.
I trust Huawei a heck of a lot more than a company just just remains silent.All u guys that are still unsure about Apples privacy stance, one thing is for sure, the OTHER tech companies collect much more WAY more, if u are worried about this, then the amount of data Google, Microsoft and facebook collect is beyond ur belief.
Of course you are.So because there are IPs which are geolocated more than 2 miles, I'm not allowed privacy? The fact is, it's possible to geolocate ALOT of people fairly closely with IP addresses alone.
If you block trustd from updating cert lists, that'll affect both Developer ID certs and website certs.No it won't. Apple doesn't issue certs used on the web, those issues will have their own revocation lists. Absolute nonsense.
Another one they are being sued for is collecting data in Incognito Mode. So no, the source of the problems with Google aren't the settings ... it's Google.
Documentation, please."Guys it's cool, we're not actually doing that" Apple says, while they're secretly giving U.S. Military the traffic.
Of course you are.
All I'm saying is that it's quite a stretch to go from Apple opens an HTTP connection to transmit the hash of a developer cert to Apple tracks which apps which people launch, and soon the NSA does, too.
Which is what the original post heavily suggests.
If you block trustd from updating cert lists, that'll affect both Developer ID certs and website certs.
My biggest gripe in these sorts of affairs is that neither MS nor Apple, for all their PR bluster (especially Apple) about privacy, provides the tools to push the "trust" model farther down to the user. Granted, MOST users wouldn't know what to do, how to run their own services that respond to these requests. However, I could. And most competent I.T. departments have the talent. I'm all for the measures that Apple has implemented… but ONLY in the sense that they should have also open sourced the code and engineered it such that the "user" (or org, or I.T. dept) has the power to also supersede the signing. That way, once I had my system "locked down", I could say "OK, macOS, now I want you to use my private key to resign the binaries; stop talking to Apple's servers and now talk to mine." If the system was properly engineered, this wouldn't be a problem, because the entire PKI infrastructure was fundamentally built around the concept of "web of trust". It is Apple that has taken a decentralized system and centralized it; bad engineering, contrary to the fundamental basis of the design intent.
Just like with DNS, if I (or any org) don't want to "leak" personal data that can be used for fingerprinting, like app signatures, I could take control of that service. Same with being able to encrypt data stores rooted with my own external private key.
Then again, I'm pretty "radical" when it comes to this… I think Apple should be releasing their entire iCloud stack as an open source container, or at least be forced to license it economically. There is ZERO reason why Apple or Microsoft need to be sole collectors of all of this user data, merely because they wrote the operating systems. I simply cannot shake the "conspiracy theory" that this continues to be the "normal" case because this paradigm also very conveniently matches the capabilities of the various nation-states' intelligence systems. Which, at some level, raises the question: is Apple's "privacy" PR really nothing but a facade of misdirection? It all just naggingly reminds me a bit too much of the lessons of the original 'Tron' movie's 'MCP', and given Apple's PR history that should be a connection they fight against not march toward.
Right. I'm saying if you block the process rather than the host, it will.No, it won't. Browsers (Safari included) should be checking the CRL of the issuing provider. Apple is not the issuing provider. Ergo it should not affect website certs.