Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
An Apple founder (Steve Jobs) once said:
“I believe people are smart. Some people want to share more than other people do. Ask them.”

Apple didn't! They must now adhere to the above. It should be user choice whether to to install that part of the software in order to grab back the bastion of Apple in safeguarding privacy and avoiding surveillance. Best not to include it at all and if people want to download that facility that is their choice, rather than embed it in OS etc. Personally think they should backtrack immediately.

Again this is nothing to do with CSAM as that seeks to use children as an argument to allow SURVEILLANCE and remove PRIVACY, something that Apple has built a platform on, even in court battles with other companies let alone out of court statements about Facebook and others.

A real own goal scored by Apple who need to reel it back in immediately.
 
Last edited:
  • Like
Reactions: 09872738 and femike
Good lord! Tim Cook is nailing it on this privacy thing. OMG, he has got no clue what it means, all the while he goes around prancing about it.
 
I believe many people posting on this site are avoiding the central issue, as its been deliberately couched into the pa@do category, when in fact the concern is not about that (even though I despise all who prey on children). I have 2 grandchildren I'd do anything to protect, but that also includes protecting their freedom.

This is not about CSAM, this is just the start of the slippery slope. This is about Apple engaging in surveillance, whatever way you like to try and change it.

This is how autocratic governments and dictatorships sell their actions to the masses by first using an emotive subject where I would imagine most people despise those who prey on children....but its a ruse by these governments etc., as its really about opening a door to a much more malign SURVEILLANCE, as really that is what this action amounts to from Apple, whatever way they like to try and sell it is SURVEILLANCE.

Surveillance is irrevocably linked to privacy, something Apple has previously SOLD as a flagship policy.

You have to wonder whether Apple have been leaned on by a government or governments, and of course the first way is to use an emotive action where the subject matter would be hard to disagree with, i.e. children safety, but where in fact its just to open a door, and whatever words we use, whatever we seek to justify these actions by Apple, it is still SURVEILLANCE inextricably linked to PRIVACY.

Privacy once taken, never returns, and what is so serious is the hypocrisy of Apple who have made a massive platform about protecting us from surveillance society and protecting privacy that they have now holed below the waterline and their credibility is shot to pieces.

Been Apple through and through, bought Apple devices from Apple II right the way down the line, spoken to many of the founders of Apple so no one could accuse me of Apple bashing.

But this is a major mistake, and I suspect it was Mr Cook's ideas and if not must have passed his overseeing, and he should really stick to marketing if that is the case, but born out of genuine concern, as he was giving talks on protecting children way back.

However the road to hell is paved with good intentions and this 'good intention' is a green light for more of the surveillance society, that in the past Apple has refused to comply with.

If that credibility goes, so do many of the customers. Its not about CSAM its about opening the gate and destroying Apple's credibility over privacy and surveillance.
Well written sir. A much needed piece. This should of been an article on this site, not a comment. You are spot on.
 
Android ROM means unlocked boot loader mean very easy for a malware to infect it.
it's the same with jailbreak and android root trough.

And like I said android device is never updated as long as an iPhone
There are Android ROMs that focus on security and do not keep the boot loader unlocked. GrapheneOS and CalyxOS look interesting.
Updates on old Androids are a shame. I wish there was some regulation that enforced updates for smartphones for at least 6 years in order to reduce waste.
 
I tried to be a good customer, not jailbreak my phone, buy from official app store, buy media from iTunes, trust Apple's closed source software. But if this is how you want to play fine by me, you want to force your authority on me, thats very very fine by me.

In the mean time guys, keep your options open and do not create your own monster. Donate and start using open source software for the benefit of all. CalyxOS is a nice option for smartphones here is a youtube review

If you are considering an OS switch, Linux is not as bad as you think, things are a lot more thriving now. Here some distros:

ElementaryOS * ZorinOS * Manjaro * Mint

If you are new to Linux and confused as to what a distro is, its basically a pre-configured Linux with specific set of looks and apps. Pick the one you like and go with it, don't get confused by the huge selection.
 
  • Like
Reactions: Evil Lair
Because history shows us what appears to be a privacy friendly way, rarely stays that way.
Likewise even in Apple's own blurb it constantly refers to 'designed not to' etc., which means nothing. How many things designed not to end up doing just what they were designed not to.

Apple you really have shot yourselves in both feet with this idea however great the idea is about safeguarding kids.

can you give an example of Apple privacy features that didn’t stay private?
 
  • Like
Reactions: slineaudi
That’s a term redirected by Wikipedia from Private Set Intersection page, which Apple claims to use to securely transfer hash information between Apple and user’s iPhone without either Apple or user knowing the content of hashed image unless the matching threshold reaches.
ok - so that's to ensure Apple only gets knowledge once the threshold is reached. That would indeed require some serious cryptography.
 
It’s happening on your phone, and it stays on your phone.
Here‘s where you are wrong. Its a snitch on your phone. It scans (hashes) your images and then decides what to do.
If it decides you have incriminating images in possession, it „informs“ Apple. Nothing stays on your phone then.

They access images, private images. Only thereafter they may apply super-sophisticated algorithms, but in the end what they do is accessing private data.
They are not even telling you when they believe they found something.

Its a snitch on a phone, designed for mass surveillance, period. There is no question on that whatsoever, because the system HAS to be, otherwise it would not fulfill its alledged purpose.
What makes matters worse is that it can (EFF and Snowden agree) - and sure as hell will - be used for other „checks“ rather sooner than later.

Apple even knows this, as they adress this in their FAQ, albeit giving no satisfactory answers. That alone has to ring bells
 
Marketing stunt at the cost of destroying user’s privacy.

Shame on you Apple,greedy dishonest company.
I lost all my trust and interest in Apple.

they used “privacy” and publicity cases such as refusing to unlock the phone for FBI case as marketing stunts and now they are using child safety as the new marketing stunt model.
 
  • Like
Reactions: Evil Lair
Here's the thing - can you believe it's just icloud? If they have this scanning technology, they'll end up just baking it into the core OS to scan everything on the phone. Doesn't matter if it's iPhotos, iMessage, whatsapp, if its on the filesystem, the core OS will do the check on that file, and if the file is flagged, it'll get reported. iCloud or not.

If you believe it's just iCloud, you're a fool.

If Apple were doing this or plan to do this they wouldn’t even mention all this on the first place. So no I don’t think they will do this at all.

I’d they do then why state all this and just take data without coming out and saying they do like other companies
 
  • Like
Reactions: slineaudi
Apple *is not* scanning your photos.
That’s not how it works.
There’s a One and 1 trillion chance that anyone from Apple will ever see any of your images
They will 100% and repeatedly scan your photos ON YOUR DEVICE by software.

even if you disable iCloud photos the search still carries on every day on your device.

they claim then matching cases will only be reported if they are uploaded to the server,and then human review carries on.

but that’s not the concern,the fact that your photos are being scanned and policed on YOUR DEVICE all the time without your permission is.

this is 100% breach of privacy.

think,this is the equivalent of being stripped searched every day by police.
 
Are most commenters trolls or are they just reading headlines and jumping to conclusions!? Read this first.

Apple is going to start checking photos stored in iCloud on Apple servers for CSAM. Ok, fine. You don't have to use iCloud Photos if you feel that's some great slippery slope.

"The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device."

iOS15, if you enabled it, will be able to recognize explicit content and automatically blur it forcing a user to accept seeing that material. It can also notify parents if children under 12 accept viewing the material. Nothing gets reported to Apple.

"This doesn’t change the privacy assurances of Messages, and Apple never gains access to communications as a result of this feature. For accounts of children age 12 and under, parents can set up parental notifications which will be sent if the child confirms and sends or views an image that has been determined to be sexually explicit. None of the communications, image evaluation, interventions, or notifications are available to Apple."
 
  • Disagree
Reactions: Stunning_Sense4712
They will 100% and repeatedly scan your photos ON YOUR DEVICE by software.
...
think,this is the equivalent of being stripped searched every day by police.
What?

For 10 years, Spotlight has been constantly indexing your Mac file system looking in files and through meta data so you can find files quickly. The current Photos app on your phone scans photos to recognize faces and objects to help you find photos of your dog faster. None of this is new. No one had any problem with this before. Now, if you let it, iOS15 will also recognize CSAM (not using AI but actual match with known CSAM) and Apple will be informed if you upload it to iCloud.
 
  • Disagree
Reactions: Stunning_Sense4712
These news make me so happy to live in Europe where Apple will not be able to implement this fiasco for the time being. Apple itself has announced it will roll out this feature in the US only.

This is a "lick where you once spat" move from Apple. After more than a year of bashing third-party apps for their lack of privacy, Apple is taking this rabbit out of the hat with their own privacy-violation move. And it is a violation of privacy no matter the jargon they hide themselves behind or however the honorable the end goal is.
Privacy should be categorical: You either have it or you don't. If you can't sustain a "no" across the board no matter what (to third-party vendors, to the FBI, to governments) then you only get a half-baked privacy.

This move means the sacrifice of 99.99% of the population to potentially identify the .01%. Even so, I truly wonder who would be that stupid to even keep such material on iCloud instead of, say, an external drive, particularly after they had been alerted so much in advanced with these Apple headlines? So that .01% plummets even further.
All in all, the costs are really stacked hard against the benefits.
 
I guess that's my question :)
If you're going to hit them where it hurts, by not buying the next iPhone for instance, then what should you get instead that has better privacy?

I'm not arguging that there's no "better" alternative. You either get privacy or you get a new shiny while you pay for them to take your privacy. What are the other flagship options today? Samsung? Are they even allowing custom OS installs on their flagships? They track just as much if not more.

Before I sold out to Apple, I've compiled my own version of LineageOS and didn't install Google Play packages on a few phones. Then I ran a firewall to monitor traffic. This didn't last long as the open source apps didn't exactly inspire my confidence in getting critical alerts for work. The tediosness of monitoring a cell phone like a server is very annoying. I never went through all the code to find every possible way the system was talking to sources I did not allow and removed that code, though I always wanted to try. Do I want to go back to doing this? NO. There's a reason I switched to Apple. It just sucks that they're making me even reconsider it.
 
Of course it is scanning. How do you think the “digital fingerprint” of a photo is computed? Answer: by scanning the content of the photo, running it through an algorithm, and representing the ”essence” of the photo as a fairly large number (a hash).

There are many different ways of computing a hash code for an array of bits. For example, I operate a system containing several hundred million photos, and each picture has a simple 64-bit hash code so that I can figure out if any given one is unique. The algorithm I use is a simple one and does not evaluate the ”essence” of a photo, so it would be no good at matching manipulated photos, but it suits my purposes. Apple’s algorithm will be substantially more involved.

The human brain is quite good at evaluating the essence of a photo. Take a picture of Albert Einstein, remove all colour, chop a bit off the top, make it smaller and rotate it a bit. You can still see that it’s a picture of Einstein. In a sense, you have a hash representation of Albert Einstein already in your brain, and the brain is good at applying pattern matching to any photo it sees. This is the kind of hashing that Apple will be doing.

Leaving aside the rights and wrongs of doing that, they’ve slipped up on the implementation by storing known “bad” hashes on the device. It’s not sustainable because that list will get larger with every iOS update. Ultimately, the hash list will be larger than iOS itself. I’m at a loss to explain why they don’t just pass the computed hash to a web service to get the yay/nay from Apple.
I hadn’t thought of the *process* of obtaining the hash, had only thought of the *outcome*. So you’re right, that part is a “scan”. But it is still not Apple that is doing it, it is the phone, and if the phone passed along that hash to Apple. then the hash itself is useless.

All of this should be passively preceded by “if it works as Apple describes”. If one believes, however, that Apple is hiding something or lying about how it works, then this all doesn’t matter anyway, because they’ve already got the actual photos on their server. They wouldn’t need to do this if their intent all along is to examine your photos.

I wondered about what happens as the list grows as well. I don’t worry about it overwhelming the phone’s memory: it’s text. Perhaps I’m vastly underestimating the size of the text string.

But the list has to be able to grow, right? So there has to be some mechanism to get those updates to the phone. What is that mechanism? How abstract is it? I would expect that Apple doesn’t have the phone query the CSAM database directly without Apple as a go-between, so it can be a gatekeeper. I would like some more specifics about that process.

if changing that list requires an actual iOS update, then it’s static in between, and I expect (but don’t know) that some watchdog group or individual COULD extract the list from the phone and compare to the CSAM database, since the phone-embedded list would be a reasonably stable target.

EDIT:

It occurs to me that the phone is already scanning the photos as you describe so you can do a search on, say, “dog” or “sunset” and it can generate results. That scanning is also done on-device.

It used to be that identifying faces was not synced between the iPhone and your Mac (via iCloud, implied). That used to be an inconvenience, but I understood it. iCL isn’t encrypted (I believe?) so having ID data in the cloud was a weak point. I think they have solved this somehow.
 
Last edited:
This already has been tried at least once and failed.


Then we also have have https://ubuntu-touch.io (which is more or less open I guess), but
has not gained any traction either.
You're right that there have been a few attempts -- the PinePhone being one more recent effort. They haven't gained much traction yet, but as Google and Apple continue to skirt privacy, and the more they become -- like many companies in Silicon Valley these days -- aligned with our intelligence organizations, this technology becomes more appealing.

Remember, there were many attempts made at smartphones before the iPhone came along. I still have my Palm Treo in a drawer somewhere...
 
I
It's right from the horse's mouth at https://www.apple.com/child-safety/
I don't know what else to think. They say exactly what they're doing, and what they're doing is rotten. You can keep faith, whatever that means (I guess keeping your Apple devices or maybe stock), but I'm done regarding them as the privacy-focused alternative unless they want to undo this and apologize.
I disagree that what they’re doing is rotten, if your source for making that determination is what they’ve described. There is the potential is rottenness, if things become different than described. But if all I’m going on is what they’ve said publicly, then I’m satisfied they’ve done this in as privacy-preserving a way as possible.
 
Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
Android is open sourced, their are many different versions you can flash and in addition Android will be modified depending on the changes by the manufacturer. In addition I do not see Google being able to put this kind of software on Android operating systems due to it being open source and in addition Android already has the ability to perform this type of scan on the files you store on their cloud server side which is less invasive than what Apple is doing.
 
They will 100% and repeatedly scan your photos ON YOUR DEVICE by software.

even if you disable iCloud photos the search still carries on every day on your device.

they claim then matching cases will only be reported if they are uploaded to the server,and then human review carries on.

but that’s not the concern,the fact that your photos are being scanned and policed on YOUR DEVICE all the time without your permission is.

this is 100% breach of privacy.

think,this is the equivalent of being stripped searched every day by police.

The photos are scanned as a prerequisite to being uploaded to iCloud. So everything you just typed is an outright falsehood.

The closest analogy I can think of is being asked to show your ID before you enter a nightclub. Hardly an unreasonable request.
 
The photos are scanned as a prerequisite to being uploaded to iCloud. So everything you just typed is an outright falsehood.

The closest analogy I can think of is being asked to show your ID before you enter a nightclub. Hardly an unreasonable request.
I am assuming even with iCloud Photos off the pictures are hashed on your local device and they are just not compared to the list until you upload to iCloud. But that easily can change with a simple software update since the software is on your device not on the iCloud server.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.