Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You don't have to trust anyone, logout iCloud and block all outgoing connection to Apple(on your router if possible) and you will be fine.
 
Well, I don't trust Apple. But I don't want to use Windows so I stopped using iCloud and decided to maintain some kind of trust that my Mac is not spying on every file that I have on it. I really try to be not very paranoid but you never know what it's going on in the background. I also started using open-sorced software more because it's more transparent. Once the trust is broken it'll never be the same.
 
I thought this was just iOS devices… I’m guessing if I don’t update to MacOS 12 and iOS 15, then my Mac and iDevices will be fine?
Well that's the thing. The NeuralHash code has already been found in iOS 14.3, supposedly just not activated by Apple, but that might change in iOS 14.7.2 but there's no way to know for the end user.
 
You don't have to trust anyone, logout iCloud and block all outgoing connection to Apple(on your router if possible) and you will be fine.

Regarding iCloud. We have to trust Apple's word the NeuralHash is only activated when logged into iCloud, but since it's happening On-device, it may be running at all times.. Even if it's not, since the system is already in place, it can probably be activated by means of a simple software update..
 
Film Camera. Develop and print your own.
Problem fixed...

The person I feel sorry for is the designer of dolls, you know, the plastic figures that go into the molds before they are dressed.
And the teacher who has a collection of life-like baby robots that are used to teach teenagers what it is really like to look after babies.
And the parents and grandparents who put scans of 'baby in bath' photos from the 50's and 60's up on FaceBook.
And the Paediatrician writing a book on baby diseases, including nappy rash and all its causes.
And the lawyer writing a book on FGM* with photos.

AI cannot, or ever will be 'Just Right'. It will either be too sensitive and bring up false positives, or not sensitive enough and miss illegal stuff.

It's the false positives that the AI will flag that worry me. Because once you are flagged, that's it.


* Look it up...
 
Regarding iCloud. We have to trust Apple's word the NeuralHash is only activated when logged into iCloud, but since it's happening On-device, it may be running at all times.. Even if it's not, since the system is already in place, it can probably be activated by means of a simple software update..

On macOS. It should be much easier to identify if the process is running or not. I also wonder if disabling SIP will allow people to forcefully remove it. Plus what that would effect.
 
Well, I don't trust Apple. But I don't want to use Windows so I stopped using iCloud and decided to maintain some kind of trust that my Mac is not spying on every file that I have on it. I really try to be not very paranoid but you never know what it's going on in the background. I also started using open-sorced software more because it's more transparent. Once the trust is broken it'll never be the same.

That's why I decided back in 00s to use Little Snitch on most Macs I have boughten sense!
 
Well that's the thing. The NeuralHash code has already been found in iOS 14.3, supposedly just not activated by Apple, but that might change in iOS 14.7.2 but there's no way to know for the end user.
This is mighty scary. I did not know this at all… Hmmm. I have kids myself and anything sexual to do with kids is abhorrent that goes without saying.

I’m not into anything dodgy at all, but one has to ask, if they can scan for certain types of pictures, obviously as a first step, (and this is a very emotive subject), then they can scan for anything. This ventures into extremely dodgy territory… Coming from a supposedly Privacy Centric company, this is really low. I’m vehemently opposed to this.

Film Camera. Develop and print your own.
Problem fixed...

The person I feel sorry for is the designer of dolls, you know, the plastic figures that go into the molds before they are dressed.
And the teacher who has a collection of life-like baby robots that are used to teach teenagers what it is really like to look after babies.
And the parents and grandparents who put scans of 'baby in bath' photos from the 50's and 60's up on FaceBook.
And the Paediatrician writing a book on baby diseases, including nappy rash and all its causes.
And the lawyer writing a book on FGM* with photos.

AI cannot, or ever will be 'Just Right'. It will either be too sensitive and bring up false positives, or not sensitive enough and miss illegal stuff.

It's the false positives that the AI will flag that worry me. Because once you are flagged, that's it.


* Look it up...
These are all very good points. When you say, “once you are flagged, that’s it”, what do you mean? Is it a visit from the Law or something? Please excuse my ignorance. What a terrible journey Apple are about to embark on…

No. I wont be installing Monterey again or buying a new M whatever mac again for now.
No, nor will I. I have a MBP from 2012 that got its last update with Catalina and my Mac Mini is compatible but I certainly won’t be upgrading.

That's why I decided back in 00s to use Little Snitch on most Macs I have boughten sense!
Agreed, I’ve been using this for years on my Macs. Since about 2008 ish I reckon. However, even this has been hobbled recently I believe. Something to do with it now not having access to certain parts of the OS? Sorry, I used to be more technical with the OSes but my days are mostly spent in the Cisco world these days.
 
Film Camera. Develop and print your own.
My worries are not with CSAM. It goes without saying me and 99,99% of people can agree upon the fact the spreading of CSAM is wrong must be fought within the rules of the law, yet still protecting the privacy of the innocent!
The person I feel sorry for is the designer of dolls, you know, the plastic figures that go into the molds before they are dressed.
No, I do actually believe Apple's algorithm is smart enough it can distinguish between those sort things and actual CSAM. The human verification step they've added furthermore eradicates any room for this sort of error. I do firmly believe that.

My worry is the backdoor this is creating. With the NeuralHash technology, I really believe Apple has created a monster. Now it's CSAM, next year it's '' Potential for Terrorism''.

If I look at my own country: The Netherlands, shortly after completely abolishing the Referendum, our government implemented a law called ''Sleepwet''. This makes it completely legal for our government to track our location and keep track of all our online activity. Of course they say this is for predicting terrorism early, but how does an AI or a government even distinguish between potential for terrorism or just thinking different?

Not too long ago, more then 50% of people were against the COVID measures. Today there are still groups of people against vaccination. In a liberal society those kinds of ways of thinking should be able to co-excist next to each other, but today this is actually no longer happening. Different ways of think are already being blocked from Social Media.
Why is this? Are governments trying to prevent the spread of misinformation or does it assume thinking different is a precursor to terrorism?

With NeuralHash Apple has created, where does it stop? We are already under online surveillance all the time, but now the surveillance can potentially extend to on-device! Meaning we now have ZERO privacy, and de facto ZERO ownership of our devices and data!

I think that's very scary! Not because of CSAM scanning, but because of NeuralHash and it's potential!
 
Last edited:
Trust, not really, but I have no choice since I need it for work and alternatives aren't any better. Windows is probably worse and Linux is a server os.

The development is worrisome to say the least, the tech is there so it will be abused 100%.
To be fair, systems haven't been save for a very long time with plenty of back-doors to get all of your data. However, the average end-user wouldn't be much of a target I'd imagine.

If you make a successful operating system or computer-chip, sooner or later the USA comes knocking and demands a backdoor. I believe Japan tried to make an OS back in the eighties and it was shutdown by the USA. It's extremely unhealthy that the whole world relies on operating systems from Apple, Microsoft and Google, which are all USA companies. I'm surprised the rest of the world, especially companies with trade secrets, are ok with this. However this is another discussion..

And if you'd ask me, this isn't about terrorism or child abuse, this is about making sure that the people who control the world, stay in control. And no revolution will ever emerge, unless it's beneficial to those in control. /tinfoil
 
  • Like
Reactions: hxlover904
With Apple's CSAM software and Apple bringing on-device scanning to the Mac with MacOS Monterey, will you still trust your Mac?
Can't "still trust" something you don't have, but we have permanently canceled plans to acquire Mac desktops for my wife and I. Instead I'll build us a pair of Linux desktops. I'd hoped to avoid having to do that as I simply don't care to do it any more. But it is what it is.
 
I have zero reasons to mistrust Apple so far. They have been fairly transparent about how they do things and I consider their image as an „evil company“ to be a product of baseless paranoia. I don’t remember them being involved in anything I’d consider dubious practices, unlike companies such as Microsoft, FaceBook and Nvidia.

That said, I strongly disagree with their plans of introducing CSAM technology and I think they would be making a big mistake if they go forward with it. If you want, I trust Apple but I distrust this specific technology. In fact, I wrote a lengthy email on this very topic to Tim Cook the same day Apple announced their plans and I was one of the people who signed various petitions.

At any rate, I am happy that Apple has listened to the concerns and delayed their plans to roll out the technology. I hope they will find a better solution for these issues.
 
Personally I’ve divided up the things I “trust” Apple with.

Anything I don’t want leaked I’ll back up manually and keep separate from my Apple devices.

Other stuff that I have, yeah I think it’s fine to keep on my devices. They can scan my photos for cars all they want.

And now that I really thought about it, any potential “illegal” activity that I may have committed. Going through my phone is the worst attack vector. I’m not dumb enough to take photos of illegal **** and I say dumb stuff in public all the time. The only potential thing of interest that I have on iCloud is a bunch of dumb memes and gunsmithing instructions.

Honestly there’s easier ways to spy on me than scanning my photos. And anything potentially embarrassing I’ll airgap from now on.
 
  • Like
Reactions: Euronimus Sanchez
Film Camera. Develop and print your own.
Problem fixed...

The person I feel sorry for is the designer of dolls, you know, the plastic figures that go into the molds before they are dressed.
And the teacher who has a collection of life-like baby robots that are used to teach teenagers what it is really like to look after babies.
And the parents and grandparents who put scans of 'baby in bath' photos from the 50's and 60's up on FaceBook.
And the Paediatrician writing a book on baby diseases, including nappy rash and all its causes.
And the lawyer writing a book on FGM* with photos.

AI cannot, or ever will be 'Just Right'. It will either be too sensitive and bring up false positives, or not sensitive enough and miss illegal stuff.

It's the false positives that the AI will flag that worry me. Because once you are flagged, that's it.


* Look it up...
This is somewhat of a false scenario as they have to be matches of known child abuse photographs on an existing database.
Your kid in the bath isn’t a child abuse photo, nor is a plastic doll and the computer isn’t even looking at the photo, but the hash.
The false positives you hear mentioned aren’t due to a photo that could be construed as an abuse photo, rather arandom hash that may match.
 
Film Camera. Develop and print your own.
Problem fixed...

The person I feel sorry for is the designer of dolls, you know, the plastic figures that go into the molds before they are dressed.
And the teacher who has a collection of life-like baby robots that are used to teach teenagers what it is really like to look after babies.
And the parents and grandparents who put scans of 'baby in bath' photos from the 50's and 60's up on FaceBook.
And the Paediatrician writing a book on baby diseases, including nappy rash and all its causes.
And the lawyer writing a book on FGM* with photos.

AI cannot, or ever will be 'Just Right'. It will either be too sensitive and bring up false positives, or not sensitive enough and miss illegal stuff.

It's the false positives that the AI will flag that worry me. Because once you are flagged, that's it.


* Look it up...
You do not understand the CSAM hash database. This is not AI-based.
 
I'll say it, I have no CSAM material, or even any "adult" material at all, yet I find on device scanning EXTREMELY troubling -- to the point of not using any device that has it incorporated. No, they wont catch me at anything with it as it is currently, but next version, we'll see, governments can be quite demanding if they know a company can do something.
 
Last edited by a moderator:
Now that Apple has decided to "reevaluate" their CSAM approach, I think the best course of action is to wait and see how they address the criticism. However, as a Mac user, I've recently been looking into Linux distros and potentially building a PC, if I think Apple's reach into my computer goes too far. I'd prefer not to leave the platform, I've been pleased with it since I switched from Windows in 2005, but I don't approve of any on-device scanning that isn't specifically related to the function of the machine.

I don't use iCloud Photos, I don't even keep personal photos in general, but that's not my concern. I don't want anyone snooping around inside my computer, particularly on behalf of the government. I understand that anything that goes out into the cloud is beyond my control, but that's not acceptable for my personal physical property. There will likely be ways to disable such scanning on the Mac, but it won't be sanctioned by Apple, and that's still unacceptable.

Also, I generally dislike it when the government gets involved in regulating the tech industry. Even though the recent legislation that's slowly making its way through the U.S. Congress probably doesn't impact the Mac, which is the only Apple product I care about, I had considered writing to my Representative and Senators to oppose that legislation. After this debacle over privacy, if world governments go after them, then Apple deserves it.
 
I still trust my Macs...but all are disconnected from the Internet, and--even if that weren't the case--I don't see Apple releasing any updates for System 7.5 or OS 9.1. LOL
 
I currently use Linux. At times, I've considered going back to a Mac. Recently, those M1 Macs have been tempting. But I'll be sticking with Linux. The only way I'd get a new Mac now is if I had some need that Linux couldn't meet--but my primary personal computer would remain on Linux.
 
Anyway, I happened upon this guy earlier. Very interesting. Those devices not connected to the internet may still need to worry due to the mesh networks being created.

 
Last edited by a moderator:
  • Like
Reactions: Hessel89
Also, I generally dislike it when the government gets involved in regulating the tech industry.

Frankly, the government has to regulate the tech industry, but this must come in form of legislations that protect customer privacy and personal data, not violate them. Otherwise you get a Facebook and Cambridge Analytica phenomena where personal data is used as a manipulation weapon. Unfortunately, politicians worldwide are utterly incompetent when it comes to tech.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.