Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
what if they don't own a Mac/pc for backups? 🤦‍♂️😂
Love your reply. Smile.

I own both Windows and Apple Notebooks. Both Android and Apple smartphones. All are connected in my network, and to the outside world via a router.

I have a choice. I've just made a policy shift.

My next step is to isolate all Apple gear in a separate network behind one router, only allowed to https to the outside world. Access to all known outside Apple IP's will be blocked inside this "Apple isolation ward". Internally Apple gear will only be able to use my NAS DHCP and DNS. Plus access ONLY via RTP to the Windows desktop becoming my new main machine. After a transition period, this will no longer be necessary.

I have an extra, safe router, that is coming online one of these days, and that router will enforce the global rules for all systems, but Apple is no longer allowed to access my network directly. The router/firewall responsible for controlling the Apple environment now use a shared SSD, and this will in essence be the data exchange platform for files to be used on my coming main system and the able "walled garden". Most content will be readonly, and one shared directory "write only" from the Apple side. After a transition period, this SSD will loose its use, and be moved to other uses.

If Apple plan of opening up the walled garden, that's fine with me. Apple stays within MY "Apple isolation ward" and is not allowed outside (like insane and Covid-19 deniers and other carriers of pestilence ;-)

No need to believe, that Tim Apple has the muscle - or even will (Apple income wise) - to stand up to any tin-pot dictator or ruffian in power in the majority of the more than two hundred countries of this world. And if he can't, nobody is safe in the long run. MCAS is the IT-equivalent of Covid-19 before spreading and before any vaccine. Believing that MCAS or Covid will not "expand" and not "mutate" into a more dangerous monster, is worse than naive).

I just assume, that he won't. He certainly can't, when push comes to shove. What's the difference. For me?

I'll just have to use a few hours over the next couple of days or weeks to get things done my way, but I'm not forced to invest any extra money. I might even learn something new, when fortifying my systems to fit new dangers.

When Apple gear gradually gets decommissioned - due to old age, non-support or one too many new Apple bug ("Apple Altzheimer" syndrome is ripe on M1 gear with Big Sur - in my native language "Sur" - spelled exactly the same - litterally means "grumpy" or worse ;-) The "Darwin Rewards" are simply removed from the "Apple isolation ward", and sent to the eternal bit-wrangler pastures in a casket.

Properly decorated after disinfection by blow-torch ;-)

By the way: Thanks to Tim Apple for the "heads up"!
 
The interesting thing to wonder is whether it is only detecting known CSAM pictures (already seen and catalogued in said database) or if it is scanning the content of pictures and determining if nudity/infantile attributes exist and raising alarm bells…

One would be a privacy-protecting hash comparison system that detects photos that are already known to authorities. The other would be the screening of individual photos and detection of features deemed to be so (including original pictures) and would be a less affective but more grave violation of the privacy values espoused by Apple.

I still haven’t seen a clear answer on this, only ever that pictures currently known and catalogued are detected, but nothing about original content that also contains CSAM.
Well, if you asked me, Apple would not spend time and resources in activities that does not gain them any benefits. IMHO, this CSAM on-devices hash is to pave the way for them to go forward with E2EE for iCloud Photos. Keep in mind that all photos uploaded to iCloud Photos is already being scanned for CSAM materials today. So the on-device hash is another implementation of the same thing that they are doing today.

It doesn't provide Apple any benefits that I can see for them to scan user device contents, as they are not in the business of selling user data. They may do it to improve user experience but those currently stays within devices.

With this CSAM implementation I don't see this changing.
 
the whole point of this isn’t to find and prosecute child abusers, it is to get a system in place for monitoring everything we own digitally. by saying they are going after paedophiles they have chosen a subject that it is very hard to argue against, witness the posts earlier saying you must be a paedo if you are against it. so the public will think it is all a good idea, until their door gets kicked in for a meme about transgender people or whatever their government takes offence at.
you never know quite how slippery the slope is until you are in it and it is too late, and if you give power to anyone they will ALWAYS abuse it
Sorry but this is tin foil hat level paranoia and why I can't take you naysayers seriously. You've turned Apple in to Big Brother in your imagination and trying to pass it off as truth. You're doing nothing to help your case and everything to make yourself seem like a lunatic.
 
Last edited:
As per Apple's technical whitepaper:
But semantic similarity means a similarity of content, e.g. are both images showing similar people doing similar things.
See my post.


abb1-35945c104cd19df9.png


Subtle manipulations are sufficient to trick a neural network. For example, the latter classifies a stop sign with a high degree of certainty as a speed limit of 120 km/h because barely perceptible noise has been added to the sign.

Why should this not happen with the Apple solution?
Thus, it could happen that private and sensitive information ends up at Apple due to false detection, such as a business secret.
 
Constitution of United States of America 1789 (rev. 1992)

Amendment IV

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
 
Yep, how long until there is a UK court order to add copyrighted content to the database?
Actually, if there is already a law in UK requirement cloud providers to scan for such contents, it is already being done. If there's no law requiring this, no UK court can demand this.
 
Or how about just not storing stuff on icloud? Every month i just transfer pictures from my phone to computer.

I imagine the Mac will be subject to scanning too. If mission creep has any meaning, Apple will sooner than later be pressed to scan non iCloud loaded photos.

i will note that from personal experience, in past year’s iOS updates iCloud Photos was mysteriously activated. Even with PB5, I noticed that long deselected iCloud alias email addresses had been switched on on each of my devices.

I have nothing that an Apple scan will find, but what concerns me greatly is how this tech will be abused and expanded.

Apple crossed its own red line.

No assurances they give anymore can have any credibility.
 
The door already opened in 2015 when the technological capability came about with the launch of iCloud Photo Library; just because Apple haven't actively used that technology up to now doesn't have any bearing on the potential for a government to have compelled them to use it at any time. If a government says to Apple "we know you have access to people's photos and the technology to scan them for pictures of XYZ", Apple can't just turn round and say "we could, but we don't yet offer that feature, sorry".
You clearly missed what my entire point was. When this is implemented, as in- IT IS ACTUALLY IN USE- then it cannot be undone. Stop trying to simplify it. Try to at least understand what someone is saying before you try to reply with a retort.
 
I think the feature definitely has slipper slope problems. But, it shouldn't be looked at in a societal vacuum. Laws already seem to be coming in the US, UK, and EU that take a sledgehammer to encryption. Apple may think this feature will work around said laws, but it remains to be seen.

One other note. Ben Thompson said the feature shifts the balance from capability to policy. I disagree, since it's always been a policy decision. If the government wants to scan everything, they can pass laws forcing Apple to do exactly that. See iCloud in China.
 
I think we all can agree that child porn is unspeakably evil. At least all sane people.

But I also wonder if this doesn't open up some unintended consequences which we don't foresee right now. There have been many instances in the past that something was created to do good, turned out perhaps not being used for it's intended purpose.

And I hope their AI is better than Facebook. I once took a picture of a street sign in my town called "Hemp RD." Facebook flagged it as "a post that encouraged drug use." Seriously? I took a picture of a real sign and that's what their automated flagging system recognized it as and removed it from my feed. Nevermind that hemp isn't a drug.

I hope Apple thinks long and hard about this and future consequences.
 
Apple says that their CSAM solution uses an AI perceptual hash called a NeuralHash. They include a technical paper and some technical reviews that claim that the software works as advertised. However, I have some serious concerns here:
  1. The reviewers include cryptography experts (I have no concerns about the cryptography) and a little bit of image analysis. However, none of the reviewers have backgrounds in privacy. Also, although they made statements about the legality, they are not legal experts (and they missed some glaring legal issues; see my next section).
  2. Apple's technical whitepaper is overly technical -- and yet doesn't give enough information for someone to confirm the implementation. (I cover this type of paper in my blog entry, "Oh Baby, Talk Technical To Me" under "Over-Talk".) In effect, it is a proof by cumbersome notation. This plays to a common fallacy: if it looks really technical, then it must be really good. Similarly, one of Apple's reviewers wrote an entire paper full of mathematical symbols and complex variables. (But the paper looks impressive. Remember kids: a mathematical proof is not the same as a code review.)
  3. Apple claims that there is a "one in one trillion chance per year of incorrectly flagging a given account". I'm calling ******** on this.
 
  • Like
Reactions: Pummers
So from what I understand, Apple takes the image hash and compares it to an image hash database of known ‘blacklisted’ images. Now I haven’t looked into the CSAM feature too deeply, but there’s many ways of changing an image hash without ‘changing’ the image itself. For example, you could use steganography to make one simple change to the image and this would change the hash.
If you wanted to watch and own CSAM content, would you take the risk of modifying each pic and then upload it to iCloud anyway? No, so it definitely will work as a deterrent. Unfortunately, they will simply move elsewhere.
 
  • Like
Reactions: Frustratedperson
Slow boiling the frog.

They're starting with something that's universally reviled, then they'll add other stuff gradually.

Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:

"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"

People like me? But the funny thing is, I've had people telling me we're on this slippery slope for more years than I care to remember but these "evil thing that 'they' will do next" just don't seem to happen. I guess that frog must be on a really slow boil.

I can't argue against what you think might happen next with CSAM as that's your personal perception. I just don't share that same view. Dismiss me as a naïve fool if that helps.
 
I really hope Apple cancels their plans for this. I really want to buy the iPhone 13 mini and upgrade to iOS 15, iPadOS 15, and macOS Monterey, but I won’t with this backdoor installed and ready to be abused.

Honestly they likely have been doing this in the background for years same for other companies. Privacy is no longer something that’s possible on smart phones
 
It’s remarkable that many people in this sub don’t understand stranding for principle. They don’t understand how Apple is sneaking surveillance into your device using the thin end of the wedge. If they’re not stopped now it’s the end of privacy on Apple devices. And no, it’s nothing to do with pedophiles, it’s everything to do with privacy.
 
There is simply no excuse for this to be implemented. As soon as this is well publicised (everybody I know, who uses Apple products, knows about it), the scum who do this kind of thing will just move their pics offline or to Google drive etc. The few do wrong and we all lose our privacy, privacy that is a “human fundamental right” according to Apple.
I’ve already started the ball rolling, moving from Apple products to Google Services/Windows PC and Samsung phones. I’ve just ordered a Z Fold 3, ordered parts for a computer, sold my 2020 iMac 5k , sold my 2020 iPad pro, sold my pair of
HomePods, sold my AirPods Pro, selling my iPhone 12 Pro Max when the fold comes on Aug 27th. I will not be spied on by Apple or anybody. From now on I will keep all my private stuff (pics of my kids, documents etc) all offline and just drag whatever I CHOOSE to share, to the desktop from my offline drive and share it that way.
Stupid decision by Apple as they have just lost a life long fan (one that has spent multiple tens of thousands over the last 20yrs).
 
I doubt Apple would do e2e encryotion on icloud. If they planned to do it, they would've made some noise in WWDC because that's their marketing strategy. E2e encryption on icloud would be a huge marketing opportunity.

The current Apple had no problem announcing things for future releases, even if it turned out not happening (AirPower), for marketing hype.
Selling a wireless charging pad for iPhones is marketing. Selling a product to help millions of people hide evidence from law enforcement is politics.
 
If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason.

View attachment 1818079
That's not how it works. Known child pornography photos are hashed. Photos uploaded to iCloud are hashed. If too many photos from a single account have a hash matching known child pornography then actions are taken. Being uniformed causes ignorance.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.