Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

irjo

macrumors newbie
Jun 12, 2018
8
14
Ciudad Olivo, Johto
I will still using iPhone because of privacy matters compare to Android (unless you use CyxOS which is great) is better but I'M NOT using iCloud for Photos and Messages anymore.

So basically best solution (for now) is iOS + third party apps. Instead of using first party apps from Apple.
 

robjulo

Suspended
Jul 16, 2010
1,623
3,159
"if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity."

How exactly are they going to do this part? Are they just going to manually compare hash numbers? How is manually comparing hash numbers going to "make sure it is a correct match".

Surely they will not be looking at actual photos of CSAM material.
 

neuropsychguy

macrumors 68020
Sep 29, 2008
2,379
5,653
Pretty sure they're not reporting to law enforcement. Don't think we have seen any reports out. At this point, it's a false accusation.
You say that like it's a good thing. There are ways to scan and combat without breaking encryption.

Besides, some reporting is occurring.




The issue is most of the material is hosted in the EU (so Apple's actions won't affect that; however, it's a start: https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/659360/EPRS_BRI(2020)659360_EN.pdf
 

sashavegas

macrumors regular
Jul 11, 2018
112
77
As usual, nothing going to happens. People will talk about it for maybe a week and forget it. Everybody will update iphone to 15, and new iPhone will be 15 by default and everybody will buy it as usual. Till the next time, when apple add something more, and again it will be critisied, created a buzz for a couple days, but nothing going to changed. As of today there is no any alternative to duo-poly and it will not be in any future. Apple will introdused more walled-guarden stuff on iPhone and mac, and it will be till the day, that somebody will create so huge mailware, that will sneak behind all walls apple bult, and even apple will not recognize it and than when damage will be so $$$, maybe apple change the tactic, but not now.
 

Ethosik

Contributor
Oct 21, 2009
7,797
6,714
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.

Yep. This is how you announce something but sneak something else. I’m not the conspiracy type. But let’s say it would not surprise me if Apple is doing something else but is under national security guidelines to not disclose it. So it’s just stated as helping the children. But, as I said in another thread, this doesn’t really prevent abuse or the creator of the image. Just someone who has utterly disgusting and sick issues that they just save a pre-existing image. They didn’t abuse the child or even create the picture. They are sick and disgusting and need help definitely. But I want to see the CREATORS get caught here. That will protect the kids.
 

mymacrumorsname

Cancelled
Dec 20, 2019
77
188
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.

Also: What if Apple's take on this would decimate child porn world wide like at about 50%? Wouldn't it be worth it?
 

boss.king

macrumors 603
Apr 8, 2009
6,140
6,887
I look forward to the millions of teenagers being arrested on child pornography charges. Finish turning the high schools onto prisons.
Read up on how the scan works. It's looking for known, circulated child pornography images, not scanning for every nipple and dick it can find.

EDIT: Apparently I was wrong. According to the Decoder podcast, the hash checking against known child pornography is only for iCloud content (although the actual checking will happen on your device). Your phone will also scan your iMessage images (and possibly camera roll, although that was confusing) for ANY child abuse material, although the metric by which they categorize that is completely unclear.

That's a little more concerning.
 
Last edited:
  • Like
Reactions: mhnd
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
Are you telling me you are OK with Apple snooping inside your iPhone and looking at your wife pictures?
 

ian87w

macrumors G3
Feb 22, 2020
8,704
12,636
Indonesia
No. They scan an iPhone if set up as a 0-12 in a family. Or when you use iCloud storage for the photo app AND are in the USA.

turn off iCloud and they don’t CSAM. But, if you upload to google or Dropbox, guess what they do?
The guy just said the hashes are coded into iOS15, and it is being pushed to all iPhones worldwide. The scanning is still done locally, but the phone won't transmit any matches to Apple unless you have US iCloud turned on, for now. So imo this is still a scandal for Apple to force a US-centric policy to everyone in the world, as the hashes and scanning are coded into iOS.

And that is basically proof of concept, to other governments that this mass scanning system exist on iOS. I think we can guess what other governments are starting to plan to "have a discussion with Apple."
 

4jasontv

Suspended
Jul 31, 2011
6,272
7,548
She was a huge celebrity, uploading her nude pictures to the cloud, and not having 2 factor authentication on.
that was a huge blunder for Apple, but that was easily avoidable.
So it was ok for people to access, sell, and distribute them because Apple messed up security, people really wanted to see her naked, or she didn't declare them private enough by having a password?

What part of that should make us feel better?
 

silver25u

macrumors regular
Sep 20, 2007
132
138
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
Apparently you don't understand the concept of principles, in this case the principle of privacy. Just because you don't have anything to hide doesn't mean you should be ok with others seeing/knowing your thoughts/files/photos/locations.

You also seems to be oblivious government or corporate entities abusing user data.
 

boss.king

macrumors 603
Apr 8, 2009
6,140
6,887
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.
This is a dumb argument. If you have nothing to hide then why not let authorities come do weekly inspections of your house or frisk you on the street or search your car? Why not get audited every day? Innocence and privacy are not mutually exclusive. While I feel that people are blowing this thing out of proportion (slightly, it does open the door to bad stuff in the future ), yours is far and away the worse argument.
 

Mac4Mat

Suspended
May 12, 2021
168
466
If Apple had lead with this interview, I think a lot of peoples' concerns would have been laid to rest from the very start.
I disagree. The fact Apple have had to go on the backfoot shows how foolish it was to proceed at all. They have blotted their copybook and whatever they say, whatever is suggested, this still comes down to SURVEILANCE and PRIVACY, a mainstay that has served Apple well in protecting against invasion of privacy and surveillance, but now it has shot itself in the foot so badly that even current court cases may be affected as they have virtually given their opposition a bat to hit themselves with and I can see companies and some individuals lining up to make the most of it, including Epic, Facebook, Elon etc. etc.

For me this is a red line they should never have crossed.

In fact I do not even believe their explanation as it makes no sense. They DO NOT have access to police files, they certainly do not have authorisation in many countries outside the USA to have that data or interfere in such a way.

It is not about safeguarding children, as in common with dictators and some governments, such matters are always initiated in the name of a good cause, but the fact is this is about SURVEILLANCE AND PRIVACY.

They have been forced to fire fight in a fight they should never be in! Its not their remit to be a global police force, or act as a servant of whatever State their equipment is in.

When it comes to their own privacy on potential leaks, Apple jumps on organisations and people from a great height!

Likewise the explanation that its designed NOT to do something has never stopped such things happening and where its irrelevant to the argument, as this is SURVEILLANCE in anyone's book, and it will affect PRIVACY, something Apple previously rightly used to differentiate itself from others it took fire at over lack of privacy.

Its rather akin to a Minority Report situation. I can only reiterate my first post about this when the news came out.

First they came for the 'suspect' Children's Pictures
And I did not speak out
Because I was not a Child Abuser

Then they came for the 'suspect' Adult pictures
And I did not speak out
Because my pictures were not those

Then they came for 'suspect' Animal Abuse pictures
And I did not speak out
Because I was not an Animal Abuser

Then they came for 'suspect' Law Breakers
And I did not speak out
Because I was not a Law Breaker

Then they came to control Everyone's Data
And there was no one left who could speak out
Not even Me!
 

The Phazer

macrumors 68030
Oct 31, 2007
2,997
930
London, UK
Doesn't really address any of the important points.

Nothing on if NCMEC are trustworthy to partner with, with their history of attacking consenting adult sex workers, the lack of independent verification of their accuracy, their unaccountability outside of the US and their previous lies to congress, not to mention their appalling attitudes as shown by the emails.

No clear question on "will Apple comply with a court order to use the iMessage scanning function with other content, yes or no." (Not launching the feature outside of the US doesn't protect from this in the slightest. They've still built it. It still exists).

No clear question on why Apple hasn't published technical information on the nudity detection on iMessage. Nothing on it's failure rate.

No clear question on how the iMessage reporting function won't be abused by people falsely setting up child accounts in the case of domestic violence.
 

ThunderSkunk

macrumors 68040
Dec 31, 2007
3,814
4,036
Milwaukee Area
Read up on how the scan works. It's looking for known, circulated child pornography images, not scanning for every nipple and dick it can find.
Already did. But from the other end, as the images circulate and proliferate across the web, they get added to the database. The biggest distributors of child porn are kids themselves. The content they entertain each other with is adults problem to live in the world with, as we try to “protect” them from their own sexuality. Ridiculous. Hopefully this backfires in an appropriately extreme way.
 
Last edited:
  • Like
Reactions: Philip_S

CobraPA

macrumors 6502a
Mar 12, 2011
733
175
Lansdale, PA, USA
I have not broken a single law. Matter of fact, I work with law enforcement. (Police Department)

I just don't want Apple scanning my iPhone and going through my privacy. This is not the Apple we know. Apple is up to something.
Well, it's simpler than you think, I expect. Apple is liable if they store collections of illegal content in the Apple iCloud system, I suspect. Sure, they have a user agreement that says they are not responsible for what users store there, and users agree not to store illegal content there, blah blah. They still get served with dozens of warrants to get data out of iCloud. They do not have the keys to decrypt on phone data (Apple's says, and it's probably true). But we've seen that is not the case with iCloud backups. So I'd guess this move is to try and cut off all the child imagery warrants. If they can show a chain of blocking or detecting illegal child imagery on users devices before it goes to iCloud (or at least flagging those accounts), then poof, hopefully for them a whole class of warrant requests drops off.
It's more efficient for their data centers too, as image hashing happens before encryption, I'd bet. So they can check the hash any time and leave images encrypted on up the chain. Hash it when you run the initial AI pipeline on the images, done.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.