Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And what's stopping another country from building a database of photos they find troubling and telling apple "you must detect matches of these, you have to comply"?
I only care about what happens in the US. How other countries interact with their citizens/subjects and companies doing business there is solely their concern not mine. They have their own culture, forms of government and social expectations - they can take care of themselves, they don't need busybody Americans telling them how to run their society based on what Americans think is the "right way" to do things. Cultural imperialism at its finest doing that sort of thing. If they want to have Apple look for stuff they don't like, that is between them and Apple and Apple will conform to the law of any country they operate in and do what they are legally told to do.

What is happening in the US is concerning enough. There are lots of things that are "problematic" that lots of people believe should be removed from society. Child porn is just one of those things and the easiest sell right now. Others will follow and Apple will gladly lead the way if we don't push back now.
 
I am hoping that there are far more details and explanations of what Apple is doing on device, and in the cloud for this feature before it is activated or officially offered to consumers. I get what they are trying to do, but for some there is a huge creep factor attached to this type of service / feature.
There is. You just didn’t read it. Apple has documented both these systems in great detail, even including a technical paper describing the exact algorithms for the CSAM detection (which we are not talking about here).

You can search for "Sunsets" and "Beaches" in Photos, maybe Apple is also keeping track of other types of images at the same time.
It’s on device algorithms. If you don’t trust Apple, buy another device.

Weird prudish Apple clearly have no idea about cultures outside the USA.
I’m pretty sure proliferation of nudes of children isn’t a USA problem. Children don’t always think things through, and there has been many cases of nudes leaking.

And just like Apple must have worked on CSAM for a long time without telling anybody,
You know, apart from the very extensive documentation they released for it.

They can search it all, they own all data on the icloud and wants to own all the data on your device.
Fearmongering.

Vulnerable people will die as a result of this,
Silly hyperbole. How do you propose that will happen?

Apple won't issue the wording. The wording will be "we have a warrant to search your premises" or "you are under arrest" issued by the cop who comes to the door of the person who uploads it.
So unlike when these features were announced followed by detailed documentation?

I understand wanting to filter material for children, but that really should be the parents’s responsibility.
Of course, but they will have no way of controlling that. Besides, for older kids, this is nothin but a warning to the child.

Nahh, all of the hatred came from the CSAM matching hashes of your personal photo library.
Well, the personal photos that you choose to store in iCloud Photo Library.
 
  • Like
  • Love
Reactions: kalsta and Dwalls90


Update: We've learned from Apple that Communication Safety is not a feature that is going to be released in the iOS 15.2 update, despite the presence of the code that we found. Apple also does not plan to release the feature as it is described in the beta, and there is no word on when Communication Safety will be introduced.



Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the iOS 15.2 beta that was released today. This feature is distinct from the controversial CSAM initiative, which has been delayed.

iphone-communication-safety-feature-arned.jpg

Based on code found in the iOS 15.2 beta by MacRumors contributor Steve Moser, Communication Safety is being introduced in the update. The code is there, but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child.

As Apple explained earlier this year, Communication Safety is built into the Messages app on iPhone, iPad, and Mac. It will warn children and their parents when sexually explicit photos are received or sent from a child's device, with Apple using on-device machine learning to analyze image attachments.

If a sexually explicit photo is flagged, it is automatically blurred and the child is warned against viewing it. For kids under 13, if the child taps the photo and views it anyway, the child's parents will be alerted.

Code in iOS 15.2 features some of the wording that children will see.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also block this person.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is helpful if it was incorrectly marked as sensitive.
  • Message a Grownup You Trust.
  • Hey, I would like to talk with you about a conversation that is bothering me.
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It's not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share it. How would they feel knowing other people saw it?
  • The person in this might not want it seen-it could have been shared without them knowing. It can also be against the law to share.
  • Sharing nudes to anyone under 18 years old can lead to legal consequences.
  • If you decide to view this, your parents will get a notification to make sure you're OK.
  • Don't share anything you don't want to. Talk to someone you trust if you feel pressured.
  • Do you feel OK? You're not alone and can always talk to someone who's trained to help here.
There are specific phrases for both children under 13 and children over 13, as the feature has different behaviors for each age group. As mentioned above, if a child over 13 views a nude photo, their parents will not be notified, but if a child under 13 does so, parents will be alerted. All of these Communication Safety features must be enabled by parents and are available for Family Sharing groups.
  • Nude photos and videos can be used to hurt people. Once something's shared, it can't be taken back.
  • It's not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you send this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone-it may never go away. It can also be against the law to share.
Apple in August said that these Communication Safety features would be added in updates to iOS 15, iPadOS 15, and macOS Monterey later this year, and iMessage conversations remain end-to-end encrypted and are not readable by Apple.

Communication Safety was also announced alongside a new CSAM initiative that will see Apple scanning photos for Child Sexual Abuse Material. This has been highly controversial and heavily criticized, leading Apple to choose to "take additional time over the coming months" to make improvements before introducing the new functionality.

At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.

Article Link: Code for Communication Safety Found in iOS 15.2 Beta, But Won't Be
Go ahead, Apple; bow to China and Russia, the Philippines and the Arabic Middle East. That’s where this can conclusively go, with those and other countries determining which of your technologies they can use to inhibit and prohibit of their citizens. By all means, use the United States citizens as your lab rats for insidious spying and control of their data—-and lives.
I survived the first 45 years of my life without a smart phone. I guess I can go the next however-many without one, if this genesis of Dark Ages 2.0 gains traction. AARP keeps telling me I need a Jitterbug. Maybe I do. Maybe we all do.
 
I’m pretty sure proliferation of nudes of children isn’t a USA problem. Children don’t always think things through, and there has been many cases of nudes leaking.
It is mostly a US problems as the US has a major cultural aversion to nudity and thinks it harmful to kids. Depictions of violence, on the other hand, are perfectly fine for kids to be exposed to. Other cultures don't get upset with nudity and don't consider nude images harmful to kids, but do object to depictions of violence.

Go ahead, Apple; bow to China and Russia, the Philippines and the Arabic Middle East. That’s where this can conclusively go, with those and other countries determining which of your technologies they can use to inhibit and prohibit of their citizens. By all means, use the United States citizens as your lab rats for insidious spying and control of their data—-and lives.
I survived the first 45 years of my life without a smart phone. I guess I can go the next however-many without one, if this genesis of Dark Ages 2.0 gains traction. AARP keeps telling me I need a Jitterbug. Maybe I do. Maybe we all do.
Apple will bow to the legal demands of any country they do business with. I worry about the US. I expect that there will be no escape from the surveillance, and anybody who blatantly attempts to do so will be identified and strongly encouraged to get with the program.
 
Apple clearly never went to a primary school in the pause, and just listened to kids public conversations.
It can be shocking!
Their conversation is often just ************************************************
They aren’t all so innocent as Apple think they are.

In return that’s a pretty good indicator how few sets of user data Apple has actually analyzed for this :D
 
  • Like
Reactions: dk001
You’re all afraid of this so-called slippery slope, as though Apple doesn’t already have the power to spy on you if they wanted to! How about pausing for just a moment, using your own head (instead of just parroting what everyone else says), and taking a look at what this technology actually does… because from where I’m standing, it’s helping to address a huge problem and in a pretty sensitive way. Young people have ended their lives over online bullying and the sharing of nude images. But don’t worry yourselves about that… go on, grab your popcorn, or rant and rave about how unfair this is for you, you poor, poor entitled iPhone owners.

(For the record, I also support the far more controversial CSAM technology, but I’m not talking about that here. I’ve already posted at length about that elsewhere.)

Take us back to the old weapons of mass destruction nonsense when one is either with us or with the terrorists.
It’s all cyclical.
 
  • Like
Reactions: dk001
There is. You just didn’t read it. Apple has documented both these systems in great detail, even including a technical paper describing the exact algorithms for the CSAM detection (which we are not talking about here).


It’s on device algorithms. If you don’t trust Apple, buy another device.


I’m pretty sure proliferation of nudes of children isn’t a USA problem. Children don’t always think things through, and there has been many cases of nudes leaking.


You know, apart from the very extensive documentation they released for it.


Fearmongering.


Silly hyperbole. How do you propose that will happen?


So unlike when these features were announced followed by detailed documentation?


Of course, but they will have no way of controlling that. Besides, for older kids, this is nothin but a warning to the child.


Well, the personal photos that you choose to store in iCloud Photo Library.
You must be right, and 9 others wrong! ?
 
I’m for a license(like a driver license) to allow giving birth, that would safe many children from incompetent parents.

That’s quite the challenge to test for these parental skills that you may or may not have at the time your teenager kid runs into these situations.
 
  • Like
Reactions: kalsta
Lol iOS 15 thought they dodged a bullet disable your auto update.
It wont auto update anyways. The way its set is iOS 15 will show up as OTHER updates under settings--->general---->Software Update

You have to select it now.
 
This should serve as a reminder to all that Apple not only has the ability to find child pornography, but any sort of pornorgraphy.

Perhaps they already do...

You can search for "Sunsets" and "Beaches" in Photos, maybe Apple is also keeping track of other types of images at the same time.

The CSAM Detection System doesn't have such a general ability to find photos of any category.

The algorithms in the Photos app has had this capability for several years if not a decade. And yet, people are more worried about the CSAM Detection System.
 
“Sensitive photos and videos show the private body parts that you cover with bathing suits.”

Weird prudish Apple clearly have no idea about cultures outside the USA.

This makes sense in the context of a young kid reading it...if they are young enough they probably won't understand what's so "sensitive" about certain pictures. Apple obviously understands world cultures, but they also have to try for something general here...in most English speaking nations this is a pretty good compromise, beachwear is the minimum people consider for coverage. In some nations, "sensitive" includes many other body parts beyond the bathing suit.

It also looks like Apple is responding to, or expecting, legislation to control the spread of child porn through electronic devices. This looks like it's geared towards keeping kids from sharing images, and gives parents notifications if kids are doing things like that on their phone. I think the reasoning behind this seems like a good idea, especially if the law starts demanding it...and it's a good bulwark against legal maneuvers to put in backdoors to Apple products if they can show an effective way of deterring these activities. Implementation is key of course, we'll have to wait and see on that one.
 
By all means, it is the CHILD‘s decision as to whether or not they should allow themselves to send nude selfies. Yes of course, parents are no longer part of the equation. SMH This isn’t really the solution, Apple but then again you are not truly interested in solving this anyway. Your own productions on Apple TV push the idea that children should be chasing ill behaviors anyway (Foundation Ep.1). If you put stuff out there that children can be influenced by then don’t pretend on the other hand that you are now the arbiter of morality. Fools.
 
  • Like
Reactions: dk001
In addition this would be only available in the messages app, so all parents need to do is to make sure that your kids are not using any other messaging app. Easy!
 
Of course, anyone with half a braincell understands what they meant, but it's just a signal that says "this person is below average intelligence, don't even bother arguing with them".
Half a brain cell would be non-functioning.
Right. Look the US government is very flawed, but I’m not cynical about them abusing something like this. This feature probably won‘t affect any US citizens except actual child sex offenders.
Maybe you should be. Angry parents show up at school board meetings, and now the government wants them investigated as domestic terrorists? I rest my case.
But I’m looking at the bigger picture. I can see the governments of Russia, Iran, Saudi Arabia, etc. abusing this technology.
No government is above this.
Nice! I'm all for this feature. Protecting children is important.
Yes it is, but I caution you...one day you'll disagree with your government, and that night, maybe they'll come and arrest you and your wife in the dead of night, and send your kids to a mental rehabilitation clinic.

No proof you say? Who needs proof when you have that much power? You just create the proof!
I understand wanting to filter material for children, but that really should be the parents’s responsibility.
And this is the key point here.
All of the wording seems pretty helpful for kids. Only pedos will disagree.
There it is: The tyranny of only 2 choices. Pick one OR the other; no discussion needed and no disagreement will be recognized.
Whoever opposes hides something?
Not me. I oppose because I prefer freedom and liberty.
Once they can detect it, what's to prevent them from planting it?

Before long, people who are of one political party, or one particular religion, for example, all suddenly get arrested because they all got the same image on their devices.

Why fight for your opinion when all you really have to do to win is eliminate your opposition by having them jailed. Or put to the firing line or the hangman's noose.

Nazi Germany did that. Humans have not evolved THAT much in the decades since then.
 
Last edited:
The "backdoor spyware" that sends information nowhere except an autogenerated notification to a parent. Right. Even for this conversation, you are a shining star of having no idea what you're talking about.
It is a backdoor but keep believing the lie Apple's selling
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.