Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Moving forward, Kids won't use Messages, they'll migrate to WhatsApp OR whatever, to get out from under Apple's control.

With that group, Apple will have lost one of the key iOS features that makes iOS "sticky," Messages !

NOT saying they will migrate away from Apple products, but Apple will have lost some control over them.

Some Analysts think Messages is the #1 reason Apple has a sticky eco-system.

I don't agree, I think it's a lack of viable & credible competition.

This is the type of misinformation that people read and then love to spread, because it's written by "trusted, knowledgeable, informed" members of the tech community. :rolleyes:
 
  • Like
Reactions: FindingAvalon
So the extent of this is the parents can enable a feature on their <13 yo’s device that will scan pictures sent to them in a text message. If that picture is nudity the child’s device will display a warning message. The child can then choose to view or not view. And that is it?
 
Your wireless carrier already knows exactly where you are at any given moment, and will give this information to your government at a moments notice. Any website you visit will happily turn over this information as well, when requested to by law. And your VPN is probably lying to you.

What OS you run on your phone matters absolutely nothing in this regard.
Huh? Location privacy is a whole other battle, and one that has been largely lost, as you say. This feature implements on-device image scanning, just like the previous one about CSAM. I'm not sure why, but it seems that sentiment about on-device scanning around here has done a 180 since then. I'm still here feeling like personal devices should be, you know, personal.
 
  • Disagree
Reactions: FindingAvalon
It seems this scans for nudity to tells the recipient that it might contain nudity. No informing takes place.


This isn't doing anything that existing infrastructure in iOS has already been capable of doing for over a decade. I guess you just don't know what software is and how it works.
Sorry man, you did not read/understand the article, and I guess you never developed an app.
„by messaging a trusted grown-up in their life.“ - So the basic functionality is there. And I only said that Apple is moving into this direction, and it does.

It doesn‘t need to change infrastructure. I think the next move of Apple consists in changing the Appstore guidelines. Every messenger/social app/you name it will have to implement it, otherwise it will become an 18+ app. Simple as that.

Anything else is completely useless? The children are using snapchat, whatsapp, telegram, instagram - ohnman do you know the amount of nudity in insta? So this feature is completely useless until every app implements it and Apple knows about this, too.
 
  • Disagree
Reactions: FindingAvalon
The children are not under Apples control, they are under their PARENTS control, rightfully so.. read the article!
This is assuming the option is only enabled on child devices by the parent and not by default by Apple.

Good child protection always starts and ends with Apple having the tools available but the Parents having the 100% say over whether their child's devices have the features enabled or not. Also it should only be on child devices (via parental consent), not as added spyware that every Apple device is infected with.

CSAM fails this because the parents do not have a say, and everyone else also suffers. Also CSAM literally searches your information without a police warrant. Another reason that CSAM in its current form should NEVER exist on any Apple device.
 
  • Like
Reactions: FindingAvalon
So well, explain it to me ..-

I always thought that the framework scans for nudity and informs a third party. So you do say that this is not correct?

This framework is the foundation. At the moment it may only contact a family member. But a third party is a third party. The framework could be extended very easily to contact the authorities as well - it is just a tiny step away.



I guess you just don‘t know what software is and how it works.


Can you even read past a 3rd grade comprehension level?
 
The parents are more often than not, the offenders.

Apple are in the business of communication. Privacy is a big part of communication, as is nefarious activity. Wether it’s spam or other nefarious activity, communication companies have a responsibility.

Anyone else here let their ISP or email provider kick emails into their spam or junk folder?
Taken in the context of the article "parents are more often than not the offenders" means that they send inappropriate pictures to their kids? I don't think this is what you meant.

On the major point, though, your statement is too general. These are two different methods of communication.

This is specifically messages which are private communications. Just like mail. Just like phone calls. And should not be snooped by anyone without a warrant or explicit permission. E-mail is the same. And the spam filters for iOS and macOS run locally as well. You give permission to gmail and hotmail to scan your mail when you use their service. And what do they do with the data that they get from it? It isn't for your benefit.

I agree with you on public and posted communications like twitter, FB, etc. Those networks are different. Rather than direct communication to another entity these messages are posted for all to see. Owners of these services do have a responsibility to monitor and report on them and possibly more. In this type of communication companies do need to scan messages (and are also often the owner of those messages so its easier for them to make the case to do so).
 
  • Like
Reactions: steve09090
Taken in the context of the article "parents are more often than not the offenders" means that they send inappropriate pictures to their kids? I don't think this is what you meant.

On the major point, though, your statement is too general. These are two different methods of communication.

This is specifically messages which are private communications. Just like mail. Just like phone calls. And should not be snooped by anyone without a warrant or explicit permission. E-mail is the same. And the spam filters for iOS and macOS run locally as well. You give permission to gmail and hotmail to scan your mail when you use their service. And what do they do with the data that they get from it? It isn't for your benefit.

I agree with you on public and posted communications like twitter, FB, etc. Those networks are different. Rather than direct communication to another entity these messages are posted for all to see. Owners of these services do have a responsibility to monitor and report on them and possibly more. In this type of communication companies do need to scan messages (and are also often the owner of those messages so its easier for them to make the case to do so).
I mean, all this is doing is using the same technology that the people tab in Photos is using.
It’s not sending this information to anyone, it’s not alerting any authorities, it’s just using AI to put up a warning if you have this feature enabled, which most won’t.
Also a sidenote, most phone calls can easily be tracked, along with SMS text messages, and depending on the provider, emails as well. Not exactly the most secure form of communication.
I would trust iMessage with encryption enabled before I would trust Gmail or T-Mobile
 
Then it seems to have no substance.
It’s like broth with no meat or vegetables.
I don’t know, I can see where it could be useful.
There have been several stories of people getting extremely explicit photos from random phone numbers out of the blue, I could completely understand why putting a warning for children could be useful.
Obviously it’s not a 100% perfect solution, but that’s really not possible without actually invading privacy.
 
I mean, all this is doing is using the same technology that the people tab in Photos is using.
It’s not sending this information to anyone, it’s not alerting any authorities, it’s just using AI to put up a warning if you have this feature enabled, which most won’t.
Also a sidenote, most phone calls can easily be tracked, along with SMS text messages, and depending on the provider, emails as well. Not exactly the most secure form of communication.
I would trust iMessage with encryption enabled before I would trust Gmail or T-Mobile
With you 100%. And while you're right they can be it is technically illegal. For all that's worth. It was a legal/moral comparison I made, less so a technological one. And even that's funny. While one can't read the contents of a message the sender and receiver are fair game which is often enough for trouble.
 
I don’t know, I can see where it could be useful.
There have been several stories of people getting extremely explicit photos from random phone numbers out of the blue, I could completely understand why putting a warning for children could be useful.
Obviously it’s not a 100% perfect solution, but that’s really not possible without actually invading privacy.

Thank you. I agree with you and your time to help me understand is much appreciated.
 
  • Like
Reactions: Lounge vibes 05
Sorry man, you did not read/understand the article, and I guess you never developed an app.
„by messaging a trusted grown-up in their life.“ - So the basic functionality is there. And I only said that Apple is moving into this direction, and it does.
You see, when one reads a sentence, words within the sentence matter. For the sake of reading comprehension, you do not simply take atomistic segments out of complete sentences in order for it to fit your view. Take the complete sentence that you broke apart, for example: "Apple also presents children with ways to get help by messaging a trusted grown-up in their life." This does not say that Apple messages a trusted grown-up, but presents a child with ways to message a trusted grown-up. Furthermore, reading comprehension requires context not just at the sentence level, but at the paragraph, and for multi-paragraph trains of thought, at the full article level. If you continue reading down a bit further, you'll come across this:
When Apple first described Communication Safety in August, there was a feature designed to notify parents if children opted to view a nude photo after being warned against it. This has been removed.

If a child is warned about a nude photo and views it anyway, parents will not be notified, and full autonomy is in the hands of the child. Apple removed the feature after criticism from advocacy groups that worried it could be a problem in situations of parental abuse.

Hopefully that helps you get an "A" on the next reading exam!
 
Huh? Location privacy is a whole other battle, and one that has been largely lost, as you say. This feature implements on-device image scanning, just like the previous one about CSAM. I'm not sure why, but it seems that sentiment about on-device scanning around here has done a 180 since then. I'm still here feeling like personal devices should be, you know, personal.

The Photos app has done on device scanning for over a decade.

Anti-virus software has done on-device scanning since the 90ties.
 
  • Like
Reactions: FindingAvalon
This is assuming the option is only enabled on child devices by the parent and not by default by Apple.

Good child protection always starts and ends with Apple having the tools available but the Parents having the 100% say over whether their child's devices have the features enabled or not. Also it should only be on child devices (via parental consent), not as added spyware that every Apple device is infected with.

CSAM fails this because the parents do not have a say, and everyone else also suffers. Also CSAM literally searches your information without a police warrant. Another reason that CSAM in its current form should NEVER exist on any Apple device.

The CSAM Detection System would have searched your device with your approval when installing the OS or enabling iCloud Photo Library.
 
So well, explain it to me ..-

I always thought that the framework scans for nudity and informs a third party. So you do say that this is not correct?

This framework is the foundation. At the moment it may only contact a family member. But a third party is a third party. The framework could be extended very easily to contact the authorities as well - it is just a tiny step away.

Apple removed the functionality where it would inform the owner of the Shared Family setup. It's in the article!

Apple could very easily insert code which sent all iMessages to random persons if they wanted to also! Apple is in almost full control of most of their own software running on the device. That's why you have to trust Apple to an extreme degree if you are using their software and hardware.
 
The Photos app has done on device scanning for over a decade.

Anti-virus software has done on-device scanning since the 90ties.
Look, let's not quibble about words here, all kinds of software "scans" files. This is different because of the intent, to monitor and limit the behavior of the user. The same slippery slope there for CSAM exists here too.
 
  • Disagree
Reactions: FindingAvalon
So they’ve developed the technology and had planned to enable by default etc, following a backlash they’ve paired back the deployment and made it Non trivial to access and scrutinise, so it’ll be on everyones phone (likely already is) but only a vastly smaller subset will be able to enable it and it’ll go largely unnoticed by the majority.

the notification function is the bit that alerts users that the scanning feature is there but in all likelihood the feature could be enabled, scanning in the background, just the reporting is disabled by the options, we would be none the wiser but apple would get that telemetry data to help the, refine the process for a later date when they can then tweak it and try and market it as a feature again.
it’ll of course be used as a stepping stone to their mandatory scanning of all your photos For csam.

we all want to stop csam, but I don’t want to be spied on like this. We pay taxes for police services to stop anti human behaviour like csam, we don’t pay apple to do that, but I’m happy for apple to provide tools to police services to help in their efforts to combat these things, but it should be focussed and targeted at the criminals not everyone including the innocent law abiding citizens.
 
  • Like
Reactions: arobert3434
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.