Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Will the entire Apple ecosystem made to be like this? Like permanent inspections happening inside all end user devices?
Yeah. When you "buy" an apple product you're no longer the owner of it. You're basically just paying rent for the privilege of using these arrogant bastards' boutique products.

As much as I love my current mbp, I can't in good conscience buy another product from this company.

Also, while this laptop still does its job, I will never upgrade past Catalina. It's still on Mojave right now.
 
Last edited:
That is correct but a physical Apple employee (that is a human being) will be reviewing the pictures once it gets flagged. That does not sound like a AI to me.

View attachment 1818083
And think about this. An Apple employee will see surely every picture of our little children. And yes many parents take pictures of thier babies having a bath. How is protected their privacy in this way?

I think I'm going just low tech back to the early 2000s they won't stop with this.

Imagine this situation: China which have leverage enough over Apple to make them their puppet says: This is the picture hashes that you have to look for they are child porn, but in fact they are pictures taken in protests by opposition. Apple just located and sentenced to death their customer...
 
Apple's already been caught secretly recording and screening our private conversations, so why is everyone up in arms over THIS??

Siri is such a piece of **** I’m not sure why anyone would leave it on in the first place😀

I owe a special thank you to Tim Cook and Apple for making me pay attention to my privacy. Guess I won’t need that extra icloud storage anymore or a new iphone, iPad, and Mac cause I’m not upgrading to any new iOS or MacOS versions, neither do my wife or kids, there’s 12 devices off Apple’s future revenue . Hope many other people will do the same!
 
Last edited by a moderator:
This is a post found today in a German forum (text in German):

C37D9AB4-6602-442E-A02D-94472B8BE1A7.jpeg

It starts by citing specific items German law, and expecting problems with EU laws.

The gist of it, is that companies using iPhones as company phones already are raising eyebrows, and at least one “billion Euro class company” (my comment: presumably multinational) has already started amending company policies regarding company phones, according to the post.

The entry also comments on the continued usability of iPad’s for medical emergency personnel, where extreme privacy is required regarding photos of damaged body’s or patient wounds, where images are used for medical documentation purposes.

In short (my comment):

It seems that even if EU (and Germany) is not to be immediately included in the scanning of any or all content on any Apple product, it is unacceptable. Any suspicion, that “a flip of a bit/software-switch” - intended or not or involving a bug in firmware or error in distribution - will have repercussions for Apple sales. Starting “now”.

Apples problem is, that MCAS is official policy supported by official and public statements by Apple leadership.

IF this becomes ”standard approach” in Europe, Apple will feel the result in some form or amount on their balance sheet.

Personally I do not think it matters one bit now, should Apple decide to cancel the introduction of MCAS (highly improbable); the damage is done, and trust is lost. Nothing management does now, can remedy the loss of trust, unless the complete upper echelons of Apple management is forced to resign by stock holders. Maybe not even then.

Regards
 
Last edited:
Will the entire Apple ecosystem made to be like this? Like permanent inspections happening inside all end user devices?
I believe it will be. I think it will expand to scanning of iMessage text as well, for those who have iMessage in the cloud turned on. A lot of pedos send sick text messages to kids not just pictures. I think that will be the next step for Apple and NCMEC.
 
I don’t think That was my argument. I was admittedly trying to usher the discussion away from a racial discussion, as I don’t really see this being a racial issue. Of course this is easy for me to say as a white guy, so I do appreciate the perspective . My mileage certainly will vary as I lack the experience to fully relate. And I’m not saying that I am right either 🤠


I don’t get why people have say that their opinion is lesser. It really makes it look like they are saying, “I’m white, therefore I’m a lesser person.” People should be able to state an opinion without a qualifying statement.

You don’t have to roll over and show your belly because of being insecure.
 
  • Love
Reactions: CasualFanboy
I don’t get why people have say that their opinion is lesser. It really makes it look like they are saying, “I’m white, therefore I’m a lesser person.” People should be able to state an opinion without a qualifying statement.

You don’t have to roll over and show your belly because of being insecure.
Exactly. It's all lies in the first place but regardless, it's just not in my blood to submit.
 
If Tim Cook cares so much about stopping CSAM (he doesn't), he should start with members of congress, rather than random Macbook owners.
 
  • Like
Reactions: ian87w
Then that hash can be changed by image manipulation.
Do pay attention. The hash function employed is specifically designed to be resistant to basic image manipulation. If you put a kiddie abuse image into photoshop and convert it to a photo of the Eiffel Tower that would obviously yield a different hash. But then it wouldn’t be a kiddie abuse image, which is the whole point of having an identifying hash.
 
Do pay attention. The hash function employed is specifically designed to be resistant to basic image manipulation. If you put a kiddie abuse image into photoshop and convert it to a photo of the Eiffel Tower that would obviously yield a different hash. But then it wouldn’t be a kiddie abuse image, which is the whole point of having an identifying hash.
It is software scanning an image.

It is software, decididing what resembles (!!) something bad.

Apple - of course - always releases perfect software without any bugs. Ermmm…. In the exceptional cases, ermmm… where bugs crop up, they are immediately removed. Ermmm…. Sometimes. Or not. Whatever the reason.

iOS is currently at version 14.7.2.

You can view this is quick reaction on Apples part, but you could also ask the question: Why so many updates, down to few days apart? In less than a year. Software, that has been through several public beta tests, and hopefully internal Apple quality control and code vetting. Or a sign of sloppy on bad programming, where fixing one bug, introduces a new or many new bugs, to be fixed later?

What to look for is based on what someone, you don’t know, has asked (forced?) Apple to look for. That Apple legally cannot verify; in many countries the thought of Apple being allowed to verify anything not heavily “redacted” would be regarded as hilarious.

In other words, YOU also have to blindly trust, that everything released to Apple has been produced consciously and thorough with the best of intentions - without errors or willfull malice.

Do you believe, that all… ermmm… distinguished members of US Congress would even qualify for such a feat in matters far less critical for the well-being of countless innocent people around the world?

Will Apple even want or be able or even be allowed to keep a public tally of unwanted “side effects”? Would you want to be the target of a ‘side effect’?

Criteria you do not know, code you have no insight to - you or competent programmers cannot, ermmm, are not allowed to verify the quality, the reach and the basic workings of the code, as it looks now.

This code will be made available for use by third parties: presumably vetted (or not) by Apple. If governments are involved, Apples vetting procedure may turn out be just “Yessir. Will do!”, where some - if not all - governments are involved. They can - maybe - protest, but in most cases this will not necessarily have any effect. Public protest on Apples part on technical matters unknown to anyone is a no-no. Unless Apple leaves that market, Apple does, what the authorities demand. Law varies a LOT between countries, and especially ermmm…. ”interpretation” leave governments a LOT of “wiggling room”.

If you get flagged in the end, how will you be treated? Depending on who you are: color, religion and - especially - wealth and influence. The latter two is the only universally mitigating circumstances.

Will treatment even be equal in all states or counties of the US?

What about treatment in the roughly 200 countries of this world? Some have dramatically different views on human rights and how suspects can be treated.

Innocent or not.

Note: I have - until now - not even mentioned the issue of “privacy”, which you may choose to volunteer to give up. Others may not.

Regards
 
Last edited:
It is software scanning an image.

It is software, decididing what resembles (!!) something bad.

Apple - of course - always releases perfect software without any bugs. Ermmm…. In the exceptional cases, ermmm… where bugs crop up, they are immediately removed. Ermmm…. Sometimes. Or not. Whatever the reason.

iOS is currently at version 14.7.2.

You can view this is quick reaction on Apples part, but you could also ask the question: Why so many updates, down to few days apart? In less than a year. Software, that has been through several public beta tests, and hopefully internal Apple quality control and code vetting. Or a sign of sloppy on bad programming, where fixing one bug, introduces a new or many new bugs, to be fixed later?

What to look for is based on what someone, you don’t know, has asked (forced?) Apple to look for. That Apple legally cannot verify; in many countries the thought of Apple being allowed to verify anything not heavily “redacted” would be regarded as hilarious.

In other words, YOU also have to blindly trust, that everything released to Apple has been produced consciously and thorough with the best of intentions - without errors or willfull malice.

Do you believe, that all… ermmm… distinguished members of US Congress would even qualify for such a feat in matters far less critical for the well-being of countless innocent people around the world?

Will Apple even want or be able or even be allowed to keep a public tally of unwanted “side effects”? Would you want to be the target of a ‘side effect’?

Criteria you do not know, code you have no insight to - you or competent programmers cannot, ermmm, are not allowed to verify the quality, the reach and the basic workings of the code, as it looks now.

This code will be made available for use by third parties: presumably vetted (or not) by Apple. If governments are involved, Apples vetting procedure may turn out be just “Yessir. Will do!”, where some - if not all - governments are involved. They can - maybe - protest, but in most cases this will not necessarily have any effect. Public protest on Apples part on technical matters unknown to anyone is a no-no. Unless Apple leaves that market, Apple does, what the authorities demand. Law varies a LOT between countries, and especially ermmm…. ”interpretation” leave governments a LOT of “wiggling room”.

If you get flagged in the end, how will you be treated? Depending on who you are: color, religion and - especially - wealth and influence. The latter two are the only universally mitigating circumstances.

Will treatment even be equal in all states or counties of the US?

What about treatment in the roughly 200 countries of this world? Some have dramatically different views on human rights and how suspects can be treated.

Innocent or not.

Note: I have - until now - not even mentioned the issue of “privacy”, which you may choose to volunteer to give up. Others may not.

Regards
 
And think about this. An Apple employee will see surely every picture of our little children. And yes many parents take pictures of thier babies having a bath. How is protected their privacy in this way?

I think I'm going just low tech back to the early 2000s they won't stop with this.

Imagine this situation: China which have leverage enough over Apple to make them their puppet says: This is the picture hashes that you have to look for they are child porn, but in fact they are pictures taken in protests by opposition. Apple just located and sentenced to death their customer...
This is 100% false. You didn't read anything about this before posting did you? If you did, you would know how this works and that no one will be looking at pictures of your children.
 
I don’t get why people have say that their opinion is lesser. It really makes it look like they are saying, “I’m white, therefore I’m a lesser person.” People should be able to state an opinion without a qualifying statement.

You don’t have to roll over and show your belly because of being insecure.


Very odd concern of yours over how I hold myself. Thanks Dad, but Im not the insecure one.🤠
 
Last edited:
  • Like
Reactions: UltimoInfierno
I wonder if they're gonna brag about the lowest adoption rate of a new version of iOS...
It would be nice if this persisted. But the consoooomerists' attention span and capacity to even have principles much less stick to them, are almost nonexistent. Most of the people complaining about this today will be posting "Apple, just take my money!!!1" by the time a new marketing announcement comes out.

That's why nothing ever changes with corporations like Apple. They can just lol and move on to the next "innovation."
 
It would be nice if this persisted. But the consoooomerists' attention span and capacity to even have principles much less stick to them, are almost nonexistent. Most of the people complaining about this today will be posting "Apple, just take my money!!!1" by the time a new marketing announcement comes out.

That's why nothing ever changes with corporations like Apple. They can just lol and move on to the next "innovation."
User name checks out. (I kid, I kid...) That might be, but it won't be me doing it. I am done.

Also, let's be clear. Apple has made this easy by not really being innovative in a software sense for a decade--everyone has caught up.
 
  • Like
Reactions: CasualFanboy
Let me ask this. Since this is only currently legal in the United States, if that…

and this is not turned on for phones from Europe or other countries.

what happens if I come from Italy, into United States with an image that may not be tagged in a European database but is tagged in an American.

or VSVS when the eventual role out in different countries happens, I assume they will have different standards hashes images etc., unless this is one giant worldwide database singularly pulled from

honestly curious, how will that work. If it is illegal to have a homosexual image in country “a” and the person comes from country “b” where it is legal visiting, will they get flagged? Arrested? Will it be reported to their own jurisdiction or the local law?
 
  • Like
Reactions: eltoslightfoot
Let me ask this. Since this is only currently legal in the United States, if that…

and this is not turned on for phones from Europe or other countries.

what happens if I come from Italy, into United States with an image that may not be tagged in a European database but is tagged in an American.

or VSVS when the eventual role out in different countries happens, I assume they will have different standards hashes images etc., unless this is one giant worldwide database singularly pulled from

honestly curious, how will that work. If it is illegal to have a homosexual image in country “a” and the person comes from country “b” where it is legal visiting, will they get flagged? Arrested? Will it be reported to their own jurisdiction or the local law?
My guess would be that they will check based on the locale of your Apple ID account. If you are using an US Apple ID, then any uploads to iCloud Photo will be hashed checked, regardless of where you are in the world. If you are using a non US Apple ID, your iCloud Photo uploads will not be hashed checked even if you are physically in the USA.
 


Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.

apple-park-drone-june-2018-2.jpg

According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Ever since its announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.

Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.
An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.

Article Link: Apple Employees Internally Raising Concerns Over CSAM Detection Plans
Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
 
My takeaway from this as a father of a son -- can the algorithms tell the difference between 13 and 16 and 18 year old bodies? This is going to affect teenage boys more than anyone else. C'mon they are hormone factories. I'd pay thousands for a photo of moments with my first G/F in high school (and we were both jail bait, but she was 6 months older.) and that was 39 years ago. That's not being gross, we're just humans.

If. I had a photo from 39 years ago of my lovely first g/f's nubile body -- am I now a pedo? So many variations of grey in this.....
Looks like a lot of us are Pedos now. https://www.cnn.com/2021/08/25/ente...irvana-nevermind-lawsuit-scli-intl/index.html
 
Constitution of United States of America 1789 (rev. 1992)

Amendment IV

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

The constitution only limits what the government can do. It has no power, and does not concern, businesses.
 
  • Like
Reactions: hoodafoo
Well, they are using both actually. They're using CSAM to look for those pictures, AND they're going to be implementing AI photo recognition for people under 18 sending nudes. This article has only been about the CSAM portion, BUT there's also the "Family plan" stuff, where if a person under 18 sends what Apple considers a nude, it asks them if they REALLY want to do it, and if they say yes, it sends a copy to their parents.

Only if the parents turn that function on.
 
Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
It’s funny how gung Ho Apple is in immediate implementation of this, as if it’s perfect and presumed bug free. Meanwhile, their own advertised features like universal control are still held behind. Clearly there’s a different intention behind it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.