Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
HELP! Big question I even hesitate to ask. What is the problem with the M1? I and our family are long-time Apple users (over 25 devices) and feel totally betrayed by this. Speaking out against it everywhere. Even today, on the weekend, I am in the process of de-Appling our family and business as much as possible. Signing out of iCloud and resetting devices to sell. It will take a few weeks to complete though (we were all in). However, with our business, we use Macs, including a 2020 M1 MBA. I was planning on sticking with it on Catalina for a few months until we can move to Windows (which also has the software we need). Is the M1 chip part of the problem here, even signed out of iCloud? Thanks in advance.
Nope, the M1 macs aren't more subject to this new feature than any other Intel-based once either type is updated to Monterey. Staying on Big Sur for the time is indeed the temporary solution you're looking for.
 
Oh please. Stop that crazy conspiracy nonsense.

As much as I loathe Apple‘s spyware plans, let‘s keep our feet on the ground here
My feet are on the ground. There are multiple angles to this story, and the fact that Big Tech is integrated to the political scene by setting the tone and suppressing information and freedom of speech is well documented.
Not some "conspiracy nonsense". There are ongoing legislative motions towards The Big Tech in USA and EU.
But feel free to think what you like.


 
Last edited:
Federighi and Apple have essentially undone decades of trust with this move. What they're admitting to is they can, and might, violate our right to privacy without our permission or knowledge. Once the software is on our phones we are vulnerable to whatever Apple, and quite possibly, the government decide to do with this new capability. Will Federighi's or Tim Cook's successor keep their promise of no snooping? What if the next Trump-thumping autocratic administration decides it wants to see what people are doing and saying on their phones? There is a nightmare of possibilities just waiting to happen here. You're doing it wrong, Apple. Stop!
If you're so worried with the latest development, why are you not questioning whether Apple is already doing this (i.e. surveillance) with iOS 14 and earlier? What is Apple's motivation for doing this?

If they really want to snoop on iOS/macOS users, why would they openly announce their CSAM work to the world? They could have just pushed it as part of iOS 15 and kept quiet about it, and nobody would be any wiser. Do you really think Apple couldn't foresee this push back?

Apple already have the capability to identify all sorts of materials from the devices they designed since at least 10 years ago, and it has improved year over year since.

I'm of the view that Apple is planning to implement E2EE for iCloud Photo as this development will make server side check redundant. If the checks are done at the server end, there's no way E2EE can be done.
 

This interview is emotionally charged, but factually inaccurate. Apple are not snooping for any “potential child abuse material” as the presenter says, instead scanning for the images already identified as such and entered into the CSAM database. For iMessages, this is an opt-in parental feature for children below 12 years of age. The presenter is It might be worth rewatching Craig’s interview with WSJ:


Apple’s own paper might be helpful too:


As you will see and hear Apple’s announced system is very clever. The possible issue is that once it is in place it can be potentially used for a much wider range of issues, even though Apple currently assures us they would resist any such abuse.
 
Last edited:
  • Like
Reactions: MacAddict1978
If they really want to snoop on iOS/macOS users, why would they openly announce their CSAM work to the world? They could have just pushed it as part of iOS 15 and kept quiet about it, and nobody would be any wiser. Do you really think Apple couldn't foresee this push back?
If they knew this push back would come, why did they announce it in such a shambolic fashion and why was their communication plan and recovery plan so bad? My grandmother could have done better and she has been dead for 25 years.
 
If they knew this push back would come, why did they announce it in such a shambolic fashion and why was their communication plan and recovery plan so bad? My grandmother could have done better and she has been dead for 25 years.
And of course everyone in the "spin business" knows news you hope will disappear is always dropped Friday afternoon ... In the hopes is doesn't survive to Monday.
 
If you're so worried with the latest development, why are you not questioning whether Apple is already doing this (i.e. surveillance) with iOS 14 and earlier? What is Apple's motivation for doing this?

If they really want to snoop on iOS/macOS users, why would they openly announce their CSAM work to the world? They could have just pushed it as part of iOS 15 and kept quiet about it, and nobody would be any wiser. Do you really think Apple couldn't foresee this push back?

Apple already have the capability to identify all sorts of materials from the devices they designed since at least 10 years ago, and it has improved year over year since.

I'm of the view that Apple is planning to implement E2EE for iCloud Photo as this development will make server side check redundant. If the checks are done at the server end, there's no way E2EE can be done.
I have not been able to get an answer to this, but I'm pretty sure this is already been done by every cloud system, every social media site already? The only change is Apple is create the hash on your phone and not on their servers.
 
  • Like
Reactions: MacAddict1978
If Apple is smart, Craig’s interview will be the last you hear about this from them.
If the public are smart, they will keep bringing up this spyware issue on every platform (digital and otherwise) they are on. Fighting against wrong doings does not stop just because one side decides to go silent on the issue.

Also I really hope someone sues Apple for this. Apple can't be silent when cross examined under oath.
 
aaand here we go with the slippery slope crap again. I guess that's all you guys have when you can't produce any evidence that Apple is doing anything wrong.

👋
The evidence is here for everyone to see. Apple quietly added Spyware to iOS a few years ago and only now is the cat out of the bag and Apple is trying to justify their PR and security nightmare. Apple know what they are doing is not in the users best interests and do not care.
 
When the cops break down my door and sieze my computer to examine its content, my declaration that I didn't know what was there will be laughingly ignored and I will be successfully prosecuted. If I find it first and delete it, I don't have a problem. If I know it is there and do nothing, like Apple is doing when told they are uploading child porn, I am breaking the law.
The difference is the cops hand you a warrant first.
Apple is purpously avoiding the accountability checks and balances that warrants provide with their spyware.
 
My mind percolates slowly...And likely with many fallacies and misconceptions.

However, if I understand Craig's so called validation correctly, Apple is content with a user having 30 examples of CSAM on their device. Does this mean that happen has determined that 30 examples are only a mistake, or a passing fad? It would seem to me that if Apple was as serious as they claim, then a single example of such a terrible vice would be reported immediately...Or does this mean the technology is NOT as perfect as Apple claims it to be?

And for the iMessaging "feature", it would seem to me the messages are no longer end to end encrypted. If software can scan a message for "information" in this case sexually explicit content, this means that "someone" has the ability to peek into the message....

Or am I missing something?


I think the 30 threshold is about false positives.
 
Thank you for your response.

So, the system effectively makes 30 mistakes, meaning thinks there is CSAM, decides it is wrong and then notifies Apple?

No, If it happens at least 30 times, then then feel it’s probably correct. So they check.

The fact that they are allowing up to 30 flags, though, before checking, to me, is an admission that the system WILL make false positive flags, or they are not certain. This whole thing is a mistake and pathetic.
 
As you will see and hear Apple’s announced system is very clever. The possible issue is that once it is in place it can be potentially used for a much wider range of issues, even though Apple currently assures us they would resist any such abuse.
Apple cannot assure denying national law. What is legal and just for us may be subject for prosecution elsewhere. There only needs to be a simple law as requiring existing systems to assess all illegal activities. This may very well include political view, sexual orientation, and more.
 
Where are we supposed to store our photos, then?
Until they get it so that deleting a pic of my phone doesn't delete the cloud backup, Im uploading all my porn to google photos. I could git a **** about them scanning the 1's and 0' to match pedophile filth.

If they really wanted to create great technology, they would make iPhone explode when they identify pedophiles. The Note 7 just set fire to innocent folk.... Apple is never first but always better. Come on Craig. Blow up some pedos
 
  • Haha
Reactions: flowsy
They know that the "screeching voices of minority" will not affect iPhone sells. This is their model from several product cycles. They are "thinning the heard", removing rationally thinking minorities from the user base. The lack of media coverage proves that this move is well synced with some deep political agenda or future legislation which will benefit Apple. Next week nobody will talk about this.
Life goes on.
It's all the freaking news has talked about. It's in my every newsfeed... on CNN, WSJ, MSNBC.... I can't say if it's on Fox or Newsmax.... assuming not from your tin foil hat manifesto.

1629080863579.png
 
  • Like
Reactions: PC_tech
No, If it happens at least 30 times, then then feel it’s probably correct. So they check.

The fact that they are allowing up to 30 flags, though, before checking, to me, is an admission that the system WILL make false positive flags, or they are not certain. This whole thing is a mistake and pathetic.
Thank you. That was exactly what I was implying and waiting for others to write - if it needs 30 attempts, then it seems to me the system is NOT as full proof as Apple claims.....
 
I have not been able to get an answer to this, but I'm pretty sure this is already been done by every cloud system, every social media site already? The only change is Apple is create the hash on your phone and not on their servers.
Everyone does do this already, but they scan ALL OF YOUR PHOTOS. Apple is matching an incomplete set/partial set of 1's and 0's (the beginning strings of all the kiddie porn in the database) on your phone when the device stats to upload to iCloud.... when it hands off to iCloud that cloud server is matching the other half and if both halves match in that literal second, then, and only then, does your identifiable info flag. And if that happens.... you're a freaking pedophile. If you're not a freaking pedophile, it will never happen. They also don't report you unless you have a rather large sum of kiddie porn matches (30.... which is kind of a lot of kiddie porn).

Facebook, Google, Microsoft, and every other photo site looks at all your pics daily and reports you for kiddie porn. Apple's solution lets them not ever look at your photos at all unless you're a pedo.

I don't get why people, especially tech people, are so upset over this unless they didn't read/listen/watch the clarity on how it works.

And the texting feature....how is it bad to make sure you're 11 year old isn't sexting the universe or that some pervert isn't sexting his junk to your little kid?
 
Thank you. That was exactly what I was implying and waiting for others to write - if it needs 30 attempts, then it seems to me the system is NOT as full proof as Apple claims.....
Well, maybe the reason is Apple is giving their users the benefits of the doubts and also it is not their job to nab pedos? They are legally required to ensure that their servers are not used for storing illegal contents, and they have to show that they are taking steps in ensuring that happens. Otherwise they will land in hot soup for not doing what the law requires of them.
 
Well, maybe the reason is Apple is giving their users the benefits of the doubts and also it is not their job to nab pedos? They are legally required to ensure that their servers are not used for storing illegal contents, and they have to show that they are taking steps in ensuring that happens. Otherwise they will land in hot soup for not doing what the law requires of them.
That is a valid point, I guess. However, as you noted, the information should not be on their servers and consequently, there is already language in EULA for this purpose. Apple should simply scan the servers....

This is a back door, combined with technology that reads chats on the phone....Remember, even friendly countries such as Australia, albeit a member of the "5 Eyes", has a law that clearly empowers ministers to compel firms to retrain an existing surveillance system on different images, vastly expanding the scope of Apple’s proposed snooping.

What I write next will upset many: I completely agree child pornography is terrible. However, it it demonstrably worse and impact more people than say traffic accidents and resulting fatalities, about 35,000 - 40,000, and injuries annually? Should the government not be demanding the tech, GPS and other companies modify existing systems to report speeding and erratic driving? And should we the public not be accepting when our iPhone reports excessive speed to the relevant authorities.

If this were to happen, I believe that many more people would be upset as it impacts a larger percentage of the population. There would be a greater outcry of invasion of privacy. Yet, as noble as the fight against CSAM is, most people are untouched by the horror. And this is why I believe many see this as the "thin end of the wedge" for additional tools "to ensure YOUR safety" in the not too distant future.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.