Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

GhostOS

macrumors regular
Mar 25, 2022
110
385
I work in email marketing and the unnecessary Apple Mail privacy changes are causing havoc. So unwarranted and a detriment to the customer - all Apple want to do is make life difficult for other tech firms.

I find this advert misleading - it implies we as a business sell on info about a customers email engagement, unhelpful fear mongering.

We don’t.
Nor does any other company.
Why!
It’s worthless data.
When you have millions of customers in your CRM program you aggregate the stats. It helps us understand as a channel how we are doing.

I look forward to Apple putting this ad up in traditional media so I can file a complaint with the ASA.
Good, incredibly happy to see Apple make things more private. You and your company have no business with my email and intrusive data collection no matter how “worthless” it might be.
 

hcherry

macrumors regular
Mar 27, 2012
125
390
I’d love to find out what of my personal data is floating around out there.

I would totally pay money to have someone scour the web, buy from data brokers, etc. to see what if me is out there.
 

gaximus

macrumors 68020
Oct 11, 2011
2,264
4,464
But, your phone was programmed by Apple to do that, so why is this not on Apple anyways? Another point is a company like Apple shouldn't even have suggested that "feature" in the first place.
Its not Apple because the its not being seen by Apple employees, saying "Apple scans your photos" sounds like they are looking at your photos. Its not, what's happening is your photos get covered to a hash (no way to reverse the hash and get the photo back). Then the hash is compared to other hashes that have been flagged as child porn. This all happens on the phone (maybe iCloud as someone mentioned). But the privacy is there, Apple isn't scanning the pictures, no one is seeing your pictures, and no one is selling your data.

As for iCloud scanning the photos, I'm not sure what legal trouble Apple could be in for hosting child porn, but I assume this is to cover their asses.
 
  • Like
Reactions: CarlJ

ipedro

macrumors 603
Nov 30, 2004
6,261
8,566
Toronto, ON
I work in email marketing and the unnecessary Apple Mail privacy changes are causing havoc. So unwarranted and a detriment to the customer - all Apple want to do is make life difficult for other tech firms.

I work in e-commerce and our company, like any other in our business, relied on this data to make our ad spend and our email marketing more efficient with higher conversion. And I say "relied" because we understood where Apple was going with this, agreed that customer privacy is important and that their data is their data and we reinvented how we do business and built a model that both respects their privacy and gives them the decision on whether we're providing value to them and gives them control over what they're pitched. It has worked wonderfully. Other companies we know of doubled down on the old strategy by trying to get around the restrictions and some of them no longer exist. Doing the right thing works out in the end, although it may have looked really scary for a while.

I get why you're upset. It sucks if you're losing money and your golden goose is laying ? instead of golden eggs. But companies and people in e-commerce complaining about privacy controls are simply upset about their own losses, but this is about the customer. If you want to appeal to the customer, give them real value, make them want to subscribe, make them want to click on the ad, make them want to tell you what they want, "make them tell you, and tell you again, until they're sick of telling you".


I find this advert misleading - it implies we as a business sell on info about a customers email engagement, unhelpful fear mongering.

We don’t.
Nor does any other company.

That's not what they're saying. They're not talking about you or your business selling your customer's email engagement. Gmail on the other hand does exactly what this ad is metaphorically demonstrating. Your Gmail activity helps to build a profile about your interests and that data is in fact sold to Google's customers (advertisers) in the form of providing access to you based on what that data determines are your interests. But you go ahead and file that complaint.
 
Last edited:

dk001

macrumors demi-god
Oct 3, 2014
10,723
15,067
Sage, Lightning, and Mountains
That's just hyperbolic. To be clear, I am not in favor of the program and there are legitimate criticisms of it. But it is not treating all customers like criminals. What does exaggerating like that get us?

If Apple is scanning my device for illegality and if suspected reporting into a Government law enforcement stream with out the Government having to procure a warrant that is under the guise I am a possible criminal until they show otherwise. Neither hyperbolic nor exaggeration.

btw - one of the original concerns was exactly that and this type of "surveillance" has never been challenged in court. If it was initiated it would be.

That's besides all the other crap that was wrong with that.
 

tazinlwfl

macrumors 6502
Jul 14, 2008
321
491
Florida
I work in e-commerce and our company, like any other in our business, relied on this data to make our ad spend and our email marketing more efficient with higher conversion. And I say "relied" because we understood where Apple was going with this, agreed that customer privacy is important and that their data is their data and we reinvented how we do business and built a model that both respects their privacy and gives them the decision on whether we're providing value to them and gives them control over what they're pitched. It has worked wonderfully. Other companies we know of doubled down on the old strategy by trying to get around the restrictions and some of them no longer exist. Doing the right thing works out in the end, although it may have looked really scary for a while.

I get why you're upset. It sucks if you're losing money and your golden goose is laying ? instead of golden eggs. But companies and people in e-commerce complaining about privacy controls are simply upset about their own losses, but this is about the customer. If you want to appeal to the customer, give them real value, make them want to subscribe, make them want to click on the ad, make them want to tell you what they want, "make them tell you, and tell you again, until they're sick of telling you".

Let's not forget, these email marketing services are not free. If you are getting the data from a free service, then the service is monetizing it another way. Handling thousands of contacts was my job too, and the amount of money that company would spend on Constant Contact, MailChip, etc, or a full blown CRM that can pull up salary data of donors... bleh, I'm glad I'm out of that. And I'm even happier that email privacy and ask not to track is causing havoc.
 
  • Like
Reactions: CarlJ

cupcakes2000

macrumors 68040
Apr 13, 2010
3,892
5,308
What I’m afraid is that I have tons and tons of family pictures from vacations over the years and my kids got in pools and beaches obviously and I’m scare that CSAM or whatever the hell is call would easily mistaken it for child P…n.
I think you need to look into how it works, because you have the wrong idea about it.
Apple quickly and openly really showed their ugly face when China simply said "We want all the data." Apple gave them data and the golden key faster than the speed of light. No fight to privacy at all. Smoke and mirrors.
Apple isn’t about to break the laws of any country it operates in. Imagine the uproar if it did that in whatever country you’re from.
 
  • Like
Reactions: CarlJ

cupcakes2000

macrumors 68040
Apr 13, 2010
3,892
5,308
Then you should educate yourself about how the mechanism works: it has a table of hashes of specific CSAM pictures that are already known to be circulating between pedophiles. If you have one or more of those pictures on your phone, it would throw up a red flag. So unless you've been been taking pics of your kids in the pool and uploading those pics to pedophile forums, you should have absolutely zero problems with this system mistakenly red flagging your pics. It's not looking at the content of the pics, it's looking for exact matches with pics already known to be bad.

(By the way, "CSAM" is the name used for the bad content itself, "Child Sexual Abuse Material", not for any of the various mechanisms that have been designed to detect it.)

They also have an entirely different mechanism, which can optionally be turned on, that looks at the content of pics sent to kids, using machine learning algorithms, and if, say, your 8yo daughter's iPad detects, say, a dick pic arriving in the Messages app, it'll pop up a message on her device only, saying something along the lines of, "this picture appears to show a sensitive part of the body, you may want to check with mom or dad before viewing it" - it doesn't report anything to the parents or to Apple, it doesn't keep the kid from viewing the picture, it just gives the kid an age-appropriate dialog box equivalent of the "NSFW" tag that adults might find on a pic or forum post.

Apple made a rather big PR mistake of initially talking about these two separate mechanisms on the same day, and people started conflating the two.

People also got upset that the CSAM detection mechanism was "scanning all their pictures" - putting aside that "computing a hash" is entirely different than what people think of when you say "scanning"... guess what, code in iOS is already scanning all your pictures (and has been, for years) in order to locate/tag human faces as well as pets/animals, objects, etc. - if you go into Photos and search for "dog" or "car", it'll show you pictures that contain dogs or cars. It can do that, at a reasonable speed, because is has already built an index, as the pictures came in.

And, again, the CSAM detection isn't "scanning" your pictures in that it isn't looking at an image and trying to figure out what it is (this is the part that people worry is going to incorrectly flag pics of their kids in the pool) - the CSAM detection only computes a hash (a checksum) for the picture and compares that hash against a table of hashes of already-known-to-be-circulating CSAM pics - your random pool pics are not going to be listed in that table (unless you've been uploading them to pedophile forums). People also complained "but it's doing the CSAM detection on my phone!" - well, yes, yes it is; Apple decided doing everything on your phone was better at preserving your privacy - the alternative would be to scan your pics once they're uploaded to iCloud (which would mean that your pics on iCloud could not be encrypted) - this is what many other services are already doing. And, for those saying, "well, but Apple shouldn't be doing any of this in the first place"... yeah, well, the government is working on making some sort of CSAM detection mandatory everywhere - so Apple devised the most privacy-protecting mechanism they could, to deal with that.
Be careful. The truth is often avoided here in favour of hyperbolic mistruths, it’s more headline grabbing. Stating the truth is something that seems to be frowned upon by those proclaiming to search for what they’re lead to believe is the ‘real’ truth. It’s akin to telling their doctor they know better because they ‘researched’ it on YouTube.
 
  • Like
Reactions: CarlJ

CE3

macrumors 68000
Nov 26, 2014
1,808
3,146
I remember reading a story on this very website about apps like Snap & Facebook still tracking users that had asked them not to. Of course I’ll still select “Ask App Not to Track”, but despite what this ad implies it is merely a request to the developer and nothing more.
 

Mr Fide

macrumors member
Aug 29, 2007
75
45
The disintegration of human beings is very disturbing. Sure, they are avatars for evil corporations, but it hurts to see people vaporised like that. That's what war is like, or genocide. Casual killing with no regard to human life. All those memories, ideas and hopes for the future, gone in an instant. That ad should have a viewer discretion warning.
 

Mick-Mac

macrumors 6502a
Oct 24, 2011
511
1,151
Just because Apple say they don't invade your privacy doesn't mean your data is private. The vast majority of my emails go to non-Apple accounts (gmail etc.) so right there those emails are NOT private. Heck, you can't even trust the ISPs along the email routing path to the recipient. Likewise, my texts mostly go outside the iMessage universe, so they're not safe either. I do appreciate what Apple are trying to do, but from a boolean perspective, we need an AND function instead of an OR function...
 

philosopherdog

macrumors 6502a
Dec 29, 2008
740
518
it's a good fight since their biggest rivals are in the data harvesting business. i think Apple should continue to push back against this deceptive practice and call their rivals on it.
 
  • Like
Reactions: ipedro and CarlJ

CapitalIdea

macrumors 6502
Feb 25, 2022
372
1,593
I work in email marketing and the unnecessary Apple Mail privacy changes are causing havoc. So unwarranted and a detriment to the customer - all Apple want to do is make life difficult for other tech firms.

I find this advert misleading - it implies we as a business sell on info about a customers email engagement, unhelpful fear mongering.

We don’t.
Nor does any other company.
Why!
It’s worthless data.
When you have millions of customers in your CRM program you aggregate the stats. It helps us understand as a channel how we are doing.

I look forward to Apple putting this ad up in traditional media so I can file a complaint with the ASA.

Until you ask if we want to hear from you and we say yes, stop being a locust.
 

CarlJ

macrumors 604
Feb 23, 2004
6,976
12,140
San Diego, CA, USA
Be careful. The truth is often avoided here in favour of hyperbolic mistruths, it’s more headline grabbing. Stating the truth is something that seems to be frowned upon by those proclaiming to search for what they’re lead to believe is the ‘real’ truth. It’s akin to telling their doctor they know better because they ‘researched’ it on YouTube.
But everybody knows that 30 minutes on Facebook is just as good as six years in med school ;)
 

steve09090

macrumors 68020
Aug 12, 2008
2,192
4,198
But to treat all customers like criminals in an attempt to be proactive?
Seriously, no.
It's hardly treating people like criminals and no different to scanning your items and your body as you go through an airport.

I find it distressing that people care more about an innocuous hash tag scan that doesn’t do anything unless there is a hash match (which is bad for those people) than they do about protecting children's rights of not being molested.

Having access to things and places often means we have to give up some privacy, whether the place is under video surveillance (and potentially facial recognition), or signing in with a QR Code, or swipe card, using a credit card etc, scanning a car rego as it travels through a city. It does not mean you're being treated as a criminal.
 
  • Like
  • Disagree
Reactions: I7guy and dk001

martinX

macrumors 6502a
Aug 11, 2009
928
162
Australia
Pomeranian looks suss.
Screen Shot 2022-05-19 at 8.54.04 am.jpg
 

steve09090

macrumors 68020
Aug 12, 2008
2,192
4,198
If Apple is scanning my device for illegality and if suspected reporting into a Government law enforcement stream with out the Government having to procure a warrant that is under the guise I am a possible criminal until they show otherwise. Neither hyperbolic nor exaggeration.

btw - one of the original concerns was exactly that and this type of "surveillance" has never been challenged in court. If it was initiated it would be.

That's besides all the other crap that was wrong with that.
There are many laws that allow surveillance or 'a search' without a warrant. You’ve been watching too many tv shows. It doesn’t mean you’re a criminal.

Have you never been in an airport? Between the scan, the drug swipe, the bomb swipe, the pat down search, or opening or forcing open a bag. When has anyone produced a warrant?
 
  • Like
  • Haha
Reactions: I7guy and dk001

martinX

macrumors 6502a
Aug 11, 2009
928
162
Australia
Look up how CSAM works. It compares your pictures (in a hashed form) to known images in a database. The match has to be identical. So your personal pics would never be flagged. Plenty of people have pics of their kids in the bath, etc. They wouldn't be flagged because they aren't in the hashed database.
What if they are?

Hypothetical here: someone has a load of innocent shots of their own kids. Could even be fully clothed. Uploads some to the socials. Pics get downloaded by person or persons who take a liking to something about that photo - nothing illegal about it, but something like them loving catalogues of kids in swimsuits. They might distribute them, too.

Moving along, weirdo gets picked up by the cops for prosecutable crimes, they find a trove of images (it's always a 'trove'), some clearly illegal, some clearly innocent (what would you say about someone who collects swimsuit images of other peoples' kids?), but indicative of a pattern. Weirdo goes to jail. Cops dutifully upload whole trove to CSAM. (Can you see where this is going?)

Original owner, the family member, gets flagged for pics of their kids turning up in weirdo's collection. Questions get asked: Is original owner producing and distributing CSAM of their own kids? Can they prove they aren't? Assuming innocence can eventually be established, the process is devastating. Remember, everyone else in the justice system is being paid for prosecuting people.
 

dk001

macrumors demi-god
Oct 3, 2014
10,723
15,067
Sage, Lightning, and Mountains
It's hardly treating people like criminals and no different to scanning your items and your body as you go through an airport.

I find it distressing that people care more about an innocuous hash tag scan that doesn’t do anything unless there is a hash match (which is bad for those people) than they do about protecting children's rights of not being molested.

Having access to things and places often means we have to give up some privacy, whether the place is under video surveillance (and potentially facial recognition), or signing in with a QR Code, or swipe card, using a credit card etc, scanning a car rego as it travels through a city. It does not mean you're being treated as a criminal.

You might want to legally rethink that. US law.
This is more like having someone verifying over your shoulder at home that all your "art" is not illegal. If they see some that might be you get reported.

Then again, this has all been endlessly discussed in other threads and letters to Apple.
Not going to recover it all in this thread. Just look up threads that contain CSAM.
 

steve09090

macrumors 68020
Aug 12, 2008
2,192
4,198
What if they are?

Moving along, weirdo gets picked up by the cops for prosecutable crimes, they find a trove of images (it's always a 'trove'), some clearly illegal, some clearly innocent (what would you say about someone who collects swimsuit images of other peoples' kids?), but indicative of a pattern. Weirdo goes to jail. Cops dutifully upload whole trove to CSAM. (Can you see where this is going?)
This is where it breaks down. My understanding is that the images are selected. notwithstanding, the Hashed images are catalogued and 'if' an image is flagged, that would be checked. If an innocuous image of a kid playing under a sprinkler is found, it would be discarded as it’s not a child porn image.

You might want to legally rethink that. US law.
This is more like having someone verifying over your shoulder at home that all your "art" is not illegal. If they see some that might be you get reported.

Then again, this has all been endlessly discussed in other threads and letters to Apple.
Not going to recover it all in this thread. Just look up threads that contain CSAM.
You don’t have airports, and car parks and shops in the US? ?. I really do need to visit! What is the difference between having a drug or ammunition in a bag compared to child porn on a phone?

Anyways. The point is that private is private and should not be on sold. It doesn’t mean that proper Authorities with proper reasons shouldn’t have some level of knowledge. What Apple is trying to do is protect your privacy when it is able to be protected and the laws allow that protection. Giving away or selling your private information is not on unless you allow it.
 
  • Like
Reactions: CarlJ
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.