Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyone not switching because of CASM is living in a dream world if they think android isn't going to do the same, and probably much worse
I doubt if they do, it really is that bad, all the security people are against it, but even if they did, there's always de-googling an android phone that can be done, unlike apple's iPhone.
 
I'm curious to know how Android phone users are avoiding CSAM? If they're using Google Photos, for example, then those photos are already being routinely scanned against the CSAM databases. (source: https://9to5mac.com/2021/08/13/csam-scanning-controversy-was-predictable/)
They’re not avoiding it. It’s bad on that side too.

People tend to switch to Apple because of things like Privacy. So if the privacy is going to be as bad as Android why switch? Android does almost everything else better.
 
  • Like
Reactions: Euronimus Sanchez
They’re not avoiding it. It’s bad on that side too.

People tend to switch to Apple because of things like Privacy. So if the privacy is going to be as bad as Android why switch? Android does almost everything else better.
We’ll android does some things better, based on ones’ use case.
 
I am an Android user for about 20 years, but I plan to buy my first iPhone when 13 is here. They didn't include me in the survey 😁 Seriously though, I would just use the pin number to unlock my phone when having my face mask on. Seems like first world problem to me. yes, I use fingerprint all the type, but I don't think going back to pin during mask time is going to be a deal breaker.
 
Isn't CSAM just generating a hash key of a photo, and comparing it against a pre-defined list of flagged photos?

Why is this a problem? They aren't looking at your photos, they can't tell what your photos are. They can ONLY identify if it is a photo that is pre-determined to be child porn. How is this a problem?
 
I just did a quick Google search and found the number to be as high as 26% of respondents.
This number likely comes from an article originally posted by Android Central with a title worthy of MacRumours “Why Apple doesn’t care that a quarter of all iPhone users eventually switch to Android.” Kind of hard to ignore the hyperbole and inaccuracy of that headline based on the facts within the report that follows but hey, we are all familiar with click-bait. It is worth reading beyond the headline and first paragraph as there are some very interesting points being made.

JonGarrett was accurate in stating “as high as” but where do these numbers come from? They are from court documents in the Apple vs. epic case filed by Apple to demonstrate iPhone users are NOT locked into the ecosystem. The high numbers of 26% come from Q1 and Q2 of 2020, these numbers are higher than average, and they are just one piece of data out of context In order to score a point by Apple in court.

When considering this number keep in mind we have not seen a corresponding drop in iPhone revenues; iPhone revenues did drop in 2020 but only a couple percent and despite what you might read Apple has not raised the price of the iPhone to make up this difference. Keeping in mind there were other things happening in 2020 that could be responsible for this dip. I wrote “dip” because Apple iPhone revenues have increased significantly this year.

So how is it that iPhone revenues continue to grow while Apple is apparently losing a significant number of users to Android while not raising the cost of the phone during the same period? New users? Android switchers? People returning to the iPhone after dipping their toes on the other side? I don’t have an answer but it is clear to me the number Apple reported at the trial, and Android Central jumped on, is not the whole story.
 
  • Like
Reactions: mdriftmeyer
I doubt that many outside of Tech Blog readers have ever heard about Apple's CSAM code.
Interesting point. Only 10% of the respondents said they wouldn't make the switch due to on-device CSAM-scanning, but how many of the respondents knew about it and knew precisely what it meant?

Even if some did mange to hear about it, most would only have heard that Apple is going after people making or downloading CSAM images and they would think that's a good thing.
I'm not so sure. I informed my wife (non-techie, iThings user) and a handful of other people (one, one of my best friends, a techie and Android user). All of them, save one, reacted with various degrees of incredulity or shock at the concept. "They're going to do what?!?!" would be a fairly accurate summation of their responses.

So, you may be right, but I wouldn't place too big a bet on it ;)

One of the big polling firms, one that's known for producing better-than-average results, once did a test. They polled a representative pool about certain objectives on the part of certain activist groups. They got results similar to what had been reported in the dominant "news" media. Then they informed their pool of existing realities relating to the issue--carefully sticking to neutral wording and facts only, then re-polled them. Got entirely different results.

Because the dominant "news" media no longer really informs and because of the way many of these polls are conducted, many of the polls are not unlike the old "Have you stopped beating your wife?" question.
 
iPhone 13 is in third place in design, smarts, features and freedom after the Galaxy S21 Ultra and upcoming Pixel 6 Pro.
 
  • Like
Reactions: eltoslightfoot
Isn't CSAM just generating a hash key of a photo, and comparing it against a pre-defined list of flagged photos?

Why is this a problem? They aren't looking at your photos, they can't tell what your photos are. They can ONLY identify if it is a photo that is pre-determined to be child porn. How is this a problem?

The problem that most people have isn't that they're scanning ("identifying") at all. It's that they're doing it on device, rather than server-side - even though they're only scanning what will end up server-side anyway.

People accept server-side scanning since your stuff is on their servers. But to use one's device to do it - that's the issue.
 
Well... black on white... FaceID is not superior to TouchID as many always stated and was also the reason I sold my iPad Pro back then because FaceID on the tablet is just plain stupid when the tablet is lying flat on a table. All the engineering at Apple and no one realized that usability failure...
Now we are in the middle of a pandemic and wearing masks is a common thing now -and has been in Asia for a long time. One should have expected Apple realized at least the latter now... but no...
They care about 15% share on the App-Store yet they leave "a lot of money on the table" caused by their own arrogance and ignorance when it comes to customer demand.
 
  • Like
  • Love
Reactions: Apple$ and nvmls
I get it. At least Apple has a way to turn that off, by turning off iCloud photo sharing. No device scanning will take place on iPhones if iCloud photo sharing is off. If they move onto videos, that will be even easier to turn off. It’s inconvenient, definitely. And a sad day for Apple. But I believe Apple was forced to do this. If Apple was forced, so will others, who may not notify you of what they are doing on your device.
 
My friends $150 android phone from two years ago lasts for three days without needing a charge. Doesn’t lag, doesn’t even talk about csam. While I don’t trust google apple kinda screwed the pooch with the forced upgrade iPhone 11 Pro lags, 5 hour battery, and now with csam eh. I’m looking at flip phone and buying a dslr.
I use both platforms on flagship devices extensively and the lag stuff is real. Download let's say a third party Android music player app like Musicolet or any other of your choice, and compare it with Apple's Music app on basic functionality like touching play, scrubbing the song's timeline, go to next song, etc. Android is much more responsive, Apple's always takes a second to trigger. iOS is lipstick on a pig. Airdrop works like half of the time even under their latest hardware (M1).

Apple blows hardcore smoke with their 5 year "support" & their silicon marketing, they bloat the device within a year or 2 of updates, which makes both "features" moot.
 
  • Like
Reactions: Osxguy
Isn't CSAM just generating a hash key of a photo, and comparing it against a pre-defined list of flagged photos?

Why is this a problem? They aren't looking at your photos, they can't tell what your photos are. They can ONLY identify if it is a photo that is pre-determined to be child porn. How is this a problem?
The problem is apple tout themselves as a privacy security platform, they are opening the door for other stuff to scan in the name of safety and security. Literally a back door in a mostly secure platform.
 
  • Like
Reactions: Osxguy and k27
I doubt that many outside of Tech Blog readers have ever heard about Apple's CSAM code.

Even if some did mange to hear about it, most would only have heard that Apple is going after people making or downloading CSAM images and they would think that's a good thing. It's unlikely that the part about searching people's Apple devices without a warrant is ever mentioned in whatever source they used to learn about this issue.
Well, considering it’s been brought up on multiple news outlets, many people know…..
 
  • Like
Reactions: Osxguy
thats unfortunate since apple could've easily put a scanner on the lock button like the ipad air. i'm fortunate not have to remove my mask to unlock since i have a apple watch, but to tell someone who doesn't have a watch that they have to get a expensive watch just to unlock is simply not plausible.
In all fairness, Apple hasn’t had a chance to gouge yet during this pandemic. Give them the chance to gouge!!!
 
And we want Android users because…??
Right. I mean without them and Android hardware makers and Google and OEM's that fork Android innovating new features, how would Apple ever copy and paste new features for the iPhone if everyone already had an iPhone? Would the iPhone even have copy and paste?

I say that both with seriousness and sarcasm.

If it wasn't for the ease of how Apple devices work together, who would want an iPhone? Without that, the iPhone is a dumbed down phone that is always years behind in features, display technology, with limited choices and options behind a severely walled garden and restrictive to the owners that has better security and biometric authentication?
 
Isn't CSAM just generating a hash key of a photo, and comparing it against a pre-defined list of flagged photos?

Why is this a problem? They aren't looking at your photos, they can't tell what your photos are. They can ONLY identify if it is a photo that is pre-determined to be child porn. How is this a problem?
Like 1/3 the people here are not even Apple product users at all and a good number of them are so suspicious of and jaded towards Apple that they see nefarious behavior in every decision the firm makes.:D
 
If it wasn't for the ease of how Apple devices work together, who would want an iPhone? Without that, the iPhone is a dumbed down phone that is always years behind in features, display technology, with limited choices and options behind a severely walled garden and restrictive to the owners that has better security and biometric authentication?

That's really a great point. It's the integration - both with other Apple products and with other Apple users - that keeps people locked in. If it wasn't for that (and I'm not dismissing the importance of that), it would be just another smartphone and have a harder time competing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.