Based on....?They don't have to be bright in the slightest to avoid these measures. Just marginally less stupid than a sack of carrots. As it happens, most of them are probably not stupid in the slightest. They are devious.
Based on....?They don't have to be bright in the slightest to avoid these measures. Just marginally less stupid than a sack of carrots. As it happens, most of them are probably not stupid in the slightest. They are devious.
Apple has said that once a 30 match is made on your iPhone an example will be sent to an employee at Apple to review. Once that Apple employee sees something unpleasant that person will notify the authorities. So yes, some sad sets of Apple employees will have to review these 24 hours a day. (Probably getting paid minimum wage - hopefully psychiatric care is included). Actual people have to look at these, and by reporting them immediately they aren’t breaking the law.You have just told everyone you don't understand the system.
Apple doesn't have examples of such material.
Apple doesn't use it for training the CSAM Detection System.
Apple doesn't have to develop algorithms to recognise child pornorgraphy, because the system doesn't do that.
Apple isn't breaking the law since they aren't in possession of examples of child pornography.
Turning off iCloud photos does not remove the code used to create and compare hashes from my device. Claiming "we won't use it unless you do X" is not good enough.
No, it's not, not even close. I own the phone, they own the iCloud and that's a VERY big difference. Think of it like your own property ratting you out verses someone else's server that you may or may not use. They can do all they want on their property, they cannot do what they want on my property.Reporting CSAM when scanned on iCloud is semantically identical to reporting CSAM when verified on iCloud.
Now you're not making any sense, but I assume you mean the CSAM scanner, and yes, they would eventually get caught if they expanded things without telling us and it would be very bad for them. Of course since the scanner is already there, government's could tell them to do it anyway since the software is already there to do it. And since a lot of people like yourself would accept it, government's surveillance win, we all lose. (including Apple, because enough people would just throw away their iStuff.)...you're making no sense. You're arguing that Apple would be caught if they were using your data now for nefarious purposes, but they wouldn't be caught if they used CSAM for this?
Well, most of Apple's ecosystem is built on open source, and several of the frameworks and toolkits they've invented (or had their fingers in) are open source as well.
FreeBSD, Darwin, swift, WebKit, LLVM, ...
The list goes on and on --- "Apple" and "open source" go quite well in the same sentence.
Would be fun to train a word2vec model and see what vector math we could do to see just how far off Apple(s products) are from "open source."
Apple has said that once a 30 match is made on your iPhone an example will be sent to an employee at Apple to review. Once that Apple employee sees something unpleasant that person will notify the authorities. So yes, some sad sets of Apple employees will have to review these 24 hours a day. (Probably getting paid minimum wage - hopefully psychiatric care is included). Actual people have to look at these, and by reporting them immediately they aren’t breaking the law
They're already doing it on server side, like everyone else.
Why not? Do you have inside info into Apple’s hardware & software? Even if your suspicions are true, with iCloud photo sharing disabled, NOTHING is ever sent to Apple. No person at Apple can view the images on your devices. So no person at Apple can send the authorities after you. Apple told its customers how to disable this “feature” probably pissing off people in very high places.Turning off iCloud photos does not remove the code used to create and compare hashes from my device. Claiming "we won't use it unless you do X" is not good enough.
Okay 😉Do you recall when “the gays” were after your kids? Perhaps you are too young for that. Every repressive government uses ”the children” as a whistle cry to erode freedoms that people still continue to fight and die for everyday. If you think this is to save children, do you also think the earth is flat?
Really? Are you sure? Correct me if I'm wrong though, but with the actual facts, not what you think.Apple has said that once a 30 match is made on your iPhone an example will be sent to an employee at Apple to review.
Still is. The things that Apple adds to those open source projects is often not opened source. However with Apple using all these different open source projects, they do get more attention & exposure, some improvements come from Apple, and more because it’s being used by Apple (by others not affiliated w/ Apple). Apple also creates free tools for creating code.That was then, this is now. It once may have been.
This is not about CSAM or protecting the children at all. This is all about deploying the software framework, the mechanism on every Apple device for future use cases. Apple has caved into the US Gov to install this framework on Apple devices and is using CSAM as the excuse to get this accepted by the public. Privacy is out of the window with this software. Don't only look at today, look at the morrow.
Apple must not allow this software on Apple devices. If they don't change I hope sales is significantly affected, and it deserves to be. It has made me very reluctant to buy another Apple device. Apple have lost all trust
I’m really getting tired of the “if it saves 1” argument. Stomp all over the rights of millions of people to save 1. Don’t get me wrong, I don’t want any kids to have to go through that experience, but Apple’s method is only going to catch old, known material while setting a very dangerous precedent by circumventing any possibility or effectiveness of E2E encryption. Think about this, if Apple goes after all the old material, wouldn’t that create more demand for new material? Wouldn’t that, in effect, actually harm more children?Is this a safer world?, should everyone have unlimited and uncontrolled freedom?. A little sacrifice of “freedom” isn’t worth it for a big reward?. Just 1 child saved is not worth it?. We value more our individual “freedoms” over the safety of children?. There a cameras on the streets in many cities for our safety, but at the same time they “spy” on you. Paranoids and molesters making big noise with a “warning about misuse” and parents not heard. American companies are a kind of joke.
The rest of us will disable iCloud photo sharing, and suffer the consequences of that. I sadly suspect your right that it won’t affect sales. But I hope it will. Apple will never be able to advertise about it’s privacy when scanning goes live.Sales will not drop. If anything, the masses will end up buying more iPhones (not necessarily because of the whole CSAM thing, but just because).
Mankind has shown that its just not smart enough to rid itself of these sorts of things. They are hooked to their smart devices. They are hooked on the internet. They can not live without these things. So what do you do when you're an addict, and the thing you're addicted to is bad for you? Why, you convince yourself that it's either not bad, or you ignore it altogether. And that's exactly what is going to happen here.
The rest of us will just further unplug.
Again, correct me if I'm wrong (but only with facts if possible), but aren't the images they are scanning for known images from CSAM? So it's not scanning for any photos, It's matching photos against known photos. A little bit of a distinction there I think.Yes, but he was talking about during the development of the CSAM Detection System:
"anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....
Then someone has to review these results to make sure they are correct and meet the requirements of the SW product."
the “apple is protecting the user” shtick is wearing thin, would rather have more app stores than some theoretical protection from malware, don’t like malware obviously but the app store matters for exactly the reason you cite, it gives us users far more choice and the power that goes with itOnce more this affair shows sideloading is necessary, I want apps created independently from Apples restrictions. A backup option is now Apple only, with that monopolized.. freedom of choice is necessary in every part of life, including your idevice..
Imagine apple not able removing an app like hk.live from the appstore.
Because apple monopolizes every aspect on its idevices, its vulnerability is just that, it can be pressured to change policy..by the public or other entities .. if there was a different option for backup besides apple, I simply could switch to another service..
This wouldn’t have helped children. The consumers of child porn that we’re caught by this would be so many so many steps removed from the abusers making the content because. The images have to be prolific enough to already be in all of the source databases.I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
not to mention the fact that apple apparently was going ahead with a plan that allowed pedophiles to upload 29 child porn images without any problem … so, do you do 29 on tuesday and the 29 more on wednesdayUh, NONE. People that enjoy looking at CSAM will just turn off iCloud photo sharing. It’s everyone else that’s in danger.
That's your right. So you should do it. Change systems. Just do it and don't look back. Show Apple with your wallet what you feel. I will applaud you for your convictions if you do so. Otherwise it's all just p*ssing and moaning on a rando message board on the internet that will amount to nothing except a bit of venting of your frustrations for a moment.For me, the damage has already been done... I thought Apple was a champion for The User, and because of that I was happy to overlook the walled-garden and in some cases the lagging behind other companies' products in terms of innovation or feature-set. For any given tech I'm interested in, for the last few years I have either only gotten the Apple version or waited for Apple to launch something, and haven't even really looked at reviews of anything else.
Now Apple has destroyed the illusion that they are working to put me in control and let me safely use my own devices. I guess I should have already known, but before this it was pretty easy to forget.
Funny/difficult to now tell myself that I don't actually need the iPhone 13... I don't need the new Apple Watch... There are other AR/VR things I can get into now instead of waiting... I can look for HomeKit alternatives instead of sinking more dollars into that ecosystem.
Sigh.
every single person on this forum and every apple forum … in the world … could do everything you suggest and it would not even create a ripple in the mac user universe, not a rippleLittle late to the party on this one today, but I’ll say it again, I’m fine with scanning iCloud and most definitely not against CSAM, but I will not stand for scanning MY ACTUAL DEVICES.
Like many of you, I’ve looked into hardware/software Apple alternatives. Much easier said than done. That said you can still make difference even staying in the Apple ecosystem.
If you feel the same way about the slippery slope that is on device scanning, say so with your wallet and do the following…
At the end of day, Apple will not pay attention until it affects their bottom line and their share holders.
- Cancel your subscriptions to iCloud and use other cloud services. I personally like Sync.com. It has E2E encryption and there are other options out there.
- Cancel your subscriptions to Apple TV+, iTunes, etc. Plenty of options for media for music and video.
- Do not use Apple Pay and close your Apple Card account if you have one.
- Sell your Apple stock if have them.
- Do not upgrade your OS on any Apple device.
- Do not purchase anything through the Apple app stores.
- Do not buy any new Apple hardware.
Obviously canceling all Apple services affects the money they make and they pay for a lot of infrastructure and development. They also spend a lot on marketing and subsidies. So if for example, your cell phone carrier offers you a free subscription to Apple TV+ or Apple Arcade, do not redeem it.
Apple makes money from Apple Pay when you use the cash option, so obviously don’t do that. That said, Apple spends a lot of money on development and upkeep of this service, so if you don’t use it, it also affects their bottom line. Getting rid of your Apple Card means less money for them, as well as straining their relationship with Goldman Sachs who is the backbone for Apple Card.
Selling Apple stock has a direct affect on the stock price. The more people sell, the lower the price will go. Sure it’s a lil more complicated than that, but it sends a clear message.
Not upgrading to new OS on any Apple device has a similar affect. Apple spends a boatload on development, so if you do not upgrade, that money essentially is going to waste. Furthermore, they use new OS adoption as a marketing tool.
Not buying anything through the app stores directly affects their bottom line. As you probably know, Apple makes a boatload of money in commissions (30%) from these sales. If at all possible buy directly from the developer.
Finally and most important, not buying any new Apple hardware has a ginormous impact! I know some of you need newer hardware, but buy something used instead of new. Apple’s biggest expense is the combination of R&D and production. So it’s pretty obvious, if their sales of new product sales drops by even 20%, it means Apple HAS to pay attention.
it certainly looks like you have no idea of the value of "freedom" and how hard it is to get back after you lose them. How many freedoms are you willing to lose for good causes. Are you okay with everyone having to use a breathalyzer every time they start their car because we need to make sure we get all those drunk drivers. Are you okay with your car calling the police on you when you go over the speed limit because speed kills you know. Are you okay with your new TV having a camera looking for drugs in your room because we need to get rid of all those drug addicts. Are you okay a year down the road when Apple expands the CSAM to scan for other illegal activities our government doesn't like because those will be good causes too. Just wondering what your line in the sand is.It certainly looks like "cancel this for freedom" is the new "I watch it for the plot".
1. Fair enough. Decent enough solution although quite inelegant and troublesome.I will not stand for scanning MY ACTUAL DEVICES.
Like many of you, I’ve looked into hardware/software Apple alternatives. Much easier said than done. That said you can still make difference even staying in the Apple ecosystem.
If you feel the same way about the slippery slope that is on device scanning, say so with your wallet and do the following…
- Cancel your subscriptions to iCloud and use other cloud services. I personally like Sync.com. It has E2E encryption and there are other options out there.
- Cancel your subscriptions to Apple TV+, iTunes, etc. Plenty of options for media for music and video.
- Do not use Apple Pay and close your Apple Card account if you have one.
- Sell your Apple stock if have them.
- Do not upgrade your OS on any Apple device.
- Do not purchase anything through the Apple app stores.
- Do not buy any new Apple hardware.
You know it doesn't work that way. Once it hits 30 overall it is checked. But I agree, why is there a 30 limit? Quite arbitrary.not to mention the fact that apple apparently was going ahead with a plan that allowed pedophiles to upload 29 child porn images without any problem … so, do you do 29 on tuesday and the 29 more on wednesday
like people who move money in amounts of $9950.00, just under the 10K reporting limit
Exactly. So I implore everybody who sees this as crossing a line and being absolute about it, to jump ship for your own beliefs and nothing more. Do it for yourself and what you believe in.every single person on this forum and every apple forum … in the world … could do everything you suggest and it would not even create a ripple in the mac user universe, not a ripple