I took photos of my 3month old daughter taking bath for her first year album, thats mean that this algorythm consider that is abuse/crime/i dont know what and remove that photos from my icloud and iphone? what we done with this word?
Why don't you enlighten us?Jesus. How can there be 4 pages of posts and almost every single one of them is false, emotionally charged, incorrect nonsense. Like… y’all aren’t even trying to sound like you understand this technology while you adamantly argue against it. In fact, many are spouting misunderstandings that Craig literally clears up in the interview 😂
Excellent. Advice to Apple: Don't double-down on disaster. Stop digging, eat humble pie, and get out of this!"Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government"
What they are actually confirming is that their excuse about safeguarding children is a ruse, as withough government files they are reliant on one database, and not even a crime fighting database nor from those whose task is to specifically seek out such criminals, and where this idea from Apple will make it harder for these authorities to do their job.
Frankly, I don't want to see or hear some trumped up crap about how secure their intrusion into your privacy is, nor about how secure THEIR surveillance of your computer is, because again its obfuscation to suggest its pictures on iCloud, when the process starts with software on YOUR hardware.
It is clear Apple are struggling to justify this 360degree turn on privacy, but in my experience if an idea is crap, better to scrap it than try continuously to justify the unjustifiable.
Perhaps Apple should have a proper user panel to help guide them as to what users want, as opposed to this awful idea. People might wonder if I'm now anti Apple. The answer unequivocal NO. In fact my loyalty to Apple is what pushes me to get them to change this dangerous and ridiculous move in having snooping software on your equipment, as there is no point on earth in assuring anyone that its secure or that it won't be extended, because previously Apple got on their own pedestal to explain how crucial it was for them to protect the very privacy they now seek to destroy.
It won't even help in the fight against child abuse, it will make the task harder for investigating authorities who DO HAVE A REMIT to try to catch these awful people, but where they will obviously take steps that include VPN, Tor, dark web, or encryption and avoid iCloud like the plague. But the problem is it IS NOT just iCloud, the problem starts on your hardware, hardware you have paid, hardware where you should decide how its processing power is used, and hardware you pay the electricity bill for!
SURVEILLANCE IS SURVEILLANCE, and I'm starting to wonder now with the attempts to mitigate their actions whether. it was indeed pressure, as its certainly doesn't seem like safeguarding kids, when it will encourage paedophiles to take preventative action to hide their evil deeds.
It's not very hard to follow my logic, is it? I wouldn't call ownership of property a form of gymnastics. I own everything included in the items I pay for.If you want to do all those mental gymnastics to say you're "paying" for iOS 15, then I can't stop you 🤣 In any case, it's a moot point. If you're no longer happy with the entire "bundle" you purchased, then no one is stopping you from selling your iPhone and moving on to greener pastures. Put your money where your mouth is. Apple has every right to do what they're doing. YOU are the one who agreed to that when you (most likely) failed to read any of the legal stuff you tapped "agree" to. If you're truly this paranoid about things, you should have read every word of those documents before agreeing to them.
I completely agree that physical hardware being deprecated in newer iterations has had some reasonable pushback. I've never seen software services as a part of the device, but I'm a CS guy and might be in the minority.Considering the number of people who got upset about chargers and earbuds being removed from new devices I think we can safely assume that people include more than just the handset when considering what comes with their purchase.
To be absolutely frank, the EU like many governments have not got a clue about technology, hence so many of them spend billions on systems that don't work and where I draw your attention to one EU name who openly boasted about lying to people to get what the EU wanted? Not just EU, UK hasn't got a clue about system security or technology and spends billions keeping antiquated computers running. Relying on these people to safeguard you.....Phew±±Thanks for the report!
Do you consider the Firmware as part of the device?I completely agree that physical hardware being deprecated in newer iterations has had some reasonable pushback. I've never seen software services as a part of the device, but I'm a CS guy and might be in the minority.
facepalm.,I took photos of my 3month old daughter taking bath for her first year album, thats mean that this algorythm consider that is abuse/crime/i dont know what and remove that photos from my icloud and iphone? what we done with this word?
I can't seem to find it --- could you provide a link and page/paragraph #?It's in the quarterly reports.
Do you even have the slightest idea about how hashes are created? Do you know what the process is for creating a hash? I think the answer is no. If you did, you would understand just how disingenuous the second point is. Apple is absolutely scanning the images on your phone. They have to in order to create the hash. You can't create a hash without having some base data as the input for the hash. The base data in this case is the 1's and 0's of your image. Is a human looking at this image while it's on your phone? No, but if you have enough images flagged and you do an upload to iCloud, some human at Apple will review those images to see if the content is truly CSAM.I feel a lot of folks in these kinds of threads are wearing more tinfoil hats than anything I have seen. Even when Google scans your private emails its not this much conspiracy driven
So lets see
Is Apple creating a backdoor to Government ?
This Answer is entirely based on how you view it through practical angle or conspiracy angle. e.g. I trust Apple so I dont believe they are creating a backdoor and technology wise Its not a backdoor either because a Backdoor means Government having Keys to Secure Enclave even with iCloud Off, so logically it cannot be a back door if the hashes scanned locally is same as on iCloud and even if Apple wants it, the scope cannot be expanded due to Secure Enclave
Is Apple scanning images on your Phone before it starts uploading to iCloud
No, Apple is scanning HASHES or images not the images themselves when the transition starts FROM your Phone to iCloud. For example. The scan of the image hash will not occur if your internet is off because the upload has not begun as per documentation
Can Apple expand the categories for types of hashes to scan for iCloud Photos?
Again, this answer is entirely based on how you view it. From a conspiracy angle or practice angle. Apple has stated that they will not and I trust their word. Else I would not be buying their products.
Can Apple scan your phone for non-cloud data?
No.
Why not scan hashes on iCloud servers alone like other tech companies ?
Again, this answer is entirely based on if you view from conspiracy angle or practical angle. Apple has stated that it does not want to scan iCloud libraries because that is more intrusive than scanning hashes locally one by one as opposed to all the files in iCloud. Again, personally I trust their word on this from a technical perspective than most security experts who are looking at it from a functional perspective
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature
Comments like these indicate you don't understand what privacy is? Apple is installing a tool on *every* Apple device that has the potential to on-device scan any/all of your personal content. This is a direct backdoor to your content, as it circumvents all encryption, and effectively means none of your phone data is private.and people will refuse to read anything about it and continue to lash out about privacy.
maybe actually understand what really is going on, how realistically it would affect you in a negative way, and what Apple's end game with this tech really is.
and no, "total invasion of your privacy" is not a real answer.
Apple's own account shows where the software is, and on what platform the checks take place? Its on YOUR hardware, not in iCloud. The software is not on iCloud, it will be on your device using your hardware that you own,, that you pay the electricity bill for, that you pay your internet supplier for etc. etc.
A few years back courts took a view that emails on a server were not necessarily private, whereas emails on a device at home were, and on occasions it meant no court order was required to get details of information stored on a server, hence many people immediately delete email from servers.
Technically and legally it would have been much easier if Apple had suggested that checks on child photos were taking place in iCloud as its their servers (or Googles) and left the operating systems alone, as there is literally no need for that unless it is FOR SURVEILLANCE. So why didn't they just have iCloud check rather than via our hardware, using our processing power, our electricity and equipment we have paid for outright?
On their own servers they could legally do that as they own them and their terms and conditions would apply and then it would be a choice for users as to whether to use iCloud or not, but I don't see this latest situation helping them sell iCloud+ and absolutely the wrong move to make it part of operating system on devices they have sold to consumers, especially on their privacy platform.
However they have chosen to enact SURVEILLANCE via software embedded in our hardware. Hardware we have paid for, we pay for electricity for, and on the basis of a performance not curtailed by surveillance, surveillance that will not help the fight against child abuse anyway, as they've telegraphed how to avoid being caught, making the whole system a complete farce.
And how are hashes generated? Do you even understand? Or do you just look at Apple's explanation, see the buzzword "hash" and think it's all good?facepalm.,
man people really need to read what's in this. this is why so many are confused.
No. it will not flag your photo of your daughter taking a bath because that Image's HASH is NOT in the DB . it DOES NOT SCAN FOR NAKED PEOPLE FOLKS. understand this. It scans the HASHES . the back end of the file not the front end.
Apple should not get out of CSAM scanning.Excellent. Advice to Apple: Don't double-down on disaster. Stop digging, eat humble pie, and get out of this!
Let's call it what it is: spyware. The supposedly most secure device available is going to have spyware built into it.Comments like these indicate you don't understand what privacy is? Apple is installing a tool on *every* Apple device that has the potential to on-device scan any/all of your personal content. This is a direct "backdoor" to your content, as it circumvents all encryption, and effectively means none of your phone data is private.
We should be asking ourselves, why is Apple policing anything? Why now? How could this capability be used in the future (like scanning different content, or cross-referencing a different database)? Can Apple use it differently without users knowing a thing? One only has to remember the whole Apple CPU throttling fiasco in 2019, where Apple was doing something rather innocuous (at least compared to this), only informing it's users after being exposed.
I'm really curious to know how Apple thinks this feature is going to improve their market share?? To be a fly on the wall in those meetings!
This doesn't work because no reasonable person would assume that iCloud is off their device. They assume iCloud is an extension of their device since the service is included in the purchase of the device.
For everyone shouting "damage control" --- this seems to be the real reason why Apple is continuing to talk about this so much. There is legitimate misunderstanding of what they are doing.I took photos of my 3month old daughter taking bath for her first year album, thats mean that this algorythm consider that is abuse/crime/i dont know what and remove that photos from my icloud and iphone? what we done with this word?
I already did the work of explaining it to you and pointing you in the direction you need to go. Those of us who have been reading them for decades already know this to be true. If nothing else, go back and re-read them from the beginning. It's good history.I can't seem to find it --- could you provide a link and page/paragraph #?
It's not very hard to follow my logic, is it? I wouldn't call ownership of property a form of gymnastics. I own everything included in the items I pay for.
I just want what I paid for.
Only two reason for doing this. First is general marketing level virtue signaling that they are the good guys fighting the good fight so people should like them more and buy their stuff. Second is legally they can't host this crap on their servers and it is cheaper for them to do the scan for it on the uploader side.However they have chosen to enact SURVEILLANCE via software embedded in our hardware. Hardware we have paid for, we pay for electricity for, and on the basis of a performance not curtailed by surveillance, surveillance that will not help the fight against child abuse anyway, as they've telegraphed how to avoid being caught, making the whole system a complete farce.
The hashing and scanning happens on device. It only gets reported if you upload to iCloud. Apple is installing software on your device that you paid for that is essentially spyware. Spyware will now be an intrinsic feature of iOS.If you're uploading anything to iCloud, let alone CSAM, you are waiving any rights to 100% privacy. Read the iCloud legal documents on Apple's website.
No. As someone who has experience writing firmware, I would say that it is not part of the device in any way whatsoever.Do you consider the Firmware as part of the device?
"If you are no longer happy with the entire 'bundle' you purchased, then no one is stopping you from selling....." rather misses the point that the hardware may have been sold on the basis of privacy, so on that basis someone changing that should pay the refund, not the user having to sell?If you want to do all those mental gymnastics to say you're "paying" for iOS 15, then I can't stop you 🤣 In any case, it's a moot point. If you're no longer happy with the entire "bundle" you purchased, then no one is stopping you from selling your iPhone and moving on to greener pastures. Put your money where your mouth is. Apple has every right to do what they're doing. YOU are the one who agreed to that when you (most likely) failed to read any of the legal stuff you tapped "agree" to. If you're truly this paranoid about things, you should have read every word of those documents before agreeing to them
If something can go wrong, it will go wrong. Two many variables. And a human review? That's so comforting.This is one place where I am dumbfounded. They could easily get the 1 in 1 trillion false positive rate with only 4 matches if NeurHash's false positive rate was 1 in 10,000.
My thoughts on this is one of:
-- "Craig is lying, and they are flagging with much less"
-- "NeuralHash must be much worse than I could put together in a few weeks' time."
-- "They are being insanely cautious to ensure not a single false positive will ever happen ever to anyone"
They get in trouble if they are told about it and don't remove it. If they don't know about it and someone flags it, they are covered by Section 230.They have a legal requirement to not host CSAM on their servers, which they own and you are renting. They get in trouble if they let you store it.