What amazes me is the consumer uproar against this change when people have so readily adoped smart assistants.
What amazes me is the consumer uproar against this change when people have so readily adoped smart assistants.
You are intentionally being dishonest here. Their is a difference Facebook/Google scan files on their server that you sent to it and Apple does this kind of scanning on the device itself. Apple claims that turning off iCloud Photos disables this but you have no way to verify that. In addition it is a policy who is to say Apple can not change that policy down the road and your local and cloud photos are being scanned.do you use facebook / google services ? nothing new under the sun simply sad for it to come from Apple.
But they are all in PRISM anyways
It's computing a function from your picture, which some people may see as scanning. I'll go ahead and give that to you and others. The issue it seems you have is that you don't want Apple having any say whatsoever on what you store on iCloud Photos. Apple, on the other hand, has a moral and outspoken social responsibility to protect the privacy of everyone, and in this case especially those who have been subjected to known child abuse. You know, since the beginning of iPhone, all of your photos have been "scanned" in one way or another (for example, to apply Apple's own saturation/brightness effects). This is just a new "scan" taking place, in which a number is checked against child porn. If you're that worried about it, you shouldn't be using any modern technology, as it all "scans" in various ways."scan" is a broad term used by many people on these threads. It is basically "scanning" because it is looking at my pictures, producing a hash and checking with the database. Going through my pictures one by one comparing them. Basically scanning in that regard.
I sensed the discursive fallacy as I posted it, meant for entertainment purposes only then.What makes you think the same people upset about it also have non-Apple smart assistants?
(I don't - I only use Siri's dumb half abilities)
Only thing I use Siri for is - Set a timer for X minutes.What amazes me is the consumer uproar against this change when people have so readily adoped smart assistants.
Once it gets on THEIR PROPERTY, you know ON iCloud itself, scan away. But do NOT scan on MY PRIVATE DEVICE.It's computing a function from your picture, which some people may see as scanning. I'll go ahead and give that to you and others. The issue it seems you have is that you don't want Apple having any say whatsoever on what you store on iCloud Photos. Apple, on the other hand, has a moral and outspoken social responsibility to protect the privacy of everyone, and in this case especially those who have been subjected to known child abuse. You know, since the beginning of iPhone, all of your photos have been "scanned" in one way or another (for example, to apply Apple's own saturation/brightness effects). This is just a new "scan" taking place, in which a number is checked against child porn. If you're that worried about it, you shouldn't be using any modern technology, as it all "scans" in various ways.
In a purely computational way, what does it change.You are intentionally being dishonest here. Their is a difference Facebook/Google scan files on their server that you sent to it and Apple does this kind of scanning on the device itself. Apple claims that turning off iCloud Photos disables this but you have no way to verify that. In addition it is a policy who is to say Apple can not change that policy down the road and your local and cloud photos are being scanned.
Only thing I use Siri for is - Set a timer for X minutes.
In that case, you shouldn't be taking photos, or doing anything else on your private device. Every file you create on your device is "scanned" in similar ways. Functions are computed on the data, and outputs are recorded. This is true of every single thing you do on your phone, no matter what brand is on it.Once it gets on THEIR PROPERTY, you know ON iCloud itself, scan away. But do NOT scan on MY PRIVATE DEVICE.
In that case, you shouldn't be taking photos, or doing anything else on your private device. Every file you create on your device is "scanned" in similar ways. Functions are computed on the data, and outputs are recorded. This is true of every single thing you do on your phone, no matter what brand is on it.
And it will be too late when we have a future president who decides to force Apple to scan for images that may show someone was at a protest and may have held a sign that said unflattering things about said person....and how long before the "scanning iMessages" isn't just for kids under parental controls?
I can't believe people are blind to the bad precedents and paths we are going down here.
You don’t know because you obviously didn’t read what he said. He was very clearDon’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.
Once it gets on THEIR PROPERTY, you know ON iCloud itself, scan away. But do NOT scan on MY PRIVATE DEVICE.
I don't agree. First, it's compared against internal database managed within iOS directly. Second, it's not scanning all your photos, it's scanning photos you are trying to upload to their servers.There's such a huge difference between all of that and scanning all your photos to compare them against third party databases "looking for things"
The distinction is crystal clear
And it will be too late when we have a future president who decides to force Apple to scan for images that may show someone was at a protest and may have held a sign that said unflattering things about said person.
It becomes a security and privacy issue because Apple now has full access to my entire device and can scan for whatever they want without me knowing about it. That is why I prefer this type of software to stay in the cloud.In a purely computational way, what does it change.
And for the sake of argument, does the fact that the given Algo "scanning" your pictures not provide you with more control as to it's possible modification rather than it done in the arcanes of the cloud is in your hands.
I do not understand this argument.
And you obviously don’t know how technology and governments work. There’s an icon about the came’s nose…Those who are complaining obviously did not read how the technology works.
You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
You ever use a web browser? How is child porn kept off the iPhone if one were to use a web browser?Don’t know what Snowden is upset about. Apple is in their right to keep child abuse off their servers. This feature acts as a deterrent so that nonces don’t even try it. It protects the PRIVACY of children.
He was very buzzwordy and even posted memes for good measure.You don’t know because you obviously didn’t read what he said. He was very clear
Yeah, it’s clear you don’t get what’s going on with this tech. Imagine if someone had the superpower to just glance at an individual and know if they were abusing a child. You would not tell that individual to wear a blindfold because their mere glance was an invasion of your privacy, you would utilize them to bring abusers to justice.
This tech is like that superpower, but using ML instead of fiction. It doesn’t scan and catalog your photos, it looks for something specific and spits out a pass/fail. If the bot flags too many photos it spits out a fail and a human checks to makes sure it’s right. But if you pass nothing happens and you get to continue living your life in anonymity and privacy. IE, a soulless robot does the checking, which has no memory, no prejudice, not even the slightest bit of interest or understanding of what is in your photos except that it doesn’t match a predetermined data set of child abuse material.
If a human came to check your home regularly that would be one thing, but having an autonomous system simply verify that you are not a child abuser is another thing entirely and you would do well to note the difference.
Again, not wishing to be that guy, but with the explosion of digital cctv, this has been technically possible for 15 years.I'm happy to see that you (and many others here) "get it".
This has to be thwarted before it even gets going --- or it's too late.
The problem with the statement is that there is a fundamental misunderstanding of what is happening. This is not a neural network trained to look for things in photos, it's a neural network trained to generate hashes. There are plenty of neural net architectures that do image analysis, but this is not one of them, and it could not be applied in the way stated here.I'm happy to see that you (and many others here) "get it".
This has to be thwarted before it even gets going --- or it's too late.
And once it gets on THEIR servers, then they can scan. They don't need to have it scan while on my phone. People need to look towards the future because it will be TOO LATE by the time something bad happens. This will NOT just be limited to CSAM and ONLY when you have iCloud Photos on. Think about things a year or two from now...Do that NOW because it will be too late when it actually DOES happen.I don't agree. First, it's compared against internal database managed within iOS directly. Second, it's not scanning all your photos, it's scanning photos you are trying to upload to their servers.
between sync.com and pcloud.com which one do you think is better?
I was googling about cryptomator now and I read a lot about boxcryptor too. Are you experienced with the software?
I don't mind paying for the software if it works well so I don't necessarily need a free service.
Thanks a lot!