I want to put a plug in for privacy. Without intentionally building these things in from the start, it is VERY VERY difficult to backtrack and do them when a technology matures.*
So while I don't know if Apple's technical approach is the right one, I want give them a lot of credit for building privacy into their AI right from the start -- and along with however they "struggle" with that as they build up their AI tools.
Privacy issues were not a great concern when the internet was first built, so you have lots of infrastructure that was NOT built with privacy in mind. So, what happens is, most of us are using unsecured email protocols!! And, it's clunky and often not user friendly when you want to make privacy and confidentiality a key part of your email process. Do you remember back in the day when public PGP keys were in some email signatures? Well, as much as I'm a geeky, and while I kinda understood that process, it was clunky enough AND also where no one in my network of friends and workers use PGP or even thought about privacy that I too ended up never really adopting it. That's just one example of how clunky setting up privacy is for emails with others when the protocol itself wasn't developed from day one to have privacy in mind.
So again, kudos to Apple.
Sure, using ChatGPT directly with ChatGPT may provide better immediate results etc. What "you" (or others who don't think much of these things) are missing are the known and unknown downstream consequences.
Even today when digital communications and internet technology is so common, "we" (general "we") are all enamored by the glitz of immediate results and responses without understanding the full consequences of what and how we are selling ourselves.
Nope. We as the general public HAVE TO push back and demand privacy and confidentiality MUST be built into new technology and technology cannot and should not consider privacy and confidentiality as afterthoughts.
[*] This is a footnote to say that what I mean by "very very difficult to backtrack" and build privacy and confidentiality back in is to say that it is hard to build that in such a way that either people adopt them, is not clunky, and doesn't have glaring holes. Holes will always exist, but fewer of certain issues certainly will exist when something is intentionally built from the ground up with a particular idea in mind.
Edit: And thank you for reading what are my sort of rambling thoughts. I'm sure if I took the time I could edit down that mess up there
So while I don't know if Apple's technical approach is the right one, I want give them a lot of credit for building privacy into their AI right from the start -- and along with however they "struggle" with that as they build up their AI tools.
Privacy issues were not a great concern when the internet was first built, so you have lots of infrastructure that was NOT built with privacy in mind. So, what happens is, most of us are using unsecured email protocols!! And, it's clunky and often not user friendly when you want to make privacy and confidentiality a key part of your email process. Do you remember back in the day when public PGP keys were in some email signatures? Well, as much as I'm a geeky, and while I kinda understood that process, it was clunky enough AND also where no one in my network of friends and workers use PGP or even thought about privacy that I too ended up never really adopting it. That's just one example of how clunky setting up privacy is for emails with others when the protocol itself wasn't developed from day one to have privacy in mind.
So again, kudos to Apple.
Sure, using ChatGPT directly with ChatGPT may provide better immediate results etc. What "you" (or others who don't think much of these things) are missing are the known and unknown downstream consequences.
Even today when digital communications and internet technology is so common, "we" (general "we") are all enamored by the glitz of immediate results and responses without understanding the full consequences of what and how we are selling ourselves.
Nope. We as the general public HAVE TO push back and demand privacy and confidentiality MUST be built into new technology and technology cannot and should not consider privacy and confidentiality as afterthoughts.
[*] This is a footnote to say that what I mean by "very very difficult to backtrack" and build privacy and confidentiality back in is to say that it is hard to build that in such a way that either people adopt them, is not clunky, and doesn't have glaring holes. Holes will always exist, but fewer of certain issues certainly will exist when something is intentionally built from the ground up with a particular idea in mind.
Edit: And thank you for reading what are my sort of rambling thoughts. I'm sure if I took the time I could edit down that mess up there
Last edited: