Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,581
37,965


Apple's User Privacy Engineering Manager Katie Skinner and Privacy Product Marketing Lead Sandy Parakilas recently sat down with YouTuber Andru Edwards for a wide-ranging discussion on Apple's privacy policies.


Topics covered include Apple's approach to privacy, the ways Apple contends with privacy laws in different countries, and how Apple deals with government requests, plus there's a good deal of information on the new features in iOS 18.

Some of what's discussed covers privacy information that Apple has reiterated over and over again, but there are some interesting tidbits on Apple's adoption of ChatGPT, Maps privacy, the Passwords app, and accessory pairing in iOS 18.

It's a long discussion at almost 45 minutes, but worth it for those who want a bit more insight into Apple's philosophy on privacy.

Article Link: Apple's Privacy Team Does Deep Dive Into iOS 18 Privacy Features
 
  • Haha
  • Like
Reactions: mhnd and sinoka56
i'll never understand this. installing almost any ad blocker will show you just how many known tracking servers your apps are contacting. maybe safari blocks them but let's face it, that's much less than half the battle now.

i mean heck, almost every app's "privacy label" shows that so much of your data is collected and linked to you. so how exactly is apple/iphone promoting your privacy?
 
Before people hate on this, maybe spend 45 minutes and watch the video. We pay big bucks to be an apple user, and expect privacy to be at the forefront.

If you bother to watch the video, you will quickly see, that the Apple employees don’t actually answer a lot (if not most) of the questions asked.
They talk strategy, policies and mission. High level directions. Things like 'checking out the code for privacy and security" is pretty specific. When they are been given very broad questions, they can only answer broadly.
 
i mean heck, almost every app's "privacy label" shows that so much of your data is collected and linked to you. so how exactly is apple/iphone promoting your privacy?

The way I look at the Privacy Labels is at least you are being told, and on the rare occasion you find an app that doesn't collect info you can give that app a little extra weight in your decision.

Consumers in the EU will start to lose those with the advent of alt-stores. I am very sure Epic will not force devs to declare the data they hoover as I am sure Epic will be hoovering with the best of them.
 
All PR spin to counter their move to use AI. Everybody knows Apple collects your data (and uses it for their own purposes) but they also allow apps to collect data. Of course Apple aren't the bad guys though because they allow you to "turn it off" through privacy settings. So wholesome. Thank you for protecting us, Tim!
 
All PR spin to counter their move to use AI. Everybody knows Apple collects your data (and uses it for their own purposes) but they also allow apps to collect data. Of course Apple aren't the bad guys though because they allow you to "turn it off" through privacy settings. So wholesome. Thank you for protecting us, Tim!
So you haven’t watched it then?
 
Before people hate on this, maybe spend 45 minutes and watch the video. We pay big bucks to be an apple user, and expect privacy to be at the forefront.


They talk strategy, policies and mission. High level directions. Things like 'checking out the code for privacy and security" is pretty specific. When they are been given very broad questions, they can only answer broadly.
I am not hating on the video or the people therein. I think those interviewed did a very poor job with the overly broad (quick yes answer) replies, instead of spending a few more moments to answer in a more through manner without divulging company secrets etc. The video (as presented) strikes me as a PR stunt for the YouTuber to gain clicks.
 
I am not hating on the video or the people therein. I think those interviewed did a very poor job with the overly broad (quick yes answer) replies, instead of spending a few more moments to answer in a more through manner without divulging company secrets etc. The video (as presented) strikes me as a PR stunt for the YouTuber to gain clicks.
I’m not suggesting you did. I think they are providing extensive answers. TBH I am 3/4 the way through and unless they are finishing with a lot of yes/no, they are doing a lot of informative answers. Unfortunately they have not got transcript turned on in their YouTube clip, so I'm not giving specific examples, but I am hearing a lot of explanation. As I’m writing this, the link was only put up 30 minutes ago and is 45 minutes long… so….
 
maybe spend 45 minutes and watch the video.
Or use https://notegpt.io/youtube-video-summarizer for a readable transcript, much faster.

00:00:00 Are apps listening to me at all times? Why would anyone who isn't a criminal or who isn't cheating on their spouse need privacy features at all? What privacy features are coming in iPhone 16? Is there any personal data that Apple collects, whether it's to make the product better or for marketing purposes? - Yes. (dramatic music) - Apple Intelligence was announced back at WWDC and you guys had a bunch of questions as it pertained to privacy. I'm here at Apple Park to get answers about those and more,

00:00:29 so let's go inside. - All right, Katie Skinner, Sandy Parakilas. I appreciate you guys joining me. Before we jump in, I wanna talk about your titles. Katie Skinner, you're Apple's User Privacy Engineering Manager and Sandy, you're Apple's Privacy Product Marketing Lead. For each of you, what does that actually mean? - Yeah, so I lead privacy engineering and so what me and my team do is we work with engineers all across Apple to ensure we build privacy into all of our products across the company, from our OSs to our services

00:01:01 like Apple Music, things like Siri, even when you go buy a computer through the Apple online store. And then we also partner with all the teams across Apple to build privacy-preserving features. These are things like app tracking transparency, Apple nutrition labels, and things like that. - Yeah, so one of the key things that I do in privacy product marketing is I work very closely with Katie and her team and the many teams at Apple that work on privacy to help figure out what privacy features and products

00:01:27 we wanna build. And then another part of my job is the outbound marketing about privacy. So this is everything from the slides that you see and the words that executives are saying at WWDC to our privacy marketing campaigns, our privacy ads, and that is some of the funnest, getting to sort of figure out how we communicate and add humor and context to our features is one of the really fun parts of my job. - Okay, so that leads me to some table setting here before we jump into some specific features.

00:01:57 What is privacy? - The way we think about privacy at Apple is we think it is a fundamental human right. And we think that, and this goes back to some words that Steve Jobs said many years ago that still inform us today, that you should ask, as a company, you should ask your users every time, ask them how you're gonna use their data and be really clear with them about what you're doing. And so that's really how we think about privacy. We think about it as a right, and we use that thinking to inform everything we build

00:02:29 and everything we do. - So why is that important to Apple? A lot of things you said there, obviously I work in the tech field, that does not seem to be the common shared belief amongst most tech companies. Why is it important to you? - We're all about putting the user first, putting our customers first. And we think privacy is a really key part of that. We think people expect it, and they expect us to be thoughtful and to think around corners and get ahead of things so that they don't have to worry about it.

00:03:04 - To make it clear, 'cause I think a lot of people kind of treat these the same, what is the difference between security and privacy? - Yeah, so we think that these are two different concepts that support each other. And security is more about protecting you from attack, from malicious actors. And privacy is more about choices, it's about control, giving you transparency about how your data is being used and what you're sharing with whom and how. But we think these two things really support each other

00:03:38 because if you don't have a basic level of security protections from those attacks, then you can't have any privacy. - Are you seeing users becoming more aware of how their data is being used and privacy settings and making choices with the tools that you've given them? - Yeah, I think we really are. I think we've really seen more and more awareness about this over the years and particularly the last few years. Look, I mean, this is something that we've cared about for a very long time, really since the beginning.

00:04:09 And what we've noticed is that users are becoming more and more aware of this issue. And so we're able to sort of meet them where they are and say, "Hey, we've been thinking about this "for a long time. "Look at all the stuff we've been building "and we're gonna continue to build for you." - Where does the privacy team fit in terms of when features are being built, proposed, or new hardware is being thought up? When someone at Apple comes up with an idea, this would be a cool feature for the iPhone

00:04:40 or a spatial computing headset might be something we should build. Does it get built and then someone says, "Hey, we've got to sprinkle in some privacy," or do they have to run it by the privacy team? The way I've seen companies say, "Oh, we've got to get legal involved. "We don't know what the, we've got to call them." Or is your team involved from the beginning? How does this work? - At Apple, we really believe that you have to build from privacy from the start. What you really have is when someone's thinking up an idea,

00:05:09 they're like, "I want to build a spatial computer "like you talked about." I think we're in, there's a member of my team that works with every different engineering team across the company, and they're in those same rooms when that spark of an idea of a spatial computer grows into more of a real plan, an idea. And then we continue on throughout the future development process, from working, partnering on how do we build a privacy-preserving architecture, identifying what are privacy risks, what are the things we're protecting with that feature,

00:05:40 and also developing what we call privacy assurances. So these are like the privacy promises of a feature. You might hear Craig talk about them or other people at WWDC, and that's kind of the guiding light that we use throughout the feature of is the implementation holding up to these? And then as we get closer to the end, actually looking at the code that's written and checking out the features to make sure that the technical design is implemented correctly. And I think one last key part of think what makes it work

00:06:12 is we also partner closely with our human interface and design team here. And so really thinking about what are the right controls to build into that feature, and how can we build users' expectations and transparency of how their data will be treated into the feature directly. And so that's, I think, both us and so many people across the company caring about privacy is really what we find to be really successful here. - Okay, so is there, or has there been conflict between we wanna do this really cool thing,

00:06:44 but there's privacy restraints? Or is the goal to find ways to make a privacy feature, like one that I'm thinking off the top of my head is Face ID, a privacy feature that is also cool at the same time? - Well, I think our big kind of mission is great features and great privacy. That's what we talk about all the time is we're looking to have both in one. And we also think about, Face ID is a great example because it both makes it super easy to use your phone faster than typing in a passcode, but also makes it so more people have more complex passcodes.

00:07:21 And so it means that the security and privacy of people's devices increase. So it's a great example of really where we're doing both at once. - It often seems the more privacy, the less convenient, or the more convenient, the less privacy you have. How does Apple think about that balance between privacy and convenience? - I think that's really where I was talking about great features and great privacy as our goal. And so we really think that we take a look and we need to innovate to be able to have both.

00:07:48 And that's really where we sometimes will push a little harder, look for new techniques from academia or other, or invent some to make sure that we have both in our features. - And kind of on that same subject, when it comes to privacy, it sounds like compromise often means you get less privacy. Privacy will lose in the face of compromise. So take various governments out there who ask Apple to create a backdoor for law enforcement to break through to help investigate crimes, for example. Why hasn't Apple compromised to provide a backdoor

00:08:20 for quote unquote, the good guys? - I think it really goes back to our belief in privacy as a fundamental human right. We really think about it at that level. You know, we approach all of the issues around privacy that way. We have this sort of very real focus on the customer and building features that, you know, give them the protections that they're expecting. Of course, you know, there are instances where we do need to work with law enforcement. - For the governments that have laws that seemingly go against the notion

00:08:59 that privacy is a fundamental human right, like that's Apple's belief. I think that's probably most people's personal belief. But if you live in a country where the country doesn't see that, how do you reconcile that privacy is a fundamental human right with governments that say, well, we need to have this data open and available to us? - We talk to them, right? We talk to governments, we talk to our customers, we talk to NGOs all the time. And, you know, what we communicate is the importance of privacy protections.

00:09:31 And, you know, we want to engage in a dialogue with all the stakeholders and be really clear about, you know, the importance of these things and the fact that, you know, really truly is a human right. - In some of these cases, it's not Apple's decisions, right? Like, I want to get over the misconception that if a government has a law and Apple wants to sell products and services there, then you need to follow those laws. Even though you might have a dialogue about maybe trying to change them or show them probably better lights,

00:10:03 ultimately you have to play by the rules of the country that you're in. - Yeah, I mean, I think it really goes back to, you know, we want to have this dialogue, we want to be clear with everyone about, you know, what customers expect and how the technology works and engage in this dialogue so that hopefully we can get to a good place. - Before we get to some misconceptions, I'm excited about that section. What do you think about feedback that I hear sometimes, I'm assuming you guys see this all the time too,

00:10:34 about how Apple's privacy features are too over the top, that no one needs them unless they're doing something wrong or have something to hide. So you see it all over social media during the keynotes. Apple is just making it easier for people to commit crimes. Apple is just helping people who cheat on their significant others. Why would anyone who isn't a criminal or who isn't cheating on their spouse need privacy features at all? - We think that privacy is really about what you want to share with other people.

00:11:01 And it doesn't need to, you know, what you're doing with your life and your behavior, that can also change over time. And so something that you may think is not sensitive initially, may over time, if you identify that it's connected to a condition or something like that, can change sensitivity. And so it's really about giving people the ability to make choices for themselves of who, whether it's people, applications, who they want to share their data and their information with. - So Apple, to be clear,

00:11:33 is not on this side of the pyramid. - No. - Just asking, just asking. Okay, let's get to some misconceptions. Are apps listening to me at all times? Is my iPhone just transmitting the audio of all my conversations to different companies all day long? - You know, this is something that we hear a lot and it's easy to jump to the easiest conclusion when you can see ads that may have show you something about the couch that you're talking to a friend about. But I think I can start first by talking about

00:12:05 our protections and what we know. And then some of the other things that may be responsible for that. So first off, we're not aware of any application that is breaking our OS security and privacy protections for microphone or camera. And so what that really means is our layers of protection start by no application has access to the camera or mic when you install it. Then an application can choose to request access to either the camera or mic and the user's in control to decide whether they want to authorize it or not.

00:12:39 If they choose to do so, then while they're using the app, if the camera or mic is used, they can always go and see the indicator. And so the camera or mic indicator, you can just swipe down from control center and see exactly what is using those sensors. And I think that's one thing that users can always feel in control with that. We've all had that case where we're having a conversation about a particular product or a topic, and then it seems to show up in an ad, seems to show up in a recommendation.

00:13:10 And there are a lot of other techniques that are used for that that are a little bit more advanced. So if I go over to your house and I join your wifi and you've been looking at cool blue couches online, that IP address of me joining your wifi can be connected in the backend. And that's a link that can be used to then serve me ads later. Or maybe if we're connected on different types of social media, there's a lot of pieces in the backend that these powerful models can use to link these things.

00:13:41 So there's other techniques out there that are probably responsible for that. - One thing I heard, some of these ad companies, they don't even need the, they don't need it. They have so much other things they can look at that it would be almost useless to have it. You're saying these models are so powerful. - Well, and I think that's why we've built in things like app tracking transparency into our device to put you back in control of your interactions with a particular application, how you want them to be used and for which purposes.

00:14:10 - Okay, next misconception, let's talk about this. There's a distinction between what your device knows and what Apple knows. And a lot of people kind of see them as one in the same. For example, when face ID was first announced, I was seeing there's no way, there's no way I'm saying, I'm not giving Apple my face, I'm not giving my face again. What are they gonna do with that? So the question really is what is Apple doing with all the thumbprint data and facial data that you've collected and have been storing

00:14:34 over all these years? - So when you enroll in face ID or touch ID, that information is staying on your device and never leaves. So we're doing nothing, we don't have it. - So in other words, my device, anything that's happening on my device, my device is not Apple. - Correct. - Just because I take a photo and it uploads to iCloud doesn't mean everything goes there. How do you explain that in a way that a customer, I guess for lack of a better term, would believe it? - Well, I think there's two parts of it.

00:15:04 I think first, we really believe in the privacy pillar of data minimization. And that means only collecting the data that we need to do the particular task. And so before we even get to explaining it to people, I think that's how we build our products from the start. And so that enables us to just explain to people when their data is actually leaving the device. And so I think that's one big thing that there's so much engineering and work that's done across so many teams at Apple to really make it

00:15:37 so things like Face ID are protected on your device and never leave. - Is there any personal data that Apple collects and uses in any product, whether it's to make the product better or for marketing purposes? - Yes, and I think this is a great place to stop and sort of walk you through our privacy principles. Katie mentioned one of them, but there's actually four. And so these are four things that we think about whenever we start designing a product. From the very beginning, we think about these things.

00:16:07 The first one that you mentioned is data minimization, right, that's about trying not to collect data, minimize the amount of data that Apple or other companies collect. The second one is on-device intelligence. So this is one way to do data minimization. And this is when we actually build the intelligence for a feature into your device. And that's really cool because that means that the data can stay on the device and doesn't leave, which is wonderful for privacy, wonderful for security. Then, to your point and your question,

00:16:39 there are some times that we might need to collect some data or someone else might need to collect some data. We all use the internet, right? Like, that's about communicating and all of those wonderful things. And so what we do in that case is pillar three, which is transparency and control. And this is about when data is collected, being really transparent with users about what's being collected and giving them control over that. Do you want that to be collected? Here's what's gonna be collected.

00:17:06 Do you wanna turn that off? And then the last one is one that we talked about earlier. It's security, right? We have to have this foundation of security underpinning all of these things to protect you throughout that whole process. - For Apple in particular, though, do you guys take and store data, user data, for marketing purposes or to make products better? Like, do you have a bank of our data somewhere? - Well, one program we can talk about is improve iPhone analytics. So when you set up an iPhone, we give you a choice

00:17:39 whether you'd like to share data that's not linked to you to help improve our software. And so that data is used by engineering teams to make decisions, to find bugs, track down issues and fix them, and also to understand how our features are being used so then we can improve them and invest in the places that people love. - So for both that and earlier, we talked about the, if apps are listening to you in the background, these are things that users have chosen to do. It's not something that's just happening in the background

00:18:13 that you have to later find out about and then figure out how to turn off and opt out. Or in the case of the microphone, there's no way for an app to record you without the indicator letting you know that, right? - Correct. - And going back to the sort of the Steve Jobs quote at the beginning, ask them, ask them every time. That is the watchword, right? Just ask people, let them know what you're doing, let them make a choice. - And privacy is one of those things where people can have different opinions about it.

00:18:41 And so we really think the best approach is putting users in control so they can make the best choices for them. - What happened a couple months ago where people started seeing old photos that they believe they deleted a while ago that started reappearing in their camera roll? Does Apple actually hang on to user photos and data after it's been deleted? - Yeah, I'm really glad you brought that up. So what you're referring to is a rare issue that impacted some users and some photos where due to database corruption,

00:19:13 those photos were reappearing in people's photo libraries. Now, I think it's really important to understand here that in many cases, those photos were actually photos that those users wanted to keep, but that had disappeared for them. And so how we're addressing this in iOS and iPadOS 18 and macOS Sequoia is we've actually created a new album in photos called the Recovered Album. And this is gonna be a locked album. What's gonna happen is your device is gonna scan for files that are impacted by database corruption.

00:19:45 Again, a rare issue, but if it happens to you, those photos will be moved into this locked Recovered Album and you can go in, you have to authenticate with Face ID or Touch ID, and then you can decide, ah, I see this photo was affected by this database corruption. Do I want to recover it to my photo library or do I wanna delete it? - If I, as a user, have any data on my phone or anywhere in iCloud and I delete it, I can feel confident that even though I've deleted from my devices, Apple doesn't still maintain

00:20:19 a copy somewhere else. - Correct. - I don't know if this is a misconception, but it's just something I thought was cool that I heard about years ago, and I forget exactly how it worked as it pertains to privacy. So can you remind me how Maps Navigation works? Because my understanding is when you get directions from a location to another location, by the time your trip is done, Apple has no clue where you started and where you ended. - One thing that I think is different than some other mapping applications is this is,

00:20:47 that trip is not linked to you. We know that map data is super sensitive. All the different places that you go from the gym to a restaurant, to a bar, to a doctor's office is really sensitive. And so we use rotating identifiers to enable navigation. And then what we also do is we cut off the beginning and the end of it and apply those with separate identifiers so your whole trip cannot be linked as we continue to improve the map service. - So even Apple has no idea where you started, where you ended?

00:21:19 - Yeah. - We don't keep a history of the places that you've been through the Maps app. We've intentionally designed the system to not do that. - All right, let's move on to some of the new stuff you guys announced recently at WWDC, starting with Apple Intelligence. Probably the coolest thing that was announced, I guess in my opinion, but also maybe the most eyebrow raising as it pertains to privacy, probably mostly because these things are so new and so different. So let's start with the semantic index.

00:21:46 It's the way your devices can access your data and give you answers based on what is on your device. How would you describe the semantic index to just a common person and how is it secured? - Yeah, so the way we think about this and the way we thought about Apple Intelligence from the start of the process was we have to provide powerful privacy to go hand in hand with the powerful intelligence that we're building. And so the first and sort of foundational way that we're doing that is by building the core

00:22:18 of Apple Intelligence into your device. And part of that is the semantic index, right? We wanna give you incredible functionality using your personal context. And so we have this semantic index that allows the system on device to understand your personal context and provide you with really helpful results. And of course, it's incredibly important to secure that. So the semantic index itself is encrypted on your device and it's protected by Data Vault, which is a kernel level security mechanism when you use to make sure that only the right processes

00:22:56 have access to certain information. - Will this only include information from Apple first party apps? - It can also include information that other apps on your system also provide. - So going off of that, I mentioned earlier like the balance between convenience and privacy. If I'm on my iPhone and then I switch over to my Mac or my iPad, will all those devices have their own semantic index or is there one semantic index that kind of syncs between them all? - They'll have their own, but keep in mind,

00:23:28 we have really great syncing between your devices using end to end encryption to keep your devices up to date. So in many cases, your devices already have the same or almost all the same information on them to begin with. - Okay, but if my iPhone has like extra apps that I may not have on my iPad, for example, then I may get a different answer or it might not find the same information because it's not synced. It's not like an iCloud synced index, it's an on device. - Yeah, in that case, you're not gonna get a result

00:23:57 from the app you only have on your iPhone on your app. - Yeah, that context won't be available on the other device. Does the semantic index include markers for how I personally communicate? So in other words, if I ask Apple Intelligence to make an email sound more professional, will it make it sound like generally more professional or will it make it sound more professional based on how I write? - The model is taking in what you have provided as that text and so it's kind of the combination of the input

00:24:29 that is the original as well as what we've trained to make it seem more professional. - Then you guys have private cloud compute, which almost sounds like an extension of my device. How do you explain what private cloud compute is? - The way we think about this is, as I mentioned earlier, we're building the core of Apple Intelligence onto your devices and we're doing a lot of the processing on your devices because that is a great way of doing things. But of course there are times when you have a really complicated request

00:25:03 and a larger model than can fit on your device today is necessary to give you a fast and good response. And so for those cases, we've built private cloud compute. And I think you sort of suggested it, but the whole approach here is just to take the security and privacy promises of your iPhone and extend it to the cloud. And so based on the way we've built it, with private cloud compute, just like with your device, Apple never has access to the data that's being used. - You've built private cloud compute in a way

00:25:38 that it's third party verifiable, that it's as private as you say it is. How does that work? How is it that a third party can verify this? In other words, how do people know that Apple isn't just saying this is how it works, but we're really doing something else? - Well, I think that's one of the core promises that we have today from a security and privacy perspective with an iPhone is because software is in your hands or security or privacy research is in, they can technically verify what it's doing.

00:26:09 And so I think that's the thing that we really wanted to bring to private cloud compute. Because traditionally, if the code is running on someone's servers, you don't know what's happening. Someone could be storing the data, they could be using it for other kinds of purposes than they say. And so that's one of the things that we think is really important about private cloud compute is giving access to that code to researchers so they can take a look and they can also verify that the promises that we're saying here

00:26:39 are actually happening. - And so when you do something that requires private cloud compute, is there a visual indicator or is it just you kind of just don't have to worry about it because of the fact that it's private? - I think we really believe that we've built it, as Sandy said, where the powerful intelligence and powerful privacy go hand in hand. And these layered privacy protections that we've talked about of using on-device intelligence, then only sending the data up to private cloud compute

00:27:09 that is needed, never storing the data, are really strong, kind of setting the bar of what intelligence can be. - Okay, so I do something that requires private cloud compute, sets to the cloud, it gets processed there, I get a response, and how long before, like is there no storage at all? It's like the response comes and it's just gone? - Correct, and the data is not used, that either the query or the response is not used to train our models as well. - One of the other big things that I think raised some eyebrows a little bit

00:27:42 was the chat GPT integration in Apple Intelligence. So before I dive deeper into that, just tell me how that works. - Yeah, so this is another case where we're really putting users in control. We're excited to have the chat GPT abilities be an extension, but before any data is sent to OpenAI or chat GPT, the user can choose whether they'd like to share that data or not. And we think there's really strong privacy promises that are part of this integration. And so the data that is sent to chat GPT

00:28:18 is not stored by OpenAI, it's not used to train their models, and it doesn't require anyone to sign in. And so I think those are all key, plus we also obscure IP addresses. And so that prevents the linkage of multiple different sessions with chat GPT to be linked together. - There's one other, I think, really important point in addition to all the really strong privacy protections that Katie outlined. All the Apple Intelligence features that we've built, those use Apple models. Those are models that we've built,

00:28:49 and the chat GPT integration is separate from that. - So until you send something to chat GPT, which you have to confirm on your own to do so, anything else you're doing in Apple Intelligence is not using anyone else's models but Apple. - Correct. - Those rules that you gave, those privacy assurances that you called them around sending to OpenAI, will that apply across the board? Like Apple has already said, we're looking to see what other models we might add in the future. Are there gonna be different promise assurances

00:29:16 based on which model you use, or is it gonna be if you use a third party, here's what you can expect? - So those promises are for the chat GPT integration, and that's all that we've announced at this point. Certainly we, obviously as we've been saying, we care deeply about privacy and we're gonna continue to apply our privacy principles broadly to everything we build and every integration that we do. - And then on the flip side, if I love chat GPT, have an account, use it all the time, can I send anything to chat GPT,

00:29:47 or is it only things that Apple intelligence recognizes as something that I might want to send there? - Well, you can link your accounts, your chat GPT account, and in that case, you are logged into your chat GPT account and you're sort of under the auspices of their privacy policies. - So how about, let's look at search for a moment. We know you're keeping privacy paramount with the chat GPT integration. Can you talk about how that works with search? We know Google is the default search engine in iOS,

00:30:18 and a lot of people are leery about Google's privacy practices and standards. What kind of privacy guard rails, if any, are in place for search? - Yeah, so when you use Safari to start, there's a bunch of privacy protections that are built in around search. So you can choose your own, whatever your preference is to be your default search engine, and you can even choose a different default search engine when you're in private browsing than when you're in regular browsing. And when you're using Safari and you're typing

00:30:50 into the smart search field, the bar along the top, we have a set of privacy protections there. So for example, we don't send your device location data when you're searching that way, and we don't send cookies through those searches. - Let's move on to the new passwords app. What are the key features of the new passwords app and what can it do that the previous, I don't even know what you would call it, password management tool in settings could do? - Yeah, so we're super excited about the new passwords app.

00:31:19 And I think the best way to describe this is, we've been working on passwords and password management for 25 years, like you may remember Keychain. We've, over this time, we've been iterating on and improving the way that we help users with their passwords. And what we've done this year is we've brought together all these great features into one place, an easy to use app, the passwords app. And so you can do everything from creating passwords, managing your accounts and passwords, you can get two factor authentication numbers

00:31:54 through the passwords app. You can even get warnings if a password has been compromised in a known data breach, or for example, if your password is password, that might not be a great password. So we'll have a warning that says this might be a weak password. And then you can start changing it right from there in the passwords app. - Speaking of one of the things you mentioned there, so the compromised passwords, the known databases, how does that work? If someone, if that happens and I'm in my passwords app

00:32:20 and I see, oh, here's some passwords that have appeared in a breach, how is Apple finding that information and what is the recommended next step? - There are databases that we work with that provide for these known breaches. And so through that in a privacy preserving process, we can notify you that your particular account was impacted. And then the way it works in the passwords app is you'll see this alert and you can then write in line, you can start the process, go to the website or the app of changing that password.

00:32:59 - Let's move on to another new feature, app locking and app hiding. What prompted Apple to introduce the ability to lock and hide apps? - We've all been in the experience where we've maybe handed our phone to someone to show a photo or maybe if you're driving and you need help with directions. And there's just a growing amount of sensitive data that are on our devices. And so we wanted to give an ability for people to be able to hand over their device to show a photo, but do so with confidence while protecting their sensitive data.

00:33:29 - So walk me through the process of how this works. You have a bunch of apps on your phone with data you don't want just anyone seeing, let's say your banking app, right? How do you lock it? - So you long press on the app and then in the menu there you can choose to lock an app. And then what happens is if that app, someone wants to open it, you tap on it, it will authenticate either with face ID, touch ID or passcode, and then it will open. So if someone can't authenticate, then they can't see it.

00:33:57 And I think another key part of that is it's not just opening the app. If there's a search or a notification comes in, also that information will not be shown. - For locked apps? - Correct. - Oh, interesting. Okay, I thought that was only for the hidden app. So even if an app is locked, you'll get those assurances too, okay. - Right. - And you did say it will have a passcode backup. So if you have someone in your life who you are fine with going into your locked apps, they can just, they don't have your face,

00:34:24 but they can just type in your passcode and get in there. - Passcode will work. - So maybe you have a health-related app or a dating app or an app with sensitive work information maybe you're working on. Again, maybe a spatial computer that no one's supposed to know about it. You have messages in there. That's where hidden apps come into play, which hides the app as the name implies. Tell me more about how that works. - So you can also choose to long press and hide an app. And this will put it in a locked app folder.

00:34:50 So you'll navigate over to that in-app library to access an app. - If I have a hidden apps folder, does that then make it weird that I'm hiding apps in the first place? Or does everybody just have a hidden apps folder? - Everybody has the hidden apps folder, yeah. - So even if I don't hide an app. - Right, and you can't see inside it, right? Until you face ID or touch ID or open it, you can't see inside the hidden apps folder. - So it doesn't reveal anything about it. - All right, next let's move on to contact sharing.

00:35:17 This is a big one. This is actually a topic I've wanted to talk to you guys about for years. So I'm super glad I'm talking to the right people. So before we talk about the new feature, I wanna talk more broadly. Apple believes privacy is a fundamental right. You already said that and your data should be your data. So here's my argument. If I give you my name, address, work, email, which tells you where I work, personal email and phone number, I'd argue that what you have on your phone is my data,

00:35:43 not your data. But if I'm someone who deleted my Facebook account, because I don't trust how Meta uses personal information, there's nothing I can do to stop you from sharing my information with Meta if it asks you when you install Facebook, do you wanna share your contacts? How does Apple square those two beliefs, privacy and your data is yours, but this one piece of data, I feel like among any other on your iPhone, the data in your contacts is full of data that belongs to others. - The way we think about this is we really wanna

00:36:14 put people in control and give them choices. And I think what you've seen from us over the years is that we have incrementally added more and more granular controls. And you've seen this in things like location, you've seen this in photos, and this year you're seeing it in contacts. And so what we're saying is, look, maybe there's a messenger app of some kind that you only use to communicate with your family. Maybe you don't wanna share all of your contacts with that app. So in that case, you could just individually share your family members,

00:36:49 or you could even create a smart list and share the list with the app. And then they're only getting the specific people who you know you want to communicate with in the app. - Do you think if the iPhone was being announced today, or any of the platforms being announced today, that you would include the ability to ask for your entire contact list? - You know, it's hard to speak to a hypothetical. I think we've learned a lot over the years about how people think about privacy and what's important.

00:37:20 And the approach that you've seen from us is one that we will continue, which is that we're gonna continue to give you more and more control and we're gonna continue to build more and more privacy protections and features so users stay fully in control and make the choices that they wanna make. - And it really, I think, comes to the pillar about data minimization of, there are different kinds of applications that you wanna do different things with, and it's appropriate to share different amounts of data

00:37:49 to accomplish that. And so to what Sandy says, we wanna put you in control to help, you know, be able to fully experience the vast app ecosystem and have there be flexibility and exploration, but at all points let you be the one kind of choosing what to share. - Now that Apple has made person-to-person contact sharing so much easier than it was in the past, like in the past you had to either just type it out or copy a business card, now it can automatically recognize, you know, information from an email,

00:38:20 or you can just tap and share your contacts. Do you see more room for improvement still on allowing the person sharing their information with others to maybe have some sort of say in how their information can be shared? - I think like Sandy said, where we always continue to look for ways we can improve and put users more in control. And you've kind of seen us year after year continue to make improvements and take steps that put users in the driver's seat. - And for the new selective contact sharing,

00:38:51 what situations do you kind of imagine it being used for? I know you said like one example might be a messaging app where you're just, it's more like personal friends and family, but what other things can kind of get the ball rolling in people's minds for how they can use this? - Yeah, so banking apps where you're wiring money to a handful of people. I think one of the ways to think about this is, you know, think about how you could use lists. Like maybe there's a gaming app where you have a set of contacts

00:39:21 or people you play games with, and that's who you want to share with a particular app. So we think that the possibilities here are kind of endless. - Let's move on to accessory pairing. We've got a new way to pair accessories coming. This is interesting because I think most people, they buy a smart home product and they have to download an app and they're setting it up and then you'll get hit with a prompt. Can we look at all the devices on your network? And I almost feel like it's like just tapping yes

00:39:44 on a big EULA screen, licensing screen. No one really reads it. I bought this thing, it's $300. I just want it to work, I'm just gonna say yes. How does this new method aim to change that behavior? - Well, I think we really want to make it more seamless to set these things up. So a lot of these accessories we find require Bluetooth, Wi-Fi, multiple types of communication to get set up. And that app really only wants to connect to that one accessory. So we wanted to build a way that's faster to set up,

00:40:13 better experience for people, and really limits to only grant access to that one accessory. So that application doesn't need to see the other devices in your home or around in the local area. - Is this something that just kind of works once you install iOS 18 and the related operating systems? Or do the app vendors need to kind of update their apps for this? - It does require developer adoption to get that seamless pairing. - How about some quick lightning round questions? What feature for each of you are you most proud of

00:40:49 as it relates to privacy that you've shipped? - Shipped ever is a good question. I think this year, I do think the Apple intelligence and the private cloud compute is a big one. Just the whole approach of how we thought about it, of having the on-device portion and reaching out with the technical base of Apple Silicon. And in both cases, it's these years of investment that really enable us to build this new kind of baseline for privacy in AI. - You know, a few years ago, we shipped privacy nutrition labels

00:41:29 and app tracking transparency in the same year. And to me, that was just a great example of us giving users transparency and control. Nutrition labels, you can see how an app is gonna use your data, you can make a choice, and then app tracking transparency, we're giving you a choice about whether apps are allowed to track you or not. - And then kind of in that same vein, what privacy features have you heard from customers that they've appreciated the most? - You know, I think those two are two big ones for sure,

00:42:02 ATT and privacy nutrition labels. You know, just the ability to make choices, the prompts that are available throughout the system that let people decide what data to share and what not. Like, you know, the iterations that we've made on those over the years, I think people really appreciate. - App tracking transparency is my favorite privacy feature. I love it so much that I specifically don't turn on the feature that lets me just have it say no for me in the background because I like-- - You want the--

00:42:32 - I wanna be able to press it. I feel good when I download an app. No, you cannot. - Satisfied, huh? - Yes. - What's something that surprised you about working in this field? - I think the continued evolution. Like, that's what keeps it interesting too, is it continues to be new threats that are emerging, but also I love to see the growing amount of researchers and community coming up with new technologies to help us mitigate them. - You know, I feel surprised often by how complicated this landscape is and how much there is going on.

00:43:09 And like, you know, I never have a, I don't know about you, but I never have a boring day here. There's always some new challenge to deal with. - Something a little more fun. What devices do each of you use on a daily basis? What phones do you carry? What Mac or iPad or what do you use? - Okay, I think I probably use a bunch. I have like a MacBook Pro that I use most of the time, commute on the bus with. I also have a desktop at work. Sometimes I use my iPad too. So multi-device, I love my watch for running.

00:43:42 And this year I'm using the big phone. - So I'm in the midst of a transition that hasn't quite finished yet from the Pro to the Plus. But I actually have, I have two phones. I have a Pro that I use for work and an SE that I use in my personal life. It's very light. - That's interesting. - You know, it's just, it's great. I love it. And then similarly, you know, I've got a MacBook Pro and iPad Pro that I use for personal creativity and things like that. And an Apple Watch. - What privacy features are coming in iPhone 16?

00:44:16 No, I'm kidding. (laughing) Katie, Sandy, thank you so much for joining me today. I really appreciate it. - Thanks for taking the time to share with everyone all the cool features this year. - Yeah, thank you so much.
 
Last edited:
i'll never understand this. installing almost any ad blocker will show you just how many known tracking servers your apps are contacting. maybe safari blocks them but let's face it, that's much less than half the battle now.

i mean heck, almost every app's "privacy label" shows that so much of your data is collected and linked to you. so how exactly is apple/iphone promoting your privacy?
My ad blocker has blocked 18+ million items since I got my M1. And that's just on Chrome. I use Safari and Firefox for other things. So I won't bother to check but the total might be more than double that.
 
I’m not suggesting you did. I think they are providing extensive answers. TBH I am 3/4 the way through and unless they are finishing with a lot of yes/no, they are doing a lot of informative answers. Unfortunately they have not got transcript turned on in their YouTube clip, so I'm not giving specific examples, but I am hearing a lot of explanation. As I’m writing this, the link was only put up 30 minutes ago and is 45 minutes long… so….
The video was posted 6 hours ago even though the MR article was recently posted.
 
The video was posted 6 hours ago even though the MR article was recently posted.
That’s so cool. Thanks. 🫡

I'm guessing my search was private? 😂. That summary is amazing, especially because there is so much content in it.

The video was posted 6 hours ago even though the MR article was recently posted.
Fair enough. For what it’s worth, I haven’t heard a single Yes/No answer.
 
  • Like
Reactions: dreckly
true privacy from tracking can only be obtained with a systemwide ad blocker.

Also Apple touts its on-device intelligence, but there's no way to completely turn off Spotlight learning, Photos face recognition/memories, and all the analytics for the "For You" sections of their services. Charge limits used to require them to learn your habits until iPhone 15 when a hard toggle was made available.
 
But Apple has been defaulting Safari's search engine to Google?
Okay.

So what?

That is the very first thing I change on any new device. I hate Google with every fiber of my being but if I owned Apple and Google continued to stroke me a check for b-b-b-billions every year I would make the same choice. I would also take some of those b-b-b-billions and spend it on anti-Google ads telling people how and why to choose a different search ;).

If the average Joe would spend 5 minutes searching for easy ways to better protect their data I bet many more would "opt-out".
 
So what?

That is the very first thing I change on any new device. I hate Google with every fiber of my being but if I owned Apple and Google continued to stroke me a check for b-b-b-billions every year I would make the same choice. I would also take some of those b-b-b-billions and spend it on anti-Google ads telling people how and why to choose a different search ;).

If the average Joe would spend 5 minutes searching for easy ways to better protect their data I bet many more would "opt-out".
So, Apple can get off their high horse about "privacy" when they default their search engine to this evil company.
Grow a spine, Apple.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.