Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

southerndoc

Contributor
Original poster
iOS 26 enables Apple Intelligence to learn from apps by default.

My health system uses several apps for communication and medical records, which I disabled the learn from app and anything that I could that would allow Apple Intelligence to obtain data (prioritization, etc.).

It makes me wonder how many people do this -- not just for work-related healthcare apps, but also for personal apps that contain sensitive medical information/PII.

Does anyone know if Apple has certified Apple Intelligence for HITECH/HIPAA compliance?
 
iOS 26 enables Apple Intelligence to learn from apps by default.

My health system uses several apps for communication and medical records, which I disabled the learn from app and anything that I could that would allow Apple Intelligence to obtain data (prioritization, etc.).

It makes me wonder how many people do this -- not just for work-related healthcare apps, but also for personal apps that contain sensitive medical information/PII.

Does anyone know if Apple has certified Apple Intelligence for HITECH/HIPAA compliance?

Key thing to note: data is only PHI and therefore covered by HIPAA if all these criteria are met:
1) Generated or passed through a covered entity (e.g. physician, hospital, payer, etc)
2) Identifies an individual (or includes data elements for which there is a reasonable basis to believe it can be used to identify an individual)
3) Relates to the individual's past, present or future physical or mental health or condition, the provision of health care to the individual, or the past, present, or future payment for the provision of health care to the individual

In particular HIPAA does not cover someone's use of a device or software from a non-covered entity. For example, OpenAI does not have to protect any medical information uploaded to ChatGPT (unless OpenAI agreed to do so for example under an enterprise agreement).

Providers do have to ensure compliance with HIPAA including ensuring any system, software, device, etc they load PHI into meets HIPAA, etc requirements including maintaining BAA with software vendors, consultants, etc touching their PHI.

Presumably any apps your health system supports/allows on phones (or otherwise allows to access data from their EHR, etc) were provisioned with a BAA and the vendor supplying that software keeps all PHI received, transmitted, stored, etc encrypted (e.g. encryption at rest and encryption in transit end-to-end, etc) in such a way that that Apple Intelligence, Siri, etc wouldn't be able to see the PHI even if you hadn't disabled them from learning from the app. Note that it is the responsibility of the covered entity, business associate, etc to ensure there's no leakage.

On the flip side, safest to assume any app, etc not covered under HIPAA, etc or a mutually signed contractual agreement with penalties will sell your information sooner or later. That has not stopped people from uploading the most personal information into these apps. I would be less concerned about Siri, etc indexing the data in these apps than what the developer of these apps does with the data.

Long story short, be careful with any health information of yours that you put into apps, websites, etc for your own sake. Be very careful with any health information of your patients that you put into apps, websites, etc for your patients' sake as well as your legal risk and liability.

P.S.The above assumes the US regulatory environment ignoring any additional state-specific rules. Similarly, other countries have their own rules (or lack of rules) which may be completely different. Also, not a lawyer...
 
  • Like
Reactions: WarmWinterHat
I know when it applies, and trust me, a physician communicating patient information on behalf of a health system, HIPAA applies.

PerfectServe (the app we use) encrypts it, but it was my impression that Apple Intelligence would have access to that. Does it not?
 
I know when it applies, and trust me, a physician communicating patient information on behalf of a health system, HIPAA applies.

Yes definitely applies to you and your health system. What I was trying to clarify is that it doesn't apply to any apps (not made available by a covered entity) that a person loads with their information. Similarly, Apple's is not generally a business associate to covered entities (there may be a few exceptions where they took on that risk for example to collaborate on a specific project with an academic health system) and their software doesn't generally meet HIPAA requirements. As HouseLannister highlighted, they explicitly prohibit loading PHI into iCloud, Private Cloud Compute, etc.

PerfectServe (the app we use) encrypts it, but it was my impression that Apple Intelligence would have access to that. Does it not?

As a software that serves the health care sector specifically to handle PHI, I am sure they thought through all the requirements to meet HIPAA, and I don't see how any software handling PHI could let something like Siri, Apple Inteliigence, etc touch the PHI it is holding.

My guess is that Apple's services never see unencrypted PHI through this software due to the way that it encrypts the data in flight and at rest and doesn't cache it locally (on the phone):

Just to be clear I never audited this software nor peripheral to anyone who did or even an expert in this area. It is a pretty safe bet though that your health system's IT, Compliance, Legal, etc did a full assessment before contracting with them.

The tricky side of this is that the risk you identified is real and likely mitigated by PerfectServe by design. However, someone not asking the questions you are might think some other software that seems to do similar things as PerfectServe would be just as good and that would be bad.
 
I was just curious because I doubt 95% of the physicians turned off the "allow AI to learn from this app" and other settings.

Yes agree and it is an important point. Unfortunately my guess is that outside of large health systems/etc, these issues aren't being managed and millions of patients' PHI aren't protected to HIPAA standards. It's up to the providers to ensure their compliance and the level of understanding and care required is beyond most medium-sized and smaller organizations. However, I wouldn't expect more from Apple on this issue.

In any case, it's good that you thought about this and limited Siri/AI/etc as you did even if not probably not strictly required. I do same as you as far as Siri/AI/etc for each app and also limit apps that can run in the background, access cellular, etc. I am not protecting PHI on my phone but I have no interest in helping companies use my phone to compile information about me. Once you see what even an app like GasBuddy can do you realize these things are a sieve of very personal information and AI/etc is going make even more holes.
 
I think you would be responsible for the data after it is in your screen. HIPAA doesn’t cover the use of sensitive health information by the owner of the data. Especially after it is displayed on the screen. So the owner is in the hook for the safe handling.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.