Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The answers to why Apple are involved are in the very opening post of this topic. Apple have worked with universities before to develop or assist in the development of new technologies before, this isn't any different.

Why is Apple getting involved though? Medical researchers have been perfectly capable of developing their own methods and codes for decades without the direct involvement of the particular multinational company that happens to manufacture a device they use. Of course Apple might be doing this for purely altruistic reasons. Though that does go against the “Apple is a business” argument rolled out every time Apple puts profit first. On the other hand there are numerous less favourable reasons Apple could be doing this, ranging from tax breaks to advertising to surveillance.
"Why is Apple getting involved though?" They're involved so far as they made the API, ResearchKit, to do exactly as you've noted—to develop their own methods utilizing the tools at hand. No bogeyman here. http://researchkit.org/
 
If my iPhone finds out about my kids autism before their pediatrician, I should lose my parenting license. Also find a new pediatrician.
 
  • Like
Reactions: lxmeta
This MR title and post really needs to be updated. The article on the WSJ has been substantially rewritten (hopefully in part due to my emailing the journalist about an inaccuracy [to which he responded and then updated the article]). The autism research in any case is almost an aside about the main focus.

The only news is that Apple is continuing to collaborate with university and other scientists on health-related work. This is fascinating and greatly needed research.
 
The son of one of my nephews has Asperger's.

His parents wasted precious years bouncing from one false diagnosis to another before finding a doctor who got it right.

Oh, wow. I am sorry to hear that. It took us quite a while, too, to get my son's diagnosis. That's frustrating for sure. I hope he's doing well. My son is in high school now and is doing okay for the most part.
 
I don't know what to say...
What's next? Scanning the way you look at pictures on socials to determine your emotions based on what you're looking at? If I remember correctly, something like that was being tested (not by Apple specifically)
 
Oh, wow. I am sorry to hear that. It took us quite a while, too, to get my son's diagnosis. That's frustrating for sure. I hope he's doing well. My son is in high school now and is doing okay for the most part.
He's fine, thanks. He doesn't like being hugged but gets along fine with all his cousins as far as I can see when the family gets together. He's doing very well at school too. Like many in the spectrum he's gifted in many ways.
 
  • Love
Reactions: AimeeHogan
I don't know what to say...
best to say nothing then...
What's next?
Breakfast for me...
Scanning the way you look at pictures on socials to determine your emotions based on what you're looking at? If I remember correctly, something like that was being tested (not by Apple specifically)

Have you even bothered to read the article before jumping on the bandwagon?

"the Journal says that these features may never become a user-end feature"
 
The son of one of my nephews has Asperger's.

His parents wasted precious years bouncing from one false diagnosis to another before finding a doctor who got it right.

Yes but any diagnosis made by a mobile phone is going to be referred to physicians for confirmation, and you're back to square on.

EDIT: got caught out by the grammar police. :)
 
Last edited:
  • Like
Reactions: Unsupported
I’m not even surprised this turned into a privacy debate full of misinformation, especially given what happened last month. This is not the kind of thing Apple would even make a part of iOS itself, it’ll be an app that you can download or not (assuming it actually comes out, which there is a chance it never will because after all, it is just a research study - some of them go absolutely nowhere). So anyone concerned about privacy shouldn’t be because believe it or not, you don’t have to use it.

I think this will be huge if it ever comes out. So many autistic children grow up undiagnosed, and if this can be used to help them get an actual diagnosis quicker (which is what this was made to even do), that will allow them to get the help and support they need much quicker. Undoubtedly a win.
The motives of Apple, like for the proposed CSAM scanning, are unerstandable, but you have to think about what a given technology can be used for. The internet was designed to allow a military communications network that exhibited graceful degradation under attack. Now we do our shopping on it, and a lot of other less laudable things (sharing illegal CSAM material for one). The point is that should be trying to look not just at what could be accomplished now, but how the technology can be applied and misapplied in the future. To me the scary thing is not that new machine learning algorithms will occasionally make errors. I am more frightened that it will reach a stage where they won't, doing exactly what humans told them to do.
 
  • Like
Reactions: 5105973
Yes but any diagnosis made by a mobile phone is going to be referred to physicians for confirmation, and your back to square on.
Yes, but my back has nothing to do with it 🤔

****inmug.jpg
 
  • Haha
Reactions: VulchR
Apple Definitely Not Researching Ways To Not Get Kids Hooked On Screens, As Almost Universally Advised by Pediatricians And Child Psychologists

-Babylon Bee MacRumors
 
This is an article I just ran across in Apple News. It has nothing to do with the study Apple is participating in, I know that, so before anyone jumps on me about including this here, just give me a moment and I’ll explain why I’m doing so.

For those who don’t have the time to read the article, it highlights how a technology designed with the best of intentions: getting some measure of control over our country’s massive opioid crisis, is hurting real people who are not abusing opioids.

It’s a sobering tale about how data is collected on ALL of us in the US healthcare system with almost no oversight and fed into small streams that aggregate into larger and larger streams until they have the power to control our lives in ways that didn’t exist just a few years ago.

And nobody thought to put any checks and balances because the system grew into being before anyone stopped to think about the potential negative outcomes. Apparently even HIPAA does not protect our data in this system.

And if they had thought to question it, would people have said they were standing in the way of progress on the war on drugs? That they were obstructionists? That they were paranoid tinfoil hatters? After all, this is America, land of the free, home of the brave. The surveillance state can’t happen here in any meaningful way. We’re not the “communists”. Right?

It is a cautionary tale of what happens when we trust too much to technology to do everything for us and take humans out of very human concerns. When we substitute human discernment for the cold scrutiny of AI.

I include it here because people have been accused of being luddites and paranoid about their hesitancy or outright refusal to embrace any form of what they perceive as “surveillance technology” even when deployed with the best of intentions and for the good causes.

I know in some respects I’m comparing Apples and Oranges. But I think it’s so important to have the discussions. For people to raise their fears, even if they’re coming from a place of ignorance. Let more knowledgeable people discuss and enlighten. But never shut down inquiry with derogatory labels.

I’ve had my own struggles being locked out of medical care due to notations filed in my medical history because I tried to bring a complaint against a doctor for sexual harassment. As bad as that experience was, I can’t imagine going up against a system where it’s my word against an algorithm. It’s an impartial machine, so it must be right. The end.
 
  • Sad
Reactions: VulchR
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.