I was simply sharing my personal opinion after having spent a few years doing grad level research in the field. My research wasn't funded by Apple money so I could be wrong about what the ultimate outcome will be.
The research of connecting mental health to how someone uses the device is being encouraged by Biogen which released the controversial Alzheimer's drug, Aduhelm. The one that's approval made 3 FDA committee members resign.
I don't like the jump Apple wants to make from signals that are meaningful in a clinical setting when reviewed by a human to get a diagnosis, to an automated system that make a guess based on a black box model and is deployed to the general public. Just like a doctor wouldn't encourage every single person go out tomorrow and do a fully body mri scan and then follow it up with every single blood test just to see if something wrong.
I hope I am wrong and their research produces something meaningful. If it's actually robust with no false positives and opt-in, it sounds like a great idea. But that's not how reality works when you're testing for something so subjective like depression or autism. There's no clear relationship between depression and the kinds of signals they are using, unlike heart rate and AFib.
I would be the first one to join any sort of study that will use blood tests to find markers of depression, to quantify it. But the reality is that it's not currently quantifiable. Using proxies like screen time and how focused you are on the screen isn't good enough. And going as far as:
"data that may be used includes analysis of participants’ facial expressions, how they speak, the pace and frequency of their walks, sleep patterns, and heart and respiration rates. They may also measure the speed of their typing, frequency of their typos and content of what they type, among other data points"
it's literally throwing everything and the kitchen sink at the wall and seeing what sticks.
That's not science, that's p-hacking. With that many signals, some might be promising but it's like a study where you see if vitamin D effects the outcome of 40 lifetime illnesses and then only reporting which ones will work.
For other people getting an alert that their HR has spiked might help, but I don't see how the person would not have noticed themselves already.
I personally don't believe an automated phone/watch analysis is going to give me a better understanding of myself using some black box model. If I feel depressed, the watch isn't going to call a doctor for me, and even if it did, I don't know how it could give me motivation to physically seek help in that state.
I have a smartwatch that does 24/7 per minute heart rate monitoring and it's really really interesting to see the data. I love using my polar h10 HR chest strap when I work out because it gives a live EKG graph as I work out.
I am all for using data to help inform, but this doesn't seem like the a scientific way to go about it.
Especially if they've jumped to saying they want to detect Autism, where by definition it's broad and very specific to each individual, and Depression and Anxiety which are insanely environment specific.
I have wished many times that mental health issues could be quantized in some robust way because that would make treatment easier and remove a lot of stigma surrounding diagnosis and treatment. That's why I spent energy in research there myself. I don't see this move by Apple doing that. And I understand that's not Apple's goal, but anything less is going to have too many cofounding variable which will create a lot of noisy data. Apple wants to go for the more basic signals, but that's not something you use your phone for. Maybe it'll help people that don't have a strong social support system and live alone.
The other main issue is that they want to create a generalized model that doesn't need human analysis to make these predictions. Depression causes a cluster of symptoms but not everyone has the same ones and it doesn't hold them back in the same exact way. Anxiety and Autism and other mental health are the same in that regard.
The research that might potentially produce some meaningful positive outcome is the testing for Alzheimer's by doing cognitive tests on devices, but the 2019 feasibility study had that "31 adults with cognitive impairment exhibited different behavior on their Apple devices than healthy older adults" leaves out the total cohort size and any details you can use to evaluate the validity of the research. They also go on to mention that they're testing against traditional cognitive tests and brain scans that show plaque buildup. The plaque theory has shown to have a lot less evidence supporting it than originally thought [0]. This research is in collaboration with Biogen, the company's who's Alzheimer's drug was approved earlier this year. The drug the 3 members quit over.
I have reservations about privacy and the invasive sensors this article seems to describe, but that doesn't change that the underlying approach of finding significant signals has a lot of issues.
I hope it's clearer now where I'm coming from. This is an issue I am invested in and am passionate about. I would be just as happy as you if the research is positive and they make something meaningful. I just wanted to point out the red flags I see.
I can't edit anymore but I thought I should mention that some of the sensors and data we explored for various projects. I have experience trying to find data from an eeg cap. For a different project I had to use neuron firing data recorded during a surgery. I have used an accelerometer glove, eye tracking harness, pedometers, myoband.
And my favorite is heart rate data. With an ant+ sensor and a polar h10 strap you get the RR and hrv testing if you want and can even keep a live ekg graph going. As a side hackathon project, for medhacks, I wanted to see if with enough data you could find a correlation between tempo of music with your heart rate. The idea is that first the model would learn what songs paired with what music. Then if your heart rate spikes which I naively and intentionally assumed was related to anxiety, there would be a song recommend to help lower your heart. It was just a fun idea for a weekend hackathon. I couldn't get to work as well as I wanted it to.
Sensors and data analysis are the future of all medicine. But in order for the data analysis to be useful, it has to be methodology has be to rigorous with very little room for a subjective outcome.
If someone ends up taking that idea, or would like to talk more, I would love to hear about it, my email is knaik1994 at gmail dot com.
The research of connecting mental health to how someone uses the device is being encouraged by Biogen which released the controversial Alzheimer's drug, Aduhelm. The one that's approval made 3 FDA committee members resign.
I don't like the jump Apple wants to make from signals that are meaningful in a clinical setting when reviewed by a human to get a diagnosis, to an automated system that make a guess based on a black box model and is deployed to the general public. Just like a doctor wouldn't encourage every single person go out tomorrow and do a fully body mri scan and then follow it up with every single blood test just to see if something wrong.
I hope I am wrong and their research produces something meaningful. If it's actually robust with no false positives and opt-in, it sounds like a great idea. But that's not how reality works when you're testing for something so subjective like depression or autism. There's no clear relationship between depression and the kinds of signals they are using, unlike heart rate and AFib.
I would be the first one to join any sort of study that will use blood tests to find markers of depression, to quantify it. But the reality is that it's not currently quantifiable. Using proxies like screen time and how focused you are on the screen isn't good enough. And going as far as:
"data that may be used includes analysis of participants’ facial expressions, how they speak, the pace and frequency of their walks, sleep patterns, and heart and respiration rates. They may also measure the speed of their typing, frequency of their typos and content of what they type, among other data points"
it's literally throwing everything and the kitchen sink at the wall and seeing what sticks.
That's not science, that's p-hacking. With that many signals, some might be promising but it's like a study where you see if vitamin D effects the outcome of 40 lifetime illnesses and then only reporting which ones will work.
For other people getting an alert that their HR has spiked might help, but I don't see how the person would not have noticed themselves already.
I personally don't believe an automated phone/watch analysis is going to give me a better understanding of myself using some black box model. If I feel depressed, the watch isn't going to call a doctor for me, and even if it did, I don't know how it could give me motivation to physically seek help in that state.
I have a smartwatch that does 24/7 per minute heart rate monitoring and it's really really interesting to see the data. I love using my polar h10 HR chest strap when I work out because it gives a live EKG graph as I work out.
I am all for using data to help inform, but this doesn't seem like the a scientific way to go about it.
Especially if they've jumped to saying they want to detect Autism, where by definition it's broad and very specific to each individual, and Depression and Anxiety which are insanely environment specific.
I have wished many times that mental health issues could be quantized in some robust way because that would make treatment easier and remove a lot of stigma surrounding diagnosis and treatment. That's why I spent energy in research there myself. I don't see this move by Apple doing that. And I understand that's not Apple's goal, but anything less is going to have too many cofounding variable which will create a lot of noisy data. Apple wants to go for the more basic signals, but that's not something you use your phone for. Maybe it'll help people that don't have a strong social support system and live alone.
The other main issue is that they want to create a generalized model that doesn't need human analysis to make these predictions. Depression causes a cluster of symptoms but not everyone has the same ones and it doesn't hold them back in the same exact way. Anxiety and Autism and other mental health are the same in that regard.
The research that might potentially produce some meaningful positive outcome is the testing for Alzheimer's by doing cognitive tests on devices, but the 2019 feasibility study had that "31 adults with cognitive impairment exhibited different behavior on their Apple devices than healthy older adults" leaves out the total cohort size and any details you can use to evaluate the validity of the research. They also go on to mention that they're testing against traditional cognitive tests and brain scans that show plaque buildup. The plaque theory has shown to have a lot less evidence supporting it than originally thought [0]. This research is in collaboration with Biogen, the company's who's Alzheimer's drug was approved earlier this year. The drug the 3 members quit over.
I have reservations about privacy and the invasive sensors this article seems to describe, but that doesn't change that the underlying approach of finding significant signals has a lot of issues.
I hope it's clearer now where I'm coming from. This is an issue I am invested in and am passionate about. I would be just as happy as you if the research is positive and they make something meaningful. I just wanted to point out the red flags I see.
0. https://www.nature.com/articles/d41586-018-05719-4