Scientists are trying to use machine learning to help detect when a person may be at a high risk of suicide. According to a New York Times article, the process would work by “a Fitbit programmed to track \ sleep and physical activity. On [the person’s] smartphone, an app collects data about [their] moods, movement, and social interactions.” The data would then be sent over to a team of researchers and see if the patient is at risk.
Matthew K. Nock, a Harvard Physiologist, said that this process could work by “The sensor reports that a patient’s sleep is disturbed, she reports a low mood on questionnaires and GPS shows she is not leaving the house. But an accelerometer on her phone shows that she is moving around a lot, suggesting agitation. The algorithm flags the patient. A ping sounds on a dashboard. And, at just the right time, a clinician reaches out with a phone call or a message.”
There are a lot of issues that may come with this. Algorithms are never 100% accurate. There could be false positives or, more dangerous, false negatives. This could lead to patients getting pinged when not needed or patients at risk of not getting the help they need.
When Dr. Nock was asked about these potential hazards, he said “With all due respect to people who’ve been doing this work for decades, for a century, we haven’t learned a great deal about how to identify people at risk and how to intervene.” He also said “The suicide rate now is the same as it was literally 100 years ago. So just if we’re being honest, we’re not getting better.”
Many people like Dr. Karen L. Swartz, a professor of psychiatry at Johns Hopkins University are worried about a “gray zone” that may also come into play. She had a patient who had admitted to suicidal thoughts but refused to be hospitalized. Dr Swartz was worried that she may have been fired if she forced her into a hospital, so decided to send her home and monitor instead.
When reflecting on this experience, she said “It was one of those things where I just genuinely hoped I was right. With experience, it only becomes clearer that suicidal thoughts can come and go without warning.” Swartz also stated “We are asked to predict something that is highly unpredictable,”
Although machine learning is something that is a growing trend. People may have to consider the risks that come with it, especially when using it with something as sensitive as suicide.
The first graf can work, but don’t try to include the quote. Just paraphrase. The same for the second graf. If you use it you’ll need to shorten a bit, and cutting the quotes may be the quickest. This is something students should be able to comment on for you.