Monday, April 01, 2019

App based suicide prevention

From STAT:
Digital health apps, which let patients chat with doctors or health coaches or even receive likely medical diagnoses from a bot, are transforming modern health care. They are also — in practice — being used as suicide crisis hotlines.

Patients are confessing suicidal thoughts using apps designed to help them manage their diabetes or figure out why they might have a headache, according to industry executives. As a result, many digital health startups are scrambling to figure out how best to respond and when to call the police — questions that even suicide prevention experts don’t have good answers to.

“To be honest, when we started this, I didn’t think it was as big an issue as it obviously is,” said Daniel Nathrath, CEO of Ada Health.

The European company built a chatbot to provide smartphone users with possible explanations for their medical complaints. Since the app launched in late 2016, people around the world have used it to complete more than 10 million health assessments. In about 130,000 of those cases, users have told Ada that they’re struggling with suicidal thoughts or behaviors, the company said.
That's a lot of suicidal thought confession!   Why do people feel so free to tell an app this?:
The phenomenon is, in some respects, no surprise: There’s a large body of research showing that people are more willing to confess potentially taboo thoughts to a computer than to a fellow human a few feet away.
But as the article goes onto explain, there is no good research on how best to intervene if an app is told by a patient that they are feeling suicidal right now.  

Perhaps a premium app service in future could send in a drone with a nice cup of tea and a biscuit for starters.   Then one of those faked up videos faces (of a psychiatrist in a white coat?) so good it's hard to know if it's real or not offering some kind words?

No comments: