Apps don’t provide reliable help for suicide prevention

  News, Rassegna Stampa
image_pdfimage_print

A handful of depression management and suicide prevention apps — downloaded millions of times — included incorrect or nonfunctional contact information for suicide crisis help lines, according to a new analysis. While apps can offer people with suicidal thoughts or behaviors an important lifeline, experts are worried that many of the apps available on the Apple App Store or Google Play may not be following best practices, or connecting people with appropriate resources.

Depression management and suicide prevention apps can fill an important role: many people feel more comfortable looking for information or seeking help online, and report that it’s easier to ask questions and share problems online rather than speaking with a person face to face. But that makes it even more important that the digital tools people turn to are up to the highest standards for prevention.

“Not only could there be nothing useful about them, in fact, they could be harmful,” says Igor Galynker, director of the Mount Sinai Beth Israel Suicide Research and Prevention Laboratory in New York. Other experts were also worried about people relying on unregulated apps to deal with suicide prevention. “Incorrect or even harmful information could result in a very problematic outcome,” Nadine Kaslow, chief psychologist at Grady Health System, wrote in an email to The Verge.

In the new analysis, published in the journal BMC Medicine, the authors assessed 69 apps that targeted people with depression, assessed suicide risk, or provided suicide prevention advice. They didn’t include any applications aimed at health care providers, or that connected users directly to physicians or counselors.

The assessments looked to see how many of six broad, evidence-based strategies for suicide prevention the apps employed. The authors specifically looked at strategies that have been developed by expert groups in the United States, United Kingdom, and the World Health Organization. Generally, they looked to see if each app tracked suicidal thoughts, included the ability to build a safety plan for a user with suicidal thoughts, or recommended activities to deter thoughts. They also looked to see whether the apps provided educational materials about risk factors for suicide, could include the contact information of a user’s support network, and provided a way to access emergency counseling.

Only 7 percent of the apps included all six strategies, including Stay Alive, developed by the United Kingdom-based Grassroots Suicide Prevention group. Most apps only included one, two, or three of the strategies. The depression management app 7 Cups, for example, assesses mood and connects users to a crisis help line, but doesn’t provide users with the other four recommended tools. At the time of the analysis, 46 of the apps included access to a crisis help line within the app. Of those that did, though, six had phone numbers that did not work. The authors note that they informed the app developers of the problem, and two of the six have since corrected the errors, according to the paper.

The findings weren’t surprising because many health apps for other health conditions don’t follow evidence-based guidelines, wrote study authors Josip Car, director of the Global eHealth Unit at Imperial College London, and Laura Martinengo, a doctoral student at Nanyang Technological University in Singapore, in an email to The Verge. However, they said that it’s particularly disappointing to see in this area of health. “Suicide is [literally] a life-and-death question and one would hope that those who wanted to help would follow best practice standards,” they said.

Their analysis showed that there’s little oversight of the specific information contained within these apps. “Our findings show information may not be corroborated and clearly demonstrate the lack of self-regulation and self-monitoring of the industry,” they wrote in the paper.

Developer guidelines on Apple’s App Store say that apps that could provide inaccurate health information may be reviewed more closely by Apple, and Google Play says that apps with misleading medical content are in violation of their policies. The Verge has twice emailed Google and Apple for comment, and will update this with their response.

Connecting users to a suicide hotline that doesn’t work is clearly terrible, but actually figuring out how effective the other apps are remains tricky. Health providers currently have very few ways to study the potential positive or negative consequences these apps may have. Compounding the problem, researchers also aren’t entirely sure whether the strategies developed in clinical settings for preventing suicide stay effective when they’re built into digital platforms.

“Most apps do not have much scientific evidence to support their efficacy,” Kaslow said. At this point, apps may be able to supplement care from a doctor or professional, but they shouldn’t be used as the sole treatment or management tool for depression or suicidal ideation, she said.

It’d be difficult for people searching the app store to figure out on their own which apps have evidence-based functions, Car and Martinengo said — so users should be cautious, and ideally talk with their doctor before starting to use one. But the burden of vetting apps shouldn’t be on vulnerable people struggling with their mental health, and not everyone is willing or able to go to a doctor. “We thus urge both developers of such, we believe, vitally important apps and Apple and Google app stores to urgently raise the quality and safety of these apps and follow best practice standards,” they said.

https://www.theverge.com/2019/12/30/21035816/suicide-prevention-apps-reliable-help-crisis-hotline-support