The GLP is committed to full transparency. Download and review our just-released 2019 Annual Report.

Mental health apps are reading your texts—some of them are selling your data, raising privacy concerns

| | October 10, 2019

An app for monitoring people with bipolar disorder and schizophrenia is so precise it can track when a patient steps outside for a cigarette break or starts a romantic relationship — and where that new partner lives. Another app, meant to screen for suicidality, analyzes not only text message metadata, but also the content of the conversations.

Many of these apps have a common problem, say experts on health technology: They put patients’ privacy at risk while providing marginal or unknown benefits. And in some cases, without customers’ knowledge, makers of mental health apps or services are using the data collected to create products that have nothing to do with health care.

Related article:  A third of cancer drug clinical trials don’t report on race. Here’s why that matters

An eye-opening study published in JAMA Network Open in April revealed that 81% of the 36 top-rated mental health apps sent data to Google and Facebook for analytics or advertising purposes. Only 59% of those apps revealed this in their privacy policy; three explicitly stated that they would not share information with a third party but did it anyway, and nine others had no policy at all.

Read full, original post: Mental health apps are scooping up your most sensitive data. Will you benefit?

The GLP aggregated and excerpted this article to reflect the diversity of news, opinion, and analysis. Click the link above to read the full, original article.
News on human & agricultural genetics and biotechnology delivered to your inbox.
Optional. Mail on special occasions.

Send this to a friend