jueves, 22 de agosto de 2013

Social Media Project Monitors Keywords to Prevent Suicide

A Boston-based project is using predictive analysis technology to comb through social media posts in hopes of preventing suicide.

The opt-in Durkheim Project combines search technology with predictive analysis to estimate the suicide risk of an individual based on what he or she is posting on Facebook, Twitter or LinkedIn. Already in collaboration with Facebook, the Durkheim Project is currently gathering social media data from participating active duty military members and veterans.

The technology will analyze participants' engagement on social networks: daily status updates on Facebook, chatter on Twitter, and structure changes and keywords on LinkedIn. Participants with Android phones can even opt-in their text messages for analysis. The system will gather that data and look for individual keywords that may indicate a high-risk for suicide, psychiatric illness or whether someone is completely fine. But since people are not always explicit in their intentions, the system will also look at other factors that might demonstrate a person's level of agitation or use of painkillers.

"It turns out that what is predictive is much more related to social isolation, chronic pain and things like that — which are actually strongly associated in the literature with suicidal risk," said Chris Poulin, the project's director and principal investigator.

The goal of the project is to detect suicide risk in real time and eventually send an alert for intervention.

But what happens when someone innocently posts hyperbole that could be misinterpreted as serious risk? ("I'm going to kill myself if this guy doesn't stop talking.") Poulin told Mashable the system has a protective factor, so phrases people say that end up not being correlated to suicide risk will be pushed down.

The graphic, below, shows a data sample from the Durkheim Project: red terms are associated with suicide risk; yellow terms are associated with psychiatric/non-suicidal risks; and green terms are associated with low risk.

DurkheimProjectDataSample

Potential for Intervention

The Durkheim Project is beginning with active-duty military and veteran participants, a group associated with high risk for suicide. In fact, the project is funded by the United States' Defense Advanced Research Projects Agency (DARPA). But suicide is also prevalent across the overall population: Suicide remains one of the leading causes of death in the United States.

The current goal of the project is to get 100,000 individuals to opt in during this data collection period. Over time, the system is expected to become more accurate.

"Do I think that we can predict suicide risk right now with high precision? No. But I do think that if someone in our system was suicidal today, I think we would have a chance of picking them up," Poulin said.

Currently, in the data collection phase, the process consists of passive observation. The project has to wait for clinical approval in order to actually intervene and signal that someone might need help — adding the "human in the loop," Poulin said.

In the project's next phase, he envisions automated intervention: "If they have a clinician, then the clinician will get the risk alerts for that individual." For those who don't have a clinician, the system would, in theory, alert a trusted family member or designated buddy.

"The buddy standing next to them is also the one that's looking out for them digitally," said Poulin, in a nod to the project's military participants.

As a third option, if a person is alone, there may be a safety plan to remind him or her of factors to consider.

In addition to providing real-time analysis of who might be at risk, Poulin said he hopes the project can better inform the clinical community about the risks and language of suicide.

Since the information gathered from participants is highly-sensitive, the project is ensuring HIPAA compliance and has partnered with Dartmouth College's Geisel School of Medicine for the data storage. Poulin stresses that the project lays out a consent form in clear language for participants, in order to emphasize the privacy aspect and ensure people know what they're opting into.

"Hopefully the younger generation embraces things like this and realizes that as long as your privacy control is emphatic and you know what you're getting into, that it's not unsafe," he said. "It's unsafe when you don't know who's looking at your data."

Editor's note: If you are having thoughts of harming yourself, please call 911, the Veterans Crisis Line at 1-800-273-8255 or the National Suicide Prevention Lifeline, also at 1-800-273-8255.

Lead image: D Sharon Pruitt. Screenshot/graphic: Durkheim Project

[h/t The Boston Globe]

No hay comentarios:

Publicar un comentario