This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more here

NEWS2 July 2013

Social media analysed to explore US veterans’ suicide risk

Data analytics North America

US — A research project has been launched to detect suicide risks among US military veterans.

Led by predictive analytics firm Patterns and Predictions and the Veterans Education and Research Association of Northern New England – with assistance from Facebook – the project is deigned to help mental health professionals detect and monitor communications and behavioural patterns that could predict suicide risk.

The Durkheim Project involves a voluntary opt-in database of participants’ social media and mobile phone data that automatically uploads relevant content onto an integrated database.

The data is continuously updated and analysed by AI systems and Durkheim’s predictive analytics applications will then provide real-time monitoring of text content and behavioural patterns statistically correlated with tendencies for harmful behaviours such as suicide.

Director of the Durkheim project Chris Poulin said: “The study we’ve begun with our research partners will build a rich knowledge base that eventually could enable timely interventions by mental health professionals.”

@RESEARCH LIVE

4 Comments

6 years ago

This type of work is very important yet EXTREMELY risky. What happens when you find vulnerable people during the test phase? You can't just put them aside as test cases. When do you intervene? Who intervenes? Have you got permission to intervene? Must you report the cases to health care professionals? Have you got enough doctors on staff to handle all the potential risk cases that are discovered? What are your legal and ethical obligations? What a tough, tough situation.

Like Report

6 years ago

Hi Annie, Your comment was sent to my attention, and you raise a number of strong points about our new program. Let me try to address each; "yet EXTREMELY risky." which, frankly, is why it hasn't been done prior. Many medical and legal complexities. "What happens when you find vulnerable people during the test phase?" The Clinical Protocols are clear ('non-interventional'), we aren't allowed to intervene at this time. We will obviously flag these people, and if authorized, do so, will intervene. "You can't just put them aside as test cases." Correct. "When do you intervene? Who intervenes?" We are building an automated intervention system right now. "Have you got permission to intervene? Not at this time. Ours is the first time suicide risk has been reliably identified. As such, no large scale intervention protocols exist in this area. "Must you report the cases to health care professionals?" Our protocol protects the privacy of the participant. So we can only suggest 'self-help' numbers (such as suicide hotlines). Not optimal, but the rules. "Have you got enough doctors on staff to handle all the potential risk cases that are discovered?" No, I am of the opinion that the nation has a shortage of psychiatric professionals to deal with this level of crisis. Without an automated system, we do not feel it is solvable. Hence our current research focus. "What are your legal and ethical obligations?" We will consider publishing our medical protocol at some point for a detailed answer to this. But in short, the same as any research trial. Let me give you a scenario; If a new cancer drug comes out, and the results are extremely promising, what is the ethical obligation then? Ultimately there is going to be a clinical review, regardless of how promising the treatment might be. (And for what its worth, I share your obvious frustration with the length of these clinical processes.) Please keep in mind that we care, and are trying our best. Finally, thank you for your concerned feedback. Was helpful to the effort. Best Regards, Chris Poulin, Principal Investigator www.durkheimproject.org

Like Report

6 years ago

Thanks for the lovely response. It still irks me in numerous ways but both you and I understand the inherent problems with what we WANT to do, and what we are ALLOWED to do and what is ETHICAL to do.

Like Report

6 years ago

Congratulations to them for undertaking something so ambitious. The predictive power of social media is something that's generally been overrated...but that doesn't mean more efforts shouldn't be made. For anyone reading...there's loads more detail at http://www.durkheimproject.org/news/ and it's a shame this article isn't a little more detailed. Two questions - firstly I'd be interested to know how many opt-ins they have on the database; secondly - how do they measure the "success" of the model?! I know little about suicide/mental health rates in veterans vs the general public but assuming it is higher, it would be interesting to know if there is a control group as well for comparison. Annie raises very important points similar to the challenges posed by adverse event reporting from healthcare social media research. I'm impressed with the answers from Chris Poulin.

Like Report