Artificial Intelligence and Suicide Prevention

ChatGPT, OpenAI and other large-language models popularly known as artificial intelligence are in the news every day, demonstrating increasing capabilities to analyze, summarize and spot patterns in large sets of data. The latest version of ChatGPT recently passed a medical licensing exam, leading Dr. Isaac Kohane, who's both a computer scientist at Harvard and a physician, to explain that the software was “better than many doctors I've observed” at diagnosis. 


The field of suicide prevention is also benefitting from the rapidly advancing abilities of AI. 

AI identifies life events reported in clinical notes found in completed suicides

We know that Veterans are a high-risk group for suicide -- but knowing exactly which factors might make someone attempt to take their own life is an important piece of research for suicide prevention. Researchers have analyzed checklist items such as diagnosis and other medical codes. But even more useful is an AI engine that can review the “unstructured data” of written clinical notes to identify issues in a person’s life described to a physician or mental health clinician – examples include social isolation, personal or family problems, job or financial challenges, housing or food instability, domestic violence, or even difficulties with transportation or access to care. These factors are also known as social determinants of health (SDOH). 


An article, A New Way to Gauge Suicide Risk?,  in the medical newsletter, Medscape, recently reported on a study published in the Journal of the American Medical Association.  The researchers compared records of Veterans’ deaths by suicide and information gleaned from both structured data and clinical notes on those patients and a control group. Researchers were asking which SDOH were the most prevalent among those who took their lives. The study included 8,821 Veterans who committed suicide and 35,284 Veterans with similar demographic characteristics who did not commit suicide.  


Three determinants stood out as significant for suicidal Veterans: legal problems, violence, and social isolation. The senior investigator on the study, Hong Yu, of the University of Massachusetts-Lowell, said that the study was the first to implement a natural language processing system that can analyze more information captured during health care encounters and thus can provide “a better system for risk assessment and suicide prevention.”



Data guides more intensive treatment to improve Veterans’ behavioral health.

The Veterans Health Administration has been using a suicide risk prediction algorithm called REACH VET (Recovery Engagement and Coordination for Health – Veterans Enhanced Treatment) for several years, and recently reported on positive outcomes for Veterans targeted at highest risk by the algorithm.


This algorithm, based on structured data only, has been in use since 2017 and has resulted in improved outcomes for individuals in the top 0.1% of suicide risk. Outcomes included measure such as improved attendance at medical appointments, mental health visits, and development of individual safety plans, as well as clinical outcomes such as reduced mental health admissions, emergency department visits, non-fatal suicide attempts and suicides.  


“Using an interactive dashboard, program coordinators receive information and communicate with clinicians who re-evaluate treatment strategies and conduct outreach to collaboratively initiate care enhancements. Reevaluation includes assessment of service needs, and clinicians ensure that patients can access services” according to a VA report. “Treatment enhancements include suicide prevention safety planning, enhanced communication, increased monitoring of stressful events, and interventions to support coping.”


Analyzing gun transactions to identify people prone to suicide or self-injury

Guns are the most lethal means of suicide, with more than 9 of 10 attempts with a gun being fatal, according to a 2019 study.  


New research published in JAMA Network Open indicates that AI analysis of handgun transaction data can accurately forecast firearm suicide risk, resulting in insights that could inform targeted interventions for suicide prevention.


The authors of the study hypothesized that firearm purchasing records might offer a large-scale and objective data source for developing tools to predict firearm suicide risk.


To test their hypothesis, the researchers used California’s Dealer’s Record of Sale (DROS) database, which consisted of 4.9 million handgun transaction records from 1.9 million individuals between Jan. 1, 1996, and Oct. 6, 2015, compared to vital statistics records on cause of death within one year of purchase.


Differences of note for transactions involving persons who died by firearm suicide include a larger fraction of revolver purchases, older age, more likely to be female and white, and fewer or no prior firearm purchases. 


 The authors concluded: “This study contributes to the growing evidence that computational methods can aid in the identification of high-risk groups and the development of targeted interventions.”


For more information on suicide prevention and programs to help Veterans, visit Suicide Prevention Resources — NH Coalition for Suicide Prevention (zerosuicidesnh.org).