Using AI to decipher data from the deceased to prevent soldier suicide

  • Date

    September 20, 2022

  • Read time

    5 min


Artificial Intelligence (AI) is already used in many ways in healthcare, such as to improve diagnosis accuracy and to accelerate drug development. But what if AI could be used to prevent suicide? The Cledar Radar takes a closer look.

We at Cledar are intrigued by the work being done by Stop Soldier Suicide, Inc., a veteran-backed non-profit based in North Carolina. The organization recently applied for patent protection for its method of preventing suicide among soldiers and veterans, which, according to the organization’s website, will be 23 times higher than the number of post-911 combat deaths by 2030. In addition to the tragic loss of life and devastating impact on the families of the deceased, such (relatively) high suicide rates could cost the U.S. $221 billion.

Suicide rates among soldiers and veterans

Suicide rates tend to be higher among soldiers and veterans than the general population. A NY Times article from 2019 revealed that suicide rates among veterans had risen by 30 percent since 2000. According to Stop Soldier Suicide, “veterans are at 50% higher risk of suicide than their peers who have not served”

Suicide is considered preventable and, given the adverse situations soldiers might encounter and the well-documented challenges of transitioning back into ‘regular’ civilian life after service, it’s not surprising that several administrations have focused on addressing this epidemic, with the Trump administration rolling out a 10-point strategy as recently as June 2020.

Conventional approaches to gauging suicide risk focused on reviewing previous diagnoses and considered factors such as substance abuse. They relied heavily on practitioner instinct and experience, which are subjective and unscientific. In much the same way that AI is being used to support – not replace – radiologists and doctors in diagnosing disease, the use of AI in preventing soldier suicide could be a great example of how technology can be used to augment services typically provided by humans.

The idea of supplementing the human approach to suicide prevention with technology is not entirely new – indeed the Department of Veteran Affairs (VA) adopted artificial intelligence to reduce the instances of suicide in its ranks back in 2017 and claims to have had significant success.

How can AI help prevent soldier suicide? The tech behind the tool

The system developed by Stop Soldier Suicide uses data gathered from decedents who committed suicide, including data from third-party aggregators and from digital devices used by the deceased. Examples of such data disclosed in the patent application include social media posts, direct messages, connections with friends, web browsing history, blog posts, text messages, photos, gaming profiles and more. This data is then processed into a large dataset and AI/Machine Learning algorithms are applied to generate models of predictive behavior correlated with veteran suicide.

The value is in merging private data on individuals with publicly available data on specific people. Such augmentation is not a trivial task. Combining the publicly visible data footprint with private data means multiple factors are used in parallel and this enables a higher level of sensitivity and accuracy.

The legal perspective on using AI and data to predict human behavior

The benefits of using AI and algorithms to identify possible suicide risks and thus being able to provide support and help save lives are undoubtedly huge. However, there are obviously a few key factors that need to be considered.

Accessing data of the deceased, in many instances, affords less privacy protection, but gaining access to the data from messaging, web browsing history, and other private activities or interactions would require extensive management of consents for each data source, assuming those who enter service will have the same level of privacy protection as civilians. Perhaps there will need to be legislation addressing such broad consent rights, balancing privacy rights with the public’s interest in suicide prevention.

If there is an opt-in mechanism, what restrictions, if any, should there be on the use of the data beyond suicide prevention, and how should such limitations be communicated to the servicemember to encourage participation? Should the data be admissible for all types of legal proceedings? Should tech companies be afforded limited liability in exchange for their participation, bearing in mind the overarching goal?

Or should we collectively take the view that it’s better to know that somebody’s looking out for you, even if they might get it wrong from time to time? After all, most of us consent to the use of our data by tech giants like Google and Meta in exchange for ‘free’ access to tools and services. Is it, therefore, reasonable – especially in a climate where it’s become more common to talk about mental health – to consent to sharing data from our private lives if it can contribute to the important goal of saving lives?

How else can AI and data be used to improve healthcare and wellness?

  • Improving the efficacy of seasonal vaccinations (such as the flu jab) by analyzing historical data and using predictive data analytics to anticipate and identify virus mutations and adjust vaccination formulas accordingly. [Time]
  • Enhancing the accuracy of diagnosis, reducing radiologist backlogs, and potentially detecting conditions much earlier in their development than human-only approaches typically do. [World Economic Forum]
  • Forecasting air quality and pollution levels based on historical data and climatological conditions (temperature, wind, humidity) to inform better decisions about when to spend time outdoors and what type of activities can be done. [Open Access Government]
  • Helping customers to find the vitamins and supplements they need quickly and conveniently by aggregating data and products from a variety of sources in one place. Cledar helped Vitaminology build the world’s first search engine dedicated to vitamins and supplements to help customers find products that match their needs at the best price. [Vitaminology case study]

Interested in harnessing the power of data and AI to improve healthcare outcomes?

If you have an interesting use case involving AI and data aimed at improving healthcare outcomes and wellbeing, get in touch with us and we’ll be happy to discuss.

Congratulations to Glenn Devitt and Chris Ford at Stop Solider Suicide, Inc. The patent can be found on the website of the US Patent & Trademark Office.

Curious about the latest tech insights? Follow us on our social media.

See more of what we do on our social media.


Interested in enhancing your AI capabilities?

Leave your details and any questions or ideas you might have, and we’ll get back to you to schedule a call.