Jerry Evans Jr. fills out a suicide report every time he picks up the phone. “Everyone who calls the crisis line gets assessed. If they call trying to refill a prescription, because they just dialed the wrong number, they still get assessed,” he explained.

He’s a responder at the Veterans Crisis Line. It’s a hotline administered by the Department of Veterans Affairs that takes calls from veterans, active-duty soldiers and civilians who are seeking help for suicidal thoughts and behavior. Evans wants to know whether the person at risk has a plan and the necessary means to hurt himself — if he shows serious intent. “We find out if they have reasons for wanting to die. Then we try to find reasons for them to keep living,” Evans said.

In these tense conversations, responders like Evans try to comprehend the swirl of factors behind the caller’s state and to accurately gauge suicide risk. But Evans is still a responder — ideally, high-risk individuals would be identified before they feel the need to call the crisis line.

Doing that isn’t easy. There’s a long list of factors for suicide — marriage and financial problems, depression, alcohol and drug abuse, chronic pain, post-traumatic stress disorder (PTSD), etc. — and each person’s mix could be different. It’s the kind of task that might be better suited for computers, not humans. (Research has repeatedly shown that doctors are not accurate in predicting who is at risk of suicide.)

With the help of people, the computers are getting better at it. Researchers have created — and are still honing — a model to predict who might commit suicide. That model relies on sophisticated algorithms and a massive amount of data, and it’s blossoming at a surprising institution: the Army.

The Army is constructing a high-tech weapon to fight suicide because it’s losing the battle against it.

In 2012, more soldiers committed suicide than died while fighting in Afghanistan: 349 suicides compared to 295 combat deaths. That’s a symptom of a military suicide rate that has been on the rise since 2005, far outpacing the general population’s rate.

Every category of soldier has seen its suicide rate rise since 2003: those currently deployed, previously deployed and even the never-deployed. That last group’s change suggests that what’s behind the phenomenon isn’t merely more Army soldiers in combat, or the nature of that combat. Not even ending the wars in Iraq and Afghanistan could bring the numbers down.

In spring 2008, the then-secretary of the Army, Pete Geren, noticed the upward trend in soldier suicides and convened some experts to try to find an answer. But there was no easy solution — no single, identifiable fix. “I don’t think of suicide as a problem that has an answer,” said Michael Schoenbaum, who was part of the initial group and is a senior adviser for mental health services, epidemiology and economics at the National Institute of Mental Health (NIMH). Rather, it has many.

And so, in 2009, to help identify suicide’s myriad causes, the Army started the largest suicide study ever: the Study to Assess Risk and Resilience in Servicemembers (STARRS). Combining the resources of the Army with the research abilities of academics at several universities and the NIMH, the STARRS program is to run through June but will release its findings over many years after that. Its mission is to understand the shared characteristics of Army soldiers who commit suicide. From 2004 to 2009, the study tracked more than 1.6 million soldiers.

This endeavor is possible because the Army is so integrated into soldiers’ lives. As Ron Kessler, one of the principal researchers in the STARRS program, described the Army, “It’s the employer, the doctor, the judge and jury. No place else can you get all this data on one person but the military.”

Equipped with all this soldier-level data, the next challenge was how to make sense of it all.

That’s where the model comes in. The U.S. military has more than a million people scattered across the world. The vast majority have negligible suicide risk, so an intervention program can’t be efficient unless the military can winnow down the population to the high-risk individuals. The STARRS program needed to “build smaller haystacks with a higher concentration of needles,” Schoenbaum said.

STARRS researchers developed an algorithm to predict suicides in soldiers, as reported in an article published last month in JAMA Psychiatry. Specifically, the JAMA study tracked soldiers who were hospitalized for some sort of psychiatric disorder and then released. From 2004 to 2009, 53,769 hospital stays were examined, totaling more than 40,000 soldiers who represented nearly 1 percent of all soldiers in any given year.

Using sophisticated machine learning methods, the algorithm distilled more than 400 personal characteristics into a smaller set of factors that were consistently predictive of suicidal behavior. Among them: being male; having undergone psychiatric inpatient treatment; suffering from major depression; and attempting suicide before.

But the model also identified other Army-specific characteristics correlated with high suicide risk, including enlisting in the Army at age 27 or older, having nonviolent weapons offenses and scoring above the 50th percentile in the Armed Forces Qualification Test.

But researchers were cautious not to over-interpret any one variable. Many are correlated and no single one is enough to raise a red flag. The statistical model needs to be fed dozens of data points about an individual, otherwise it doesn’t provide any explanatory power.

Take “hearing loss,” for example, which is a variable associated with higher suicide risk. But hearing loss is also associated with traumatic brain injury.

The model’s predictive abilities were impressive. Those soldiers who were rated in the top 5 percent of risk were responsible for 52 percent of all suicides — they were the needles, and the Army was starting to find them. The table below shows the suicide rate for the U.S. as a whole, the military (all branches) and the Army. Also included is the suicide rate for the highest-risk groups as predicted by the model. The soldiers with the highest 5 percent of risk scores committed over half of all suicides in the period covered — at an extraordinary rate of about 3,824 suicides per 100,000 person-years.

Matthew Nock, a co-author of the JAMA article and a professor at Harvard University, sought to underplay the model, painting it as a marginal improvement. “I’m optimistic,” he said. “We’re getting denser and denser haystacks. But we still have further to go. We’re not going to wave the ‘Mission Accomplished’ flag and say we’ve done it.”

They’ll know they’ve done it when the model is being used to identify those at risk as their symptoms manifest. That’s when it moves from an academic study to a tool that helps authorities and advocates take action.

The specific actions can vary. Often, it’s just keeping patients connected to the care system. As Schoenbaum said, “Don’t just send them home. You schedule follow-up visits. If they have family members, you coordinate with them.”

Army clinicians say too many soldiers hospitalized for a psychiatric disorder will drift away after discharge and not return for care. “The visits often don’t happen. People just don’t follow directions,” Schoenbaum said.

An algorithm can help, but it can’t do everything. That’s where clinicians come in; they can tailor treatment regimens for people with suicidal tendencies. Right now, though, these conversations are happening without a predictive algorithm, if they’re happening at all.

“In a way we’re stating the obvious; that mental health problems are associated with suicide risk. That in and of itself is not news, but we’ve turned it into a quantitative algorithm,” Schoenbaum said.

Evans, the crisis line responder, is open to using data. But in the moment, he said, “we have to treat every call with what they’re saying right then, high risk or not.” Perhaps the mountains of data and fancy algorithms will not be capable of capturing the deeper, emotionally complex factors behind suicide.

When I asked Evans about the callers’ most common symptoms, he enumerated the risk factors cited in the literature — drug abuse, PTSD, depression. But then he paused and thought about it.

“Feelings of hopelessness and helplessness are the biggest things,” he said. “We get a lot of calls about loneliness.”