Yet there is one frontier in suicide prevention that seems especially promising, though in a way, it maybe a bit removed from the problem’s human element: big data predictions and intervention targeting.
We know that some populations are more likely than others to commit suicide. Men in the United States account for 79 percent of all suicides. People in their 20s are at higher risk than others. And whites and Native Americans tend to have higher suicide rates than other ethnicities. Yet we don’t have the greatest ability to grasp trends and other niche factors to build up actionable, targetable profiles of communities where we should focus our efforts. We’re stuck trying to expand a suicide prevention dragnet, as opposed to getting individuals at risk the precise information they need (even if they don’t tip off major signs to their friends and family).
That’s a big part of why last year, groups like the National Action Alliance for Suicide Prevention’s Research Prioritization Task Force listed better surveillance, data collection, and research on existing data as priorities for work in the field over the next decade. It’s also why multiple organizations are now developing algorithms to sort through diverse datasets, trying to identify behaviors, social media posting trends, language, lifestyle changes, or any other proxy that can help us predict suicidal tendencies. By doing this, the theory goes, we can target and deliver exactly the right information.
One of the greatest proponents of this data-heavy approach to suicide prevention is the United States Army, which suffers from a suicide rate many times higher than the general population. In 2012, they had more suicide deaths than casualties in Afghanistan. Yet with millions of soldiers stationed around the globe and limited suicide prevention resources, it’s been difficult to simply rely on expanding the dragnet. Instead, last December the Army announced that they’d developed an algorithm that distills the details of a soldier’s personal information into a set of 400 characteristics that mix and match to show whether an individual is likely in need of intervention. Their analysis isn’t perfect yet, but they’ve been able to identify a cluster of characteristics within 5 percent of military personnel who accounted for 52 percent of suicides, showing that they’re on the right track to better targeting and allocating prevention resources.
Yet perhaps the greatest distillation of this data-driven approach (combined with the expansive, barrier-reducing impulse of mainstream efforts) is the Crisis Text Line. Created in 2013 by organizers from DoSomething.org, the text line allows those too scared, embarrassed, or uncomfortable to vocalize their problems to friends, or over a hotline, to simply trace a pattern on a cell phone keypad (741741) and then type their problems in a text message. As of 2015, algorithmic learning allows the Crisis Text Line to search for keywords, based on over 8 million previous texts and data gathered from hundreds of suicide prevention workers, to identify who’s at serious risk and assign counselors to respond. But more than that, the data in texts can trip off time and vocabulary sensors, matching counselors with expertise in certain areas to respond to specific texters, or bringing up precisely tailored resources. For example, the system knows that self-harm peaks at 4 a.m. and that people typing “Mormon” are usually dealing with issues related to LGBTQ identity, discrimination, and isolation. Low-impact and low-cost with high potential for delivering the best information possible to those in need, it’s one of the cleverer young programs out there pushing the suicide prevention gains made over the last century.
It’ll be a few years before we can understand the impact of data analysis and targeting on suicide prevention efforts, especially relative to general attempts to expand existing programs. And given the limited success of a half-century of serious gains in understanding and resource provision, we’d be wise not to get our hopes up too much. But it’s not unreasonable to suspect that a combination of diversifying means of access, lowering barriers of communication, and better identifying those at risk could help us bring programs to populations that have not yet received them (or that we could not support quickly enough before). At the very least, crunching existing data may help us to discover why suicide rates have increased in recent years and to understand the mechanisms of this widespread social issue. We have solid, logical reason to support the development of programs like the Army’s algorithms and the Crisis Text Line, and to push for further similar initiatives. But really we have reason to support any kind of suicide prevention innovation, even if it feels less robust or promising than the recent data-driven efforts. If you've ever witnessed the pain that those moving towards suicide feel, or the wide-reaching fallout after someone takes his or her life, you'll understand the visceral, human need to let a thousand flowers bloom, desperately hoping that one of them sticks. Hopefully, if data mining and targeting works well, that'll only inspire further innovation, slowly putting a greater and greater dent in the phenomenon of suicide.
No comments :
Post a Comment