The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
ViewpointFull Access

Why the Nails Should Boss the Hammers

Published Online:https://doi.org/10.1176/appi.ps.201900218

As the digital revolution finally reaches the mental health clinic, we can point to actual examples of modern information technology ready to improve mental health care. The Veterans Health Administration is implementing population-based outreach to prevent suicide, driven by machine learning–derived risk prediction models (1). The Food and Drug Administration recently approved the first “prescription digital therapeutic,” a mobile phone app officially endorsed as safe and effective treatment for substance use disorders. Computerized voice processing may soon be capable of providing real-time feedback to psychotherapists regarding quality of treatment and therapeutic alliance (2).

Excitement about the potential of new information technologies, however, has sometimes focused more on the high-tech tools than on the problems we hope they can solve. Focusing on the technology, we might orient around finding new mental health applications for eHealth or machine learning rather than around unmet treatment needs or gaps in mental health care. In our excitement about new technology, we can become too much like the young child with a shiny new hammer who’s looking to pound anything that might be a nail.

If we hope to find the maximum benefit from exciting new tools, we should first identify the jobs that need doing. As readers of this journal are well aware, we need not look far to identify important areas of unmet need or “pain points” in our delivery of services for mental or substance use disorders.

Recent work on suicide risk prediction illustrates the relationship between jobs and tools. The unmet need is clear; rates of suicide continue to increase, and traditional clinical assessment is hardly better than chance for identifying people at highest risk. This could be a job for machine learning or artificial intelligence tools. If we hope to inform a population-based outreach program, then we would start with a case-control design comparing all people who attempt or die by suicide with a control group drawn from the same at-risk population (3). We would use the resulting prediction models to identify the highest-risk individuals in the population. If we hope to deliver accurate individual risk predictions to clinicians at the point of care, then we would start with a cohort design including all types of visits for which we hope to deliver predictions (4). We would use the resulting prediction models to estimate risk at future visits. In either case, we might use any of several machine learning tools, such as penalized or regularized parametric models, decision tree–based models, or deep learning models. Available evidence suggests that those model development tools have generally similar performance for this specific job. No amount of artificial intelligence, however, can replace good clinical epidemiology, which entails clearly identifying an aim or question and matching that aim or question to the appropriate research design. In neither of the above scenarios would we use machine learning or artificial intelligence tools to simply confirm long-established risk factors for suicidal behavior. We don’t need a more complicated, more expensive, and less transparent tool for that relatively simple job.

A philosophy of user-centered design (5) begins the development of any product or service with understanding the needs, priorities, preferences, and constraints of end users. This approach is most often applied to the development of interventions, especially technology-enabled interventions. But it is equally applicable to the development of prediction models or computerized decision support systems. Development of a prediction model would begin with questions such as: Which decision maker do I aim to help? What decision(s) does this person need to make and when/where must the decision be made? What information could be available to inform those decisions? What is the most helpful way to deliver any recommendation or prediction?

Unfortunately, user engagement often comes late in the process of developing new mental health (or general health) technologies. Rather than ask what consumers or patients might want or need, late-stage user engagement focuses on how to entice consumers or patients to use a tool we’ve already developed. User engagement regarding the look and feel of a product or tool is fundamentally different from user engagement regarding what product is actually needed. To continue the hammers-and-nails analogy, late-stage user engagement is akin to fine-tuning the shape of the hammer rather than asking what (if anything) needs to be built.

We should also remember that technology trends change nearly as fast as the seasons and that human nature evolves quite slowly. If we had been designing “digital therapeutics” as recently as 2015, we would have been focused on the exciting potential of Google Glass. At the height of the Bitcoin craze, anything involving Blockchain would have seemed unstoppable—no matter how irrelevant Blockchain technology was to the problem at hand. Adding “Blockchain” to the name of the Long Island Iced Tea Corporation boosted the stock price by 200%, followed soon by a crash and a federal investigation. Those are instructive cautionary tales about technology fads. In contrast, our human problems with fear, hopelessness, impulsivity, apathy, or distraction are likely to be just as important generations from now.

Although I am old enough to remember when computers filled whole rooms and were controlled by punch cards, I am definitely not a Luddite. I spend much of my time developing and improving machine learning tools to identify people at risk for suicide. And I have helped develop and test eHealth interventions for dialectical behavior therapy and effective self-management of bipolar disorder. I am genuinely excited about the potential for artificial intelligence to support better human decision making and the potential for eHealth and mHealth interventions to deliver empirically supported psychosocial treatments when and where they are actually needed. But I hope to stay focused on the things associated with those unmet needs: inconsistent clinical decision making and the limited current reach of empirically supported psychosocial treatments. I should have no allegiance to any specific tools—only to finding the right tools to solve those problems.

We will focus our attention in the right places if we remember who we work for. We call our work psychiatric services (or mental health services) because we aim to serve people who live with mental health conditions—and their families and caregivers.

I’m sure the manager of my local home improvement store would prefer I start my weekend shopping for new tools, rather than first identifying which jobs I need to do. If I were serving the home improvement store, I’d end my weekend with a basement full of shiny new tools and many jobs left undone. Let’s make sure that the jobs rule over the tools and not the other way around.

Kaiser Permanente Washington Health Research Institute, Seattle.
Send correspondence to Dr. Simon ().
References

1 Reger GM, McClure ML, Ruskin D, et al.: Integrating predictive modeling into mental health care: an example in suicide prevention. Psychiatr Serv 2019; 70:71–74LinkGoogle Scholar

2 Imel ZE, Caperton DD, Tanana M, et al.: Technology-enhanced human interaction in psychotherapy. J Couns Psychol 2017; 64:385–393Crossref, MedlineGoogle Scholar

3 Kessler RC, Hwang I, Hoffmire CA, et al.: Developing a practical suicide risk prediction model for targeting high-risk patients in the Veterans health Administration. Int J Methods Psychiatr Res 2017; 26(3)CrossrefGoogle Scholar

4 Simon GE, Johnson E, Lawrence JM, et al.: Predicting suicide attempts and suicide deaths following outpatient visits using electronic health records. Am J Psychiatry 2018; 175:951–960LinkGoogle Scholar

5 Lyon AR, Bruns EJ: User-centered redesign of evidence-based psychosocial interventions to enhance implementation—hospitable soil or better seeds? JAMA Psychiatry (Epub ahead of print, Nov 14, 2018)Google Scholar