The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:https://doi.org/10.1176/appi.ps.201800430

Abstract

Objective:

The authors documented rates of sustained use of an evidence-based practice following training sponsored by New York State (NYS), and they identified clinician characteristics related to sustained use.

Methods:

Clinicians (N=89) who were employed in licensed NYS Office of Mental Health agencies serving children and adolescents and who were trained to proficiency in Managing and Adapting Practice (MAP) in 2016 were contacted between 9 and 18 months later and asked whether they were still using (users) or had stopped using (nonusers) MAP and their reason for doing so.

Results:

Responses were received from 57% of trainees and of those, 80% reported continued use of MAP. Score on the appeal subscale of the Evidence-Based Practices Attitude Scale (EBPAS) was the only significant difference between users and nonusers.

Conclusions:

Most clinicians reported sustained use of MAP. The EBPAS appeal subscale can be used to identify clinicians who are likely to discontinue use.

HIGHLIGHTS

  • Responders to a follow-up survey reported high levels of sustained use of an evidence-based practice for 9 to 18 months after initial training.

  • The four-item appeal subscale of the Evidence-Based Practices Attitude Scale identified clinicians who were likely to discontinue use.

In recent years, many states have taken on the challenge of educating their existing mental health care workforce in evidence-based practices (EBPs), both to improve the quality of care provided to children and adolescents and to address a shift toward reimbursing services for quality rather than quantity (1, 2). Because of the complexity of EBPs, training clinicians to fidelity has proven challenging. There are scant data evaluating these state training programs and even less data on whether, following training, the use of EBPs is sustained (3, 4). Reviews suggest that sustainment of EBPs has been examined less frequently than implementation of EBPs (4), and the literature on sustainability is “fragmented and underdeveloped” (5).

New York State (NYS) has been a leader in the efforts to educate its existing mental health workforce in EBPs. After September 11, 2001, a large percentage of the population of NYS was in need of mental health services but the workforce was poorly prepared to deliver these services. Through a partnership with Columbia University, the NYS Office of Mental Health launched a training program in cognitive-behavioral therapy for childhood trauma (6). The success of this early effort led to the establishment of the Evidence-Based Treatment Dissemination Center (EBTDC) in 2006. EBTDC was designed as a quality improvement initiative for the clinical workforce in agencies licensed by the NYS Office of Mental Health to serve children and adolescents. In its first 3 years, EBTDC trained 916 clinicians and 275 supervisors (7).

In 2013, EBTDC began training in Managing and Adapting Practice (MAP), an evidence-informed approach to guide clinicians in the selection of treatments that match a child’s characteristics (8). MAP was selected because it has broad coverage of child populations and clinician-friendly decision support tools with measurable outcomes, and it has been successfully implemented across the United States (1, 9). Although the initial training was successful in training clinicians to fidelity, dropout from training was high (51.2%). Older clinicians were more likely to drop out, as were clinicians from downstate urban areas (1). To improve the dropout rate, multiple modifications were made to the NYS MAP training. After these modifications, the dropout rate decreased significantly to 12.3%, and the only predictor of dropout was a low score on the appeal subscale of the Evidence-Based Practices Attitude Scale (EBPAS) (2, 10).

However, no data were collected on sustained use of MAP. Therefore, the objectives of this longitudinal cohort study were to document the rate of sustained use of MAP after completion of the training sponsored by NYS and identify the characteristics related to sustained use.

Methods

The study population consisted of 89 clinicians who were employed in licensed NYS Office of Mental Health agencies serving children and adolescents and who were trained to proficiency in MAP from January 1 through December 31, 2016 (8). Nine to 18 months posttraining, clinicians were contacted via e-mail up to seven times and asked whether they were still using MAP and their reason for use or nonuse. Fifty-one (57%) clinicians completed the survey. The data collection part of the EBTDC program is considered to be a quality improvement activity and did not require institutional review board review.

Sociodemographic and professional practice characteristics—such as age, sex, race-ethnicity, education level, hours worked, service setting, and licensure—and reasons for enrolling in MAP training were all assessed prior to the training. The Texas Christian University (TCU) Survey of Organizational Functioning Efficacy and Director Leadership scales (11) were also administered prior to training. This scale and its components have excellent reliability (12). Clinicians’ familiarity and experience with Excel were also assessed prior to training.

The EBPAS was administered after the didactic training and again after the postconsultation Webinars (10). Three of the four subscales—appeal, openness, and divergence (12 questions)—were administered. EBPAS has moderate to good internal consistency and reliability (10, 13). The language was slightly edited to ascertain clinicians’ attitudes toward MAP specifically rather than toward EBPs in general. Responses (ranging from 1 to 5) to individual items in the subscales were summed and then averaged to create a mean score.

Categorical data were summarized with counts and percentages, whereas continuous data were summarized with means and standard deviations. Differences between the full sample of trained clinicians and the follow-up participants, between follow-up participants and nonparticipants, and between clinicians who continued to use (users) and did not continue to use (nonusers) MAP were evaluated with unpaired t tests for continuous data, chi-square tests for categorical data, and logistic regression analyses. All analyses used SPSS, version 23.0 (14).

Results

Table 1 displays the characteristics of the 89 clinicians trained in 2016 and the 51 (57%) clinicians who responded to the follow-up survey. There were no statistically significant differences between the trained clinicians and the subgroup who completed the follow-up survey in sociodemographic or practice characteristics or in reasons for MAP training. Additionally, the two groups did not differ in their evaluation of organizational leadership and efficacy as measured by the TCU scales, in technology skills, or in EPBAS scores. Compared with responders (N=51), nonresponders (N=38) were less likely to be female (84% versus 100%, p=.003), more likely to report that MAP training was required by their agencies or supervisors (55% versus 33%, p=.05), and less likely to have a personal interest in using MAP (34% versus 63%, p=.01).

TABLE 1. Sociodemographic and practice characteristics of 89 clinicians who received training in Managing and Adapting Practice (MAP), by response to follow-up survey and use of MAP between 9 and 18 months after training

Follow-up surveyMAP
Total (N=89)Responders (N=51)Nonresponders (N=38)Users (N=41)Nonusers (N=10)
CharacteristicN%N%N%N%N%
Years with agency (M±SD)6.32±6.546.42±7.246.19±5.526.13±7.317.61±7.16
Full-time employment 839347923695379010100
Field of highest degree: social work6876397629763278770
Highest academic degree
 Bachelor’s442425250
 Master’s8090458834893790990
 Doctorate44242512110
Licensed778745883284358510100
Age (M±SD)38.66±10.9639.00±10.3438.21±11.8638.02±10.3943.00±9.60
Female gendera83935110032844110010100
Race-ethnicity
 Non-Hispanic white 6371336530792561880
 Black or African American121391838820110
 Other1416918513820110
Experience with cognitive-behavioral therapy
 None/little10115105135120
 Some/a lot76854588318235851010
 Expert/certified331225120
Clinical setting
 Outpatient6169356926682868770
 School1112612513512110
 Group home/residential/other17191020718820220
Direct clinical service hours per week (M±SD)21.55±9.5420.96±10.2822.37±8.4620.78±11.0021.70±6.94
TCU SOFEDL score (M±SD)b
 Efficacy subscale37.96±6.7938.43±6.9637.32±6.6039.02±6.7036.00±7.83
 Director leadership subscale30.77±7.4230.95±7.7630.52±7.0531.20±8.1829.88±5.83
 Total 33.34±5.7633.60±5.8533.00±5.7033.99±5.8631.90±5.81
Technology skills (M±SD)c2.42±.782.45±.842.38±.692.46±.862.40±.78
Reason for MAP trainingd
 Required by supervisor/agencya3843173321551537220
 Personal interesta4551326313342663660
 Recommended by colleague/peer3944193720531332660
 Other22240250
EBPAS total score (M±SD)e
 Directly after training2.79±.452.83±.452.74±.452.84±.462.82±.42
 5 months after trainingf2.70±.412.72±.462.67±.342.80±.442.38±.44
EBPAS appeal subscale (5 months after training) (M±SD)f,g2.54±.532.60±.552.47±.502.69±.522.19±.51

aSignificant difference between respondents and nonrespondents (p<.05).

bTexas Christian University Survey of Organizational Functioning Efficacy and Director Leadership. Possible scores for efficacy range from 5 to 25, with higher scores indicating more reported efficacy. Possible scores for director leadership range from 9 to 45, with higher scores indicating better leadership skills.

cPossible scores range from 1 to 4, with higher scores indicating better technical skills.

dRespondents gave more than one answer.

eEBPAS, Evidence-Based Practice Attitudes Scale. Possible scores range from 1 to 5, with higher scores indicating more positive attitude toward use of MAP.

fSignificant difference between MAP users and nonusers (p<.015).

gPossible scores range from 1 to 5, with higher scores indicating higher appeal of MAP.

TABLE 1. Sociodemographic and practice characteristics of 89 clinicians who received training in Managing and Adapting Practice (MAP), by response to follow-up survey and use of MAP between 9 and 18 months after training

Enlarge table

Sociodemographic and practice characteristics did not differ between MAP users (N=41, 80%) and nonusers (N=10, 20%). Similarly, there were no differences between MAP users and nonusers in the reason for seeking MAP training, the results of the TCU efficacy and director leadership subscales, the technology questions, or score on the EBPAS total scale at the end of the didactic training. The most common reason given for discontinued use was lack of time. Differences between MAP users and nonusers were observed in scores on the EBPAS administered after the consultation Webinars, both for total score (2.80±.44 versus 2.38±.44) and for the appeal subscale (2.69±.52 versus 2.19±.51). Logistic regression results (data not shown) suggested that clinicians who scored high on the appeal subscale were more likely to have continued using MAP compared with those who scored low on this scale (odds ratio=5.98, 95% confidence interval=1.31, 27.36).

Discussion

Data from this follow-up study of a cohort of clinicians trained in an evidence-informed practice, MAP, suggest that 80% of clinicians who responded to the follow-up survey continued to use the EBP for 9 to 18 months after training. Given the investment that states are making in training agencies and providers, this is an encouraging finding. Brookman-Frazee and colleagues (15) examined the sustainability of using any of six EBPs over a 57-month period among clinicians practicing in the Los Angeles Department of Mental Health. They found that 88.9% of clinicians made claims for at least one of the six EBPs and submitted claims for one or more EBPs for an average of 21.71 months. Notably, MAP was one of two EBPs that showed a lower risk of discontinuation compared with the other EBPs examined. Thus MAP is well suited for use in community mental health settings that serve children. Southam-Gerow et al. (9), also examining MAP implementation in Los Angeles County, concluded that 75% of clinicians were trained to proficiency and that MAP could be implemented on a large scale.

Given the importance of sustained use of EBPs to improve the quality of mental health care for children in community settings, the ability to identify trainees who are unlikely to continue use of an EBP is critical. Our data suggest that the four-item appeal subscale of Aarons’ EBPAS (10) can differentiate between those who will and will not sustain use. This short set of questions, which in earlier work identified those who dropped out of training (2), can also be used to identify those who might benefit from additional training to improve sustained use of MAP. Interestingly, only the total subscale score—rather than any individual items—predicted sustained use. Clinicians who discontinued use were more likely to report a lack of time to use the MAP system. Future research is needed to document whether providing additional training for clinicians who score low on the EBPAS appeal subscale improves rates of sustainability.

These data had certain limitations. The study sample consisted of clinicians who practice in NYS licensed mental health agencies and who volunteered for MAP training. Thus they are a self-selected sample. Although the cohort of 89 clinicians trained in 2016 was contacted by e-mail up to seven times, the response rate, 57%, was suboptimal. Although there were no statistically significant differences between the trained cohort and the follow-up sample in sociodemographic or practice characteristics, reasons for seeking MAP training, or attitudes toward organizational efficacy and leadership, undetected differences between the samples could have biased the results. Further, although we know that there is considerable staff turnover in these community agencies and that turnover is likely responsible for some of the loss to follow-up, other possible reasons for nonresponse include lack of perceived relevance of MAP, lack of agency support for participation, and clinical demands. Comparisons between responders and nonresponders to the survey suggest that interest in using MAP and feeling required by agencies or supervisors to participate in training may be driving nonresponse. MAP use is self-reported, and no data were collected on fidelity to MAP or on whether it was related to improved client outcomes.

Conclusions

Data from NYS show that MAP was successfully adapted for use in a state system. Clinicians were trained to fidelity (2), and 80% reported having sustained use of the EBP for 9 to 18 months after training. Importantly, these data also show that the four-item appeal subscale of Aarons’ EBPAS (10) can be used to identify clinicians who are likely to discontinue use and who should be targeted for additional training.

Department of Child and Adolescent Psychiatry, New York University School of Medicine, New York (Horwitz, Lewis, Gleacher, Wang, Hoagwood); Division of Integrated Community Services for Children and Families, New York State Office of Mental Health (NYS OMH), Albany (Bradbury, Ray-LaBatt).
Send correspondence to Dr. Horwitz ().

Data included in this article were presented at the National Institute of Mental Health Mental Health Services Research Conference, Bethesda, Maryland, August 1, 2018.

This study was supported by grant NYS OMH C008725 from the NYS OMH (Dr. Hoagwood, principal investigator).

The authors report no financial relationships with commercial interests.

References

1 Olin SS, Nadeem E, Gleacher A, et al.: What predicts clinician dropout from state-sponsored managing and adapting practice training? Adm Policy Ment Health 2016; 43:945–956Crossref, MedlineGoogle Scholar

2 Vardanian MM, Horwitz SM, Storfer-Isser A, et al: A second look at dropout rates from state-sponsored MAP trainings: can targeted adaptations improve retention in evidence-based practice trainings? Behav Therapist 2017; 40:273–282Google Scholar

3 McHugh RK, Barlow DH: The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. Am Psychol 2010; 65:73–84Crossref, MedlineGoogle Scholar

4 Novins DK, Green AE, Legha RK, et al.: Dissemination and implementation of evidence-based practices for child and adolescent mental health: a systematic review. J Am Acad Child Adolesc Psychiatry 2013; 52:1009–1025.e18Crossref, MedlineGoogle Scholar

5 Wiltsey Stirman S, Kimberly J, Cook N, et al.: The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci 2012; 7:17Crossref, MedlineGoogle Scholar

6 Hoagwood KE, Vogel JM, Levitt JM, et al.: Implementing an evidence-based trauma treatment in a state system after September 11: the CATS project. J Am Acad Child Adolesc Psychiatry 2007; 46:773–779Crossref, MedlineGoogle Scholar

7 Gleacher AA, Nadeem E, Moy AJ, et al.: Statewide CBT training for clinicians and supervisors treating youth: the New York State evidence-based treatment dissemination center. J Emot Behav Disord 2011; 19:182–192Crossref, MedlineGoogle Scholar

8 Chorpita BF, Daleiden EL, Collins KS: Managing and adapting practice: a system for applying evidence in clinical care with youth and families. Clin Soc Work J 2014; 42:134–142CrossrefGoogle Scholar

9 Southam-Gerow MA, Daleiden EL, Chorpita BF, et al.: MAPping Los Angeles County: taking an evidence-informed model of mental health care to scale. J Clin Child Adolesc Psychol 2014; 43:190–200Crossref, MedlineGoogle Scholar

10 Aarons GA: Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res 2004; 6:61–74Crossref, MedlineGoogle Scholar

11 Lehman WE, Greener JM, Simpson DD: Assessing organizational readiness for change. J Subst Abuse Treat 2002; 22:197–209Crossref, MedlineGoogle Scholar

12 TCU Survey of Organizational Functioning (TCU SOF): Fort Worth, Texas Christian University, Institute of Behavioral Research, 2005. https://ibr.tcu.edu/forms/organizational-staff-assessmentsGoogle Scholar

13 Aarons GA, Glisson C, Hoagwood K, et al.: Psychometric properties and US national norms of the Evidenced-Based Practice Attitude Scale (EBPAS). Psychol Assess 2010; 22:356–365Crossref, MedlineGoogle Scholar

14 IBM SPSS Statistics for Windows, Version 23.0. Armonk, NY, IBM, 2015Google Scholar

15 Brookman-Frazee L, Zhan C, Stadnick N, et al.: Using survival analysis to understand patterns of sustainment within a system-driven implementation of multiple evidence-based practices for children’s mental health services. Front Public Health 2018; 6:54Crossref, MedlineGoogle Scholar