The Influence of Team Functioning and Workload on Sustainability of Trauma-Focused Evidence-Based Psychotherapies
Abstract
Objective:
It has been over a decade since the U.S. Department of Veterans Affairs (VA) began formal dissemination and implementation of two trauma-focused evidence-based psychotherapies (TF-EBPs). The objective of this study was to examine the sustainability of the TF-EBPs and determine whether team functioning and workload were associated with TF-EBP sustainability.
Methods:
This observational study used VA administrative data for 6,251 patients with posttraumatic stress disorder (PTSD) and surveys from 78 providers from 10 purposefully selected PTSD clinical teams located in nine VA medical centers. The outcome was sustainability of TF-EBPs, which was based on British National Health System Sustainability Index scores (possible scores range from 0 to 100.90). Primary predictors included team functioning, workload, and TB-EBP reach to patients with PTSD. Multiple linear regression models were used to examine the influence of team functioning and workload on TF-EBP sustainability after adjustment for covariates that were significantly associated with sustainability.
Results:
Sustainability Index scores ranged from 53.15 to 100.90 across the 10 teams. Regression models showed that after adjustment for patient and facility characteristics, team functioning was positively associated (B=9.16, p<.001) and workload was negatively associated (B=–.28, p<.05) with TF-EBP sustainability.
Conclusions:
There was considerable variation across teams in TF-EBP sustainability. The contribution of team functioning and workload to the sustainability of evidence-based mental health care warrants further study.
Posttraumatic stress disorder (PTSD) is a potentially disabling psychiatric disorder that disproportionately affects veterans (1,2). Fortunately, there are effective treatments (3–10). Treatments with the highest level of endorsement in clinical practice guidelines are trauma-focused evidence-based psychotherapies (TF-EBPs) (11,12). To implement two TF-EBPs, the U.S. Department of Veterans Affairs (VA) began nationwide competency-based training for cognitive processing therapy (CPT) (13) in 2006 and for prolonged exposure (PE) (14) in 2007 (15). VA policy requires that all VA patients with PTSD have access to one of these TF-EBPs (16).
Over the past decade, research has evaluated VA’s implementation of TF-EBPs in terms of provider and patient perceptions of TF-EBP acceptability, barriers and facilitators of provider uptake (17), and reach to patients with PTSD (18,19). Reach is a measure of the population impact of an intervention or program. It has been defined as the percentage and representativeness of individuals within a defined population who receive an intervention or program (20). Sayer et al. (18) found that outpatient PTSD teams with high levels of reach for TF-EBPs were organized differently than those with low reach. High-reach teams had a defined sense of purpose, or mission, to deliver TF-EBPs, internal and external leadership support for this mission, team members engaged in this mission, TF-EBP operational processes, and access to services outside the team for patients who do not want a TF-EBP or who have other treatment needs.
Another metric of implementation success is sustainability (21), which is defined as the extent to which a newly implemented treatment is maintained or institutionalized within a service (21,22). Although over a decade has passed since implementation of the TF-EBP training initiatives, very few studies have examined the sustainability of these psychotherapies (17). This omission is consistent with the larger dissemination and implementation literature, in which sustainability is understudied relative to initial adoption (23,24). In fact, sustainability is discussed in conceptual papers more often than it has been empirically investigated (21). Yet policy makers and program managers need to plan for sustainability so that the resources used in implementation are not wasted (25). Researchers have recommended that studies of program outcomes also identify factors that account for sustainability, given that such information can be used to guide effective implementation strategies and policy (26). In addition, data on the association among implementation outcomes are needed to determine whether improvements in various outcome domains are positively related with each other (for example, whether greater reach affects the extent to which a practice becomes a cultural norm, which promotes sustainability). Data can also show whether various outcome domains are negatively related (for example, whether a short-term increase in reach burdens the system and undercuts its sustainability) (21).
This study is guided by the British National Health System’s sustainability model, which uses a quantitative measure to estimate the likelihood of sustainability of a practice change. This model consists of 10 factors grouped into three larger domains: process (for example, credibility and adaptability of the change), staff (for example, leadership and staff support for the change), and organization (for example, infrastructure for sustaining the change and fit with the organization’s aims and culture) (22,27). The sustainability model was designed to help teams identify local contextual challenges to sustainability (27).
Is also important to determine whether there are additional factors that help ensure that a new practice or program will continue. Although the sustainability model assesses staff support for a practice change (22,27), it does not assess team functioning. Team functioning is the representation of how well team members work together and includes interpersonal cohesion and activities related to the work of the team (28). Prior research indicates that dimensions of team functioning affect program implementation and sustainability within health care settings (29–34). In the VA, TF-EBPs are primarily delivered by providers working in specialized outpatient PTSD teams, known as PTSD clinical teams (15,19). Providers on these teams may have varying levels of success in working together to maintain the infrastructure and care processes needed for TF-EBP delivery. Thus high levels of team functioning may translate into greater TF-EBP sustainability. Mental health teams with greater cohesion may be more willing or able to support team members who are implementing TF-EBPs. A positive association between team functioning and sustainability could represent an important managerial consideration.
Another potentially important factor missing from the sustainability model is workload. Workload has been found to affect the provision of evidence-based care in both general medical and mental health settings (35–37). CPT and PE are time-intensive (eight to 12 weekly sessions), manualized treatments (13,14). Adding weekly sessions of a TF-EBP, which are 60 or 90 minutes in length, may be challenging if providers already have a high caseload. In fact, providers reported that lack of time is a barrier to use of these treatments (32,38,39). Thus it would be important to determine whether greater workload diminishes TF-EBP sustainability.
This study examined TF-EBP sustainability in VA outpatient PTSD clinics 10 years after initial implementation. We hypothesized that there would be a positive association between team functioning and TF-EBP sustainability and a negative association between workload and TF-EBP sustainability. To inform future research, we conducted exploratory analysis to examine the association between TF-EBP reach and sustainability.
Methods
The VA Central Institutional Review Board approved this research. An alteration of informed consent (to allow for online consent) and waiver of documentation of informed consent were obtained for the anonymous staff survey. A waiver of informed consent and a HIPAA waiver were obtained for use of the administrative data.
Data Source
This study is based on VA administrative data and the PTSD team staff surveys collected as part of a mixed-methods study on reach of TF-EBPs. For the parent study, we used purposive sampling to select 10 PTSD teams from nine diverse facilities, with oversampling of teams that had high reach of TF-EBPs. Results from site visit interviews are presented elsewhere (18).
Procedures
To select sites for the parent study, we extracted administrative and chart note data for fiscal year (FY) 2015 (October 2014–September 2015) for all patients with PTSD who received psychotherapy from outpatient PTSD teams at the nine facilities. We identified staff assigned to the selected teams through a point of contact on each team. We used e-mail to recruit clinical staff and peer support specialists assigned to each team to complete an anonymous staff survey. Survey recruitment methods involved an introductory e-mail followed one week later by an e-mail with a link to the online consent and survey and then by three weekly reminder emails with the same link (40). At the time of the final reminder, we also mailed printed copies of the consent form and the survey and self-addressed stamped envelopes to our point of contact for distribution to the PTSD team clinicians. Surveys were completed between August 2014 and October 2015.
Dependent and Independent Variables
EBP sustainability.
Our outcome measure was the British National Health Service Sustainability Index. This 10-item measure assesses staff perceptions of factors that influence sustainability (22,27) and has been used to evaluate sustainability as an outcome (41–43). For each item, responders choose one of four weighted statements that they believe best describes each domain. Possible scores range from 0 to 100.9. Preliminary evidence suggests that a total score of 55 or higher means that an intervention or program will be sustained. A score of 45 or lower means that significant effort is required to address barriers to sustainability (26). The Sustainability Index also includes subscales reflecting the conceptual domains of process (N=4), staff (N=4), and organization (N=2). In the survey to staff, we assessed sustainability of CPT and PE separately (CPT, α=.84; PE, α=.73). However, because the Sustainability Index scores for CPE and PE were highly correlated (r=.95), we combined them into one score (α=.78).
Team functioning.
The staff survey included 53 items with 10 measures to assess team functioning domains. Twenty-one items drawn from the Survey of Organizational Attributes of Primary Care Practice assessed team communication, decision making, chaos and stress, and history of change (44,45). We used measures from the Learning Organization Survey to assess the extent to which the learning environment was supportive and leadership reinforced learning (46); Edmondson’s conceptualization of team behaviors to assess team learning behavior (47); organizational slack measures to assess organizational resources for staff, space, and time (48); and the Team Helping Measure to assess organizational citizenship behaviors (49). The response options range from 1 (disagree strongly) to 5 (agree strongly). Scores were computed as the means across completed items, and reliability was moderate to high for all scales (Cronbach’s α>.79) (50).
Given the large number of items and small sample size, we used principal-components analysis to determine whether these 10 measures could be combined to create a composite measure. The first component explained 90.1% of total variance (eigenvalue=5.73), demonstrating unidimensionality. Nine of the 10 team functioning measures collectively contributed to the total score, whereas one measure, history of change, did not (eigenvalue=.73, 12% of variance explained). Thus we calculated a composite team functioning measure on the basis of the first principal component by using nine scales, with higher scores indicating more positive team functioning (α=.94). History of change served as a separate independent variable. Kaiser’s overall measure of sampling adequacy was .90.
Team workload.
We determined the number of patients with PTSD seen by the outpatient clinical teams by using the stop codes that designate team workload. Workload was calculated by dividing the number of PTSD patients seen by team members for any treatment in fiscal year 2015 by the number of PTSD providers on the team.
TF-EBP reach.
In FY 2015, VA released chart note templates that allowed for identification of EBP type. Because the templates were new, we applied natural-language processing to identify additional TF-EBP cases from psychotherapy notes that were not derived from templates. Our natural-language processing system was based on the work of Shiner et al. (51) but was built by using hand-coded rules in the SAS Enterprise Guide, version 7.1. Performance for therapy note classification as measured by recall, precision, and F measure was strong (recall=.91 and .85, precision=.96 and .95, and F measure=.93 and .90, for CPT and PE, respectively). A patient who received at least one session of CPT, PE, or both in FY 2015 was designated as a TF-EBP recipient. TF-EBP reach was defined as the number of patients with PTSD who received a TF-EBP divided by the number of therapy patients with PTSD who received any form of psychotherapy on each team. Reach on each team ranged from 14% to 59% (mean±SD=36%±17%). Because we oversampled high-reach teams, the rates did not represent TF-EBP reach across the VA.
Covariates.
We modeled three classes of covariates, each representing patient, team, or facility characteristics. We extracted patient age, sex, race, and ICD-9 codes indicative of major psychiatric disorders (alcohol and drug abuse, anxiety, depression, and psychosis). We constructed team-level patient characteristics by calculating the proportion of patients who were male, the proportion who were Caucasian, the patients’ mean number of psychiatric comorbidities, and the patients’ age. The staff survey assessed team members’ gender, age, race-ethnicity, occupation, and VA tenure. For each participating facility, we extracted location, complexity level, and designation as urban or rural. VA facilities are classified into five complexity levels (1a, 1b, 1c, 2, and 3) on the basis of patient population, clinical services, educational and research mission, and administrative complexity. Rankings range from most (1a) to least complex (3).
Analyses
Twenty-two of 78 providers did not complete all Sustainability Index items. We used regression modeling to induce a monotone missingness pattern and the propensity method (52,53) to multiply impute (five times) the missing items prior to calculating Sustainability Index scores. All the analyses were conducted within each imputed data set separately; the reported numbers are the combined results. We used responses to the other survey measures (for example, team functioning, provider characteristics, and completed Sustainability Index items) to construct the propensity and regression imputation models.
Initially, we used Pearson correlations to examine the associations between independent variables and TF-EBP Sustainability Index total and subscale scores. To test our hypotheses, we then used multiple linear models with Sustainability Index total scores regressed on the independent variables (team functioning, workload, history of change, and reach) separately, each time accounting for the covariates. Because of the small number of teams, team-level predictors were used only for the means modeling. In a final multiple linear regression model, we included all explanatory variables and the covariates simultaneously.
To evaluate the robustness of the findings obtained from the combined analyses of multiple imputed data sets, a complete case reanalysis of our model was also performed. The results were virtually identical. Model assumptions were examined by using scatter diagram and residual plots. The normality assumption was not rejected. Gaussian multiple regression analyses were conducted by using SAS, version 9.4.
Results
Table 1 describes the participating sites, patients, and survey responders. A total of 140 PTSD team members received surveys, and 78 (56%) completed the team functioning survey. The response rate per team ranged from 33% to 75%. Compared with responders, nonresponders were just as likely to be psychologists or social workers. All but one participant spent at least 20% of his or her time providing direct patient care, and 78% (N=61) identified as clinical staff without team leadership responsibilities. Measures to assess team functioning and scores of responders are presented in Table 2.
Characteristic | N | % |
---|---|---|
U.S. Census region | ||
Midwest | 3 | 33 |
Northeast | 2 | 22 |
South | 2 | 22 |
West | 2 | 22 |
Complexity level | ||
1a (most) | 4 | 44 |
1b | 3 | 33 |
1c | 1 | 11 |
2 | 1 | 11 |
3 (least) | 0 | — |
Urban setting | 8 | 89 |
Patients with PTSD (N=6,251) | ||
Male | 5,096 | 82 |
Race | ||
American Indian | 107 | 2 |
Asian | 22 | .4 |
Black | 2,132 | 34 |
Hawaiian | 73 | 1 |
Multiracial | 92 | 2 |
Unknown | 390 | 6 |
White | 3,435 | 55 |
Ethnicity | ||
Hispanic | 838 | 13 |
Not Hispanic | 5,261 | 84 |
Unknown | 152 | 2 |
Age (M±SD | 50.4±14.9 | |
Staff (N=78)b | ||
Age | ||
<40 | 33 | 42 |
40–49 | 18 | 23 |
50–59 | 15 | 19 |
≥60 | 9 | 12 |
Declined to answer | 3 | 4 |
Gender | ||
Female | 51 | 65 |
Male | 25 | 32 |
Declined to answer | 2 | 3 |
Race-ethnicity | ||
White | 63 | 81 |
African American | 5 | 6 |
Asian | 1 | 1 |
Native American | 0 | 0 |
Multiracial | 2 | 3 |
Hispanic | 8 | 10 |
Declined to answer | 3 | 4 |
Professional discipline | ||
Psychologist | 45 | 58 |
Social worker | 13 | 17 |
Psychiatrist | 11 | 14 |
Clinical nurse | 5 | 6 |
Other | 2 | 3 |
Declined to answer | 2 | 3 |
Professional role on PTSD team | ||
Program director, assistant director, or team leader | 16 | 21 |
Staff member | 61 | 78 |
Declined to answer | 1 | 1 |
Very familiar with CPT, PE, or both | 47 | 60 |
Use of CPT or PE | ||
Use CPT more than PE | 33 | 42 |
Use CPT and PE | 30 | 39 |
Use PE more than CPT | 13 | 17 |
Rarely or never use either CPT or PE | 2 | 3 |
Tenure at the VA (years) | ||
<6 | 26 | 32 |
6–10 | 34 | 44 |
11–15 | 6 | 8 |
≥16 | 10 | 13 |
Declined to answer | 2 | 3 |
Measure | Items | M | SD | α |
---|---|---|---|---|
Communication | 4 | 4.13 | 1.03 | .92 |
Decision-making | 8 | 3.89 | .99 | .93 |
Chaos and stress | 6 | 2.88 | .97 | .90 |
History of change | 3 | 2.97 | .99 | .81 |
Supportive learning environment | 7 | 3.72 | 1.01 | .94 |
Leadership that reinforces learning | 4 | 3.90 | 1.07 | .94 |
Team learning behavior | 7 | 3.26 | .72 | .79 |
Organizational resources | ||||
Staff and space | 3 | 3.35 | 1.22 | .88 |
Time | 4 | 3.34 | .93 | .88 |
Organizational citizenship behaviors | 6 | 3.96 | .76 | .86 |
Table 3 presents each team’s values for team functioning, workload, and reach, ranked by Sustainability Index score. Only one team (5a) had average Sustainability Index scores that fell below the preliminary cutoff of 55. Table 4 presents correlations between independent variables and Sustainability Index subscale and total scores. Higher scores on each Sustainability Index subscale were positively associated with better team functioning and reach and negatively associated with greater workload.
Team functioningb | Workloadc | Reachd | Sustainabilitye | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Team | Census region | N of patients with PTSD | N of providers | N of survey responders | M | SD | M | SD | M | SD | M | SD |
5 | West | 231 | 3 | 2 | .48 | .17 | 77 | 44.7 | 56.5 | 3.4 | 100.90 | .00 |
7 | Midwest | 1,083 | 22 | 16 | .92 | .49 | 49 | 10.6 | 58.9 | 1.9 | 98.58 | 5.52 |
2 | Midwest | 389 | 10 | 6 | .39 | .26 | 39 | 12.5 | 55.9 | 2.6 | 94.93 | 6.49 |
3 | West | 512 | 8 | 6 | .37 | .45 | 64 | 22.8 | 31.7 | 2.3 | 92.84 | 5.71 |
9 | Midwest | 373 | 4 | 2 | 1.11 | .04 | 93 | 46.9 | 28.8 | 3.3 | 92.75 | 1.53 |
4 | South | 2,293 | 35 | 19 | –.20 | .89 | 66 | 11.2 | 42.0 | 1.3 | 87.90 | 13.31 |
8 | South | 1,570 | 18 | 6 | –.29 | 1.09 | 87 | 20.7 | 14.0 | 1.2 | 77.68 | 14.45 |
6 | Northeast | 1,470 | 19 | 8 | –.26 | .80 | 77 | 17.9 | 17.7 | 1.5 | 68.40 | 18.52 |
1 | Northeast | 1,264 | 16 | 10 | –1.17 | .96 | 79 | 19.9 | 15.3 | 1.4 | 62.50 | 22.06 |
5a | West | 729 | 5 | 3 | –1.09 | .99 | 146 | 65.4 | 38.6 | 1.8 | 55.89 | 9.57 |
Subscale | ||||
---|---|---|---|---|
Characteristic | Process | Staff | Organization | Total |
History of change | −.04 | −.02 | .02 | −.02 |
Team functioning | .45** | .46** | .56* | .53** |
Workload | −.43** | −.47** | −.57** | −.54** |
Reachb | .57** | .48** | .49** | .59** |
Multiple linear regression analysis in which we adjusted for covariates confirmed our hypotheses that a higher total score on the Sustainability Index was positively associated with better team functioning (B=9.18, SE=2.01, p<.001) and negatively associated with greater workload (B=–.26, SE=.12, p=.03). Exploratory analysis showed that after adjustment for the covariates, the association between reach and sustainability was not significant. These effects persisted when we included all three explanatory variables in one multiple linear regression analysis. That is, team functioning remained positively associated (B=9.16, SE=1.97, p<.001) and workload remained negatively associated with scores on the Sustainability Index (B=–.28, SE=–.12, p=.02), with no association found for reach.
Discussion
Although a decade has passed since the VA began implementation of TF-EBPs, this is one of the only studies to examine the sustainability of these psychotherapies (17). We found that mean TF-EBP Sustainability Index scores exceeded the threshold recommended by the British health care system, perhaps reflecting institutionalization of TF-EBPs through VA policy and training (15,16). Nonetheless, there was variability in sustainability of TF-EBPs, with some teams demonstrating room for improvement. We know from site visit interviews that the team with the lowest sustainability score was experiencing a leadership change and was worried about losing infrastructure and support for TF-EBPs. Thus the team’s relatively low sustainability scores reflected its concerns about the program’s viability during the site visits.
Even after adjustment for covariates, workload and team functioning were associated with TF-EBP sustainability. The finding that teams with better functioning had higher sustainability scores expands prior research suggesting that higher levels of team functioning are an important precursor for high-quality care (54). Higher functioning teams may have more frequent discussions about monitoring and meeting team goals, better allocation of job duties across work roles, better workload sharing practices, greater collaboration in solving problems and coping with challenges, and more effective team leaders (31), which may have facilitated sustainability.
The finding that higher workload was associated with lower sustainability converges with prior research suggesting that a high workload may interfere with delivery of evidence-based mental health care (35–37), including TF-EBP delivery (32,38,39). It appears that more research is warranted to determine whether there is a change point after which greater workload threatens the sustainability of EBPs more substantially than would be predicted by a linear relationship. In classical statistical approaches, the change point is examined by approximating the function between two variables with a polynomial and constructing a confidence interval around the maximizer (55). A larger sample than the one available here would be needed to fully examine this issue.
Research examining the relationships among implementation outcomes is needed to inform theory of implementation and the design of implementation trials (21). Although TF-EBP reach was correlated with sustainability, it did not predict sustainability over and above the patient- and facility-level covariates. This finding suggests that interventions to promote TF-EBP sustainability may differ from interventions to improve reach. In qualitative interviews related to this study, we found that some sites with low reach of TF-EBPs expressed strong support for continuing to use these treatments, but with only a small portion of their patients. We examined TF-EBP reach and sustainability over the same 12-month period; a longitudinal study with a greater number of clinical teams and patients and multiple measurement points would be needed to evaluate the possibility of temporal associations between reach and sustainability. Evaluating temporal associations would lead to a better understanding of whether changes in one measure precede changes in the other or whether a synergistic effect exists.
Methodologically sound research on sustainability of evidence-based treatments is rare (23,24). A strength of this investigation is our quantification of TF-EBP reach by using chart note data rather than clinician self-report. The response rate to the survey (56%) was better than what is often achieved in surveys of providers (32,43), but our sample was relatively small because of the limited number of clinical staff assigned to each team. This limited our ability to examine more complex relationships among independent variables. Because our survey was anonymous, we had limited information on nonresponders. We did not measure team mission or purpose, a factor that was associated with TF-EBP reach in prior research (18) and that may also be associated with sustainability. Last, we did not assess fidelity, another important implementation outcome (21), and thus we cannot address questions about the quality of treatment delivery or the relationship between TF-EBP fidelity and sustainability.
Conclusions
Much of VA’s TF-EBP implementation efforts has focused on training individual clinicians to ensure that they have the skills to deliver CPT and PE (15). Although training is crucial, attention must also be paid to the context in which clinicians work (18,19,32). We are aware of implementation strategies that have included leadership training (56,57), but less attention has been paid to how to develop high-functioning teams (58). Similarly, there is a dearth of guidance regarding workload levels that optimize TF-EBP implementation and sustainability. The contribution of team functioning and workload to the sustainability of evidence-based mental health care warrants further study.
1 : Prevalence of a positive screen for PTSD among OEF/OIF and OEF/OIF-era veterans in a large population-based cohort. Journal of Traumatic Stress 27:542–549, 2014Crossref, Medline, Google Scholar
2 : Screening, diagnosis, and treatment of post-traumatic stress disorder. Military Medicine 177(suppl):7–13, 2012Crossref, Medline, Google Scholar
3 : The efficacy of recommended treatments for veterans with PTSD: a metaregression analysis. Clinical Psychology Review 40:184–194, 2015Crossref, Medline, Google Scholar
4 : Cognitive processing therapy for sexual assault victims. Journal of Consulting and Clinical Psychology 60:748–756, 1992Crossref, Medline, Google Scholar
5 : Cognitive behavioral therapy for posttraumatic stress disorder in women: a randomized controlled trial. JAMA 297:820–830, 2007Crossref, Medline, Google Scholar
6 : Cognitive processing therapy for veterans with military-related posttraumatic stress disorder. Journal of Consulting and Clinical Psychology 74:898–907, 2006Crossref, Medline, Google Scholar
7 : Outcomes of prolonged exposure therapy for veterans with posttraumatic stress disorder. Journal of Traumatic Stress 26:419–425, 2013Crossref, Medline, Google Scholar
8 : The comparative effectiveness of cognitive processing therapy for male veterans treated in a VHA posttraumatic stress disorder residential rehabilitation program. Journal of Consulting and Clinical Psychology 79:590–599, 2011Crossref, Medline, Google Scholar
9 : Effectiveness of national implementation of prolonged exposure therapy in Veterans Affairs care. JAMA Psychiatry 70:949–955, 2013Crossref, Medline, Google Scholar
10 : Effectiveness of cognitive processing therapy for male and female US veterans with and without military sexual trauma. Journal of Traumatic Stress 28:174–182, 2015Crossref, Medline, Google Scholar
11 VA/DOD Clinical Practice Guideline for the Management of Posttraumatic Stress Disorder and Acute Stress Disorder. Version 3.0. Washington, DC, US Department of Veterans Affairs and Department of Defense, 2017. https://www.healthquality.va.gov/guidelines/MH/ptsd/Google Scholar
12 Clinical Practice Guideline for the Treatment of PTSD in Adults. Washington, DC, American Psychological Association, 2017. http://www.apa.org/ptsd-guideline/ptsd.pdfGoogle Scholar
13 : Cognitive Processing Therapy: Veteran/Military Version. Therapist and Patient Materials Manual. Washington, DC, US Department of Veterans Affairs, 2014. www.div12.org/wp-content/uploads/2015/07/CPT-Materials-Manual.pdfGoogle Scholar
14 : Prolonged Exposure Therapy for PTSD: Emotional Processing of Traumatic Experiences—Therapist Guide. Oxford, United Kingdom, Oxford University Press, 2007Crossref, Google Scholar
15 : Dissemination of evidence-based psychological treatments for posttraumatic stress disorder in the Veterans Health Administration. Journal of Traumatic Stress 23:663–673, 2010Crossref, Medline, Google Scholar
16 Uniform Mental Health Services in VA Medical Centers and Clinics. VHA Handbook 1160.01. Washington, DC, US , 2008Google Scholar
17 : A review of studies on the system-wide implementation of evidence-based psychotherapies for posttraumatic stress disorder in the Veterans Health Administration. Administration and Policy in Mental Health and Mental Health Services Research 43:957–977, 2016Crossref, Medline, Google Scholar
18 Sayer NA, Rosen CS, Bernardy NC, et al: Context matters: team and organizational factors associated with reach of evidence-based psychotherapies for PTSD in the Veterans Health Administration. Administration and Policy in Mental Health 44: 904–918, 2017Google Scholar
19 : Maintenance and reach of exposure psychotherapy for posttraumatic stress disorder 18 months after training. Journal of Traumatic Stress 30:63–70, 2017Crossref, Medline, Google Scholar
20 : Evaluating the public health impact of health promotion interventions: the RE-AIM framework. American Journal of Public Health 89:1322–1327, 1999Crossref, Medline, Google Scholar
21 : Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research 38:65–76, 2011Crossref, Medline, Google Scholar
22 : Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implementation Science 8:127, 2013Crossref, Medline, Google Scholar
23 : The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science 7:17, 2012Crossref, Medline, Google Scholar
24 : Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implementation Science 10:88, 2015Crossref, Medline, Google Scholar
25 : Public health program capacity for sustainability: a new framework. Implementation Science 8:15, 2013Crossref, Medline, Google Scholar
26 : Developing a comprehensive definition of sustainability. Implementation Science 12:110, 2017Crossref, Medline, Google Scholar
27 : NHS Sustainability Model and Guide. London, NHS Institute for Innovation and Improvement, 2010. http://www.qihub.scot.nhs.uk/media/162236/sustainability_model.pdfGoogle Scholar
28 : The influence of hospital culture on rehabilitation team functioning in VA hospitals. Journal of Rehabilitation Research and Development 39:115–125, 2002Medline, Google Scholar
29 : Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research 38:4–23, 2011Crossref, Medline, Google Scholar
30 : Program fidelity and beyond: multiple strategies and criteria for ensuring quality of assertive community treatment. Psychiatric Services 63:743–750, 2012Link, Google Scholar
31 : Measuring team factors thought to influence the success of quality improvement in primary care: a systematic review of instruments. Implementation Science 8:20, 2013Crossref, Medline, Google Scholar
32 : Utilization of evidence-based psychotherapies in Veterans Affairs posttraumatic stress disorder outpatient clinics. Psychological Services 12:73–82, 2015Crossref, Medline, Google Scholar
33 : Team factors that predict to sustainability indicators for community-based prevention teams. Evaluation and Program Planning 34:283–291, 2011Crossref, Medline, Google Scholar
34 : Factors associated with sustainability of 2 quality improvement programs after achieving early implementation success: a qualitative case study. Journal of Evaluation in Clinical Practice 23:1135–1143, 2017Crossref, Medline, Google Scholar
35 : Volume, quality of care, and outcome in pneumonia. Annals of Internal Medicine 144:262–269, 2006Crossref, Medline, Google Scholar
36 : What drives change? Barriers to and incentives for achieving evidence-based practice. Medical Journal of Australia 180(suppl):S57–S60, 2004Crossref, Medline, Google Scholar
37 : Implementation of evidence-based practice in community behavioral health: agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research 34:479–488, 2007Crossref, Medline, Google Scholar
38 : Dissemination and experience with cognitive processing therapy. Journal of Rehabilitation Research and Development 49:667–678, 2012Crossref, Medline, Google Scholar
39 : Evaluation of an implementation model: a national investigation of VA residential programs. Administration and Policy in Mental Health and Mental Health Services Research 42:147–156, 2015Crossref, Medline, Google Scholar
40 : Mail and Internet Surveys: The Tailored Design Method, 2nd ed. Hoboken, NJ, Wiley, 2017Google Scholar
41 : Measuring sustainability within the Veterans Administration mental health system redesign initiative. Quality Management in Health Care 20:263–279, 2011Crossref, Medline, Google Scholar
42 : Sustainability in primary care and mental health integration projects in Veterans Health Administration. Quality Management in Health Care 21:240–251, 2012Crossref, Medline, Google Scholar
43 : Family care map: sustaining family-centered care in polytrauma rehabilitation centers. Journal of Rehabilitation Research and Development 51:1311–1324, 2014Crossref, Medline, Google Scholar
44 : Measuring organizational attributes of primary care practices: development of a new instrument. Health Services Research 42:1257–1273, 2007Crossref, Medline, Google Scholar
45 : Measuring organizational attributes in primary care: a validation study in Germany. Journal of Evaluation in Clinical Practice 16:1289–1294, 2010Crossref, Medline, Google Scholar
46 : Is yours a learning organization? Harvard Business Review 86:109–116, 134, 2008Medline, Google Scholar
47 : Psychological safety and learning behavior in work teams. Administrative Science Quarterly 44:350–383, 1999Crossref, Google Scholar
48 : Staff, space, and time as dimensions of organizational slack: a psychometric assessment. Health Care Management Review 36:252–264, 2011Crossref, Medline, Google Scholar
49 : Organizational citizenship behavior and the quantity and quality of work group performance. Journal of Applied Psychology 82:262–270, 1997Crossref, Medline, Google Scholar
50 : Making sense of Cronbach’s alpha. International Journal of Medical Education 2:53–55, 2011Crossref, Medline, Google Scholar
51 : Measuring use of evidence-based psychotherapy for posttraumatic stress disorder. Administration and Policy in Mental Health and Mental Health Services Research 40:311–318, 2013Crossref, Medline, Google Scholar
52 : Inference and missing data. Biometrika 63:581–592, 1976Crossref, Google Scholar
53 : Multiple Imputation for Nonresponse in Surveys. New York, Wiley, 1987Crossref, Google Scholar
54 : What do we know about health care team effectiveness? A review of the literature. Medical Care Research and Review 63:263–300, 2006Crossref, Medline, Google Scholar
55 Carlstein E, Muller H, Siegmund D (eds): Change-Point Problems. Lecture Notes–Monograph Series, vol 23. Hayward, CA, Institute of Mathematical Statistics, 1996Google Scholar
56 : Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annual Review of Public Health 35:255–274, 2014Crossref, Medline, Google Scholar
57 : Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science 10:11, 2015Crossref, Medline, Google Scholar
58 : A team quality improvement sequence for complex problems. BMJ Quality and Safety 8:239–246, 1999Crossref, Google Scholar