The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:https://doi.org/10.1176/appi.ps.201700432

Abstract

Objective:

It has been over a decade since the U.S. Department of Veterans Affairs (VA) began formal dissemination and implementation of two trauma-focused evidence-based psychotherapies (TF-EBPs). The objective of this study was to examine the sustainability of the TF-EBPs and determine whether team functioning and workload were associated with TF-EBP sustainability.

Methods:

This observational study used VA administrative data for 6,251 patients with posttraumatic stress disorder (PTSD) and surveys from 78 providers from 10 purposefully selected PTSD clinical teams located in nine VA medical centers. The outcome was sustainability of TF-EBPs, which was based on British National Health System Sustainability Index scores (possible scores range from 0 to 100.90). Primary predictors included team functioning, workload, and TB-EBP reach to patients with PTSD. Multiple linear regression models were used to examine the influence of team functioning and workload on TF-EBP sustainability after adjustment for covariates that were significantly associated with sustainability.

Results:

Sustainability Index scores ranged from 53.15 to 100.90 across the 10 teams. Regression models showed that after adjustment for patient and facility characteristics, team functioning was positively associated (B=9.16, p<.001) and workload was negatively associated (B=–.28, p<.05) with TF-EBP sustainability.

Conclusions:

There was considerable variation across teams in TF-EBP sustainability. The contribution of team functioning and workload to the sustainability of evidence-based mental health care warrants further study.

Posttraumatic stress disorder (PTSD) is a potentially disabling psychiatric disorder that disproportionately affects veterans (1,2). Fortunately, there are effective treatments (310). Treatments with the highest level of endorsement in clinical practice guidelines are trauma-focused evidence-based psychotherapies (TF-EBPs) (11,12). To implement two TF-EBPs, the U.S. Department of Veterans Affairs (VA) began nationwide competency-based training for cognitive processing therapy (CPT) (13) in 2006 and for prolonged exposure (PE) (14) in 2007 (15). VA policy requires that all VA patients with PTSD have access to one of these TF-EBPs (16).

Over the past decade, research has evaluated VA’s implementation of TF-EBPs in terms of provider and patient perceptions of TF-EBP acceptability, barriers and facilitators of provider uptake (17), and reach to patients with PTSD (18,19). Reach is a measure of the population impact of an intervention or program. It has been defined as the percentage and representativeness of individuals within a defined population who receive an intervention or program (20). Sayer et al. (18) found that outpatient PTSD teams with high levels of reach for TF-EBPs were organized differently than those with low reach. High-reach teams had a defined sense of purpose, or mission, to deliver TF-EBPs, internal and external leadership support for this mission, team members engaged in this mission, TF-EBP operational processes, and access to services outside the team for patients who do not want a TF-EBP or who have other treatment needs.

Another metric of implementation success is sustainability (21), which is defined as the extent to which a newly implemented treatment is maintained or institutionalized within a service (21,22). Although over a decade has passed since implementation of the TF-EBP training initiatives, very few studies have examined the sustainability of these psychotherapies (17). This omission is consistent with the larger dissemination and implementation literature, in which sustainability is understudied relative to initial adoption (23,24). In fact, sustainability is discussed in conceptual papers more often than it has been empirically investigated (21). Yet policy makers and program managers need to plan for sustainability so that the resources used in implementation are not wasted (25). Researchers have recommended that studies of program outcomes also identify factors that account for sustainability, given that such information can be used to guide effective implementation strategies and policy (26). In addition, data on the association among implementation outcomes are needed to determine whether improvements in various outcome domains are positively related with each other (for example, whether greater reach affects the extent to which a practice becomes a cultural norm, which promotes sustainability). Data can also show whether various outcome domains are negatively related (for example, whether a short-term increase in reach burdens the system and undercuts its sustainability) (21).

This study is guided by the British National Health System’s sustainability model, which uses a quantitative measure to estimate the likelihood of sustainability of a practice change. This model consists of 10 factors grouped into three larger domains: process (for example, credibility and adaptability of the change), staff (for example, leadership and staff support for the change), and organization (for example, infrastructure for sustaining the change and fit with the organization’s aims and culture) (22,27). The sustainability model was designed to help teams identify local contextual challenges to sustainability (27).

Is also important to determine whether there are additional factors that help ensure that a new practice or program will continue. Although the sustainability model assesses staff support for a practice change (22,27), it does not assess team functioning. Team functioning is the representation of how well team members work together and includes interpersonal cohesion and activities related to the work of the team (28). Prior research indicates that dimensions of team functioning affect program implementation and sustainability within health care settings (2934). In the VA, TF-EBPs are primarily delivered by providers working in specialized outpatient PTSD teams, known as PTSD clinical teams (15,19). Providers on these teams may have varying levels of success in working together to maintain the infrastructure and care processes needed for TF-EBP delivery. Thus high levels of team functioning may translate into greater TF-EBP sustainability. Mental health teams with greater cohesion may be more willing or able to support team members who are implementing TF-EBPs. A positive association between team functioning and sustainability could represent an important managerial consideration.

Another potentially important factor missing from the sustainability model is workload. Workload has been found to affect the provision of evidence-based care in both general medical and mental health settings (3537). CPT and PE are time-intensive (eight to 12 weekly sessions), manualized treatments (13,14). Adding weekly sessions of a TF-EBP, which are 60 or 90 minutes in length, may be challenging if providers already have a high caseload. In fact, providers reported that lack of time is a barrier to use of these treatments (32,38,39). Thus it would be important to determine whether greater workload diminishes TF-EBP sustainability.

This study examined TF-EBP sustainability in VA outpatient PTSD clinics 10 years after initial implementation. We hypothesized that there would be a positive association between team functioning and TF-EBP sustainability and a negative association between workload and TF-EBP sustainability. To inform future research, we conducted exploratory analysis to examine the association between TF-EBP reach and sustainability.

Methods

The VA Central Institutional Review Board approved this research. An alteration of informed consent (to allow for online consent) and waiver of documentation of informed consent were obtained for the anonymous staff survey. A waiver of informed consent and a HIPAA waiver were obtained for use of the administrative data.

Data Source

This study is based on VA administrative data and the PTSD team staff surveys collected as part of a mixed-methods study on reach of TF-EBPs. For the parent study, we used purposive sampling to select 10 PTSD teams from nine diverse facilities, with oversampling of teams that had high reach of TF-EBPs. Results from site visit interviews are presented elsewhere (18).

Procedures

To select sites for the parent study, we extracted administrative and chart note data for fiscal year (FY) 2015 (October 2014–September 2015) for all patients with PTSD who received psychotherapy from outpatient PTSD teams at the nine facilities. We identified staff assigned to the selected teams through a point of contact on each team. We used e-mail to recruit clinical staff and peer support specialists assigned to each team to complete an anonymous staff survey. Survey recruitment methods involved an introductory e-mail followed one week later by an e-mail with a link to the online consent and survey and then by three weekly reminder emails with the same link (40). At the time of the final reminder, we also mailed printed copies of the consent form and the survey and self-addressed stamped envelopes to our point of contact for distribution to the PTSD team clinicians. Surveys were completed between August 2014 and October 2015.

Dependent and Independent Variables

EBP sustainability.

Our outcome measure was the British National Health Service Sustainability Index. This 10-item measure assesses staff perceptions of factors that influence sustainability (22,27) and has been used to evaluate sustainability as an outcome (4143). For each item, responders choose one of four weighted statements that they believe best describes each domain. Possible scores range from 0 to 100.9. Preliminary evidence suggests that a total score of 55 or higher means that an intervention or program will be sustained. A score of 45 or lower means that significant effort is required to address barriers to sustainability (26). The Sustainability Index also includes subscales reflecting the conceptual domains of process (N=4), staff (N=4), and organization (N=2). In the survey to staff, we assessed sustainability of CPT and PE separately (CPT, α=.84; PE, α=.73). However, because the Sustainability Index scores for CPE and PE were highly correlated (r=.95), we combined them into one score (α=.78).

Team functioning.

The staff survey included 53 items with 10 measures to assess team functioning domains. Twenty-one items drawn from the Survey of Organizational Attributes of Primary Care Practice assessed team communication, decision making, chaos and stress, and history of change (44,45). We used measures from the Learning Organization Survey to assess the extent to which the learning environment was supportive and leadership reinforced learning (46); Edmondson’s conceptualization of team behaviors to assess team learning behavior (47); organizational slack measures to assess organizational resources for staff, space, and time (48); and the Team Helping Measure to assess organizational citizenship behaviors (49). The response options range from 1 (disagree strongly) to 5 (agree strongly). Scores were computed as the means across completed items, and reliability was moderate to high for all scales (Cronbach’s α>.79) (50).

Given the large number of items and small sample size, we used principal-components analysis to determine whether these 10 measures could be combined to create a composite measure. The first component explained 90.1% of total variance (eigenvalue=5.73), demonstrating unidimensionality. Nine of the 10 team functioning measures collectively contributed to the total score, whereas one measure, history of change, did not (eigenvalue=.73, 12% of variance explained). Thus we calculated a composite team functioning measure on the basis of the first principal component by using nine scales, with higher scores indicating more positive team functioning (α=.94). History of change served as a separate independent variable. Kaiser’s overall measure of sampling adequacy was .90.

Team workload.

We determined the number of patients with PTSD seen by the outpatient clinical teams by using the stop codes that designate team workload. Workload was calculated by dividing the number of PTSD patients seen by team members for any treatment in fiscal year 2015 by the number of PTSD providers on the team.

TF-EBP reach.

In FY 2015, VA released chart note templates that allowed for identification of EBP type. Because the templates were new, we applied natural-language processing to identify additional TF-EBP cases from psychotherapy notes that were not derived from templates. Our natural-language processing system was based on the work of Shiner et al. (51) but was built by using hand-coded rules in the SAS Enterprise Guide, version 7.1. Performance for therapy note classification as measured by recall, precision, and F measure was strong (recall=.91 and .85, precision=.96 and .95, and F measure=.93 and .90, for CPT and PE, respectively). A patient who received at least one session of CPT, PE, or both in FY 2015 was designated as a TF-EBP recipient. TF-EBP reach was defined as the number of patients with PTSD who received a TF-EBP divided by the number of therapy patients with PTSD who received any form of psychotherapy on each team. Reach on each team ranged from 14% to 59% (mean±SD=36%±17%). Because we oversampled high-reach teams, the rates did not represent TF-EBP reach across the VA.

Covariates.

We modeled three classes of covariates, each representing patient, team, or facility characteristics. We extracted patient age, sex, race, and ICD-9 codes indicative of major psychiatric disorders (alcohol and drug abuse, anxiety, depression, and psychosis). We constructed team-level patient characteristics by calculating the proportion of patients who were male, the proportion who were Caucasian, the patients’ mean number of psychiatric comorbidities, and the patients’ age. The staff survey assessed team members’ gender, age, race-ethnicity, occupation, and VA tenure. For each participating facility, we extracted location, complexity level, and designation as urban or rural. VA facilities are classified into five complexity levels (1a, 1b, 1c, 2, and 3) on the basis of patient population, clinical services, educational and research mission, and administrative complexity. Rankings range from most (1a) to least complex (3).

Analyses

Twenty-two of 78 providers did not complete all Sustainability Index items. We used regression modeling to induce a monotone missingness pattern and the propensity method (52,53) to multiply impute (five times) the missing items prior to calculating Sustainability Index scores. All the analyses were conducted within each imputed data set separately; the reported numbers are the combined results. We used responses to the other survey measures (for example, team functioning, provider characteristics, and completed Sustainability Index items) to construct the propensity and regression imputation models.

Initially, we used Pearson correlations to examine the associations between independent variables and TF-EBP Sustainability Index total and subscale scores. To test our hypotheses, we then used multiple linear models with Sustainability Index total scores regressed on the independent variables (team functioning, workload, history of change, and reach) separately, each time accounting for the covariates. Because of the small number of teams, team-level predictors were used only for the means modeling. In a final multiple linear regression model, we included all explanatory variables and the covariates simultaneously.

To evaluate the robustness of the findings obtained from the combined analyses of multiple imputed data sets, a complete case reanalysis of our model was also performed. The results were virtually identical. Model assumptions were examined by using scatter diagram and residual plots. The normality assumption was not rejected. Gaussian multiple regression analyses were conducted by using SAS, version 9.4.

Results

Table 1 describes the participating sites, patients, and survey responders. A total of 140 PTSD team members received surveys, and 78 (56%) completed the team functioning survey. The response rate per team ranged from 33% to 75%. Compared with responders, nonresponders were just as likely to be psychologists or social workers. All but one participant spent at least 20% of his or her time providing direct patient care, and 78% (N=61) identified as clinical staff without team leadership responsibilities. Measures to assess team functioning and scores of responders are presented in Table 2.

TABLE 1. Characteristics of nine VA medical centers, PTSD patients, and PTSD clinical team staffa

CharacteristicN%
U.S. Census region
 Midwest333
 Northeast222
 South222
 West222
Complexity level
 1a (most)444
 1b333
 1c111
 2111
 3 (least)0
Urban setting889
Patients with PTSD (N=6,251)
 Male5,09682
 Race
  American Indian1072
  Asian22.4
  Black2,13234
  Hawaiian731
  Multiracial922
  Unknown3906
  White3,43555
 Ethnicity
  Hispanic83813
  Not Hispanic5,26184
  Unknown1522
 Age (M±SD50.4±14.9
Staff (N=78)b
 Age
  <403342
  40–491823
  50–591519
  ≥60912
  Declined to answer34
Gender
 Female5165
 Male2532
 Declined to answer23
Race-ethnicity
 White6381
 African American56
 Asian11
 Native American00
 Multiracial23
 Hispanic810
 Declined to answer34
Professional discipline
 Psychologist4558
 Social worker1317
 Psychiatrist1114
 Clinical nurse56
 Other23
 Declined to answer23
Professional role on PTSD team
 Program director, assistant director, or team leader1621
 Staff member6178
 Declined to answer11
Very familiar with CPT, PE, or both4760
Use of CPT or PE
 Use CPT more than PE3342
 Use CPT and PE3039
 Use PE more than CPT1317
 Rarely or never use either CPT or PE23
Tenure at the VA (years)
 <62632
 6–103444
 11–1568
 ≥161013
 Declined to answer23

aPTSD, posttraumatic stress disorder; CPT, cognitive processing therapy; PE, prolonged exposure

bResponded to survey

TABLE 1. Characteristics of nine VA medical centers, PTSD patients, and PTSD clinical team staffa

Enlarge table

TABLE 2. Response scores among 78 staff members on measures to assess team functioninga

MeasureItemsMSDα
Communication44.131.03.92
Decision-making83.89.99.93
Chaos and stress62.88.97.90
History of change32.97.99.81
Supportive learning environment73.721.01.94
Leadership that reinforces learning43.901.07.94
Team learning behavior73.26.72.79
Organizational resources
 Staff and space33.351.22.88
 Time43.34.93.88
Organizational citizenship behaviors63.96.76.86

aPossible scores range from 1 to 5, with higher scores indicating more positive team functioning.

TABLE 2. Response scores among 78 staff members on measures to assess team functioninga

Enlarge table

Table 3 presents each team’s values for team functioning, workload, and reach, ranked by Sustainability Index score. Only one team (5a) had average Sustainability Index scores that fell below the preliminary cutoff of 55. Table 4 presents correlations between independent variables and Sustainability Index subscale and total scores. Higher scores on each Sustainability Index subscale were positively associated with better team functioning and reach and negatively associated with greater workload.

TABLE 3. Results of study measures among 10 PTSD clinical teams, ranked by Sustainability Index Scorea

Team functioningbWorkloadcReachdSustainabilitye
TeamCensus regionN of patients with PTSDN of providersN of survey respondersMSDMSDMSDMSD
5West23132.48.177744.756.53.4100.90.00
7Midwest1,0832216.92.494910.658.91.998.585.52
2Midwest389106.39.263912.555.92.694.936.49
3West51286.37.456422.831.72.392.845.71
9Midwest373421.11.049346.928.83.392.751.53
4South2,2933519–.20.896611.242.01.387.9013.31
8South1,570186–.291.098720.714.01.277.6814.45
6Northeast1,470198–.26.807717.917.71.568.4018.52
1Northeast1,2641610–1.17.967919.915.31.462.5022.06
5aWest72953–1.09.9914665.438.61.855.899.57

aPTSD, posttraumatic stress disorder. Teams 5 and 5a are located at the same VA medical center.

bScores ranged from –2.42 to 1.43. Possible scores range from 1 to 5, with higher scores indicating more positive team functioning.

cOperationalized as number of PTSD patients seen by each PTSD clinical team divided by number of providers on the team. Workload is a ratio of two correlated Poisson random variables with unknown covariance at the site level. Therefore, the covariance was assumed to be zero, which yielded the most conservative estimates of SD.

dNumber of patients with PTSD who received a trauma-focused evidence-based psychotherapy divided by the number of patients with PTSD who received any form of psychotherapy from therapists on the team

ePossible scores range from 0–100.90, with scores above 55 indicating that the program is likely to be sustained.

TABLE 3. Results of study measures among 10 PTSD clinical teams, ranked by Sustainability Index Scorea

Enlarge table

TABLE 4. Pearson correlations between independent variables and Sustainability Index subscale and total scoresa

Subscale
CharacteristicProcessStaffOrganizationTotal
History of change−.04−.02.02−.02
Team functioning.45**.46**.56*.53**
Workload−.43**−.47**−.57**−.54**
Reachb.57**.48**.49**.59**

aPTSD, posttraumatic stress disorder; TF-EBP, trauma-focused evidence-based psychotherapy

bDefined as the number of patients with PTSD who received a TF-EBP divided by the number of patients with PTSD who received any form of psychotherapy from therapists on the team

*p<.05, **p<.001

TABLE 4. Pearson correlations between independent variables and Sustainability Index subscale and total scoresa

Enlarge table

Multiple linear regression analysis in which we adjusted for covariates confirmed our hypotheses that a higher total score on the Sustainability Index was positively associated with better team functioning (B=9.18, SE=2.01, p<.001) and negatively associated with greater workload (B=–.26, SE=.12, p=.03). Exploratory analysis showed that after adjustment for the covariates, the association between reach and sustainability was not significant. These effects persisted when we included all three explanatory variables in one multiple linear regression analysis. That is, team functioning remained positively associated (B=9.16, SE=1.97, p<.001) and workload remained negatively associated with scores on the Sustainability Index (B=–.28, SE=–.12, p=.02), with no association found for reach.

Discussion

Although a decade has passed since the VA began implementation of TF-EBPs, this is one of the only studies to examine the sustainability of these psychotherapies (17). We found that mean TF-EBP Sustainability Index scores exceeded the threshold recommended by the British health care system, perhaps reflecting institutionalization of TF-EBPs through VA policy and training (15,16). Nonetheless, there was variability in sustainability of TF-EBPs, with some teams demonstrating room for improvement. We know from site visit interviews that the team with the lowest sustainability score was experiencing a leadership change and was worried about losing infrastructure and support for TF-EBPs. Thus the team’s relatively low sustainability scores reflected its concerns about the program’s viability during the site visits.

Even after adjustment for covariates, workload and team functioning were associated with TF-EBP sustainability. The finding that teams with better functioning had higher sustainability scores expands prior research suggesting that higher levels of team functioning are an important precursor for high-quality care (54). Higher functioning teams may have more frequent discussions about monitoring and meeting team goals, better allocation of job duties across work roles, better workload sharing practices, greater collaboration in solving problems and coping with challenges, and more effective team leaders (31), which may have facilitated sustainability.

The finding that higher workload was associated with lower sustainability converges with prior research suggesting that a high workload may interfere with delivery of evidence-based mental health care (3537), including TF-EBP delivery (32,38,39). It appears that more research is warranted to determine whether there is a change point after which greater workload threatens the sustainability of EBPs more substantially than would be predicted by a linear relationship. In classical statistical approaches, the change point is examined by approximating the function between two variables with a polynomial and constructing a confidence interval around the maximizer (55). A larger sample than the one available here would be needed to fully examine this issue.

Research examining the relationships among implementation outcomes is needed to inform theory of implementation and the design of implementation trials (21). Although TF-EBP reach was correlated with sustainability, it did not predict sustainability over and above the patient- and facility-level covariates. This finding suggests that interventions to promote TF-EBP sustainability may differ from interventions to improve reach. In qualitative interviews related to this study, we found that some sites with low reach of TF-EBPs expressed strong support for continuing to use these treatments, but with only a small portion of their patients. We examined TF-EBP reach and sustainability over the same 12-month period; a longitudinal study with a greater number of clinical teams and patients and multiple measurement points would be needed to evaluate the possibility of temporal associations between reach and sustainability. Evaluating temporal associations would lead to a better understanding of whether changes in one measure precede changes in the other or whether a synergistic effect exists.

Methodologically sound research on sustainability of evidence-based treatments is rare (23,24). A strength of this investigation is our quantification of TF-EBP reach by using chart note data rather than clinician self-report. The response rate to the survey (56%) was better than what is often achieved in surveys of providers (32,43), but our sample was relatively small because of the limited number of clinical staff assigned to each team. This limited our ability to examine more complex relationships among independent variables. Because our survey was anonymous, we had limited information on nonresponders. We did not measure team mission or purpose, a factor that was associated with TF-EBP reach in prior research (18) and that may also be associated with sustainability. Last, we did not assess fidelity, another important implementation outcome (21), and thus we cannot address questions about the quality of treatment delivery or the relationship between TF-EBP fidelity and sustainability.

Conclusions

Much of VA’s TF-EBP implementation efforts has focused on training individual clinicians to ensure that they have the skills to deliver CPT and PE (15). Although training is crucial, attention must also be paid to the context in which clinicians work (18,19,32). We are aware of implementation strategies that have included leadership training (56,57), but less attention has been paid to how to develop high-functioning teams (58). Similarly, there is a dearth of guidance regarding workload levels that optimize TF-EBP implementation and sustainability. The contribution of team functioning and workload to the sustainability of evidence-based mental health care warrants further study.

Dr. Mohr is with the Center for Healthcare Organization and Implementation Research and Dr. Kehle-Forbes is with the National Center for PTSD Women’s Health Sciences Division, U.S. Department of Veterans Affairs (VA) Boston Healthcare System, Boston. Dr. Mohr is also with the School of Public Health, Boston University, Boston. Dr. Kehle-Forbes is also with the Center for Chronic Disease Outcomes Research, Minneapolis VA Health Care System, Minneapolis, where Mr. Orazem, Dr. Noorbaloochi, Ms. Clothier, and Dr. Sayer are affiliated. Dr. Noorbaloochi and Dr. Sayer are also with the Department of Medicine, University of Minnesota, Minneapolis. Dr. Rosen, Dr. Eftekhari, Dr. Crowley, and Dr. Ruzek are with the Dissemination and Training Division, National Center for PTSD, VA Palo Alto Health Care System, Menlo Park, California. Dr. Rosen and Dr. Ruzek are also with the Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Palo Alto, California. Dr. Schnurr and Dr. Bernardy are with the Executive Division, National Center for PTSD, White River Junction VA Medical Center, White River Junction, Vermont, and also with the Department of Psychiatry, Geisel School of Medicine at Dartmouth, Hanover, New Hampshire. Dr. Chard is with the PTSD Division, Cincinnati VA Medical Center, Cincinnati. Dr. Cook is with the Evaluation Division, National Center for PTSD, and with the Yale University School of Medicine, both in New Haven, Connecticut.
Send correspondence to Dr. Mohr (e-mail: ).

This research was supported by VA Health Services Research and Development (HSR&D) grant CRE 12-021 (Dr. Sayer, principal investigator). Dr. Kehle-Forbes was supported by a VA HSR&D Career Development Award (CDA 09-020).

HSR&D was not involved in any aspect of the study’s design and conduct; data collection, management, analysis, or interpretation of data; or preparation, review, or approval of the manuscript. The findings and conclusions presented are those of the authors and do not necessarily represent the views of the VA or HSR&D.

The authors report no financial relationships with commercial interests.

References

1 Dursa EK, Reinhard MJ, Barth SK, et al.: Prevalence of a positive screen for PTSD among OEF/OIF and OEF/OIF-era veterans in a large population-based cohort. Journal of Traumatic Stress 27:542–549, 2014Crossref, MedlineGoogle Scholar

2 Wisco BE, Marx BP, Keane TM: Screening, diagnosis, and treatment of post-traumatic stress disorder. Military Medicine 177(suppl):7–13, 2012Crossref, MedlineGoogle Scholar

3 Haagen JF, Smid GE, Knipscheer JW, et al.: The efficacy of recommended treatments for veterans with PTSD: a metaregression analysis. Clinical Psychology Review 40:184–194, 2015Crossref, MedlineGoogle Scholar

4 Resick PA, Schnicke MK: Cognitive processing therapy for sexual assault victims. Journal of Consulting and Clinical Psychology 60:748–756, 1992Crossref, MedlineGoogle Scholar

5 Schnurr PP, Friedman MJ, Engel CC, et al.: Cognitive behavioral therapy for posttraumatic stress disorder in women: a randomized controlled trial. JAMA 297:820–830, 2007Crossref, MedlineGoogle Scholar

6 Monson CM, Schnurr PP, Resick PA, et al.: Cognitive processing therapy for veterans with military-related posttraumatic stress disorder. Journal of Consulting and Clinical Psychology 74:898–907, 2006Crossref, MedlineGoogle Scholar

7 Goodson JT, Lefkowitz CM, Helstrom AW, et al.: Outcomes of prolonged exposure therapy for veterans with posttraumatic stress disorder. Journal of Traumatic Stress 26:419–425, 2013Crossref, MedlineGoogle Scholar

8 Alvarez J, McLean C, Harris AH, et al.: The comparative effectiveness of cognitive processing therapy for male veterans treated in a VHA posttraumatic stress disorder residential rehabilitation program. Journal of Consulting and Clinical Psychology 79:590–599, 2011Crossref, MedlineGoogle Scholar

9 Eftekhari A, Ruzek JI, Crowley JJ, et al.: Effectiveness of national implementation of prolonged exposure therapy in Veterans Affairs care. JAMA Psychiatry 70:949–955, 2013Crossref, MedlineGoogle Scholar

10 Voelkel E, Pukay-Martin ND, Walter KH, et al.: Effectiveness of cognitive processing therapy for male and female US veterans with and without military sexual trauma. Journal of Traumatic Stress 28:174–182, 2015Crossref, MedlineGoogle Scholar

11 VA/DOD Clinical Practice Guideline for the Management of Posttraumatic Stress Disorder and Acute Stress Disorder. Version 3.0. Washington, DC, US Department of Veterans Affairs and Department of Defense, 2017. https://www.healthquality.va.gov/guidelines/MH/ptsd/Google Scholar

12 Clinical Practice Guideline for the Treatment of PTSD in Adults. Washington, DC, American Psychological Association, 2017. http://www.apa.org/ptsd-guideline/ptsd.pdfGoogle Scholar

13 Resick PA, Monson CM, Chard KM: Cognitive Processing Therapy: Veteran/Military Version. Therapist and Patient Materials Manual. Washington, DC, US Department of Veterans Affairs, 2014. www.div12.org/wp-content/uploads/2015/07/CPT-Materials-Manual.pdfGoogle Scholar

14 Foa EB, Hembree EA, Rothbaum BO: Prolonged Exposure Therapy for PTSD: Emotional Processing of Traumatic Experiences—Therapist Guide. Oxford, United Kingdom, Oxford University Press, 2007CrossrefGoogle Scholar

15 Karlin BE, Ruzek JI, Chard KM, et al.: Dissemination of evidence-based psychological treatments for posttraumatic stress disorder in the Veterans Health Administration. Journal of Traumatic Stress 23:663–673, 2010Crossref, MedlineGoogle Scholar

16 Uniform Mental Health Services in VA Medical Centers and Clinics. VHA Handbook 1160.01. Washington, DC, US Department of Veterans Affairs, 2008Google Scholar

17 Rosen CS, Matthieu MM, Wiltsey Stirman S, et al.: A review of studies on the system-wide implementation of evidence-based psychotherapies for posttraumatic stress disorder in the Veterans Health Administration. Administration and Policy in Mental Health and Mental Health Services Research 43:957–977, 2016Crossref, MedlineGoogle Scholar

18 Sayer NA, Rosen CS, Bernardy NC, et al: Context matters: team and organizational factors associated with reach of evidence-based psychotherapies for PTSD in the Veterans Health Administration. Administration and Policy in Mental Health 44: 904–918, 2017Google Scholar

19 Rosen CS, Eftekhari A, Crowley JJ, et al.: Maintenance and reach of exposure psychotherapy for posttraumatic stress disorder 18 months after training. Journal of Traumatic Stress 30:63–70, 2017Crossref, MedlineGoogle Scholar

20 Glasgow RE, Vogt TM, Boles SM: Evaluating the public health impact of health promotion interventions: the RE-AIM framework. American Journal of Public Health 89:1322–1327, 1999Crossref, MedlineGoogle Scholar

21 Proctor E, Silmere H, Raghavan R, et al.: Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research 38:65–76, 2011Crossref, MedlineGoogle Scholar

22 Doyle C, Howe C, Woodcock T, et al.: Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implementation Science 8:127, 2013Crossref, MedlineGoogle Scholar

23 Wiltsey Stirman S, Kimberly J, Cook N, et al.: The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science 7:17, 2012Crossref, MedlineGoogle Scholar

24 Proctor E, Luke D, Calhoun A, et al.: Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implementation Science 10:88, 2015Crossref, MedlineGoogle Scholar

25 Schell SF, Luke DA, Schooley MW, et al.: Public health program capacity for sustainability: a new framework. Implementation Science 8:15, 2013Crossref, MedlineGoogle Scholar

26 Moore JE, Mascarenhas A, Bain J, et al.: Developing a comprehensive definition of sustainability. Implementation Science 12:110, 2017Crossref, MedlineGoogle Scholar

27 Maher L, Gustafson D, Evans A: NHS Sustainability Model and Guide. London, NHS Institute for Innovation and Improvement, 2010. http://www.qihub.scot.nhs.uk/media/162236/sustainability_model.pdfGoogle Scholar

28 Strasser DC, Smits SJ, Falconer JA, et al.: The influence of hospital culture on rehabilitation team functioning in VA hospitals. Journal of Rehabilitation Research and Development 39:115–125, 2002MedlineGoogle Scholar

29 Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research 38:4–23, 2011Crossref, MedlineGoogle Scholar

30 Monroe-DeVita M, Morse G, Bond GR: Program fidelity and beyond: multiple strategies and criteria for ensuring quality of assertive community treatment. Psychiatric Services 63:743–750, 2012LinkGoogle Scholar

31 Brennan SE, Bosch M, Buchan H, et al.: Measuring team factors thought to influence the success of quality improvement in primary care: a systematic review of instruments. Implementation Science 8:20, 2013Crossref, MedlineGoogle Scholar

32 Finley EP, Garcia HA, Ketchum NS, et al.: Utilization of evidence-based psychotherapies in Veterans Affairs posttraumatic stress disorder outpatient clinics. Psychological Services 12:73–82, 2015Crossref, MedlineGoogle Scholar

33 Perkins DF, Feinberg ME, Greenberg MT, et al.: Team factors that predict to sustainability indicators for community-based prevention teams. Evaluation and Program Planning 34:283–291, 2011Crossref, MedlineGoogle Scholar

34 Ament SMC, Gillissen F, Moser A, et al.: Factors associated with sustainability of 2 quality improvement programs after achieving early implementation success: a qualitative case study. Journal of Evaluation in Clinical Practice 23:1135–1143, 2017Crossref, MedlineGoogle Scholar

35 Lindenauer PK, Behal R, Murray CK, et al.: Volume, quality of care, and outcome in pneumonia. Annals of Internal Medicine 144:262–269, 2006Crossref, MedlineGoogle Scholar

36 Grol R, Wensing M: What drives change? Barriers to and incentives for achieving evidence-based practice. Medical Journal of Australia 180(suppl):S57–S60, 2004Crossref, MedlineGoogle Scholar

37 Proctor EK, Knudsen KJ, Fedoravicius N, et al.: Implementation of evidence-based practice in community behavioral health: agency director perspectives. Administration and Policy in Mental Health and Mental Health Services Research 34:479–488, 2007Crossref, MedlineGoogle Scholar

38 Chard KM, Ricksecker EG, Healy ET, et al.: Dissemination and experience with cognitive processing therapy. Journal of Rehabilitation Research and Development 49:667–678, 2012Crossref, MedlineGoogle Scholar

39 Cook JM, Dinnen S, Coyne JC, et al.: Evaluation of an implementation model: a national investigation of VA residential programs. Administration and Policy in Mental Health and Mental Health Services Research 42:147–156, 2015Crossref, MedlineGoogle Scholar

40 Dillman DA: Mail and Internet Surveys: The Tailored Design Method, 2nd ed. Hoboken, NJ, Wiley, 2017Google Scholar

41 Ford JH II, Krahn D, Wise M, et al.: Measuring sustainability within the Veterans Administration mental health system redesign initiative. Quality Management in Health Care 20:263–279, 2011Crossref, MedlineGoogle Scholar

42 Ford JH II, Krahn D, Oliver KA, et al.: Sustainability in primary care and mental health integration projects in Veterans Health Administration. Quality Management in Health Care 21:240–251, 2012Crossref, MedlineGoogle Scholar

43 Ford JH II, Wise M, Krahn D, et al.: Family care map: sustaining family-centered care in polytrauma rehabilitation centers. Journal of Rehabilitation Research and Development 51:1311–1324, 2014Crossref, MedlineGoogle Scholar

44 Ohman-Strickland PA, John Orzano A, Nutting PA, et al.: Measuring organizational attributes of primary care practices: development of a new instrument. Health Services Research 42:1257–1273, 2007Crossref, MedlineGoogle Scholar

45 Ose D, Freund T, Kunz CU, et al.: Measuring organizational attributes in primary care: a validation study in Germany. Journal of Evaluation in Clinical Practice 16:1289–1294, 2010Crossref, MedlineGoogle Scholar

46 Garvin DA, Edmondson AC, Gino F: Is yours a learning organization? Harvard Business Review 86:109–116, 134, 2008MedlineGoogle Scholar

47 Edmondson A: Psychological safety and learning behavior in work teams. Administrative Science Quarterly 44:350–383, 1999CrossrefGoogle Scholar

48 Mallidou AA, Cummings GG, Ginsburg LR, et al.: Staff, space, and time as dimensions of organizational slack: a psychometric assessment. Health Care Management Review 36:252–264, 2011Crossref, MedlineGoogle Scholar

49 Podsakoff PM, Ahearne M, MacKenzie SB: Organizational citizenship behavior and the quantity and quality of work group performance. Journal of Applied Psychology 82:262–270, 1997Crossref, MedlineGoogle Scholar

50 Tavakol M, Dennick R: Making sense of Cronbach’s alpha. International Journal of Medical Education 2:53–55, 2011Crossref, MedlineGoogle Scholar

51 Shiner B, D’Avolio LW, Nguyen TM, et al.: Measuring use of evidence-based psychotherapy for posttraumatic stress disorder. Administration and Policy in Mental Health and Mental Health Services Research 40:311–318, 2013Crossref, MedlineGoogle Scholar

52 Rubin DB: Inference and missing data. Biometrika 63:581–592, 1976CrossrefGoogle Scholar

53 Rubin DB: Multiple Imputation for Nonresponse in Surveys. New York, Wiley, 1987CrossrefGoogle Scholar

54 Lemieux-Charles L, McGuire WL: What do we know about health care team effectiveness? A review of the literature. Medical Care Research and Review 63:263–300, 2006Crossref, MedlineGoogle Scholar

55 Carlstein E, Muller H, Siegmund D (eds): Change-Point Problems. Lecture Notes–Monograph Series, vol 23. Hayward, CA, Institute of Mathematical Statistics, 1996Google Scholar

56 Aarons GA, Ehrhart MG, Farahnak LR, et al.: Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annual Review of Public Health 35:255–274, 2014Crossref, MedlineGoogle Scholar

57 Aarons GA, Ehrhart MG, Farahnak LR, et al.: Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science 10:11, 2015Crossref, MedlineGoogle Scholar

58 Ovretveit J: A team quality improvement sequence for complex problems. BMJ Quality and Safety 8:239–246, 1999CrossrefGoogle Scholar