Participation in Training for Depression Care Quality Improvement: A Randomized Trial of Community Engagement or Technical Support
Abstract
Objective:
Community engagement and planning (CEP) could improve dissemination of depression care quality improvement in underresourced communities, but whether its effects on provider training participation differ from those of standard technical assistance, or resources for services (RS), is unknown. This study compared program- and staff-level participation in depression care quality improvement training among programs enrolled in CEP, which trained networks of health care and social-community agencies jointly, and RS, which provided technical support to individual programs.
Methods:
Matched programs from health care and social-community service sectors in two communities were randomly assigned to RS or CEP. Data were from 1,622 eligible staff members from 95 enrolled programs. Primary outcomes were any staff trained (for programs) and total hours of training (for staff). Secondary staff-level outcomes were hours of training in specific depression collaborative care components.
Results:
CEP programs were more likely than RS programs to participate in any training (p=.006). Within health care sectors, CEP programs were more likely than RS programs to participate in training (p=.016), but within social-community sectors, there was no difference in training by intervention. Among staff who participated in training, mean training hours were greater among CEP programs versus RS programs for any type of training (p<.001) and for training related to each component of depression care (p<.001) except medication management.
Conclusions:
CEP may be an effective strategy to promote staff participation in depression care improvement efforts in underresourced communities.
Depressive disorders—compounded by racial disparities in access to and quality and outcomes of care in underresourced communities—are a leading cause of disability in the United States (1–7). Depression care quality improvement programs that focus on improvement of depression treatment by using team-based management of chronic diseases in primary care can improve quality and care outcomes for depressed adults, including members of racial and ethnic minority groups (8–17). Under health care reform, Medicaid funds can be used to establish behavioral health homes that provide incentives for partnerships among general medical, mental health, and social and community agencies, such as parks and senior centers. These services “must include prevention and health promotion, health care, mental health and substance use, and long-term care services, as well as linkages to community supports and resources” (18). However, few guidelines exist to help organize diverse agencies into systems that support chronic disease management. Also, no studies have compared alternative approaches for providing training in depression care quality improvement for providers from a diverse assortment of health care and social-community programs.
This study analyzed data from Community Partners in Care (CPIC), a group-level, randomized, comparative-effectiveness study of two approaches for implementation of evidence-based, depression care quality improvement tool kits adapted for diverse health care and social-community settings. One implementation approach, resources for services (RS), relied on providing more traditional technical assistance to individual programs. The other approach, community engagement and planning (CEP), used community-partnered, participatory research (CPPR) principles to support collaborative planning by a network of agencies seeking to implement the same depression care tool kits (19–25).
Health care and social-community programs were assigned randomly to each approach (20,21). Six-month follow-up comparing outcomes of clients with depression in CEP and RS revealed that clients in CEP had improved mental health–related quality of life, increased physical activity, reduced homelessness risk factors, reduced behavioral health hospitalizations and specialty care medication visits, and increased use of depression services in primary care or public health, faith-based, and park or community center programs (20). At 12 months, the effects of CEP on mental health–related quality of life continued (25).
This study focused on CPIC’s main intervention effects for program and staff participation in evidence-based, depression care quality improvement training. Training participation (program level) and total training hours (staff level) were the primary outcomes. We hypothesized that CEP would lead to a broader range of staff training options compared with RS. To determine the types of organizations that would participate in training, we compared interventions’ effects by program type (health care versus social-community). On the basis of prior work, we hypothesized that compared with RS, CEP would increase mean hours of training participation, especially among social-community programs, where such training is novel (26–28). To inform future depression care quality improvement dissemination efforts in safety-net communities, we conducted exploratory analyses of the interventions’ effects on staff training participation for each depression care quality improvement component and by service sector.
Methods
CPIC was conducted by using CPPR, a manualized form of community-based, participatory research, with community and academic partners coleading all aspects of research under equal authority (19–25,29,30). The study was designed and implemented by the CPIC Council, which comprises three academic organizations and 22 community agencies. The study design is described elsewhere (19–21,25).
Sampling and Randomization
Two Los Angeles communities with high poverty and low insurance rates (31), South Los Angeles and Hollywood-Metro, were selected by convenience on the basis of established partnerships between Healthy African American Families II, QueensCare Health and Faith Partnership, Behavioral Health Services, the University of California, Los Angeles, and RAND (19,32–36).
Programs.
County lists of agencies supplemented by community nominations were used to identify agencies (32). After assessing eligibility, we offered consent to 60 potentially eligible agencies with 194 programs. To be eligible, programs were required to treat at least 15 clients per week, have at least one staff member, and to have a focus other than psychotic disorders or home services; 133 of the 194 programs were potentially eligible and were assigned at random to RS (N=65) or CEP (N=68). [A CONSORT diagram of the recruitment and enrollment of agencies, programs, and staff is available as an online supplement to this article.]
Agencies were paired into units or clusters of smaller programs, based on location and program characteristics and randomized to CEP or RS. Site visits were conducted postrandomization to finalize enrollment; 20 programs were ineligible, 18 program refused to participate, and 95 programs from 50 consenting agencies were enrolled. Program administrators were informed of intervention status by letter. Participating and nonparticipating programs were located in comparable neighborhoods according to U.S. Census data on age, sex, race, population density, and income by zip code level (37).
Staff.
All staff (paid, volunteer, licensed, and nonlicensed) with direct client contact were eligible for training. The number of eligible staff was indicated on surveys completed by program administrators at baseline. [Survey items related to number of staff with direct client contact are available in the online supplement.]
For missing responses or outliers (low or high values), we made phone calls to programs to obtain, confirm, or correct information. The 95 enrolled programs had 1,622 eligible staff. One eligible administrative staff member who was responsible for oversight of a program assigned to RS and a program assigned to CEP was excluded from analysis, resulting in a final analytic sample of 1,621 staff.
The institutional review boards of RAND and participating agencies approved study procedures. Administrators provided written consent before completing the surveys, and oral consent for use of data related to attendance at training events was obtained from staff.
Interventions.
The interventions were designed to support implementation of depression care quality improvement components relevant to each program’s scope. Both interventions used the same evidence-based tool kits for support of care management (screening, coordination, and patient education), medication management, and cognitive-behavioral therapy (CBT) (16,19,21,25,38–40). Materials were made available to eligible programs via print manuals, a Web site, and flash drives (34). Tool kits were introduced at one-day kickoff conferences in each community before randomization (19,21,25). After randomization and enrollment, staff who had attended prior study meetings were invited by phone, e-mail, and postcard to attend training sessions for the intervention in which their program had been enrolled and were encouraged to circulate the invitations to all eligible staff. Eligible staff could choose to participate in any, all, or no training sessions, which were offered at no charge. The only incentives for participation were continuing education credits, access to training, and the food served during training.
The content, structure, and training intensity of RS were developed by a research team and the authors, rather than by participating RS agencies, to reflect a more traditional approach to depression care quality improvement implementation. Similar to Partners in Care (19,20,21,25,39), the training provided technical assistance to individual programs by using a “train the trainer” model. RS training, consisting of Webinars and primary care site visits, was conducted between December 2009 and July 2010 by an interdisciplinary team of three psychiatrists, who discussed medication management; a nurse care manager; a licensed psychologist, who discussed CBT; an experienced community administrator; and research assistants. Tool kits were modified to fit programs.
Programs assigned to CEP were invited to identify one or more staff to join South Los Angeles and Hollywood-Metro CEP councils. Each council met biweekly for two hours over five months to tailor depression care tool kits and implementation plans that would maximize each community’s strengths. The councils were given a written manual and online materials, including community engagement strategies. In South Los Angeles, the planning meetings occurred from December 2009 to April 2010, and in Hollywood-Metro, meetings were held from March to July 2010. Each council met through January 2011 to oversee implementation. In South Los Angeles, council participants included 12 academic and 13 community participants, and in Hollywood-Metro, council participants included 19 academic and 11 community participants. During planning, each council modified the tool kits as well as the goals, intensity, duration, and format of training sessions to fit community and program needs (29,30). CEP trainings were not prespecified. Each council could have chosen any plan, including replicating RS or conducting no training at all.
Data Sources
Data about service sector and community for the 95 enrolled programs were obtained from administrators during recruitment. The number of eligible staff with direct client contact was obtained from a baseline survey of administrators and follow-up phone calls to administrators. Training event data, such as date, hours, depression care component, and program affiliation of attendees, were obtained from registration forms for training events, training logs, and sign-in sheets. A data set was created listing staff members coded by program sector, intervention status, and community.
Outcomes
At the program level, the primary outcome was program participation in depression care quality improvement training, defined as the percentage of programs with any staff participation in training. At the staff level, the primary outcome was total hours of staff participation across all programs and stratified by service sector. Health care sectors included primary care, mental health care, and substance abuse services; social-community sectors included homelessness services and other social and community-based services. Secondary outcomes at the staff level included percentage of staff who participated in any training and hours of participation in each depression care component (medication management, CBT, care management, or other).
The main independent variable was program random assignment (CEP or RS). Covariates included program service sector (health care or social-community) and community (South Los Angeles or Hollywood-Metro).
Statistical Methods
At baseline, we compared program and staff characteristics among programs assigned to each intervention by using chi square tests. For main program-level analyses, we examined the interventions’ effects on outcomes; the analyses were controlled for service sector and community, and results are reported as chi square statistics. For staff-level analyses, we compared the interventions’ effects on total hours of training by using two-part models because of skewed distributions (41). The first part used logistic regression to estimate the probability of receipt of any training hours. The second part used ordinary least squares to estimate the log of hours, a measure of the total training hours received by staff who received any training; the analyses controlled for community and service sector (42).
We used smearing estimates for retransformation, applying separate factors for each intervention group to ensure consistent estimates (43,44). We adjusted models for clustering by programs by using SAS macros developed by Bell and McCaffrey (45), which used a bias reduction method for standard error estimation. We also conducted exploratory stratified analyses within each service sector by using logistic regression models for dichotomous measures and log-linear models for counts, with intervention condition as the independent variable adjusted for service sector and community because the cell sizes for each sector were not sufficient for two-part models (46). To assess robustness of the adjusted models, we supplemented the adjusted models by performing the same calculations with unadjusted raw data. [The results of the unadjusted models are available in the online supplement.] Analyses were conducted by using SUDAAN, version 11.0, and accounted for clustering of staff within programs (47).
Results
Of 95 enrolled and randomized programs, 46 participated in RS and 49 participated in CEP. Randomized programs showed no statistically significant differences by baseline characteristics (community, service sector, and total staff) or participation in study activities before randomization (attended a kickoff conference). [A comparison of program characteristics by intervention condition is available in the online supplement]. About half of the programs in each intervention were from each community, and programs were well distributed across sectors—primary care (N=17, 18%), mental health care (N=18, 19%), substance abuse services (N=20, 21%), homelessness services (N=10, 11%), and community-based services (N=30, 32%).
Of 1,621 eligible staff, 723 participated in RS programs and 898 participated in CEP programs; 493 (30%) worked in programs in the primary care or public health sector, 290 (18%) in mental health services, 264 (16%) in substance abuse services, 168 (10%) in homelessness services, and 406 (25%) in community-based services. There were no significant differences in staff characteristics by intervention status [see online supplement].
After program randomization, the training experiences developed by CEP councils in Hollywood-Metro and South Los Angeles were more intensive, broader, and more flexible than the types of training available at the time of the kickoff conference and the training experiences offered by RS. Examples of more-intensive training consisted of CBT consultation support for staff for one or two patients over 12 to 16 weeks and a ten-week Webinar providing CBT consultation to groups of staff. Training in self-care for providers and active listening are examples of broader training. The use of various methods, such as Webinars, conference calls, and multiple one-day conferences, to offer the same content was an example of more flexible training. Table 1 summarizes training modifications and innovations introduced by CEP. Across both communities, CEP provided 144 training interventions totaling 220.5 hours, including 135.0 hours for CBT, 60.0 hours for care management, 6.0 hours for medication management, and 19.5 hours for other skills.
Feature | RS | CEP |
---|---|---|
Initial model | Depression care quality improvement (QI) tool kit (slides, workbooks, and patient education materials) via print, flash drives, and Website; training via 12 Webinars and conference calls and site visits to primary care settings; up to 5 outreach calls by a community engagement specialist to encourage participation and fit tool kits to programs; study paid for training and materials ($16,333 per community) | Depression care QI tool kit (slides, workbooks, and patient education materials) via print, flash drives, and Website; 5 months of 2-hour, twice monthly planning meetings of CEP councils to tailor materials and develop and implement a written training plan for each community, guided by a manual and community engagement model; coleadership by Community Partners in Care council promulgating community engagement and social justice principles to encourage collaboration and network building; study paid for consultations and training modifications ($15,000 per community) |
Implemented model | ||
Overall | 21 Webinars and 1 primary care site visit | Multiple 1-day conferences, with follow-up training at sites; Webinar and telephone-based supervision |
Cognitive-behavioral therapy (CBT) and clinical assessment | Materials and 4 Webinars for licensed physicians, psychologists, social workers, nurses, and marriage and family therapists | Tiers of training: intensive CBT support including feedback on audiotaped therapy sessions with 1 or 2 patients with depression for 12–16 weeks (tier 1) and 10-week group consultations by Webinar (tier 2) for licensed providers and substance abuse counselors and orientation workshops for any staff trainee about concepts and approaches (tier 3) |
Care management | 4 Webinars and resources for depression screening, assessment of comorbid conditions, client education and referral, tracking visits to providers, assessment of medication adherence and depressive symptom outcomes, and introduction to problem-solving therapy and behavioral activation for nurses, caseworkers, health educators, spiritual advisors, promotoras, and lay counselors | In-person conferences, individual agency site visits, and telephone supervision for nurses, caseworkers, health educators, spiritual advisors, promotoras, and lay counselors; modifications included a focus on self-care for providers, simplification of materials (such as fact sheets), and shorter measures to track patient outcomes (similar range of providers and staff as RS); training in active listening in 1 community and training of volunteers to expand capacity in 1 community; development of an alternative “resiliency class” approach to support wellness for Village Clinic |
Medication and clinical assessment | Training for physicians, nurses, nurse practitioners, and physician assistants in medication management and diagnostic assessment provided through Webinars and in-person site visits to primary care settings | 2-tiered approach: training for medication management and clinical assessment coupled with information on complementary or alternative therapies and prayer for depression through training slides (tier 1) and orientation about concepts for lay providers (tier 2) |
Administration or other | Webinar on overview of intervention plan approaches to team-building or management group and team-building resources | Conference break-outs for administrators on team management and team building and team-building resources, as well as support for grant writing for programs; administrative problem solving to support Village Clinic, including option of delegation of outreach to clients from RAND survey group, identification of programs to support case management, resiliency classes, and CBT for depression |
Training events (hours) | ||
Total | 22 (21 webinars and 1 site visit) | 220.5 (144 training events) |
CBT | 8 | 135 |
Care management | 8 | 60 |
Medication | 1 | 6 |
Implementation support for administrators | 5 | |
Other skills | 19.5 |
After randomization, a greater percentage of CEP programs than RS programs participated in training (86% versus 61%, p=.006). Stratified analyses by service sector showed that the percentage of health care programs that participated in training was greater for CEP than for RS (p=.016) (Table 2). A similar trend, although not significant, was found within social-community sectors.
Total | RS | CEP | Group effect | ||||||
---|---|---|---|---|---|---|---|---|---|
Program | N | % | N | % | N | % | χ2 | df | p |
All | 70 | 74 | 28 | 61 | 42 | 86 | 7.6 | 1 | .006 |
Health care sectorb | 29 | 47 | 19 | 66 | 24 | 92 | 5.8 | 1 | .016 |
Social-community sectorc | 23 | 58 | 9 | 53 | 18 | 78 | 2.9 | 1 | ns |
The two-part models showed that staff from programs assigned to CEP were more likely than staff from programs assigned to RS to participate in any training overall (p<.001). In social-community sectors, staff from CEP programs were more likely than staff from RS programs to participate in training (p<.001), but there were no intervention differences in training participation among staff from health care sectors (Table 3). Estimated hours of training among staff who participated in training were greater among CEP staff compared with RS staff in all programs (p<.001), in programs in health care sectors (p=.004), and in programs in social-community sectors (p=.003). Similarly, mean hours of training among staff who participated in training were greater among CEP staff compared with RS staff for all depression care quality improvement components except medication management, which did not differ significantly by intervention.
RS | CEP | All staffb | Staff that used trainingc | |||||||
---|---|---|---|---|---|---|---|---|---|---|
Variable | Estimated hours | 95% CI | Estimated hours | 95% CI | t | df | p | t | df | p |
Primary outcome | ||||||||||
All programs | .19 | .03 to .36 | 2.35 | 1.09 to 3.62 | 3.90 | 94 | <.001 | 4.67 | 94 | <.001 |
Health care programs | .35 | .03 to .66 | 1.88 | .31 to 3.46 | 1.39 | 54 | ns | 3.08 | 54 | .004 |
Social-community programs | .1 | –.03 to .22 | 2.91 | 1.53 to 4.29 | 4.51 | 39 | <.001 | 3.29 | 39 | .003 |
Secondary outcome | ||||||||||
Cognitive-behavioral therapy | .12 | .01 to .22 | .92 | .24 to 1.59 | 2.39 | 94 | .019 | 3.90 | 94 | <.001 |
Care management | .04 | .00 to .07 | .79 | .39 to 1.19 | 4.88 | 94 | <.001 | 4.02 | 94 | <.001 |
Medication management and education | .01 | –.01 to .03 | .08 | .03 to .12 | 2.47 | 94 | .015 | .32 | 94 | ns |
Otherd | .05 | .01 to .09 | .58 | .22 to .94 | 2.81 | 94 | .006 | 7.06 | 94 | <.001 |
In exploratory stratified analysis by service sector, there were no intervention differences in percentage of staff in the primary care or mental health specialty sector who attended any training. However, participation in training was greater among CEP staff compared with RS staff in sectors related to substance abuse services (p=.005), homelessness services (p<.001), and community-based services (p<.001) (Table 4). In addition, training hours were significantly greater among programs that participated in CEP versus RS for all sectors except primary care.
RS | CEP | Group effect | |||||
---|---|---|---|---|---|---|---|
Sector | Estimate | 95% CI | Estimate | 95% CI | t | df | p |
Any training (% of staff) | |||||||
Primary care | 7 | –1 to 14 | 11 | 1 to 21 | .7 | 94 | ns |
Mental health services | 9 | –2 to 20 | 27 | –1 to 56 | 1.5 | 94 | ns |
Substance abuse treatment | 8 | –.2 to 16 | 40 | 19 to 61 | 2.9 | 94 | .005 |
Homelessness services | 6 | –1 to 12 | 61 | 33 to 89 | 3.7 | 94 | <.001 |
Community-based programs | 4 | –1 to 9 | 12 | 5.3 to 18.5 | 4.2 | 94 | <.001 |
Total hours in training (mean) | |||||||
Primary care | .18 | –.04 to.39 | .65 | –.10 to 1.40 | 1.5 | 94 | ns |
Mental health services | .36 | –.13 to.85 | 4.31 | –1.33 to 9.95 | 2.8 | 94 | .005 |
Substance abuse treatment | .27 | –.06 to.60 | 3.93 | 1.81 to 6.06 | 3.8 | 94 | <.001 |
Homelessness services | .17 | –.08 to .42 | 2.29 | .45 to 4.12 | 3.1 | 94 | .003 |
Community-based programs | .07 | –.02 to .16 | 3.09 | 1.66 to 4.53 | 5.8 | 94 | <.001 |
Discussion
Our main finding was that a CEP approach to implement depression collaborative care developed a broader and more flexible range of training experiences and provided more hours of training across diverse health care and social-community sectors compared with programs assigned to technical assistance (RS) to implement depression collaborative care. Subsequently, staff of programs assigned to CEP had higher rates of training participation compared with staff of RS programs, both for all forms of training and for each component of depression quality improvement. This is an important finding that may offer insight into previously reported positive effects of CEP on clients’ health-related quality-of-life outcomes at six and 12 months (20,25). Before randomization, there were no significant differences in the percentage of RS and CEP programs that participated in kickoff events. However, after randomization, 86% of CEP programs participated in any training compared with 61% of RS programs; in health care sectors, participation in training was significantly greater among CEP programs (92%) compared with RS programs (66%). The use of a group-randomized trial increases confidence that the differences observed in training associated with the two approaches are due to the intervention’s approach. In other words, CEP’s increased training intensity and greater focus on creating a network of training opportunities for programs and providers are consistent with its community-driven plan.
For the primary outcome at the staff level, the study show that staff assigned to CEP programs were more likely than staff assigned to RS programs to participate in training. CEP was also associated with a greater likelihood of participation in training among programs in a health care sector but not among programs in a social-community sector. However, for staff with any training participation, CEP was associated with greater hours of training among programs overall as well as among programs in both the social-community and the health care sectors. Further exploratory analyses suggest that at the program level, CEP’s effects may be greater for health care programs than for social-community programs. However, for staff who attended any training, mean training hours for both health care and social-community programs were higher among staff associated with CEP rather than RS.
Few reports in the mental health services literature describe how penetration of training among staff and programs is affected by implementation of interventions for evidence-based programs. One study found that increased participation in training in an evidence-based child curriculum was associated with increased intervention delivery to patients (48). Another study found that the use of financial incentives for providers promoted depression collaborative care implementation in health care systems (49). In contrast, programs enrolled in CPIC were told that their staff could participate in any, all, or no trainings, with continuing education credits, access to trainings and materials, and the food provided during training as the only incentives. This suggests that community engagement can encourage agencies and providers, particularly from social-community sectors, to participate in quality improvement efforts to enhance quality of and access to depression care.
CEP may have increased staff participation compared with RS through several mechanisms. Partnering with local programs and staff to adapt training content may have made the materials more consistent with the programs’ existing capacities or interests, particularly for social-community settings. Although training was offered to programs and staff, it was not mandatory. CEP may have increased participation particularly among staff in programs with engaged leadership. In addition, CEP councils offered more training opportunities in response to community partners’ feedback (29,30). The inclusion of agency staff as cotrainers may have increased ownership and trust in training among CEP programs, just as including local opinion leaders in the development of practice guidelines appears to benefit implementation (50–52). The multiagency training plan developed by the community councils may have been appealing to both programs in health care sectors and programs in social-community sectors. The CEP group’s development of a more intensive training plan with greater training options may have been more consistent with staff’s sense of the support needed to implement depression care. More generally, the community engagement principles and activities associated with CEP may have instilled a greater sense of ownership and commitment, especially among programs in social-community sectors, which traditionally are not included in depression care training.
For both interventions, training exposure estimates may be conservative, given that staff who attended a training session may have shared what they learned with staff who were not in attendance. The CEP Councils’ efforts to develop a tailored plan for implementing depression care, consisting of biweekly meetings for five months followed by monthly implementation meetings, were substantial but feasible, given the large population (up to two million people) of the participating communities. Conducting the planning required coleadership by community and academic partners with experience in applying CPPR principles to depression care. RS also had a preparation period during which expert leaders conducted outreach to the participating programs by calling or visiting, in some cases up to five times. Future research should clarify which features of CEP promoted more provider engagement relative to RS; identify potential strategies, such as financial incentives, to enhance participation in training; and determine whether training participation mediated the intervention’s impact on patient outcomes.
The study had several limitations. Estimates of eligible staff were based on administrator survey items and follow-up calls, whereas staff training participation was based on registration, logs, and attendance sheets. Given that administrator estimates of eligible staff were largely obtained prior to randomization, it is unlikely that there was any differential bias in estimation by intervention condition. Future work may benefit from validating administrator reports with human resources records. Generalizability of our findings may be limited, given that the study design and data were not designed to separate out the differential effect of increased community engagement and changes in training—such as increased hours, intensity, flexibility, and breadth—associated with CEP. If replicated, the results suggest that CEP groups may offer a different set of training options with different participation effects. The study was not designed to assess whether increased training participation led to improved quality of care or whether improved quality of care led to improved client outcomes.
Conclusions
As health care reform expands access to care for millions of Americans, including many low-income Latinos and African Americans, building capacity among underresourced communities to implement evidence-based, depression care quality improvement programs (18,53–56), for example, through Medicaid behavioral health homes and accountable care organizations, will be a continuing priority. Our findings suggest that a CEP approach to develop and implement training to a network of providers may increase program and staff engagement, particularly among programs in health care sectors. It may also help develop staff capacity in other sectors, such as homelessness services and social services, that are typically located in racial-ethnic minority communities with historical distrust of services and research (22–24,57–63). Future work is needed to compare the cost-effectiveness of CEP and other interventions related to staff training, replicate this study’s findings in larger samples, clarify which CEP components improve providers’ depression care competencies, and determine whether training participation mediates intervention effects on client outcomes.
1 : Disability-adjusted life years (DALYs) for 291 diseases and injuries in 21 regions, 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet 380:2197–2223, 2012Crossref, Medline, Google Scholar
2 : Years lived with disability (YLDs) for 1,160 sequelae of 289 diseases and injuries 1990–2010: a systematic analysis for the Global Burden of Disease Study 2010. Lancet 380:2163–2196, 2012Crossref, Medline, Google Scholar
3 : Global and regional burden of disease and risk factors, 2001: systematic analysis of population health data. Lancet 367:1747–1757, 2006Crossref, Medline, Google Scholar
4 : Twelve-month use of mental health services in the United States: results from the National Comorbidity Survey Replication. Archives of General Psychiatry 62:629–640, 2005Crossref, Medline, Google Scholar
5 : Failure and delay in initial treatment contact after first onset of mental disorders in the National Comorbidity Survey Replication. Archives of General Psychiatry 62:603–613, 2005Crossref, Medline, Google Scholar
6 : Prevalence and distribution of major depressive disorder in African Americans, Caribbean blacks, and non-Hispanic whites: results from the National Survey of American Life. Archives of General Psychiatry 64:305–315, 2007Crossref, Medline, Google Scholar
7 : Depression care in the United States: too little for too few. Archives of General Psychiatry 67:37–46, 2010Crossref, Medline, Google Scholar
8 : Collaborative care for depression: a cumulative meta-analysis and review of longer-term outcomes. Archives of Internal Medicine 166:2314–2321, 2006Crossref, Medline, Google Scholar
9 : Educational and organizational interventions to improve the management of depression in primary care: a systematic review. JAMA 289:3145–3151, 2003Crossref, Medline, Google Scholar
10 : Collaborative care for patients with depression and chronic illnesses. New England Journal of Medicine 363:2611–2620, 2010Crossref, Medline, Google Scholar
11 : Chronic disease management: what will it take to improve care for chronic illness? Effective Clinical Practice 1:2–4, 1998Medline, Google Scholar
12 : Collaborative management of chronic illness. Annals of Internal Medicine 127:1097–1102, 1997Crossref, Medline, Google Scholar
13 : Organizing care for patients with chronic illness. Milbank Quarterly 74:511–544, 1996Crossref, Medline, Google Scholar
14 : Collaborative care management of late-life depression in the primary care setting: a randomized controlled trial. JAMA 288:2836–2845, 2002Crossref, Medline, Google Scholar
15 : Five-year impact of quality improvement for depression: results of a group-level randomized controlled trial. Archives of General Psychiatry 61:378–386, 2004Crossref, Medline, Google Scholar
16 : Impact of disseminating quality improvement programs for depression in managed primary care: a randomized controlled trial. JAMA 283:212–220, 2000Crossref, Medline, Google Scholar
17 : Cost-effectiveness of a multicondition collaborative care intervention: a randomized controlled trial. Archives of General Psychiatry 69:506–514, 2012Crossref, Medline, Google Scholar
18 Behavioral Health Homes for People With Mental Health and Substance Use Conditions: Core Clinical Features. Rockville, Md, Substance Abuse and Mental Health Services Administration, Center for Integrated Health Solutions, 2012. Available at www.integration.samhsa.gov/integrated-care-models/health-homesGoogle Scholar
19 : Using a community partnered participatory research approach to implement a randomized controlled trial: planning Community Partners in Care. Journal of Health Care for the Poor and Underserved 21:780–795, 2010Crossref, Medline, Google Scholar
20 : Community-partnered cluster-randomized comparative effectiveness trial of community engagement and planning or resources for services to address depression disparities. Journal of General Internal Medicine 28:1268–1278, 2013Crossref, Medline, Google Scholar
21 : Community-partnered evaluation of depression services for clients of community-based agencies in under-resourced communities in Los Angeles. Journal of General Internal Medicine 28:1279–1287, 2013Crossref, Medline, Google Scholar
22 : “Research” in community-partnered, participatory research. JAMA 302:320–321, 2009Crossref, Medline, Google Scholar
23 : Begin your partnership: the process of engagement. Ethnicity and Disease 19(suppl 6):S6–S8, S16, 2009Google Scholar
24 : Strategies for academic and clinician engagement in community-participatory partnered research. JAMA 297:407–410, 2007Crossref, Medline, Google Scholar
25 : 12-month outcomes of community engagement versus technical assistance for depression quality improvement: a partnered, cluster, randomized, comparative effectiveness trial. Annals of Internal Medicine 161:S23–S34, 2014Crossref, Medline, Google Scholar
26 : Building community resilience through mental health infrastructure and training in post-Katrina New Orleans. Ethnicity and Disease 21(3 suppl 1):20–29, 2011Google Scholar
27 : Community-based participatory development of a community health worker mental health outreach role to extend collaborative care in post-Katrina New Orleans. Ethnicity and Disease 21(3 suppl 1):45–51, 2011Google Scholar
28 : Opportunities and challenges of implementing collaborative mental health care in post-Katrina New Orleans. Ethnicity and Disease 21(3 suppl 1):S1 30–37, 2011Google Scholar
29 : Community Partners in Care: leveraging community diversity to improve depression care for underserved populations. International Journal of Diversity in Organizations, Communities and Nations 9:167–182, 2009Crossref, Medline, Google Scholar
30 : An implementation evaluation of the community engagement and planning intervention in the CPIC depression care improvement trial. Community Mental Health Journal 50:312–324, 2014Crossref, Medline, Google Scholar
31 Los Angeles County Multicultural Health Fact Sheet. Los Angeles, California Pan-Ethnic Health Network, 2012. Available at www.cpehn.org/resources.php. Accessed April 13, 2014Google Scholar
32 : Sampling and recruiting community-based programs for a cluster-randomized, comparative effectiveness trial using community-partnered participation research: challenges, strategies and lessons learned from Community Partners in Care. Health Promotion Practice, in pressGoogle Scholar
33 : Partnered evaluation of a community engagement intervention: use of a kickoff conference in a randomized trial for depression care improvement in underserved communities. Ethnicity and Disease 21(3 suppl 1):78–88, 2011Google Scholar
34 : Witness for Wellness: preliminary findings from a community-academic participatory research mental health initiative. Ethnicity and Disease 16(suppl 1):S18–S34, 2006Medline, Google Scholar
35 : Building an academic-community partnered network for clinical services research: the Community Health Improvement Collaborative (CHIC). Ethnicity and Disease 16(suppl 1):S3–S17, 2006Medline, Google Scholar
36 : Using community arts events to enhance collective efficacy and community engagement to address depression in an African American community. American Journal of Public Health 99:237–244, 2009Crossref, Medline, Google Scholar
37 Los Angeles County, California; in State and County QuickFacts. Washington, DC, United States Census Bureau. Available at quickfacts.census.gov/qfd/states/06/06037.html. Accessed June 23, 2012Google Scholar
38 : Treating depression in predominantly low-income young minority women: a randomized controlled trial. JAMA 290:57–65, 2003Crossref, Medline, Google Scholar
39 : Evidence-based care for depression in managed primary care practices. Health Affairs 18(5):89–105, 1999Crossref, Medline, Google Scholar
40 Depression Care Tools. Los Angeles, Community Partners in Care. Available at www.communitypartnersincare.org/depression-care-resources. Accessed April 1, 2013Google Scholar
41 : Methods for improving regression analysis for skewed continuous or counted responses. Annual Review of Public Health 28:95–111, 2007Crossref, Medline, Google Scholar
42 : The logged dependent variable, heteroscedasticity, and the retransformation problem. Journal of Health Economics 17:283–295, 1998Crossref, Medline, Google Scholar
43 : A comparison of alternative models for the demand of medical care. Journal of Business and Economic Statistics 1:115–126, 1983Google Scholar
44 : Smearing estimate: a nonparametric retransformation method. Journal of the American Statistical Association 78:605–610, 1983Crossref, Google Scholar
45 : Bias reduction in standard errors for linear regression with multi-stage samples. Survey Methodology 28:169–182, 2002Google Scholar
46 : Improved hypothesis testing for coefficients in generalized estimating equations with small samples of clusters. Statistics in Medicine 25:4081–4098, 2006Crossref, Medline, Google Scholar
47 : On the variances of asymptotically normal estimators from complex surveys. International Statistical Review 51:279–292, 1983Crossref, Google Scholar
48 : The role of training variables in effective dissemination of evidence-based parenting interventions. International Journal of Mental Health Promotion 8:20–28, 2006Crossref, Google Scholar
49 : Quality improvement with pay-for-performance incentives in integrated behavioral health care. American Journal of Public Health 102:e41–e45, 2012Crossref, Medline, Google Scholar
50 : Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health Services Research 36:24–34, 2009Crossref, Medline, Google Scholar
51 : Effectiveness, transportability, and dissemination of interventions: what matters when? Psychiatric Services 52:1190–1197, 2001Link, Google Scholar
52 : Diffusion of Innovations, 5th ed. New York, Free Press, 2003Google Scholar
53 The Mental Health Parity and Addiction Equity Act. Baltimore, Centers for Medicare and Medicaid Services, Center for Consumer Information and Insurance Oversight. Available at www.cms.gov/CCIIO/Programs-and-Initiatives/Other-Insurance-Protections/mhpaea_factsheet.html. Accessed April 7, 2015Google Scholar
54 Read the Law. Washington, DC, US Department of Health and Human Services. Available at www.healthcare.gov/law/full/index.html. Accessed Oct 13, 2014Google Scholar
55 : Primary Care and Public Health: Exploring the Integration to Improve Population Health. Washington, DC, National Academies Press, 2012Google Scholar
56 : Integrating public health and primary care systems: potential strategies from an IOM report. JAMA 308:461–462, 2012Crossref, Medline, Google Scholar
57 : Under the shadow of Tuskegee: African Americans and health care. American Journal of Public Health 87:1773–1778, 1997Crossref, Medline, Google Scholar
58 : Bad Blood: The Tuskegee Syphilis Experiment. New York, Free Press, 1993Google Scholar
59 : Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, DC, National Academies Press, 2003Google Scholar
60 Israel BA, Eng E, Schulz AJ, (eds): Methods in Community-Based Participatory Research for Health. San Francisco, Jossey-Bass, 2005Google Scholar
61 : Review of community-based research: assessing partnership approaches to improve public health. Annual Review of Public Health 19:173–202, 1998Crossref, Medline, Google Scholar
62 : Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. American Journal of Public Health 100(suppl 1):S40–S46, 2010Crossref, Medline, Google Scholar
63 Minkler M, Wallerstein N (eds): Community-Based Participatory Research for Health: From Process to Outcomes. San Francisco, Jossey-Bass, 2003Google Scholar