To the Editor: Consumer satisfaction has become an important outcome domain in the assessment of mental health services, as noted by Holcomb and associates (1) in the July 1998 issue. For the most part, satisfaction is portrayed as a proxy for service effectiveness and as an excellent tool for gaining consumer input. Unfortunately, there is strong reason to believe it is neither.
In their article, the authors present research purporting to demonstrate, among other things, a relationship between satisfaction and improvement using data obtained from a sample of individuals hospitalized at a Veterans Affairs medical center. The study perpetuates a major problem in research of this type: lack of accountability for shared method variance. Shared method variance refers to the association between two or more constructs that is solely due to the method used.
In the Holcomb paper, patient self-report was used to assess both satisfaction and clinical status. In this case we would expect to find some degree of association between satisfaction and clinical status due to the use of the same data collection method. For example, a respondent who is in a good mood or is appreciative of the services that were delivered (the "thank-you-very-much" factor) might rate his or her current clinical status positively and at the same time indicate great satisfaction with all aspects of the services that were delivered. The correlation between satisfaction and clinical status might reflect the respondent's mood or thankfulness rather than any true correlation between the constructs.
My colleagues and I found that shared method variance is very important when evaluating the relationship between these two constructs (2). When accounting for shared method variance, using data obtained from children, parents, and outside raters, we found no substantial relationship between satisfaction and clinical outcome in children's mental health services.
A growing number of studies, cited in our work, that do not control for method variance fail to demonstrate relationships between satisfaction and effectiveness. The Holcomb study appears to be one of few showing such a relationship. However, it is unusual that the mean satisfaction ratings fell near the middle of the scale. Most studies of satisfaction find a ceiling effect in which mean scores are nearly at the top of the scale. The lower satisfaction ratings in the Holcomb study may reflect the fact that the respondents were relatively older veterans in inpatient treatment who might respond less favorably to services. Nonetheless, at best the results are not generalizable.
One reason for not making overenthusiastic proclamations about the putative relationship between satisfaction and clinical outcome is that satisfaction data appear to influence decisions by administrators and consumers alike and may lead to a false sense of security about the adequacy of services. Effective services may be undermined by low satisfaction ratings, and, perhaps even more distressing, ineffective services may remain funded. If the goal is to base mental health polices and decision making on data, it is probably wisest not to base them on consumer satisfaction data.
Finally, the process of obtaining consumer satisfaction data may undermine other efforts to elicit consumer input and involve consumers in policy making. Consumer empowerment is a buzzword in behavioral health care and is supposedly satisfied by collecting and presenting consumer satisfaction measures. However, such measures may diminish other approaches to enhancing consumer empowerment, such as quarterly town hall meetings, participation on boards and committees, employment as consumers or as service providers, and development of services fully run by consumers. The bottom line is that more conceptual, methodological, and analytical rigor is needed before fully accepting the concept of consumer satisfaction as having face validity.
Dr. Salzer is research assistant professor at the Center for Mental Health Policy and Services Research at the University of Pennsylvania in Philadelphia.