Sidebar

Magazine menu

19
Fri, Apr

What Would Benjamin Disraeli Say About Data Analysis in the Social Services?

Disruptive Innovations
Typography

Question:

What would Benjamin Disraeli Say about Data Analysis in the Social Services?

In his 1969 article in the American Psychologist, Donald Campbell (Campbell, 1969) put forth the argument that all social service programs should be submitted to empirical validation. He argued that social service programs should be treated with the same degree of scientific rigor as any experiment because most social service programs at that time were essentially experimental. Support for Campbell’s position came from many quarters, but notably from Senator Robert Kennedy. Kennedy was responsible for delaying an education funding bill until program evaluation was included as part of the bill. Eventually, program evaluation became mandated for all federal social service grants. Not only was program evaluation mandated, but it was funded. Many programs that used federal funding during the 1970s and early 1980s were required to spend one percent of their budget on program evaluation activities.

The availability of funding resulted in multi-disciplinary research collaboration, and ultimately, the development of program evaluation as a professional discipline. In our own area of Southeastern Pennsylvania, Temple University led in all efforts to evaluate services and supports provided for persons with what we now call intellectual disability. Dr. Glenn Snelbecker, Dr. Jim Conroy, and Dr. Bob Isett formed a collaborative research group that linked Temple faculty from psychology, special education, business administration, and sociology departments, disability researchers from Temple’s Developmental Disabilities Center, and applied researchers from Temple’s Woodhaven Center. This loosely structured collaborative group focused largely on the general question of “what works?”. According to a survey conducted at that time, it was the fourth most productive group of its type in the nation.

It should be noted that the purpose of program evaluation was not so much to produce scientific articles in arcane professional journals to be read by eight people, but rather to identify strengths and weaknesses in social service programs and to identify potentials causes (and corrections) of social conditions. In all cases, the intent was to use program evaluation to help better the human condition. Hence, you see Dr. Jim Conroy repeatedly stating that the key question for a program evaluator must be, “are they (service recipients) better off?.” Perhaps the most significant intellectual disability-related research to derive from this collaboration was the Pennhurst Study (Conroy & Bradley, 1987) that demonstrated the many benefits that accrued with placing individuals with intellectual disability in community based homes.

When President Reagan introduced funding cuts to a variety of social service programs, the expectations for accountability and using program evaluation tools to quantify/justify service outcomes did not disappear. Only the funding for such activity disappeared. Under these conditions, it is hardly surprising that most formal Evaluation & Research departments in social service programs have disappeared. Data analysis has largely been reduced to counts of incident reports, and the presence of doctoral trained specialists in program evaluation has become a rarity. Given the chronically weak economy of our country combined with the increasing support needs of the baby boomer generation, it is hardly surprising that budgets are tight and little funding is available for legitimate program evaluation activities. Note the recent analysis (Spreat, 2016) that indicated that about one-third of intellectual disability providers have expenses that exceed revenue each year.

Several months ago, this journal, Social Innovations Journal, held a half-day presentation on the use of data. Ted Dallas, Secretary of Pennsylvania Department of Human Services, described a number of smaller systems he used to correct problems within his department. Michael Walsh described a city-wide data collection system. Ann Koellhoffer described the use of data to promote college retention. The seminar described a wide range of the positive uses of data for planning and program evaluation. The seminar highlighted the positive uses of data to enhance various aspects of the human condition. For the individual provider listening to these presentations, the ideas may have appeared wonderful, but largely out of reach. Most providers lack sufficient funding to collect, analyze, interpret, and use data. Most providers lack employees with sufficient skills in the analysis and interpretation of data. The ability to make sense of data requires significant training in both data analysis and in the particular field referenced by the data. But the need remains, and arguably is of even greater import, during these challenging economic times.

Providers of supports and services to individuals who have intellectual disability and/or autism function in a complex world of excessive regulation and systematized underfunding. Excessive regulation in the form of staffing qualifications has led to unforgivable vacancy rates in excess of 45 percent in residential treatment facilities that treat children. Many intellectual disability programs do not receive sufficient money to cover the costs of the services they provide. Oss (2015) suggested that the funding of intellectual disability programs approximates a Ponzi scheme because providers must rely on a continuing source of donated funds to supplement the inadequate funding from governmental agencies. Walker (personal communication) has pointed out that funding for intellectual disability services has lagged behind the general Pennsylvania budget by almost 70 percent. These challenging issues result in diminished program quality and high amounts of unproductive time. In asking Jim Conroy’s “Are they better off?” question, it is clear that these factors detract from a positive answer to Jim’s question.

Efforts to secure sufficient funding for social service programs typically take on the form of efforts to persuade legislators and executive branch entities to allocate additional funding. It is well understood that legislators can be moved by tales of need and heartbreak, but it must be understood that the plural of anecdote is not evidence. Stories may touch the heart, but the objective analysis of empirical data leads to the most irrefutable arguments. Reliable data can be an effective tool with which to minimize the challenging conditions under which providers toil. Data can become a weapon used to support individuals who have disabilities.

Let us consider staffing. Staffing is the lifeblood of supports and services for people with intellectual disability and/or autism. Staffing typically consumes about 80 percent of a provider agency’s budget. Over the past 30 years, occasional research studies have highlighted direct support professional staffing, with turnover rates reported in excess of 50 percent per year. These studies certainly resonated with face validity, but they were distant and dated by the time of publication. Studies of turnover in Minnesota don’t carry much weight in Pennsylvania or New Jersey. The publication of these studies did little to impact funding in Pennsylvania.

Former Speaker of the House Tip O’Neill was fond of pointing out that “all politics is local.” Research done by Minnesota researchers in a Vermont setting will have less impact in Pennsylvania than local data collected and analyzed by local researchers. The challenge is how to get local data collected and analyzed. Absent a local researcher with some sort of a federal grant to study the issue of intellectual disability funding, the options are few. Fortunately, a Pennsylvania provider association recognized that collaboration was the key to gathering data that could be used to support a request for additional funding. No one provider had the resources to undertake such a study, but by pooling resources and sharing costs of the study, the coalition was able to commission a study that yielded useful information at a reasonable cost. Time was donated by several providers and by the provider association. Modest funding for a part time research assistant was obtained from a Pennsylvania foundation. With strong participation from the provider community, data were collected representing approximately 17,322 Direct Support Professionals. A mean hourly wage of $11.26 was determined, with turnover at approximately 25 percent and vacancies around 10 percent (Spreat, Brown-Mchale, and Walker, 2016). These data formed the foundation of a Pennsylvania movement to increase compensation for Direct Support Professionals. The movement combined data presentation with more traditional forms of persuasion, such as videos, face to face lobbying, and an opportunity for legislators to work as a Direct Support Professional for a day. The outcome (hopefully, I’m not jinxing this because the budget is not yet finalized) was increased funding for intellectual disability programs that will enable most providers to offer modest wage increases to direct support professionals.

The story doesn’t end with the initial compensation study. Wages and benefits alone don’t tell the full story. At the request of a small consortium of three provider associations, a second study was undertaken to describe the impact of low wages, the access to public support programs, and the impact of overtime wages on access to those public support programs. Researchers from Social Innovations Partners, Impact Germantown, and Woods Services concluded that the low wages typically paid to direct support professionals would generally qualify them for a wide range of public benefits (such as housing allowance, insurance, and fuel costs), but the amount of overtime worked by the typical employee resulted in a salary that disqualified them from access to the public benefits. Therefore, it can be suggested that there is something wrong with a legitimate form of employment that forces employees to rely on welfare programs to support themselves and their families. The impact of excessive overtime on the family unit was discussed, and models were developed that projected the strengths and weaknesses of hourly salaries of $15, $18, and $21. The provider associations are still evaluating how these latter data might be used.

The innovative part of this story isn’t that data were used to justify budget requests, but the innovation derived from the collaboration of provider agencies and researchers. By sharing the costs of the researchers, the provider associations obtained information that they were able to use in budget requests. The sharing of costs for this sort of project made the project affordable to all associations involved, and perhaps serves as a model for other types of program evaluation efforts. Noteworthy at this time would be the issue of community participation. As policy efforts are directed to reducing reliance on sheltered employment and increasing a variety of forms of community participations, it would seem reasonable for provider associations to again join together to evaluate the impact of these proposed changes.

To that end, a consortium of intellectual disability researchers met in late April to identify strategies with which to study community participation. A subgroup has continued to meet and is developing a funding proposal for research that will enable us to ascertain the impact of the new community participation rules. Participating in the project at this time are the Center for Outcome Analysis, Woods Services, and Temple University Institute for Disabilities.

We must remain cognizant that data are weapons, and any group that is unable to analyze data is at a marked disadvantage. As illustrated above, data can be used to gather better supports for individuals who have disabilities. Although, it also must be recognized that data can sometimes be used improperly. Consider the efforts of the Pennsylvania Department of Human Services to establish provider report cards using subjective and non-contextualized data. Under the legitimate intent of giving the general public access to valid information, the provision of sloppy data without contextual information would have only served to discredit providers. Dr. Russell Rice led the effort to make report cards legitimate by persuading all involved in the project to limit the reported data to data that were objective and not in need of contextual explanation. The institution of data quality control, a key component of any program evaluation effort, did result in the presentation of less data, but also the guarantee that all reported data met basic standards.

The above example illustrated an attempt to weaponize data, and we must be candid that data are weapons. Much like the towns of the old west that hired gunslingers like Wyatt Earp and Bat Masterson for their protection, contemporary providers need to find a way to hire (or at least obtain the services) of highly trained professionals who understand how data can be used. These data gunslingers can help ensure that data are used to assist in management decision making, and that the data being used meet standards of quality.

Answer: There are lies, damned lies, and statistics.

Works Cited

Campbell, D. (1969).   Reforms as experiments.  American Psychologist, 24(4), 409-429.

Conroy, J. & Bradley, V. (1987).  Pennhurst Longitudinal Study.  Philadelphia: Temple University; Cambridge, MA: Human Services Research Institute.  

Oss, M. (2015).  Just Say No.  Open Minds, 1/22/15.

Spreat, S. (2016).  Is the patient dead yet?  Woods Services Evaluation & Research Technical Report 16-3.

Spreat, S., Brown-McHale, K., & Walker, S. (2016).   PAR 2015 Direct Support Professional Wage Study.   Woods Services Evaluation & Research Technical Report 16-5.

New Partnership with Penn Praxis - Penn Design - University of Pennsylvania

Issue 39 | Disruptive Innovations