School districts/local education agencies and state Departments of Education are operating in an increasingly complex and demanding environment, with expectations that schools educate all students to considerably higher levels than required in the past. However, educational entities are limited by significantly reduced revenue and seemingly unsustainable debt. To exacerbate matters, public faith in LEAs/DOEs is eroding, meaning that educational leaders have to do more with less, while simultaneously convincing constituents of the value of their investment. Facing increasing demands, it will become increasingly important to develop a strategy for data collection aligned to key outcomes, involve key stakeholders in informing progress, and share the story of impact with the broader community.
The education sector faces an intimidating paradox.
On the one hand, school districts/local education agencies (LEAs) and state Departments of Education (DOEs) are operating in an increasingly complex and demanding environment. In a globalized economy, society’s expectation is that schools educate all students to considerably higher levels than required in the past.
However, educational entities are limited by significantly reduced revenue and seemingly unsustainable debt. To exacerbate matters, public faith in LEAs/DOEs is eroding, meaning that educational leaders have to do more with less, while simultaneously convincing constituents of the value of their investment.
Over the past several years, my organization, ImpactED,1 based out of the University of Pennsylvania, has worked with LEAs and DOEs across the country wrestling with these very real challenges. Below are three lessons we’ve learned about how the public sector can better use data to maximize impact.
Lesson #1: Strategy matters.
There’s a lot of talk about data collection methods and analytic procedures: What is the best mode for administering surveys to get a strong response rate? Should we be conducting longitudinal analyses? How do we isolate the impact of a policy? While these questions are important, until organizations have developed a clearly articulated strategy and culture around data use, the answers won’t have much of an impact.
Public sector leaders should start any data collection effort by creating a program/policy logic model and ensuring that the data they’re collecting aligns to the indicators that matter most. Program logic models depict the theory and assumptions underlying a program, policy, or strategy by linking outcomes (both short- and long-term) with activities and processes.2 But being strategic isn’t enough. To invest stakeholders in the use of data, public sector leaders must make any data they collect on public programs both accessible and transparent to the broader public.
⊕ Highlight: Delaware Department of Education
The Delaware Department of Education's Teacher & Leader Effectiveness Unit (TLEU) has built in-house capacity for data collection and use through a partnership with the Harvard Strategic Data Project (http://sdp.cepr.harvard.edu/about), which places data strategists at local, state, and national education systems. The TLEU staff publish regular data briefs called "The Set" (http://www.doe.k12.de.us/Domain/354) to inform the public on progress towards the goal of ensuring all of Delaware's students have access to excellent teachers and leaders.
Lesson #2: Evaluation shouldn’t just be external.
Public sector entities often contract with external evaluators to evaluate policy or program implementation and impact. This level of independence is important for maintaining objectivity and bringing a rigorous third-party perspective to assess the impact of public investment. Indeed, evaluators can present results and raise questions that challenge public entities in ways that can’t be done from the inside. However, while valuable, evaluation shouldn’t just be external.
Innovative LEAs/DOEs should provide opportunities for stakeholders to share their perspectives at multiple phases of the process.
- Design. During the design phase, public sector leaders should seek input from various stakeholders on the approach to data collection. For example, LEAs could host focus groups with teachers to inform the questions asked on surveys or they could provide opportunities for teachers to pilot, and provide feedback, on survey instruments. Not only would this invest stakeholders in the data collection process, but it would also improve the validity of the instruments themselves. The stakeholders closest to the work (like teachers) often understand how to best ask questions that solicit the desired outcomes.
- Implementation. During the implementation phase, public sector leaders should gather real-time formative feedback from stakeholders. For example, LEAs could conduct formative surveys of policy implementation over the course of the year and schedule quarterly meetings to review results and discuss implications for ongoing professional development or communication efforts. This would send the message that stakeholder input matters while simultaneously increasing the effectiveness of policy implementation.
- Evaluation. At the end of the year or program/policy cycle, public sector leaders should involve stakeholders in interpreting the results of evaluation. For example, LEAs could identify teacher representatives and invite them to be a part of a policy advisory group. This group could engage in strategic planning efforts, where they reflect on evaluation results and develop initiatives to address issues for the upcoming year. To maximize the benefits of evaluation, it is critical that stakeholders have a voice in how data is used to inform improvements in policy implementation.
⊕ Highlight: Aldine Independent School District
The Aldine Independent School District (AISD) has made a unique commitment to incorporating stakeholder voice into policy implementation. Teachers from each school are nominated to serve on the VEAC (Vertical Educational Advisory Committee), which meets regularly with district leadership to share teachers' perspectives on a variety of issues. for the teacher evaluation initiative, INVEST, Aldine ISD has INVEST specialists, who serve as liaisons between the district leadership and teachers, gathering input on policy implementation at the school-level.
More information is provided on Aldine ISD here:
Lesson #3: Data can be a powerful communication tool.
Evaluation reports contain a wealth of information about the effectiveness of policy and program implementation. However, they don’t generally express complex ideas with clarity or precision. In recent years, there’s been a growing focus on data visualization, which highlights the significance of data by placing it in a visual context. Data visualization software3 can help display trends and relationships in ways that go unnoticed in heavy text-based descriptions. While visualization can be compelling, it needs to be accompanied by powerful storytelling. Indeed, research has shown that 63% of people can remember stories, but only 5% can remember a single statistic.4
Public sector leaders should not look at evaluation as separate from their communication efforts. Instead, they should think about how to use research and evaluation to effectively tell their story of change. Combining evaluation and communication efforts will help create a powerful data story, which can be used to drive improvement.
⊕ Highlight: School District of PhiladelphiaSeveral years ago, the School District of Philadelphia launched the School Redesign Initiative, which invited educators and stakeholders to submit a "school redesign" proposal that draws on innovative, research-based practices. The results of the Year 1 implementation evaluation were presented in text-based form, but this more traditional report was also supplemented with a vieo telling the story of change: http://www.schoolredesignphiladelphia.org/about.html
In sum, public sector leaders should build a culture of strategic data use. As they face intensified demands and decreased budgets, it will become increasingly important to develop a strategy for data collection aligned to key outcomes, involve key stakeholders in informing progress, and share the story of impact with the broader community.
Claire Robertson-Kraft, Ph.D., is the Director of ImpactED and a professor at the University of Pennsylvania. She has over ten years of experience working as a teacher, evaluator and nonprofit leader in the Philadelphia region. She has extensive experience teaching research methods and conducting evaluations in the areas of education, public policy, positive psychology, social impact, and community engagement.
1. For more information on the examples featured in this article, visit www.impactedphl.com.
2. For more information on how to create a logic model, view this resource: http://www.smartgivers.org/uploads/logicmodelguidepdf.pdf.
3. Nishith Sharma “The 14 best data visualization tools,” The Next Web (April 21, 2015), accessed November 12, 2016, http://thenextweb.com/dd/2015/04/21/the-14-best-data-visualization-tools/.
4. Learn more about Chip Heath’s research: https://mannerofspeaking.org/2009/10/13/making-it-stick-tell-stories/.