Sidebar

Magazine menu

18
Thu, Apr

Using Data to Inform an OST Continuous Quality Improvement Process

What Works & What Doesn't
Typography

Across sectors and fields, the use of data to inform decision-making is a business best practice.  An Out-of-School Time (OST) system is no different.  Data are invaluable in determining programmatic quality, youth outcome achievement, and overall system-level success.  The OST system funded by the City of Philadelphia Department of Human Services (DHS) operates on a large scale, consisting of more than 70 distinct provider agencies operating over 200 programs, and serving nearly 20,000 youth ages five through twenty-one each year.  In fiscal year 2014, the OST system implemented a Continuous Quality Improvement (CQI) process.  Other OST systems across the nation, such as New York City, Chicago, and Atlanta operate CQI processes in varying formats (The Forum for Youth Investment, 2012).  However, the Philadelphia process is unique in its scope, scale, and framework. 

The CQI process strives to ensure that youth have access to high-quality experiences by assessing individual programs, setting program-specific goals, and offering professional development and support.  The process initiated a system-wide culture shift that incorporated more uniform standards, while continuing to recognize and appreciate the uniqueness of individual program content and delivery. The CQI process also included greater emphasis on critical reflection, assessment, and the use of data to drive intentional program improvement efforts to enhance program quality.

The CQI process includes programmatic self-assessments, goal development, external evaluator site visits, and staff-administered assessment tools. Each step of the CQI process uses data collection and analysis to guide progress and program improvement for OST providers and the system as a whole.  The majority of data collection occurs in a system-wide database that all OST providers access.  This database enables the system to gather a variety of programmatic data related to attendance, staffing, project content, performance, youth demographics and financial information. 

In order to identify a baseline at the beginning of the CQI process, OST providers completed the Pennsylvania Statewide Afterschool Youth Development Network (PSAYDN) self-assessment.  Providers scored their program performance related to structure and management, positive relationships, safety and health, activities, and homework assistance.  Data from the completed surveys were aggregated and analyzed to identify trends, areas for improvement, and areas of success.  These data findings were used to inform professional development offerings and individualized support. 

Providers identified areas of programmatic challenges in the self-assessment and worked to identify realistic, meaningful goals that were achievable by the end of the fiscal year.  The National Institute on Out-of-School Time (NIOST) identified slow, gradual change as one of five key principles that serve as the foundation for quality improvement. (After School Program Assessment Training, NIOST, 2013) Goal data were collected system-wide and analyzed to determine trends, as well as links to DHS youth outcomes.  There are many examples in the system of providers who used data from the self-assessment to identify areas for improvement, set goals related to programmatic challenges, and make strides to address these issues.

In one example, a provider identified family engagement as a particular challenge during the PSADYN self-assessment and established it as a primary goal during the goal-setting process.  The program operates in a primarily Spanish-speaking community, but most staff members do not speak Spanish.  Due to family engagement difficulties identified through the CQI process, the program now offers Spanish instruction for staff.  The program believes that having staff learn Spanish will help them better communicate with parents, develop relationships, improve parent engagement, and thereby enhance program quality.  This program is a meaningful example of how data collected as part of the CQI process can inform program areas for targeted improvement and the data collected within it can improve quality. 

Another component of data collection related to program quality is the delivery of structured activities that incorporate service learning, experiential learning, or project-based learning methodologies.  To enhance skill development, these inquiry-based projects generally last from three to ten weeks depending on the age group. Projects cover a wide range of topics and seek to provide experiences that strengthen social-emotional and 21st century skills that reflect DHS focus on children and youth well-being.  The 21st century rubric, a staff-administered assessment tool that defines levels of work quality and gradients of mastery, is one way to assess youth skill development and impact on system-wide youth outcomes.  The rubric data are analyzed and used to determine whether youth participation in the structured activity positively impacted the desired youth outcomes.  In previous years, data from the completed rubrics remained at the program level; however, this is the first time the rubrics are analyzed with a system-level lens.  Moving forward, providers will continue to enter rubric data for longitudinal analysis of youth skill development.  

Another longitudinal look at program quality includes external evaluator site visits. Implementation of program activities, staff-youth relationships, and program operations are all assessed as contributors to program quality.  External evaluator site visits are not new to the OST system; however, drawing connections between the data collected  from the PSADYN self-assessment, action plan documents, staff-administered rubrics, and site visits is a new approach.  Many scored components of the site visit evaluation connect to the PSADYN self-assessment questions.  As the end of the fiscal year nears, the hope is to compare self-assessment scores to site visit scores and determine representative program quality.  

In addition to the data collected through the CQI process, program performance information such as youth attendance and retention, positions the OST system with additional measures to  identify programs facing challenges and address associated needs.  For example, data analysis of youth attendance may suggest that specific programs have difficulties with youth retention and recruitment, informing the focus for technical assistance support. Each step in the CQI process encourages OST providers to be intentional, thoughtful, and deliberate in their efforts to achieve program improvement goals and strive for OST system defined youth outcomes.  These data are used to make decisions about how to best support programs and encourage a process of growth and development. 

The CQI process is new for the system and continually evolving.  Over time, the hope is that each aspect of the process clearly links to all other components, allows the system to make even clearer determinations of quality, and helps improve program performance.  Additionally, the process will be more inclusive of all stakeholders and provide increased communication of data analysis findings.  The system strives to shift the overall culture to one that readily accepts and actively encourages program-level data collection to aid in system-level decision-making.  The ultimate goal of the process and culture shift is to improve individual programs, thereby strengthening system-wide quality programming that positively impacts youth and families in Philadelphia.  

References
Yohalem, N., Devaney, E., Smith, C., & Wilson-Ahlstrom, A. (October 2012). Building citywide systems for quality: A guide and case studies for afterschool leaders. Forum for Youth Investment. Retrieved from http://www.wallacefoundation.org/knowledge-center/after-school/coordinating-after-school-resources/Documents/Building-Citywide-Systems-for-Quality-A-Guide-and-Case-Studies-for-Afterschool-Leaders.pdf