Many children do not have access to high-quality early childhood education (ECE) despite wide consensus among researchers that high-quality early ECE strengthens academic, social, academic, and health outcomes for children and families. In 2014, the U.S. Departments of Education and Health and Human Services awarded a $48M Preschool Development Grant (PDG) to Connecticut to expand access to high-quality preschool programs for four-year-olds from low-income families in 13 Connecticut communities, including Bridgeport, one of PEER’s partner communities. The 4-year grant provided funding for PDG classrooms in the 2015-2016 to 2018-2019 school years; the first cohort of children that enrolled in PDG classrooms is now in fourth grade.
The Connecticut Office of Early Childhood (OEC) has remained committed to understanding how the PDG program impacted Connecticut children. At the start of the grant, the OEC engaged the University of Connecticut (UCONN) Neag School of Education to lead an evaluation of the PDG program that was focused on classroom quality and children’s academic and behavioral outcomes during the four-year-old preschool year. In collaboration with the State Department of Education (SDE), the OEC engaged PEER in 2018 to lead a complementary study focused on the association between PDG participation and kindergarten outcomes.
Both evaluation studies faced a challenge routinely encountered in research aimed at studying program effectiveness: how to compare outcomes for program participants to what their outcomes would have been had they not been in the program. Since it’s impossible to study the same children’s outcomes with and without program participation, researchers typically look for ways to compare the outcomes of participating children to similar children who didn’t.
Randomly assigning individuals to an intervention group or comparison group is considered the gold-standard way for researchers to create two equivalent groups of people to be compared. But since children weren’t assigned randomly to participate in PDG classrooms, PEER needed to seek an alternative approach.
PEER decided to use a method called propensity score matching, in which program participants were matched to non-participants based on scores that represented individuals’ likelihood, or propensity, to enroll in a PDG program. In this case, PEER used individual demographic and community characteristics to calculate propensity scores, and then used these scores build a data set that included a non-PDG child who was similar to each PDG child. The resulting data set allowed PEER to compare outcomes PDG children and similar non-PDG children.
In an ideal scenario for this method, PEER would have had access to data on a broad group of children that included children who were similar to those in PDG programs and were not in a similar preschool programs. Here is where this evaluation faced a limitation that arises in many states: early childhood education data are available only for publicly-funded programs.
While data are available beginning in kindergarten for almost all children, preschool enrollment data are available only for children whose participation is supported by public funding. No data is available for other children, who may have participated in tuition-based preschool programs or may not have participated in any preschool program at all. Therefore, although the goal of the PDG program was to expand access to high-quality early childhood education for children who otherwise had limited access, this study could only compare PDG children to children in other publicly-funded programs that meet many of the same quality standards. In addition, missing data on PDG children made it challenging to compare these groups.
These unfortunate realities of data availability substantially hamper the ability to understand how participation in PDG impacted children who wouldn’t have attended high-quality preschool programs otherwise. So it was not entirely surprising that when PEER compared PDG children to similar children who attended other publicly-funded programs, there were no statistical differences in Kindergarten Entrance Inventory scores, rates of absenteeism in kindergarten, or on-time promotion to grade 1. In other words, outcomes were comparable for PDG children and similar children who attended other publicly-funded programs. Importantly, the data limitations described above mean that this study cannot tell us anything about the outcomes of PDG children compared to children with no preschool experience.
So what did PEER’s PDG evaluation study ultimately teach us? The project illuminated key lessons on how to strengthen ECE data systems, ensure data quality, and improve future evaluations, for example, by collecting data on program implementation. The project also highlighted the importance of expanding research and evaluation capacity among state agencies, including improving the ability to link data across agencies.
The good news is that the state is already engaged in several initiatives aligned with these recommendations. In 2018, the Connecticut General Assembly enacted a semi-annual process for creating a State Data Plan focused on the appropriate and efficient management, sharing, and use of data sharing across state agencies and by research partners, community partners, and the general public; the OEC has been an active partner in this process. Earlier this year, the state received a Statewide Longitudinal Data Systems (SLDS) grant from U.S. Dept. of Education’s Institute of Education Sciences. This grant focuses on expanding the Preschool through 20 and Workforce Information Network (P20 WIN), which defines manages the state’s process for linking de-identified data from different agencies for the evaluation of education, workforce, and supportive services while protecting individual privacy. As a member of P20WIN, the OEC is actively leading research in partnership with SDE, thanks to the support of the SLDS grant. In addition, the OEC has allocated resources from PDG Birth through 5 grants to enhance its data systems in support of improved data quality and data integration. These efforts show promise for future evaluations of program effectiveness that rely on linking high quality data across different agencies. PEER looks forward to continued partnership with the OEC and SDE as these initiatives strengthen state data systems and build capacity for collaborative research.