69ý

Federal

How Data Helped Head Start Centers Tackle a ‘No Show’ Problem

By Christina A. Samuels — June 19, 2018 | Corrected: June 20, 2018 7 min read
Lead teacher Melanie McLaughlin gets a hug from her student, Daleyza Gaona, 4, as Caidyn Smith, 4, works with “slime” in their classroom at Early Childhood Development Center Reed, a Head Start program in Tulsa, Okla. The center used statistical modeling to reduce the number of “no show” students from 38 in 2016 to 11 in 2017.
  • Save to favorites
  • Print
Email Copy URL

Corrected: An earlier version of this article misstated the percentage of 4-year-olds who did not show up for the 2016-2017 school year at the CAP Tulsa Head Start program, despite being enrolled. The correct proportion is 20 percent.

What do you do when you build a preschool class—but many of the children never show up?

That’s what happened at the Head Start program overseen by the Community Action Project of Tulsa in Oklahoma, or CAP Tulsa for short. In September 2016, 135 preschoolers—fully 20 percent of the program’s Head Start population—never appeared at the start of the school year, even though their parents had enrolled them.

CAP Tulsa, as it has often done in the past, turned to data both to figure out the problem and devise a solution. And in doing so, it provided an example of how all of Head Start’s 1,600 grantees are now expected to infuse data into their decisionmaking and continuous-improvement processes.

CAP Tulsa offers care and educational services for newborns through preschoolers. But for 4-year-olds, there’s competition. Parents have the option of staying with Head Start or enrolling in preschools offered by the Tulsa school district or local charter schools.

Using Data to Spot Trends

To better predict the program’s enrollment, Cindy Decker, CAP Tulsa’s director of research and innovation, and her team built a statistical model. The model found some common elements among no-shows: They had an older sibling in elementary school, suggesting parents may want their younger child in a preschool at the same building for convenience; they were new to the program that year, or they were not receiving behavioral or disability supports—children with those needs tended to stick with CAP Tulsa, Decker said.

Armed with that information, staff members started asking parents over the summer about their plans, Decker said, paying particular attention to families who had factors more likely to make them no-shows. CAP Tulsa also connected with the district and with local charters to find out if the same children were popping up on their rolls.

A year later, the number of no-shows dropped from 135 to 99—still a lot, Decker said, but the decrease meant less churn in the first weeks of the school year.

“And we also heard that this helped with some challenging behaviors,” she added, because teachers were able to focus on instilling classroom routines, she said, rather than adjusting to new children enrolling well into October.

This is just one of many ways CAP Tulsa uses data to drive its program, Decker said. “Data helps us identify the problems that need to be fixed, and the successes we should celebrate,” she said.

Areli Garcia, 3, works at a center inside his classroom 10 at ECDC Reed, a Head Start program in Tulsa, Okla.

Head Start programs have traditionally collected reams of information on themselves and their participants. But that information has often been collected to monitor compliance, not to drive program improvement or better child outcomes.

Grantees in the field wanted to improve their use of data, said Yasmina Vinci, the executive director of the National Head Start Association, an advocacy group representing the nation’s 1,600 Head Start grantees.

The association was among the groups that commissioned a 2016 report called “Moneyball for Head Start.” The paper drew its name from the analytical approach popularized by the Oakland Athletics then-general manager and now vice president, Billy Beane. Beane used statistical analysis to put together competitive baseball teams, rather than relying solely on the intuition of baseball scouts. Head Start programs should embrace data in the same way, and should be supported by the federal government in doing so, the paper stated.

From Compliance to Performance

Later that year, Head Start released a new set of performance standards, which had last been revised in 1975. Woven throughout the document are requirements for programs to use data in making decisions on issues such as budgeting, teacher coaching, and improving instruction.

“We’re really excited about it,” Vinci said. “The fact that quite a little bit of the energy and movement in this has come from the field, really makes it a powerful opportunity.”

The performance standards require a shift in mindset, and Head Start is providing technical support in a variety of ways, federal officials said. For example, they have focused technical assistance at the national, regional, and local level on “practice-based coaching,” or using data to support teacher professional development.

In addition, Head Start has offered a “data boot camp” to more than 400 Head Start staff members and technical assistance providers, aimed at boosting their abilities to use data to plan and measure program impact.

Four-year-old Sara Cifuentes Robbins reads a book inside an inflatable pool at ECDC Reed.

Federal oversight has also zeroed in on looking at how programs use the information they capture on students and families, and program efforts. For example, when monitoring review teams visit a grantee, they ask for a “data tour,” where local officials show how they collect, analyze, use, and share information.

Many programs have already demonstrated that they’re effective at this work, federal officials said. Others still need more support, a process that federal officials said is “delicate and ongoing.”

The Riverside County, Calif. board of education is another example of a program that has embraced these requirements. The board provides Head Start services directly to children, as well as oversees several subcontractors, known in Head Start as “delegate agencies.” In total, Riverside County serves about 3,500 children in Early Head Start and Head Start.

“When data first came on the scene with Head Start, it was something that everyone shied away from or was a little afraid of—what did they mean by this?” said Esmirna Valencia, the executive director of Riverside County’s early-childhood programs. “We knew at the time that we needed to introduce data in a way that made sense to the staff.”

Program managers started talking about how they already used data in their everyday work, without necessarily using the term “data-driven decisionmaking.”

Program leaders also hired staffers who were able to look under the hood of the data-management systems already in use, to see if they could tweak them for Riverside’s own purposes.

ChildPlus, a data system used by many Head Start programs, captures dozens of data points on children and families, said Fernando Enriquez, a coordinator with the Riverside County Head Start program. ChildPlus also allows users to generate basic reports, but the creators allowed Riverside access to the guts of the database, so it could produce its own reports.

Riverside linked the database to a visualization program called Tableau. “Now, it’s only limited by your ability to make analytics,” he said. For example, Riverside now maintains a “dynamic dashboard” of enrollment information. Managers can see at a glance which programs are full, which ones need more children to fill open spots, and how many potential students still need to have their eligibility confirmed.

Targeting Teacher Improvement

Another Head Start grantee, Guilford Child Development Center in North Carolina, uses data to drive teacher improvement. Guilford serves around 1,200 infants, toddlers, and preschoolers.

Federal officials use a tool called CLASS—the Classroom Assessment Scoring System—as an important part of their evaluation of Head Start programs. Programs that fall below a certain level on CLASS data and other metrics are required to recompete for federal funding.

Guilford has its own trained CLASS assessors on staff, who observe classrooms on a regular schedule. Federal officials do not require their own CLASS assessments—but seeing how Guilford compares to other programs in the state and nationally is essential for focusing professional development on the most important areas, said Robin Sink, an educational coach specialist for the program.

But Sink noted that as a coach, ease with analyzing numbers cannot replace developing a connection with the teachers she works with.

“I need to meet them and establish a base of trust,” Sink said. “The building of a relationship is more complicated than sharing the data.”

The use of data for continuous improvement is not limited to Head Start managers. Teachers are also using assessments of their students to make day-to-day decisions about how to best support children.

In Riverside, for example, Head Start teachers have been provided up-to-date access to data on their children, through a program called Learning Genie. Teachers plug in observations and assessments, and the program creates interactive reports for educators and for parents.

Boris Sanchez, a Riverside Head Start teacher, said she checks the program daily to monitor her pupils’ progress. It guides which children she might work with individually, which ones she puts together for small-group activities, and how she will focus her lesson plans.

For example, if her charges are interested in learning about butterflies but are also showing they need support learning their letters, “I’m going to merge the letters with the lesson. We merge the technical stuff with the fun stuff.”

Sanchez said the data efforts at continuous improvement fit with the work she has been accustomed to.

“We all had our checklists. I’m not afraid of data, because we were always doing it,” she said.

Coverage of continuous-improvement strategies in education is supported in part by a grant from the Bill & Melinda Gates Foundation, at g. Education Week retains sole editorial control over the content of this coverage.
A version of this article appeared in the June 20, 2018 edition of Education Week as Head Start Programs Turn to Data for Problem-Solving

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Federal Opinion What's Really at Stake for Education in This Election?
What a Harris or Trump presidential victory might mean for federal education policy, according to Rick Hess.
5 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Federal Trump's K-12 Record in His First Term Offers a Blueprint for What Could Be Next
In his first term, Trump sought to significantly expand school choice, slash K-12 spending, and tear down the U.S. Department of Education.
11 min read
Education Secretary Betsy DeVos listens at left as President Donald Trump speaks during a round table discussion at Saint Andrew Catholic School on March 3, 2017, in Orlando, Fla.
Education Secretary Betsy DeVos listens at left as President Donald Trump speaks during a round table discussion at Saint Andrew Catholic School on March 3, 2017, in Orlando, Fla. The education policies Trump pursued in his first term offer clues for what a second Trump term would look like for K-12 schools.
Alex Brandon/AP
Federal From Our Research Center How Educators Say They'll Vote in the 2024 Election
Educators' feelings on Vice President Kamala Harris and former President Donald Trump vary by age and the communities where they work.
4 min read
Jacob Lewis, 3, waits at a privacy booth as his grandfather, Robert Schroyer, fills out his ballot while voting at Sabillasville Elementary School, Nov. 8, 2022, in Sabillasville, Md.
Jacob Lewis, 3, waits at a privacy booth as his grandfather, Robert Schroyer, fills out his ballot while voting at Sabillasville Elementary School, Nov. 8, 2022, in Sabillasville, Md.
Julio Cortez/AP
Federal Q&A Oklahoma State Chief Ryan Walters: 'Trump's Won the Argument on Education'
The state schools chief's name comes up as Republicans discuss who could become education secretary in a second Trump administration.
8 min read
Ryan Walters, then-Republican candidate for Oklahoma State Superintendent, speaks at a rally, Nov. 1, 2022, in Oklahoma City.
Ryan Walters speaks at a rally on Nov. 1, 2022, in Oklahoma City as a candidate for state superintendent of public instruction. He won the race and has built a national profile for governing in the MAGA mold.
Sue Ogrocki/AP