Education Week - June 20, 2018 - 10
Head Start Programs Turn to Data for Problem-Solving
information. Managers can see at a
glance which programs are full, which
ones need more children to fill open
spots, and how many potential students still need to have their eligibility confirmed.
By Christina A. Samuels
What do you do when you build
a preschool class-but many of the
children never show up?
That's what happened at the
Head Start program overseen by
the Community Action Project of
Tulsa in Oklahoma, or CAP Tulsa
for short. In September 2016, 135
preschoolers-fully 10 percent of
the program's Head Start population-never appeared at the start
of the school year, even though their
parents had enrolled them.
CAP Tulsa, as it has often done in
the past, turned to data both to figure
out the problem and devise a solution.
And in doing so, it provided an example of how all of Head Start's 1,600
grantees are now expected to infuse
data into their decisionmaking and
CAP Tulsa offers care and educational services for newborns through
preschoolers. But for 4-year-olds,
there's competition. Parents have
the option of staying with Head
Start or enrolling in preschools
offered by the Tulsa school district
or local charter schools.
Photos by Brandi Simons for Education Week
Targeting Teacher Improvement
ABOVE: Lead teacher Melanie
McLaughlin gets a hug from her
student, Daleyza Gaona, 4, as
Caidyn Smith, 4, works with
"slime" in their classroom
at Early Childhood Development
Center Reed, a Head Start
program in Tulsa, Okla. The
center used statistical modeling
to reduce the number of "no
show" students from 38 in 2016
to 11 in 2017.
Using Data to Spot Trends
To better predict the program's enrollment, Cindy Decker, CAP Tulsa's
director of research and innovation,
built a statistical model. The model
found some common elements among
no-shows: They had an older sibling in
elementary school, suggesting parents
may want their younger child in a preschool at the same building for convenience; they were new to the program
that year, or they were not receiving
behavioral or disability supports-
children with those needs tended to
stick with CAP Tulsa, Decker said.
Armed with that information, staff
members started asking parents over
the summer about their plans, Decker
said, paying particular attention to
families who had factors more likely
to make them no-shows. CAP Tulsa
also connected with the district and
with local charters to find out if the
same children were popping up on
A year later, the number of noshows dropped from 135 to 99-
still a lot, Decker said, but the decrease meant less churn in the first
weeks of the school year.
"And we also heard that this helped
with some challenging behaviors," she
added, because teachers were able to
focus on instilling classroom routines,
she said, rather than adjusting to new
children enrolling well into October.
This is just one of many ways CAP
Tulsa uses data to drive its program,
Decker said. "Data helps us identify
the problems that need to be fixed,
and the successes we should celebrate," she said.
Head Start programs have traditionally collected reams of information
on themselves and their participants.
But that information has often been
collected to monitor compliance, not to
drive program improvement or better
Grantees in the field wanted to improve their use of data, said Yasmina
Vinci, the executive director of the
Four-year-old Sara Cifuentes
Robbins reads a book inside an
inflatable pool at EDC Reed.
National Head Start Association, an
advocacy group representing the nation's 1,600 Head Start grantees.
The association was among the
groups that commissioned a 2016
report called "Moneyball for Head
Start." The paper drew its name from
the analytical approach popularized
by the Oakland Athletics then-general
manager and now vice president, Billy
Beane. Beane used statistical analysis
to put together competitive baseball
teams, rather than relying solely on
the intuition of baseball scouts. Head
Start programs should embrace data
in the same way, and should be supported by the federal government in
doing so, the paper stated.
From Compliance to Performance
Later that year, Head Start released a new set of performance standards, which had last been revised in
1975. Woven throughout the document are requirements for programs
to use data in making decisions on
issues such as budgeting, teacher
coaching, and improving instruction.
"We're really excited about it," Vinci
said. "The fact that quite a little bit of
the energy and movement in this has
come from the field, really makes it a
The performance standards require
a shift in mindset, and Head Start is
providing technical support in a variety of ways, federal officials said. For
example, they have focused technical
assistance at the national, regional,
and local level on "practice-based
coaching," or using data to support
10 | EDUCATION WEEK | June 20, 2018 | www.edweek.org
teacher professional development.
that everyone shied away from or
In addition, Head Start has of- was a little afraid of-what did they
fered a "data boot camp" to more mean by this?" said Esmirna Valenthan 400 Head Start staff members cia, the executive director of Riverside
and technical assistance providers, County's early-childhood programs.
aimed at boosting their abilities to "We knew at the time that we needed
use data to plan and measure pro- to introduce data in a way that made
sense to the staff."
Federal oversight has also zeroed
Program managers started talkin on looking at how
ing about how they alprograms use the inforready used data in their
mation they capture on
everyday work, without
students and families, IMPROVEMENT necessarily using the
and program efforts. For
term "data-driven deciexample, when monitor- ABOUT THIS
ing review teams visit a SERIES: How can
Program leaders also
grantee, they ask for a districts move
hired staffers who were
from the constant
"data tour," where local churn of new
able to look under the
officials show how they school reform
hood of the data-managecollect, analyze, use, and initiatives to
ment systems already in
use, to see if they could
Many programs have for students in very tweak them for Riveralready demonstrated different contexts?
side's own purposes.
that they're effective at In this periodic
ChildPlus, a data systhis work, federal officials series, we look at
tem used by many Head
said. Others still need the pros, cons, and Start programs, captures
more support, a process "continuous
dozens of data points on
that federal officials said improvement"
children and families,
is "delicate and ongoing."
said Fernando Enriquez,
The Riverside County,
a coordinator with the
Calif. board of education
Riverside County Head
is another example of a program Start program. ChildPlus also althat has embraced these require- lows users to generate basic reports,
ments. The board provides Head but the creators allowed Riverside
Start services directly to children, as access to the guts of the database,
well as oversees several subcontrac- so it could produce its own reports.
tors, known in Head Start as "delRiverside linked the database to a
egate agencies." In total, Riverside visualization program called Tableau.
County serves about 3,500 children "Now, it's only limited by your abilin Early Head Start and Head Start. ity to make analytics," he said. For
"When data first came on the scene example, Riverside now maintains a
with Head Start, it was something "dynamic dashboard" of enrollment
Another Head Start grantee, Guilford Child Development Center in
North Carolina, uses data to drive
teacher improvement. Guilford serves
around 1,200 infants, toddlers, and
Federal officials use a tool called
CLASS-the Classroom Assessment
Scoring System-as an important
part of their evaluation of Head Start
programs. Programs that fall below a
certain level on CLASS data and other
metrics are required to recompete for
Guilford has its own trained CLASS
assessors on staff, who observe classrooms on a regular schedule. Federal
officials do not require their own
CLASS assessments-but seeing how
Guilford compares to other programs
in the state and nationally is essential
for focusing professional development
on the most important areas, said
Robin Sink, an educational coach specialist for the program.
But Sink noted that as a coach,
ease with analyzing numbers cannot
replace developing a connection with
the teachers she works with.
"I need to meet them and establish a
base of trust," Sink said. "The building
of a relationship is more complicated
than sharing the data."
The use of data for continuous improvement is not limited to Head
Start managers. Teachers are also
using assessments of their students to
make day-to-day decisions about how
to best support children.
In Riverside, for example, Head
Start teachers have been provided
up-to-date access to data on their
children, through a program called
Learning Genie. Teachers plug in observations and assessments, and the
program creates interactive reports
for educators and for parents.
Boris Sanchez, a Riverside Head
Start teacher, said she checks the
program daily to monitor her pupils'
progress. It guides which children she
might work with individually, which
ones she puts together for small-group
activities, and how she will focus her
For example, if her charges are interested in learning about butterflies
but are also showing they need support learning their letters, "I'm going
to merge the letters with the lesson.
We merge the technical stuff with the
Sanchez said the data efforts at
continuous improvement fit with the
work she has been accustomed to.
"We all had our checklists. I'm not
afraid of data, because we were always doing it," she said.
Coverage of continuous-improvement
strategies in education is supported
in part by a grant from the Bill &
Melinda Gates Foundation at www.
gatesfoundation.org. Education Week
retains sole editorial control over the
content of this coverage.