Education Week - June 20, 2018 - 6
Federal i3 Grants Yield Successes, Illuminate Challenges
Just nine clear winners
emerge from R&D effort
By Sarah D. Sparks
The federal Investing in Innovation program
has produced only a handful of true breakout
educational interventions in the final evaluation, and the success trajectory of all the projects funded under the program highlights the
rocky path forward for school districts nationwide working to use evidence under the Every
Student Succeeds Act.
The Institute of Education Sciences last
week released its final evaluation report on the
$1.4 billion i3 program-the only Obama-era
competitive grant to be codified into the Every
Student Succeeds Act. Of the 67 grants with
evaluations completed by last May, nine, or
13 percent, had both tight implementation and
strong positive effects.
I3's tiered evidence model awarded more
money to programs with a larger evidence
base and which has since formed the backbone of ESSA's tiered model for evidence-based
school improvement interventions. It seemed
to be effective. Half of the highest-tier "scaleup" grants, which offered up to $50 million but
which required the most initial evidence, produced evidence that the programs were wellimplemented and had benefits for students. By
contrast, only 8 percent of the lowest-tier "development" grants, which awarded $2 million to
$5 million to interventions that seemed promising but had the least initial evidence, showed
significant positive effects.
for business research and development projects.
"You look at the number and don't know if
you should be really excited or really disappointed," Goodson said. "A lot of the advances
were not particularly sexy, not earth-shattering
headlines, but the real innovation in these experiments may be less about the findings of the
programs [than about] improving the infrastructure of the field."
For example, Goodson said program developers and districts in i3 learned how to continuously evaluate and improve their interventions.
Researchers also submitted their evaluation
plans in advance, leading to less cherry-picking
of good results after the fact.
One of the successful development grants was
Building Assets, Reducing Risks, a student support program which has since built up enough
evidence to earn scale-up funding under the
Education Innovation and Research grant program, i3's successor in ESSA.
Angela Jerabek, BARR's executive director,
said the program owed its evolution and growth
to the technical assistance and ongoing evaluation in i3.
The "big lesson" for districts trying to develop
their own school improvement programs under
ESSA, Jerabek said, is that "individuals and
districts need to have the patience to try an
intervention and test it before trying to scale
it up. That's a big Achilles' heel, because ... if
your very first idea becomes the final program,
it's rare that it will have a consistent benefit for
I3 also highlighted problems in current, popular education research approaches. For example,
both i3 and ESSA encourage districts to use administrative data-the day-to-day attendance,
participation, unit tests and other information
that districts collect on their own-to study
interventions and make changes quickly. But
Goodson said administrative data didn't help
much for many programs, because some states
did not test annually in subjects such as science, and it was difficult to gauge the effects of
interventions when states or districts changed
their testing systems.
The evaluation called for better technical assistance for those trying to develop interventions.
"I do think if you really care about generating evidence, it takes a consistent message, a
clear framework ... and some kind of resources
for districts to go to when they hit problems,"
Goodson said. "Nobody's going to pay for Abt to
give individual assistance to every district" trying to develop an intervention.
For example, BARR's initial model, developed around a year of teacher training on using
data to support student transitions, evolved
over time into a three-year professional development program that today includes ongoing
professional learning networks in more than 56
school districts nationwide.
"One of the challenges with the i3 program is
that the message goes out that you are just trying
to identify what works-find more of what works
and less of what doesn't," said Vivian Tseng, the
senior vice president of programs for the William
T. Grant Foundation, which supports efforts to
improve research use in education. "But we need
to be able to glean lessons learned, so that people
who are trying to do these Tier 4 [interventions
under ESSA] can build them to yield useful lessons not just about did it work or not work, but
how to improve the next time around."
The final evaluation of the i3 program
showed the greatest success rates were
at the "scale-up" level, which sets the
highest initial bar for evidence. Half
of the programs in that category were
later found to have positive effects for
students. These charts include effects
but not implementation.
6 | EDUCATION WEEK | June 20, 2018 | www.edweek.org
TIERS OF EVIDENCE
In a final evaluation of the
Investing in Innovation grant
program, the research firm Abt
Associates found that, of the
67 projects that had completed
an evaluation by last May, only
nine could demonstrate that
they had both implemented their
programs closely and produced
academic benefits for students.
The breakthrough programs were
found at all three tiers of the
federal grant program, from the
largest scale-up grants to the
mid-sized validation grants, to
the small development grants
geared to interventions that had
never been rigorously tested
* Knowledge Is Power Program charter
* Reading Recovery
* Children's Literacy Initiative
The successful programs are:
* College-Ready Writers Program
* Enhancing Missouri's Instructional
Networked Teaching Strategies
* StartSmart K-3 Plus Program
"This report shows how difficult it is to do
good work," said Patrick Lester, the director of
the Social Innovation Research Center, who has
conducted separate studies of the i3 program.
"There's a certain issue of competence here,"
Lester said, noting that the evaluations found
successful i3 programs, including the Knowledge Is Power Program charter schools and the
Reading Recovery tutoring program, had strong
plans for both developing evidence, monitoring
implementation across schools, and gathering
data that would be needed for evaluating the
program later. "Those who are successful are
often well-funded, have top-notch people-
generally have their stuff together across the
board," he said.
ESSA does not prescribe specific interventions for schools in need of improvement, but it
does call for districts to use interventions that
have evidence of being effective with the kinds
of students they want to help. If there are no
available research-based interventions, districts
are allowed to develop, test, and evaluate their
own. But in contrast to programs developing
and testing interventions in the i3 grant, Lester said, districts working to develop their own
school improvement interventions often have
little technical support either in developing programs or collecting the data needed to evaluate
"I'm very worried about ESSA; ... I do not expect most schools, regardless of how much they
say they're focused on evidence, to build interventions that have any effect at all," he said.
The i3 program's 13 percent success rate
seems small, but was not surprising for programs of this kind, according to Barbara Goodson, a co-author of the study and a principal
scientist at the research firm Abt Associates.
One report by the Coalition for Evidence-Based
Policy found about 12 percent of rigorously
evaluated education interventions show positive results. And during i3's creation, the White
House cited a success rate of about 10 percent
l Positive Effects
l No Effects
l Negative Effects
l Insufficient Evidence
* Building Assets, Reducing Risks
* Expository Reading and Writing Course
* Spheres of Proud Achievement in
Reading for Kids (SPARK)
SOURCE: i3 Summary of 67 Evaluations
The grant also produced tools and
checklists to guide educators or
researchers trying to develop and test
new interventions, available at
SOURCE: Education Week