69传媒

Opinion
School & District Management Opinion

Why Evidence-Backed Programs Might Fall Short in Your School (And What To Do About It)

How close a program鈥檚 implementation matches its plan is important, though perhaps not as important as you think
By Heather C. Hill 鈥 May 25, 2021 5 min read
The process from idea to practice in the classroom.
  • Save to favorites
  • Print
Email Copy URL

Editor鈥檚 Note: This is part of a continuing series on the practical takeaways from research.

As funds from the American Rescue Plan start to arrive in schools and districts, many educators will find themselves tasked with quickly implementing new programs to meet the needs of students with pandemic-related unfinished learning. These programs include high-dosage tutoring, summer learning experiences, and school year curricula with linked professional development.

Many schools will choose such programs based on their evidence of success, perhaps using academic reviews or websites (like the late Robert Slavin鈥檚 ) that report specific programs鈥 effects on student achievement. Yet most school leaders also know that implementing a program does not automatically yield the promised bump in student achievement.

It鈥檚 a conundrum: Why do research syntheses find large effects on student learning, but schools less often realize the gains?

One reason is that the effect sizes promised by meta-analyses鈥攕tudies that statistically combine the results of multiple evaluations鈥攃an be : Small pilot programs are likely to make up the bulk of studies in meta-analyses, and they tend to yield larger effect sizes, which are then not replicated under real-world conditions. Another reason is that implementation鈥攖he degree to which a program or practice is carried out with fidelity鈥攃an sharply affect program outcomes.

With Anna Erickson, a researcher at the University of Michigan, I 72 studies of math, science, English/language arts, and social-emotional-learning programs, comparing the extent of program implementation to the likelihood of positive student outcomes. As expected, when schools did not adopt or maintain program elements, evaluations saw few positive impacts on students. This echoes findings from studies of individual programs, where positive effects often occur only in schools and classrooms with higher levels of fidelity.

However, in good news for schools, our analysis found that both moderate- and high-fidelity implementation tended to produce positive effects of similar size. Perfect need not be the enemy of the good when it comes to the implementation of new programs.

When principals and instructional leaders signal commitment to a program and follow up with set-asides of resources and time to learn about the program, teachers are more likely to follow their lead.

But low fidelity doesn鈥檛 cut it. So many district and school administrators will still want to work for better implementation. But how? Qualitative studies of implementation failure suggest addressing four root causes:

  • Will鈥攍iterally, whether teachers decide to embrace new practices or materials or to shelve them. It鈥檚 worth noting that in some cases, teachers鈥 decisions to forgo new materials make sense in light of their students鈥 needs.
  • Skill鈥攚hether teachers have the knowledge and expertise to implement a given program. For instance, case studies suggest STEM teachers who lack strong content knowledge may provide disorganized and error-filled instruction when using science, technology, engineering, or math curriculum materials that focus on concepts and build in student inquiry.
  • Organizational capacity鈥攚hether an organization has in place tools, routines, and relationships that enable implementation. Capacity encompasses a range of factors, from the quality of connections between central-office staff and principals to the ability to overcome seemingly simple logistics challenges, like distributing curriculum materials to students.
  • Contexts and coherence鈥攚hether the program will work given its particulars and the local setting鈥檚 needs, strengths, and weaknesses. This includes whether the program aligns with existing instructional guidance as represented in school priorities, pacing guides, and tested content and also whether the program reflects students鈥 interests, strengths, and culture.

While most point to one of these four categories as the culprit, research on how to proactively address them is more primitive. However, several studies suggest ways school leaders can plan for a successful implementation.

School leadership is key to increasing teachers鈥 willingness to take up new programs and practices. When principals and instructional leaders signal commitment to a program and follow up with set-asides of resources and time to learn about the program, teachers are more likely to follow their lead.

Allowing teachers to help shape program adoption can also increase their commitment to the program and potentially help avoid a bad fit between the program and local context. Program 鈥渢ouch points鈥 with teachers after initial program training鈥攆or instance, a brief workshop a few weeks after implementation begins鈥攃an also boost teacher will, program fidelity, and student performance.

Teacher skill in using a new program can be enhanced by concrete lesson materials: curricula, routines, or assessments. Professional development and coaching on practices specific to the new program also seem to build teacher skill and interest.

Enhancing organizational capacity to support implementation may be a heavier lift. Ideally, districts thinking about adopting a program or practice would assess their 鈥渞eadiness鈥: leader and teacher commitment to the program, earmarked resources for implementation, and the extent of conflict with existing programs or other instructional guidance. District leaders would either resolve these issues or find alternative programs. But little empirical evidence exists for how to do so.

Addressing context and coherence can also be tricky. Adapting programs to better fit local contexts is a promising practice but with a caveat: Teachers must implement with fidelity before adapting the program. James Kim of the Harvard Graduate School of Education and colleagues found such a result in a randomized study. They assigned a group of teachers first to a year-long, fidelity-oriented implementation of a reading program, then to a year of adaptation. During the adaptation year, teachers did things like better matching books to students鈥 reading level and adding a parent-outreach component. Student outcomes for this adaptation group outstripped those of both teachers who continued to implement the program with fidelity and those who adapted the program without first implementing with fidelity.

Finally, districts must take steps to protect the program following initial implementation, including continued professional development and mentoring in the program for new teachers. With average teacher turnover around 20 percent per year, getting new teachers up to speed on a particular program can both boost overall implementation and keep veteran teachers engaged.

Protecting the program also means preventing it from being 鈥渃rowded out鈥 by other initiatives. Too often, teachers report their work to learn a particular math or ELA program comes undone when their school or district adopts a conflicting classroom-observation rubric, pacing guide, or even a competing curriculum.

A program vetted by evidence is important for results, but so is implementation. With carefully crafted implementation plans, schools should see more returns on their American Rescue Plan investments.

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69传媒
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

School & District Management Reports Strategic Resourcing for K-12 Education: A Work in Progress
This report highlights key findings from surveys of K-12 administrators and product/service providers to shed light on the alignment of purchasing with instructional goals.
School & District Management Download Shhhh!!! It's Underground Spirit Week, Don't Tell the 69传媒
Try this fun twist on the Spirit Week tradition.
Illustration of shushing emoji.
iStock/Getty
School & District Management Opinion How My Experience With Linda McMahon Can Help You Navigate the Trump Ed. Agenda
I have a lesson for district leaders from my (limited) interactions with Trump鈥檚 pick for ed. secretary, writes a former superintendent.
Joshua P. Starr
4 min read
Vector illustration of people walking on upward arrows, symbolizing growth, progress, and teamwork towards success.
iStock/Getty Images
School & District Management Opinion How Social-Emotional Learning Can Unify Your School Community: 7 Timely Tips
It鈥檚 a stressful political season. These SEL best practices can help school leaders weather the unpredictable transitions.
Maurice J. Elias
4 min read
Modern digital collage of caring leader surrounded by positivity. Social Emotional learning leadership.
Vanessa Solis/Education Week via Canva