69传媒

Special Report
School & District Management

How to Find Evidence-Based Fixes for 69传媒 That Fall Behind

By Sarah D. Sparks 鈥 September 27, 2016 | Corrected: September 28, 2016 6 min read
  • Save to favorites
  • Print
Email Copy URL

Corrected: A previous version of this article incorrectly spelled the name of Vivian Tseng, the vice president of the William T. Grant Foundation. In addition, the article has been changed to clarify that the Chiefs for Change group has created an ESSA working group of 15 experimentally minded state education leaders.
An earlier version of this article misidentified the organization that created the 鈥淪tate Guide to Evidence Use.鈥 That resource was developed by the Florida Center for 69传媒 Research at Florida State University.

The Every Student Succeeds Act gives states and districts significant flexibility in how they turn around struggling schools, as long as the local approaches are backed by evidence. But without support, that flexibility runs the risk of putting smaller or more rural districts at a disadvantage.

鈥淭his is a sea change from the highly prescriptive approach to school improvement [under the No Child Left Behind Act] to what can seem like a bit of a Wild West structure under ESSA,鈥 said Mike Magee, the chief executive officer of Chiefs for Change, which has created an ESSA working group of 15 experimentally minded state education leaders. 鈥淲e have potentially unprecedented flexibility in how states address school improvement鈥攂ut that鈥檚 just another factor in how high the stakes are.鈥

As states work out how to apply ESSA鈥檚 new standards of evidence, their quest highlights the need for more research on interventions at schools with a wider array of contexts. The pool of high-quality research on education programs remains relatively small, sporadic, and focused on shorter-term gains for students.

Tying Things Together

While the federal What Works Clearinghouse has reviewed more than 10,000 studies on various interventions, a forthcoming meta-analysis based on the clearinghouse鈥檚 reviews found only 29 different interventions showed significant effects鈥攁nd the average effect was small, particularly when the interventions were in messy real-school contexts instead of highly controlled laboratory settings.

鈥淚f you look from 10,000 feet at education interventions, you can almost count on your hand the number of interventions that have truly scaled and established鈥 themselves, said Jerome D鈥橝gostino, a professor of educational studies at Ohio State University, who led the study presented at the 2016 American Educational Research Association in Washington this April.

鈥淚t鈥檚 not just the sheer volume of programs; there hasn鈥檛 been a wider effort to tie these [intervention evaluations] together in any way,鈥 D鈥橝gostino said. 鈥淚t鈥檚 not like some grand designer said, 鈥楧o we have enough interventions in reading, in math, in different grade levels? It鈥檚 field-generated. ... People have been focusing on their parts of the elephant, and I鈥檓 not sure there would be a whole elephant if you brought them all together.鈥

ESSA lays out three levels of evidence that states can choose to apply to prove an intervention works:

鈥 鈥淪trong evidence鈥 includes at least one well-designed and -implemented experimental study, meaning a randomized controlled trial.

鈥 鈥淢oderate evidence鈥 includes at least one well-designed and -implemented quasi-experimental study. For example, a program evaluation could use a regression-discontinuity analysis, in which researchers might look at differences in outcomes for students who scored a point above and below the entrance cutoff score for a particular program or intervention.

鈥 鈥淧romising evidence鈥 includes at least one well-designed and -implemented correlational study that controls for selection bias, the potential differences between the types of students who choose to participate in a particular program and those who don鈥檛.

In separate guidance, the Education Department explained that districts and states should work to use the most rigorous evidence available, with intervention studies that not only meet high methodological quality, but also reflect similar students and school types as those where the intervention would be used.

鈥淭he logic model before was, pick a good intervention, implement it, and you鈥檒l get results,鈥 said Vivian Tseng, the vice president of the William T. Grant Foundation, which has been studying research use in education. 鈥淣ow, whatever you implement, there鈥檚 this idea of ongoing evaluation to see where it worked, where did it not work, and for whom. I would say the ongoing cycle of learning is needed for programs at any of these [evidence] tiers.鈥

Proof Requirements

Resources

As states and districts grapple with how to develop and use evidence for school improvement, several new resources are rolling out to help them:

Enhanced Find What Works

Who made it? U.S. Department of Education

When is it available? Now

What is it? The What Works Clearinghouse has evaluated more than 10,000 studies on different educational programs and interventions, but in the past, some educators have found the database difficult to use. The revamped database includes a search tool to allow educators to search for studies not just based on the topic and grade level, but also on the demographic characteristics of the students who used the intervention and whether the schools studied were urban or rural, among other things.

鈥淥ne of the things we鈥檝e heard from people is they really want information on the population and context where things were tested,鈥 said Ruth C. Neild, the acting director of the Institute of Education Sciences, which runs the clearinghouse. IES has always collected contextual data from its study reviews, she said, but the new tool 鈥渇rees a lot of the data we had but didn鈥檛 really have a way of displaying without overwhelming people.鈥

National Study on Research Use Among School and District Leaders

Who made it? National Center for Research in Policy and Practice

When is it available? Now

What is it? The center is producing a series of reports based on a nationally representative survey of 733 school and district leaders from 45 states and 485 districts. The group is reporting on how and when district and school leaders use evidence to make decisions and how states can provide better resources and supports to help them use research more effectively.

RCT-Yes

Who made it? U.S. Department of Education

When is it available? Now

What is it? The highest tier of evidence under the Every Student Succeeds Act includes randomized controlled trials, or RCTs, in which researchers randomly assign participants to use an intervention. In practice, RCTs can be expensive and lengthy to perform in educational settings. This free software helps districts perform small-scale experimental and quasi-experimental studies in the regular course of implementing a program.

For example, a superintendent may pilot a new math program at five of nine elementary schools and find higher math scores for students in the participating schools at the end of the year. The software could be used to help the superintendent understand whether the new math program or something else led to the student gains.

State Guide to Evidence Use

Who made it? Florida Center for 69传媒 Research at Florida State University

When is it available? Late 2016

What is it? The guide is a self-study walk-through for states to plan their own evidence standards and school improvement strategies, applying the ESSA levels of evidence. The lab plans to devise a similar guide for districts on how to apply their state evidence levels to district school improvement decisions.

鈥淲e want to give them a structure to do their planning; we鈥檙e not telling them which strategies to pick,鈥 said John Hughes, an associate director of the Florida Center for 69传媒 Research and the deputy director of the regional lab.

Results First Clearinghouse Database

Who made it? Pew-MacArthur Results First Initiative

When is it available? Now

What is it? This search tool aggregates results from several evidence databases, including those for child-welfare, juvenile-justice, mental-health, and social-services interventions. For administrators looking for nonacademic or community-related interventions, this tool can provide a broader array of interventions.
鈥揝.顿.厂

Source: Education Week

Those levels were developed in part from the proof required under the Obama administration鈥檚 Investing in Innovation competitive grants. It鈥檚 telling that only 43 projects met the grant鈥檚 moderate-evidence bar鈥攁nd fewer than 10 have so far had the strong evidence required to win i3鈥檚 top-tier grant.

The few interventions that have established strong bases of evidence and use over time, such as Success for All and 69传媒 Recovery, which D鈥橝gostino evaluated for the federal Investing in Innovation program, created comprehensive infrastructure to implement programs in a wide variety of schools; trained and retrained staff that turned over; and sustained ongoing improvement and evaluation of the programs.

鈥淲hen I meet with other [intervention programs], they are so far from even thinking and conceptualizing that need for infrastructure, I鈥檝e come to the conclusion a lot of them will never get there,鈥 he said.

It can take years of effort to build a strong evidence base for a program. One of those i3 grantees, the New Teacher Center, has been conducting multiple randomized controlled trials and quasi-experimental evaluations of its mentoring model since 2004, according to Ali Picucci, the center鈥檚 vice president of impact and improvement.

While the studies have produced promising results, 鈥渨e know [randomized controlled trials] occur in controlled environments and are not ideal for addressing the social complexities that we find in classrooms,鈥 Picucci said.

鈥淛ust as one-size-doesn鈥檛 fit all when it comes to clothes or educational initiatives, one study doesn鈥檛 fit all district and school contexts. It鈥檚 important for all of us to remember that interventions鈥攅ven those backed by high-quality evidence鈥攁re beginnings and not ends,鈥 said Ash Vasudeva, the vice president for strategic initiatives at the Carnegie Foundation for the Advancement of Teaching. 鈥淪imply selecting an evidence-based program does not ensure that similar results can be achieved in different settings and systems.鈥

Local Contexts Critical

In Cleveland, district officials are trying to integrate thinking about evidence into day-to-day decision making in schools.

The district has set up a website with a summary of every program available in the district, and is working to provide reviews on the effectiveness of each on key outcomes like reading and math achievement. The intervention 鈥渞eport cards鈥 do include high-quality external studies if they are available, but also include the district鈥檚 own research on an intervention鈥檚 effects for schools that used it either a lot or just a little, and feedback from principals on how easy it was to use and how well it worked for them.

鈥淲e鈥檙e evaluating programs regardless of their external body of evidence, because context matters,鈥 said Matthew A. Linick, executive director of research and evaluation for Cleveland public schools. 鈥淎s a researcher, you know the more rigorous the study is, the less generalizable it becomes. While things that work in urban districts could be helpful, Cleveland鈥檚 urban context could be very different. Just because something works in Cincinnati doesn鈥檛 mean it will work in Cleveland.鈥

Both the U.S. Department of Education and nationwide groups like the Council of Chief State School Officers and Chiefs for Change have set up support networks for state officials to work together to identify evidence for what interventions will work in different school contexts. And the Institute of Education Sciences, the department鈥檚 research arm, is working to allow researchers and educators to search research for core elements of different programs and the effects of an intervention on specific populations.

鈥淥ne of the things we鈥檝e heard from people is they really want to see the context鈥 of the school where an intervention was done, said Ruth C. Neild, the IES acting director. 鈥淎lmost regardless of whether impacts are different for different [student] groups, ... a lot of times people need to see the intervention was done in their particular context in order to believe it.鈥

A version of this article appeared in the September 28, 2016 edition of Education Week as Finding Evidence-Based Fixes for 69传媒

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

School & District Management Spooked by Halloween, Some 69传媒 Ban Costumes鈥擝ut Not Without Pushback
69传媒 are tweaking Halloween traditions to make them more inclusive to all students.
4 min read
A group of elementary school kids sitting on a curb dressed in their Halloween costumes.
iStock/Getty
School & District Management 69传媒 Take a $3 Billion Hit From the Culture Wars. Here鈥檚 How It Breaks Down
Culturally divisive conflicts in schools have led to increased legal and security costs, as well as staff time spent on the fallout.
4 min read
Illustration of a businessman with his hands on his head while he watches dollars being sucked down into a dark hole.
DigitalVision Vectors
School & District Management Opinion The Blind Spot More Educators Need to Recognize
A simple activity in a training session caused a chain reaction that strengthened an educator's leadership for decades to come.
5 min read
Screen Shot 2024 10 29 at 9.19.10 AM
Canva
School & District Management Opinion 9 Ways 69传媒 Can Improve Life for Teachers and 69传媒
Educators suggest low-cost strategies to improve the education experience for teachers and learners alike.
8 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for Education Week