69ý

Opinion
School & District Management Opinion

We Must Raise the Bar for Evidence in Education

By Carly Robinson & Todd Rogers — October 30, 2019 5 min read
BRIC ARCHIVE
  • Save to favorites
  • Print
Email Copy URL

Those looking for what works in education will find no shortage of advice. Educators, hoping to improve student outcomes, eagerly embrace recommendations telling them to “Cater to each child’s learning style!” and “Give students awards for positive behaviors!” But many such intuitive, popular “best practices” may not, in fact, be what is best for students, even though their proponents stamp them “evidence based.”

Educators who prioritize evidence-based practices are fighting an uphill battle; the standards of proof for what constitutes “evidence” in schools—and education more widely—are often exceedingly low. For instance, the popular notions that we should be teaching to students’ or providing students with were both rooted in observational evidence. Both practices have now been debunked, but over 75 percent of educators still endorse learning styles, and many schools say they use awards to recognize excellent student attendance.

We should not be surprised when people have incorrect notions on what research says works—education research is littered with on a range of practical topics that either do not replicate previous findings or suggest massively inflated effects.

Educational policymakers and practitioners need to understand how study designs and research practices influence the reproducibility and credibility of a study’s findings. This is easier said than done, but there are a couple of initial indicators that suggest a research finding is “real” and worth implementing.

First, to disentangle whether a practice causes improvement or is merely associated with it, we need to use research methods that can reliably identify causal relationships. And the best way to determine whether a practice causes an outcome is to conduct a randomized controlled trial (or “RCT,” meaning participants were randomly assigned to being exposed to the practice under study or not being exposed to it).

The standards of proof for what constitutes 'evidence' in schools—and education more widely—are often exceedingly low."

Second, policymakers and practitioners evaluating research studies should have more confidence in studies where the same findings have been observed multiple times in different settings with large samples. Many educational practices are based on single research studies with small sample sizes. We can learn from small, one-off studies. But when it comes to adopting practices, we recommend those that have been evaluated in studies with large sample sizes and reproducible results.

Finally, we can have much more faith in a study’s findings when they are . That is, researchers publicly post exactly what their hypotheses are and exactly how they will evaluate each one before they have examined their data. This helps limit flexible research practices, making it less likely that researchers will find statistically significant results by chance.

Some will lament that large RCTs are too expensive, slow, or difficult to implement, and that preregistering studies is not feasible because educational research is messy and unpredictable. Yet several studies conducted in the past few years prove that changes in educational practice evaluated by large-scale, reproducible, preregistered RCTs exist and can be used to inform work on the ground.

In our own work, we have spent the last six years studying how to reduce student absenteeism. Through two large-scale, preregistered RCTs ( with more than 28,000 K-12 students and with almost 11,000 K-5 students), our research team found that sending mailings to parents several times over the course of the school year with personalized attendance information that dynamically targets key parental misbeliefs consistently reduces chronic absenteeism 10 percent to 15 percent. This research led to the creation of , a program that partners with districts around the country to help them reduce student absenteeism by implementing this research-backed intervention.

Another practice that educators can—and, strong evidence suggests, should—take up comes from a study conducted by Peter Bergman and Eric Chan. They randomly assigned parents of more than 1,000 middle and high school students in 22 schools to receive automated, frequent information via text message about their child’s missed assignments and grades. The study, which has now been replicated multiple times, found that this strategy led to a 28 percent reduction in course failures, a 12 percent increase in specific class attendance, and increased student retention by 1.5 percentage points. Providing parents with information that helps them monitor their child’s academic progress can have meaningful impacts on student success.

Finally, a led by David Yeager explored for whom growth mindset interventions are most effective. In a nationally representative sample of more than 12,000 students, the study found that adolescents assigned to complete a “growth mindset” intervention—which taught that intellectual abilities can be developed—earned higher GPAs (a modest, but real 0.05-grade points) in core classes at the end of the 9th grade. The authors preregistered that they predicted the intervention would . Consistent with this, they found that low-achieving students showed larger effects with higher GPAs (0.10-grade points) in core classes at the end of the 9th grade and were 11 percentage points less likely to get a D or F average in one of these classes. This large-scale and preregistered study provides evidence that mindset interventions can improve outcomes for struggling students.

Each of these relatively low-cost and easy-to-implement interventions have modest but real impacts on student outcomes. Holding educational research to greater standards of evidence will very likely mean the effect sizes that are reported will be smaller. But they will reflect reality.

Expectations about how much impact interventions tend to have need to be massively recalibrated since more-rigorous research with larger sample sizes tends to find smaller effect sizes. If an educational intervention’s outcomes seem too good to be true, they probably are. (A recent by Matt Kraft of Brown University provides a helpful overview of how we might think about effect sizes in education.)

There do not appear to be single “silver bullets” that will easily equalize and accelerate educational outcomes. The reality is that educational gains will come from a combination of many well-supported, evidence-based practices and communities of caring adults helping kids.

We hope educational policymakers and practitioners will start proactively looking for practices that meet these standards of evidence and work to widely implement them. By doing so, we can move steadily toward greater educational success for all students.

A version of this article appeared in the October 30, 2019 edition of Education Week as Raising the Bar for Evidence in Education

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

School & District Management Spooked by Halloween, Some 69ý Ban Costumes—But Not Without Pushback
69ý are tweaking Halloween traditions to make them more inclusive to all students.
4 min read
A group of elementary school kids sitting on a curb dressed in their Halloween costumes.
iStock/Getty
School & District Management 69ý Take a $3 Billion Hit From the Culture Wars. Here’s How It Breaks Down
Culturally divisive conflicts in schools have led to increased legal and security costs, as well as staff time spent on the fallout.
4 min read
Illustration of a businessman with his hands on his head while he watches dollars being sucked down into a dark hole.
DigitalVision Vectors
School & District Management Opinion The Blind Spot More Educators Need to Recognize
A simple activity in a training session caused a chain reaction that strengthened an educator's leadership for decades to come.
5 min read
Screen Shot 2024 10 29 at 9.19.10 AM
Canva
School & District Management Opinion 9 Ways 69ý Can Improve Life for Teachers and 69ý
Educators suggest low-cost strategies to improve the education experience for teachers and learners alike.
8 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for Education Week