69传媒

Opinion
Student Achievement Opinion

Does Studying Student Data Really Raise Test Scores?

Researchers say the popular practice is largely ineffective
By Heather C. Hill 鈥 February 07, 2020 5 min read
BRIC ARCHIVE
  • Save to favorites
  • Print
Email Copy URL

Question: What activity is done by most teachers in the United States, but has almost no evidence of effectiveness in raising student test scores?

Answer: Analyzing student assessment data.

This practice arose from a simple logic: To improve student outcomes, teachers should study students鈥 prior test performance, learn what students struggle with, and then adjust the curriculum or offer students remediation where necessary. By addressing the weaknesses revealed by the test results, overall student achievement would improve.

My own recent experiences visiting schools imply this trend continues."

Yet understanding students鈥 weaknesses is only useful if it changes practice. And, to date, evidence suggests that it does not change practice鈥攐r student outcomes. Focusing on the problem has likely distracted us from focusing on the solution.

With the birth of large-scale state assessments and widening data availability in the 1990s, school leaders and teachers could access information on student performance that was common across schools and classrooms. Many schools also instituted standardized 鈥渋nterim鈥 assessments, claiming that this periodic, low-stakes testing could help teachers identify difficult content and struggling students before the state assessment, giving both teachers and students a chance to catch up. Over time, educational testing and data companies including the Achievement Network, NWEA, and McGraw-Hill鈥檚 Acuity began to sell interim assessments to schools and states, making such assessments (and their cousins, test-item banks that support formative assessment) a billion-dollar business.

Currently, a large number of teachers report they regularly get together to analyze student assessment results. In a 2016 survey by Harvard鈥檚 Center for Education Policy Research, 94 percent of a nationally representative sample of middle school math teachers reported that they analyzed student performance on tests in the prior year, and 15 percent said they spent over 40 hours that year engaged in this activity. Case-study research suggests that in many Title 1 schools, this activity is a cornerstone of teachers鈥 weekly or monthly collaborative time.

About this series

BRIC ARCHIVE

This essay is the second in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project鈥擲usanna Loeb, the director of Brown University鈥檚 Annenberg Institute for School Reform, and Harvard education professor Heather Hill鈥攈ave received grant support from the Annenberg Institute for this series.

Read the full series here.

But here鈥檚 the rub: Rigorous empirical research doesn鈥檛 support this practice. In the past two decades, researchers have tested 10 different data-study programs in hundreds of schools for impacts on student outcomes in math, English/language arts, and sometimes science. Of 23 student outcomes examined by these studies, only three were statistically significant. Of these three, two were positive, and one negative. In the other 20 cases, analyses suggest no beneficial impacts on students. Thus, on average, the practice seems not to improve student performance.

One critical question, of course, is, why?

Observational studies suggest that teachers do, in fact, use interim assessments to pick out content that they need to return to. For instance, in a study published in 2009, Margaret E. Goertz and colleagues at the University of Pennsylvania observed teachers planning to revisit math topics using a combination of whole-group and small-group instruction.

But Goertz and colleagues also observed that rather than dig into student misunderstandings, teachers often proposed non-mathematical reasons for students鈥 failure, then moved on. In other words, the teachers mostly didn鈥檛 seem to use student test-score data to deepen their understanding of how students learn, to think about what drives student misconceptions, or to modify instructional techniques.

My own recent experiences visiting schools imply this trend continues. Field notes from teacher data-team meetings suggest a heavy focus on 鈥渨atch list鈥 students鈥攖hose predicted to barely pass or to fail the annual state reading assessment. Teachers reported on each student, celebrating learning gains or giving reasons for poor performance鈥攁 bad week at home, students鈥 failure to study, or poor test-taking skills. Occasionally, other teachers chimed in with advice about how to help a student over a reading trouble spot鈥攆or instance, helping students develop reading fluency by breaking down words or sorting words by long or short vowel sounds. But this focus on instruction proved fleeting, more about suggesting short-term tasks or activities than improving instruction as a whole.

Common goals for improving reading instruction, such as how to ask more complex questions or encourage students to use more evidence in their explanations, did not surface in these meetings. Rather, teachers focused on students鈥 progress or lack of it. That could result in extra attention for a watch-list student, to the individual student鈥檚 benefit, but it was unlikely to improve instruction or boost learning for the class as a whole.

In reviewing the research on teachers analyzing student data, I came across a small number of programs that included interim assessment as one part of a larger instructional package. While I excluded these studies from the formal review I undertook for this essay, they are notable nonetheless. One, by Janke M. Faber and colleagues in the Netherlands, focused on a program that not only contained computer-based interim assessments but also provided both instructionally focused feedback to teachers and students and personalized online student assignments. Another study, led by Jonathan A. Supovitz and colleagues at the University of Pennsylvania, examined the Ongoing Assessment Project, a program that helps teachers create assessments and examine the results, then combines this practice with professional development focused on mathematics content and student thinking about that content. Both of these studies saw positive impacts, suggesting that the analysis of data can, when combined with strong supports for improved teaching, shift student outcomes. But the small number of programs that combine the study of data with wider instructional supports limits our ability to draw real conclusions.

In total, the research in this area suggests that district and school leaders should rethink their use of state and interim assessments as the focus of teacher collaboration. Administrators may still benefit from analyzing student assessment results to know where to strengthen the curriculum or to provide teacher professional learning. But the fact remains that having teachers themselves examine test-score data has yet to be proven productive, even after many trials of such programs.

For many schools, this news is disheartening. Retooling teacher collaborative time will be a major shift鈥攁nd that鈥檚 assuming that schools can first identify more effective ways to help teachers improve their instruction. In our next column, we鈥檒l cover possible replacement activities.

Editor鈥檚 Note: This is part of a continuing series on the practical takeaways from research.
A version of this article appeared in the February 12, 2020 edition of Education Week as Does Studying Student Data Really Raise Test Scores?

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69传媒
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What鈥檚 Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Student Achievement How Motivated Are 69传媒 to Drive Their Own Learning?
An international test of students in more than 80 countries and economies finds that many struggle with motivation.
4 min read
Unhappy young African American hipster guy in eyeglasses looking in distance, feeling bored preparing for examination or doing high school research project on computer, sitting at table in library.
iStock/Getty Images
Student Achievement Spotlight Spotlight on MTSS
This Spotlight explores key aspects of MTSS implementation, including its relationship to special education and effectiveness in improving student outcomes.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Whitepaper
Progress Monitoring: Resources to Support Student Growth
Use the resources in this toolkit to increase your team's confidence in analyzing progress monitoring data and determining if an interven...
Content provided by Renaissance
Student Achievement This District Provided Tutoring to Thousands of 69传媒. The Results Were Mixed
A new study suggests that tutoring at scale could have a smaller impact than advocates had hoped.
6 min read
Waist-up view of early 30s teacher sitting with 11 year old Hispanic student at library round table and holding book as she pronounces the words.
E+