If you’re like me, the election, COVID spikes, and the rest mean that October’s NAEP scores haven’t exactly been top of mind. But, with the election behind us, I’m inclined to say a few words about the 2019 12th grade numbers, which showed math performance flat and reading declining noticeably since 2015.
Overall, just 37 percent of 12th graders were “proficient” in reading and 24 percent in math. And keep in mind that these aren’t COVID-impacted scores. The tests were administered in fall 2019, six months before COVID reared its ugly head. As Haley Barbour, chair of the National Assessment Governing Board, , “These results demonstrate that far too many of our nation’s high school seniors do not have sufficient math and reading skills for postsecondary endeavors.”
So, not great. At the same time, let’s keep in mind that NAEP results are a snapshot. They’re useful for tracking big-picture student achievement but need to be handled with care. Unfortunately, too many appear disposed to disregard such cautions, treating NAEP less as an essential reality check than as a festival of agendas and dubious narratives. That’s been especially true during this election season.
As is true with each new NAEP release, the teachers’ unions and self-described “public school advocates” have a universal excuse for any crummy results—the “disinvestment” in public education. Never mind what the actual spending figures show, or that after-inflation per-pupil spending has increased steadily over the past two decades. In their eyes, the lesson of NAEP is always more spending.
Meanwhile, there’s a segment of school choice advocates who eagerly greet any lousy new NAEP numbers by shouting, “Ah-ha, schools are failing!” For them, the lesson of NAEP is simple: more school choice. Of course, given that NAEP proficiency was purposefully set at an aspirational level from inception, aggregate proficiency results should be taken with more than a few grains of salt.
And then there are efforts to weave complex narratives to explain the results. For instance, some holdout Common Core enthusiasts have gone to great lengths to insist that we not blame the long-standing stagnation in NAEP on Common Core. Of course, this involves devising convoluted alternative explanations—such as them to the aftermath of the 2008 recession. This argument was , once again this year, with adherents implying that we should trace changes in reading and math scores to decade-old economic circumstances rather than a massive push to change reading and math instruction.
The truth is that there’s no clear or defensible way to determine what’s responsible for NAEP results, and all of the predictable spinning should be treated accordingly. It may be wishful thinking, but we’d all be better off if analysts restrained themselves from offering convenient, one-shot explanations for NAEP changes.
It’s equally important that we safeguard NAEP from its purported friends. NAEP is our one reliable tool to measure academic progress over time. On that score, recent efforts to overhaul NAEP’s reading framework are deeply troubling. In a major push to “update” NAEP’s reading framework, the National Assessment Governing Board has now developed a massive draft framework that aims, at enormous cost, to reading assessment from a straightforward snapshot of reading performance into a complicated, amorphous gauge of 21st-century “literacy” as understood by the education school set. Specially, the framework calls for the inclusion of multimedia texts, such as video clips, alongside (or instead of) textual passages. The result would compromise our ability to know how well students can actually read, introduce an array of potential distortions, and sorely reduce our ability to compare future results with past performance. As Checker Finn, the National Assessment Governing Board’s very first chair, has “One of the framework developers’ key impulses is a truly worrying overreach for NAEP.”
NAEP can play a useful role in providing a respected baseline for assessing whether states or the nation are making academic progress, grounding our sense of where we are and what we’ve done. But NAEP plays that role best when we recognize its limitations.