69ý

Assessment

Policymakers Weigh Gathering More Data for NAEP

March 13, 2012 5 min read
  • Save to favorites
  • Print
Email Copy URL

As many experts raise questions about the future of “the nation’s report card,” the governing board for the assessment program is exploring changes aimed at leveraging the achievement data to better inform education policy and practice.

The core idea, outlined in a to the board, is to expand and make far greater use of the background information collected when the National Assessment of Educational Progress is given. In doing so, the report suggests, NAEP could identify factors that may differentiate high-performing states and urban districts from low performers.

The effort, it says, would parallel the extensive reporting of background variables in global assessment systems, such as the , or PISA.

The report was released just weeks after the Obama administration proposed a fiscal 2013 budget that would cut the NAEP budget by $6 million, while funding a pilot program of state participation in PISA.

“Currently, the NAEP background questions are a potentially important but largely underused national resource,” says the report by a six-member expert panel commissioned by the , or NAGB, which sets policy for the testing program. “These data could provide rich insights into a wide range of important issues about the nature and quality of American primary and secondary education and the context for understanding achievement and its improvement.”

In addition, the report says NAEP background questions could help track policy trends, such as implementation of the Common Core State Standards or new teacher-evaluation systems.

The report, presented this month to NAGB at a meeting in New Orleans, was apparently well-received by many board members, including the chairman, former Massachusetts Commissioner of Education David P. Driscoll. But some of the ideas are generating pushback from current and former federal officials.

“NAGB has a tool that they want to use for everything,” said Mark S. Schneider, a former commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that administers the test. He argues that NAEP should stick to its core strengths, namely measuring student achievement and serving as a benchmark for state assessments.

“I find this just a distraction,” Mr. Schneider said of the proposed plan.

Causation vs. Correlation

Although the report emphasizes the importance of not letting correlations between math achievement and rates of absenteeism, for instance, be confused for causation, Mr. Schneider argues that such distinctions would be lost on the public and risk damaging NAEP’s reputation.

“They will make statements that will inevitably push the boundaries, and you will end up with questionable reports, in my opinion,” said Mr. Schneider, who is now a vice president of the Washington-based American Institutes for Research. Other concerns raised about the proposals are the cost involved, especially given the president’s proposed cut to NAEP, and what some experts say may be resistance to the federal government’s collection and reporting of more information on students, given privacy concerns.

The new report, commissioned by NAGB, notes that complementing the NAEP tests is a “rich collection” of background questions regularly asked of students, teachers, and schools. But the collection and the public reporting of such information have been significantly scaled back over the past decade, the report says.

“NAEP should restore and improve upon its earlier practice of making much greater use of background data,” the report says, “but do so in a more sound and research-supported way.”

It offers recommendations in four areas related to the background questions: asking “important questions,” improving the accuracy of measures, strengthening sampling efficiency, and reinstituting what it calls “meaningful analysis and reporting.”

It’s the fourth area, analysis and reporting, that is proving especially controversial.

Marshall S. “Mike” Smith, a co-author of the report and a former U.S. undersecretary of education in the Clinton administration, notes that the report comes at a time when NAEP’s long-term relevance is at issue. He cites the work to develop common assessments across states in English/language arts and mathematics, as well as the growing prominence of international exams, like PISA.

“The future of NAEP is somewhat in doubt,” Mr. Smith said.

PISA’s use of extensive background questions, he said, has enabled it to have wide influence.

“They’ve built narratives around the assessments: Why are there differences among countries” in achievement, he said. “We can’t do that with NAEP. We’re not able to construct plausible scenarios or narratives about why there are different achievement levels among states. And we’ve seen that can be a powerful mechanism for motivating reform.”

Mr. Driscoll, the chairman of NAGB, said the next step is for board staff members to draft recommendations on how the proposed changes could be implemented.

“I have challenged the board to think about how NAEP and NAGB can make a difference and have an impact,” he said. “There is some very valuable information that we can lay out ... that would be instructive for all of us.”

The report makes clear that NAEP should not be used to assert causes for variation in student achievement, but that a series of “descriptive findings” could be illustrative and help “generate hypotheses” for further study. For example, it might highlight differences in access to 8th grade algebra courses or to a teacher who majored in math.

“A valid concern over causal interpretations has led to a serious and unjustified overreaction,” the report says.

But some observers see reason for concern.

“It’s a mistake to present results that are purely descriptive,” said Grover J. “Russ” Whitehurst, a senior fellow at the Brookings Institution in Washington who was the director of the federal Institute of Education Sciences under President George W. Bush. “It is misleading, and it doesn’t make any difference if you have a footnote saying these results should not be considered causally.”

Jack Buckley, the current NCES commissioner, expressed reservations about some of the suggestions, especially in the analysis and reporting of the background data.

“The panel is looking toward PISA as an exemplar,” he said. “Folks at [the Organization for Economic Cooperation and Development, which administers PISA] write these papers and get a broad audience, but it’s not always clear that the data can support the conclusions they reach about what works.”

Mr. Buckley said he understands NAGB’s desire to be “policy-relevant,” but he cautioned that “we have to carefully determine what is the best data source for measuring different things.”

Mr. Driscoll said he’s keenly aware of not going too far with how the background data are used.

“I agree ... that we have to be careful about the causal effects,” he said. “I think we’ve gone too far in one direction to de-emphasize the background questions, and the danger is to go too far in the other direction.”

A version of this article appeared in the March 14, 2012 edition of Education Week as NAEP Board Considering Gathering Additional Data

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion 69ý Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/Education Week + Getty Images