69ý

Assessment

NAEP Board Sets Rules For Background Questions

By Debra Viadero — September 03, 2003 3 min read
  • Save to favorites
  • Print
Email Copy URL

The board that sets policy for the National Assessment of Educational Progress has approved new guidelines aimed at reducing and better focusing the nonacademic questions that students and educators are asked to answer as part of the federally sponsored exams.

Besides testing what students know in specific subjects, such as reading and mathematics, the NAEP tests have long included some “noncognitive” or background questions on everything from the amount of time students spend on homework to the kinds of reading materials they use in class.

Members of the National Assessment Governing Board, the independent body that oversees the testing program, were concerned that the additional questions made the exams too cumbersome and yielded results that could be misinterpreted.

Grover J. “Russ” Whitehurst, the director of the Department of Education’s Institute of Education Sciences, shared some of the concerns. He opted on recent printed copies of NAEP reports to drop analyses of the data yielded by the background questions. (“‘Report Card’ Lacking Usual Background Data,” July 9, 2003.)

By law, the exams are required to gather information on race, ethnicity, gender, socioeconomic status, disability, and English-language proficiency for the students participating in the assessment. Those questions will continue to be part of future exams, according to the final guidelines approved by the governing board at its Aug. 1 meeting here.

The new guidelines allow for a trimmer set of questions on socioeconomic status, however, with some appearing in every assessment and some popping up periodically or accompanying only limited samples of tests.

Likewise, the exams could also include questions on “contextual variables,” such as student mobility, school safety issues, or discipline, but only if those topics have been shown through other research to have an impact on academic achievement.

The same holds true for subject-specific background questions. Queries in that category might probe relevant course content, teacher preparation, or other factors related to student achievement.

Experts’ Advice

In all, the framework approved last month says, background questions should not take up more than 10 minutes of students’ time, 20 minutes for teachers, and 30 minutes for school administrators. To keep within those limits, the guidelines encourage the Education Department to try to get some of the same information from other sources, such as school transcripts and other federal surveys.

In addition, since 4th grade test-takers can’t be counted on to provide reliable information on their parents’ income levels, the framework encourages federal test designers to try developing a wider index of proxy variables that might give a more accurate reading of a family’s socioeconomic status than the ones that are currently used.

“We think if we do a good job of reforming the NAEP, it will continue to be an important source of data for the research community,” said John H. Stevens, the executive director of the Texas Business and Education Coalition and the NAGB member who spearheaded the revision of the background questions.

While generally supportive of the board’s new direction on background questions, some researchers and national education groups expressed disappointment that the board had dropped earlier plans to set up an advisory board to help select appropriate questions.

“I don’t know where the capacity is to do the good work that the board wants to do,” Gerald E. Sroufe, the director of government relations for the Washington-based American Educational Research Association, told a board committee last month. “You need people really steeped in theory and research to suggest where the field is.”

And, while the guidelines say that analyses of the background data should be included in federal NAEP reports, they don’t require it.

When the data are presented, the framework says, the Education Department should refrain from suggesting any cause-and-effect relationships between the background factors and any variations in student achievement. Drawn from cross- sectional samples of students, the best such findings can do is suggest possible links for others to probe further, the guidelines note.

Related Tags:

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 69ý Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/Education Week + Getty Images
Assessment Why Are States So Slow to Release Test Scores?
Nearly a dozen states still haven't put out scores from spring tests. What's taking so long?
7 min read
Illustration of a man near a sheet of paper with test scores on which lies a magnifying glass and next to it is a question mark.
iStock/Getty