69´«Ă˝

Assessment

NAEP Board Sets Rules For Background Questions

By Debra Viadero — September 03, 2003 3 min read
  • Save to favorites
  • Print
Email Copy URL

The board that sets policy for the National Assessment of Educational Progress has approved new guidelines aimed at reducing and better focusing the nonacademic questions that students and educators are asked to answer as part of the federally sponsored exams.

Besides testing what students know in specific subjects, such as reading and mathematics, the NAEP tests have long included some “noncognitive” or background questions on everything from the amount of time students spend on homework to the kinds of reading materials they use in class.

Members of the National Assessment Governing Board, the independent body that oversees the testing program, were concerned that the additional questions made the exams too cumbersome and yielded results that could be misinterpreted.

Grover J. “Russ” Whitehurst, the director of the Department of Education’s Institute of Education Sciences, shared some of the concerns. He opted on recent printed copies of NAEP reports to drop analyses of the data yielded by the background questions. (“â€Report Card’ Lacking Usual Background Data,” July 9, 2003.)

By law, the exams are required to gather information on race, ethnicity, gender, socioeconomic status, disability, and English-language proficiency for the students participating in the assessment. Those questions will continue to be part of future exams, according to the final guidelines approved by the governing board at its Aug. 1 meeting here.

The new guidelines allow for a trimmer set of questions on socioeconomic status, however, with some appearing in every assessment and some popping up periodically or accompanying only limited samples of tests.

Likewise, the exams could also include questions on “contextual variables,” such as student mobility, school safety issues, or discipline, but only if those topics have been shown through other research to have an impact on academic achievement.

The same holds true for subject-specific background questions. Queries in that category might probe relevant course content, teacher preparation, or other factors related to student achievement.

Experts’ Advice

In all, the framework approved last month says, background questions should not take up more than 10 minutes of students’ time, 20 minutes for teachers, and 30 minutes for school administrators. To keep within those limits, the guidelines encourage the Education Department to try to get some of the same information from other sources, such as school transcripts and other federal surveys.

In addition, since 4th grade test-takers can’t be counted on to provide reliable information on their parents’ income levels, the framework encourages federal test designers to try developing a wider index of proxy variables that might give a more accurate reading of a family’s socioeconomic status than the ones that are currently used.

“We think if we do a good job of reforming the NAEP, it will continue to be an important source of data for the research community,” said John H. Stevens, the executive director of the Texas Business and Education Coalition and the NAGB member who spearheaded the revision of the background questions.

While generally supportive of the board’s new direction on background questions, some researchers and national education groups expressed disappointment that the board had dropped earlier plans to set up an advisory board to help select appropriate questions.

“I don’t know where the capacity is to do the good work that the board wants to do,” Gerald E. Sroufe, the director of government relations for the Washington-based American Educational Research Association, told a board committee last month. “You need people really steeped in theory and research to suggest where the field is.”

And, while the guidelines say that analyses of the background data should be included in federal NAEP reports, they don’t require it.

When the data are presented, the framework says, the Education Department should refrain from suggesting any cause-and-effect relationships between the background factors and any variations in student achievement. Drawn from cross- sectional samples of students, the best such findings can do is suggest possible links for others to probe further, the guidelines note.

Related Tags:

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69´«Ă˝
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week
Assessment Opinion 69´«Ă˝ Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week