Corrected: In the earlier Web version of this story, Patrick Gibbons’ name was misspelled. A quote attributed to Mark S. Schneider was also deleted.
69´«Ă˝ scores for the United States on an international assessment of student skills have been invalidated because of major errors in the printing of the test, in what a top federal education official called an “embarrassment” for government officials and the private contractor responsible for administering the exam.
The results of the reading section of the , or PISA, were ruined when printing errors in the test booklets directed students to the wrong pages for information related to specific questions.
Mark S. Schneider, the commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that oversees U.S. participation in the exam, said today that his agency bore some responsibility for not catching the printing problem before the tests were given to students in fall 2006.
But he also said the primary duty for making sure the tests were printed correctly fell to the contractor the Education Department hired to administer the test, RTI International, of Research Triangle Park, N.C. A spokesman for the contractor said the company accepted full responsibility for the mishap.
The contractor has agreed to return $500,000 to the department, out of an original contract for its work on the PISA valued at $4 million, Mr. Schneider said in a Nov. 19 conference call with reporters.
The PISA results are scheduled for release worldwide Dec. 4.
“RTI will apologize for this. I will apologize for this mistake, but it was an error,” Mr. Schneider told reporters. The federal agency is taking steps to make sure the problem is not repeated, “but that’s after the fact,” he acknowledged, calling the mistake “an embarrassment to all” parties involved.
Printing errors were also found in the two other subjects tested on this year’s PISA results—mathematics and science—but U.S. and international officials who examined those flaws found that they did not affect the overall exam results in the subjects involved, Mr. Schneider said.
Timeline of Breakdown
Each PISA includes a “major” subject, in which students are tested in more depth, and also “minor” subjects, in which fewer questions are presented. Those designations vary with each testing cycle. 69´«Ă˝ was a minor subject on the 2006 PISA, as was math. Science was the major subject.
NCES officials provided a timeline of events that resulted in the flawed reading test.
An electronic version of the test, which NCES officials say did not contain errors, was approved by them in July 2006. One month later, the contractor sent the final version of the reading test to the printer. The test was given in the United States from September through November of last year.
A hard copy of the final testing booklet was sent to the NCES and RTI in October 2006, as the international assessment was being administered. No one at the NCES noticed any errors on the reading-test booklet then, Mr. Schneider said—the extent to which it was reviewed is unclear—though it would probably have been too late to have given a valid test at that point, he added.
In July of this year, as the test results were being analyzed, RTI officials told federal officials they suspected a problem with the reading results.Mr. Schneider said it took a colleague of his at the education statistics agency “about 10 seconds” to recognize that the printing problem would force the invalidation of the reading exam. U.S. officials notified the Organization for Economic Cooperation and Development, the Paris-based intergovernmental group that oversees PISA, of the problem, and officials of the group said the test results for reading were marred beyond repair.
RTI officials joined NCES officials in discussing the misprinted booklets in the conference call with reporters. Patrick Gibbons, a spokesman for the contractor who attended the meeting, said in a later interview that the company bore “100 percent” responsibility for the printing errors. RTI has a number of contracts with the Education Department to administer tests. The company conducts research in a variety of fields, including health and economics, for clients in the United States and abroad.
“When a client trusts us to do research for them, our responsibility is to give them a quality product,”Mr. Gibbons said. “We’re very disappointed that this unfortunate error occurred.”
RTI is a major contractor with the Department of Education. It currently has 13 active contracts with the agency, valued at $196 million, department officials said. Nine of those contracts, worth a combined $184 million, are with the Institute of Education Sciences, the central research and statistics office of the department, of which NCES is a part. The contract for the PISA test was competitively bid, Mr. Schneider said. Andreas Schleicher, the head of the indicators and analysis division for the OECD, said the incident marked the first time that a country’s results were excluded because of students being given a seriously misprinted test. In some previous cases, the response rate for students in different countries was too low for scores to be counted.
â€A Major Loss’
PISA was first given in 2000. The most serious consequence of the loss of U.S. reading scores is that it will not allow for a comparison of students’ scores in that subject over time, from the 2000 and 2003 testing cycles.
“It’s a major loss for the study,” Mr. Schleicher said in an interview. Still, the OECD’s main concern was that flawed test scores not be reported in any final PISA results. “We have high standards for quality assurance and control,” he said.
Federal officials provided a sample booklet designed to show where the U.S. reading test unraveled. According to the example provided, students who opened the PISA test booklets were supposed to find a reading passage on the left page, and a series of questions on the right page. Instructions on the right side told them to answer questions related to the reading material on the “opposite page”—the adjoining one to the left.
But with the printing error, the list of questions appeared on the left side. When students were asked to refer to the “opposite page,” they would have been forced to flip back to a preceding page—not the one directly facing them. The skewed layout would not have made it impossible to find the right answer, but on a test where all students across nations are supposed to be held to the same standard, it could have altered the results, NCES officials said.
Other Tests Within Norm
Researchers around the world use the highly regarded PISA to compare the skills of students across nations over time. Tests are administered by the various individual countries and other jurisdictions that participate. In 2006, when the most recent tests were given, 57 nations and other jurisdictions took part.
The results of a separate, country-by-country comparison of students’ reading skills, the , are scheduled to be released Nov. 28.
Federal officials pointed out that they have experienced problems with printed test booklets in the past, but Mr. Schneider, who has served as the NCES commissioner since 2005, said he did not think those flaws were as serious as the ones experienced on the latest PISA.
The problems with the reading test could have a predicted effect of up to 6 score points, which was beyond the accepted norm, OECD officials said.
By contrast, the predicted effect of those test mistakes was only about 1 score point for the math and science tests, which meant they were still valid, they said.