Texas officials are crowing over improved passing rates on state tests for the eighth consecutive year, but at least one critic of the test is asking just how much student performance has improved.
Commissioner of Education Jim Nelson last month touted the latest results, which showed 82 percent of students in grades 3-8 passed the Texas Assessment of Academic Skills in reading, writing, and mathematics—up a bit from 80 percent last year.
“We’re testing more students, and the passing rate is rising,” Mr. Nelson said. “That’s a clear sign that performance is improving in our schools.”
But students had to answer fewer questions correctly on the math test to show the same effort that it took to pass the test in previous years. Seventh graders, for example, needed to answer 30 out of 58 questions correctly this year to pass, compared with the 33 correct required last year. On the 3rd grade test, the difference was seven questions, with 24 of 44 correct answers meeting the passing grade, compared with 31 last year.
Small Changes Expected
The practice of “equating,” as it is called, is commonly used to account for variations in the difficulty of standardized tests that tend to occur from administration to administration.
“It is perfectly legitimate and essential to try to adjust scoring on subsequent forms of the test to account for accidental differences [in difficulty],” said Daniel M. Koretz, a senior social scientist with the Santa Monica, Calif.-based RAND Corp. Equating is “unavoidable because under high-stakes conditions, if you use the same test [from year to year], students’ scores will skyrocket.”
But Mr. Koretz and other testing experts say that equating should generally lead to only small changes in scoring.
Over the past two years, however, Texas officials have made relatively larger adjustments in the scoring standards because of a change in the substance of the tests, according to Debbie Graves Ratcliffe, a spokeswoman for the Texas Education Agency. Beginning last year, the tests were aligned with the Texas Essential Knowledge and Skills, the state’s academic standards. The subject matter on which the tests are now based is considered more rigorous.
“69ý are now required to use more higher-order thinking skills on the test, and there are more multistep math problems,” Ms. Ratcliffe said. “It simply resulted in a harder test.”
Critics of the test, however, say that claims of improved student achievement may be suspect, considering the significantly lower passing scores.
“Throughout the 1990s, the passing score had been set at about 70 percent correct, and varied only slightly from that,” said Walter M. Haney, an education professor at Boston College who has studied Texas education. In the fall of 1999, the passing score started to get lower, he said, and as of fall 2000 on some of the tests, the passing score was lowered to about 50 percent correct.
Mr. Haney, one of the most outspoken critics of the reported gains that students have made on the Texas exam over the years, calls the progress “the myth of the Texas miracle.”
Scrap Trend Data?
Other observers are not so harsh in their assessment of the practice in Texas, but question whether the state should continue attempts to maintain trend data over the past decade if the test has changed so significantly.
“Equating is good for making small adjustments [in test scores], but it doesn’t work as well if you have to make really big ones,” said Robert L. Linn, a co-director of the National Center for Research on Evaluation, Standards, and Student Testing, based at the University of California, Los Angeles. “The goal in designing tests from year to year is to have items as similar as you can design them to be. At some point, you want to say this is a new and different test, and it is harder than the old test.”
The state board of education in Texas is working to do just that. The new testing program is set to be phased in beginning in 2003. Passing standards will not be equated to the old versions, Ms. Ratcliffe said.
“The new tests will be similar in that they will be aligned to the new standards and will be criterion-referenced tests,” she said. Such assessments measure student achievement against a body of knowledge. “But it will be a whole new generation of testing,” Ms. Ratcliffe said. “We are really in a transition period right now, and there have been some unusual blips because of it.”