69传媒

Special Report
Assessment

Test Experts Wary on 鈥楻ace to Top鈥 Rules

By Lesli A. Maxwell 鈥 October 08, 2009 4 min read
  • Save to favorites
  • Print
Email Copy URL

While the U.S. Department of Education finalizes its rules for doling out $4 billion to states in the Race to the Top competition, a group of prominent testing experts is cautioning federal education officials on how they propose to use assessments to measure student achievement and teacher-quality improvements under the initiative.

The , a part of the National Research Council, wrote in an to U.S. Secretary of Education Arne Duncan that he and the department should 鈥減ursue vigorously the use of multiple indicators of what students know and can do,鈥 in the Race to the Top competition, part of the $787 billion American Recovery and Reinvestment Act approved by Congress earlier this year.

The 13-member Board on Testing and Assessment, which is part of the National Academies, said it could not meet the Education Department鈥檚 Aug. 28 deadline to comment on the department鈥檚 proposed regulations for Race to the Top because of the academies鈥 requirement that any public document must be reviewed by an independent group of experts before it can be released.

Nonetheless, the board proceeded with its review of the draft rules because 鈥渋t was too important an opportunity鈥 to weigh in on the massive federal investment in public schools, said Edward H. Haertel, the chairman of the testing panel.

In an e-mail to Education Week, Education Department spokesman Justin Hamilton wrote that the department welcomes 鈥渧igorous evaluations鈥 of the Race to the Top program. 鈥淭he best impact that [the program] can have is if we create a road map for reforms into the future that have solid research behind them.鈥

Evaluation Concerns

Race to the Top is one of two high-profile discretionary grant programs that are part of up to $100 billion in education aid under the economic-stimulus program.

Last week, the department proposed ground rules for the other program, the $650 million Investing in Innovation, or i3, fund. (鈥淧roposal Sets Out 鈥榠3' Rules,鈥 this issue.)

In its letter, the testing experts warned against using a single test, such as the National Assessment of Educational Progress to measure growth in student achievement, and also suggested that the department鈥檚 plans to use student growth data to evaluate teachers could be premature.

In its draft regulations, the department proposed using NAEP to 鈥渕onitor鈥 overall increases in student achievement, as well as progress in closing achievement gaps, saying that the test 鈥減rovides a way to report consistently across Race to the Top grantees as well as within a state over time.鈥

But Mr. Haertel said in an interview that while NAEP would be one good way to monitor the various strategies funded by the Race to the Top program, the national exam should not be used as a way to do objective evaluations of those same initiatives.

鈥淧art of this is the respect that we have for the value of the NAEP as a low-stakes auditing tool,鈥 said Mr. Haertel, an education professor at Stanford University. 鈥淲e don鈥檛 need one more high-stakes test to drive curriculum and instruction nearly as badly as we need the long-term trend lines that we get with the NAEP.鈥

If high-stakes decisions were to be attached to NAEP results, he warned, it could become 鈥渏ust another test that people would start teaching to.鈥

In its letter to Secretary Duncan, the board also pointed out that only students in grades 4, 8, and 12 take NAEP, and do so every other year. The test, the letter says, is not aligned with any states鈥 academic content standards or curricula, and consequently would not 鈥渇ully reflect improvement taking place at the state level.鈥

In addition, Mr. Haertel and his colleagues raised numerous concerns about the department鈥檚 plans to use individual students鈥 progress over the course of each academic year, the so-called value-added model, as a way to evaluate the effectiveness of teachers. While they expressed support for linking teachers and their students鈥 test scores for the purposes of research, they said it would be 鈥減remature鈥 to use the value-added approach in decisions on actions such as firing teachers or rewarding them. Too little is known about the accuracy of such methods, the board members said. The board also pointed out practical difficulties in using data to judge teachers.

The board also weighed in on the department鈥檚 proposed requirement that school districts use data to improve instruction on a constant basis. It cautioned that multiple-choice assessments that can be graded rapidly are not the best tools for figuring out how to tweak what teachers do in the classroom.

Mr. Haertel said that using tests that can be graded within 72 hours 鈥渂umps up against the concern that many of us have about using assessments that really measure the full range of knowledge and skills that we want children to acquire.

鈥淲hat we really need are forms of assessment that require children to construct their own answers and not just select answers from prefabricated choices,鈥 Mr. Haertel said. 鈥淏ut those can鈥檛 be graded in a fast fashion. This is a case of where the hopes and dreams of policymakers are getting ahead of realities.鈥

Related Tags:

A version of this article appeared in the October 14, 2009 edition of Education Week as Testing Experts Cautious on 鈥楻ace to Top鈥 Rules

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Special Education Webinar
Don鈥檛 Count Them Out: Dyscalculia Support from PreK-Career
Join Dr. Elliott and Dr. Wall as they empower educators to support students with dyscalculia to envision successful careers and leadership roles.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Improve School Culture and Engage 69传媒: Archery鈥檚 Critical Role in Education
Changing lives one arrow at a time. Find out why administrators and principals are raving about archery in their schools.
Content provided by 
School Climate & Safety Webinar Engaging Every Student: How to Address Absenteeism and Build Belonging
Gain valuable insights and practical solutions to address absenteeism and build a more welcoming and supportive school environment.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board鈥檚 new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week