69ý

Standards & Accountability

State Consortium Scales Back Common-Test Design

By Catherine Gewertz — July 07, 2011 6 min read
  • Save to favorites
  • Print
Email Copy URL

A student-achievement test under consideration by nearly half the states has been redesigned to ease their concerns that it would cost too much, shape curriculum, and eat up too much instructional time.

The change was announced last week by the Partnership for Assessment of Readiness for College and Careers, or , one of two state consortia using federal Race to the Top funds to craft shared assessments. The tests are for the in mathematics and English/language arts that most states have adopted.

Currently, 24 states and the District of Columbia belong to PARCC. Thirty belong to the other group, the . More than half the states in each group have pledged to use the tests, while others—including a half-dozen that belong to both groups—are still weighing their options.

PARCC’s original proposal featured a “through-course” design, in which tests would be given after teachers completed one-quarter, one-half, three-quarters, and 90 percent of instruction. Some of those tests were to be in the form of essays and performance tasks, and others were to be quick-turnaround, computer-based exams. All four required components were to be combined into one end-of-year summative score, which states would use for accountability required by the No Child Left Behind Act.

A fifth element, a test of students’ speaking and listening skills, was to be given after three-quarters of instruction but not included in the summative score.

At a , however, the 15 states that make up PARCC’s governing board reduced the number of components in the summative score to two in each subject—one computer-based test and one exam of essays and performance tasks—and placed them close to the end of the school year.

Additional flexibility was added to the speaking-and-listening test, so states can give it when they choose. The first two components were made optional and re-envisioned as a way for states to produce feedback for teachers to help guide instruction.

Concerns Weighed

Mitchell Chester, the chairman of PARCC’s governing board and the commissioner of elementary and secondary education in Massachusetts, said the changes came in response to feedback from states that giving five tests each year would be too costly and consume too much classroom time.

They also were prompted, he said, by concerns raised by states, school districts, and various national policy advocates that the quarterly tests would essentially dictate the content and pacing of curriculum. That worry has been sparking intense debate in policy circles. Some have argued that curriculum would be unduly influenced by the federal government because it is funding the work of the assessment consortia, which includes not only tests but a range of instructional resources. (“Common-Assessment Consortia Expand Plans,” Feb. 23, 2011.)

“We want to make sure that variations in states’ curricula are honored through this process and not dictated by the structure of the tests,” Mr. Chester said in a phone interview. “We also want to make sure there is flexibility in the growing movement toward personalization of learning in curriculum and instruction. We didn’t want to design a system that would hamstring our educators.”

The changes in test design are not final until they are approved by the U.S. Department of Education. In seeking proposals last year, the department outlined the many uses it wanted the tests to serve, including measuring student achievement and learning gains, and the effectiveness of teachers, principals, and schools. It also wanted tests to produce useful feedback for teachers to help them shape instruction.

Mr. Chester said that most states in PARCC “are committed” to using the two optional components, for formative or instructional purposes. But it seemed likely that money would be a major factor in that decision.

“Cost would probably dictate some of whether we would participate in the first two options,” said Gloria Turner, the director of assessment and accountability in Alabama, which belongs to both consortia.

She said she was pleased to see that PARCC had listened to state and district concerns about the test, and had responded with a change in design. The assessment’s potential effect shaping the scope and sequence of curriculum, in particular, was “a main concern” in Alabama districts and in the state education department, she said.

The change in design pointed up a potential either-or choice for states, some experts said. States can address concerns about cost and excessive testing by not using the two now-optional components, but by doing so, they would forgo the instructional feedback that is one of the key improvements sought in these “next generation” testing systems.

Design Tensions

Douglas J. McRae, a retired psychometrician who is based in California and helped design that state’s assessments, welcomed the design change as an overdue separation of the test’s dual uses: as a formative tool, to gauge how instruction is going; and a summative one, to measure learning when instruction is complete.

PARCC’s first design, he said, “violated an underlying design-feature tension” by blending formative and summative functions into one test that would be used for accountability. One test can’t be used effectively for both, Mr. McRae said.

“Frankly, I think the design feature required by the [federal education department] for both assessment consortia ... was flawed by attempting to put both types of assessment under one roof,” he said in an email. “The two types of assessment are both needed but belong under separate roofs.”

Choosing to use the optional components could create problems by giving some states an edge over others in the summative score, said Tom Loveless, who follows assessment issues as a senior fellow at the Brookings Institution, a Washington think tank.

“Components one and two could end up serving as practice tests for [components] three and four and influencing test results,” he said. “It’s essentially a sneak peek, and it calls comparability into question.”

Mr. Chester said PARCC test designers are mulling whether some states could elect to use the second optional component—a performance-based test—as a third piece to be rolled into the summative score while others use only the two required components. The question, he said, is whether that could be allowed without sacrificing comparability of test results across all consortium states.

Some leaders in the assessment world said it was inevitable that PARCC would have to change its through-course design.

“Everybody predicted this from the beginning,” said one source, who asked not to be identified because of his employer’s working relationships with both assessment consortia. “It was only a matter of time until people figured out that it would create curriculum issues and would cost a lot.”

But the changes are good ones, the source said, because they offer states the chance to funnel more attention to high-quality performance tasks that deserve to shape instruction, and they enhance the chances of gaining support for the test by minimizing the “political distraction” of arguments about cost and curriculum control.

Joe Willhoft, the executive director of the SMARTER Balanced consortium, said that group has no plans right now to change its original test design, having received “favorable responses” on it from member states, technical experts, and other stakeholders.

A version of this article appeared in the July 13, 2011 edition of Education Week

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Standards & Accountability State Accountability Systems Aren't Actually Helping 69ý Improve
The systems under federal education law should do more to shine a light on racial disparities in students' performance, a new report says.
6 min read
Image of a classroom under a magnifying glass.
Tarras79 and iStock/Getty
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Standards & Accountability Sponsor
Demystifying Accreditation and Accountability
Accreditation and accountability are two distinct processes with different goals, yet the distinction between them is sometimes lost among educators.
Content provided by Cognia
Various actions for strategic thinking and improvement planning process cycle
Photo provided by Cognia®
Standards & Accountability What the Research Says More than 1 in 4 69ý Targeted for Improvement, Survey Finds
The new federal findings show schools also continue to struggle with absenteeism.
2 min read
Vector illustration of diverse children, students climbing up on a top of a stack of staggered books.
iStock/Getty
Standards & Accountability Opinion What’s Wrong With Online Credit Recovery? This Teacher Will Tell You
The “whatever it takes” approach to increasing graduation rates ends up deflating the value of a diploma.
5 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty