69ý

Assessment

Questions Dog Common-Test Development

By Catherine Gewertz — July 31, 2012 9 min read
  • Save to favorites
  • Print
Email Copy URL

On the verge of signing a contract to help design assessments for the common standards, ACT Inc. has withdrawn from the project amid conflict-of-interest questions sparked by its own development of a similar suite of tests.

Even though it involves only a small subcontract, the move by the Iowa-based test-maker, and the questions from the state assessment consortium that propelled it, have set off ripples of reaction and reflection in the insular educational testing industry. That industry is reshaping itself in response to the unprecedented project by two big groups of states to create new tests for the , using $360 million in federal Race to the Top money.

The discussions offer a glimpse into some of the thorny issues that crop up as the two gargantuan assessment projects move forward. How does each group manage intellectual-property concerns and potentially competing interests when 20-plus states and hundreds of players are involved? Even as those questions elude easy answers, the stakes are bigger than ever.

“This work has really changed the game,” said Douglas J. McRae, who spent 40 years in the testing industry before retiring, including overseeing K-12 test development at the McGraw Hill Cos. in the 1990s. “In the past, when vendors have done [test] work for individual states, those products haven’t generally been marketable to other states. Now there is a bigger market, with much more money hanging on it.”

Success and Failure

Contractors and Subcontractors

The two prime contractors, ETS and Pearson, won contracts totaling $23 million to design the first 18,000 items in the consortium’s test bank.

SUBCONTRACTOR TEAMS

ETS:
• Measured Progress
• CTB McGraw-Hill
• College Board
• Carnegie Mellon University
• MetaMetrics
• Clark Aldrich Designs

Pearson:
• ACT (Pearson chose ACT to review test items.)
• CAE
• Knowbility
• SRI

SOURCE: Partnership for the Assessment of Readiness for College and Careers (PARCC)

The contract from which withdrew earlier this month was the biggest that the Partnership for Assessment of Readiness for Colleges and Careers, or , has awarded so far in designing tests for its 24 member states. and the won contracts totaling $23 million to design the first 18,000 items in the consortium’s test bank.

Those two prime contractors brought 10 subcontractors aboard to do pieces of the work; Pearson chose ACT to review test items, a subcontract valued at $113,000.

As the contracts were being finalized in late June, PARCC officials heard rumors that one or more of the vendors planned to announce a suite of computer-based formative and summative tests spanning grades 3-11 and designed to measure college and career readiness—similar to what PARCC planned.

The nine state schools chiefs on PARCC’s executive committee sent a letter July 2 to all the vendors, saying that the success of such a suite of tests “depends on or benefits from the failure” of PARCC and the other testing group, the , so it creates “an inevitable conflict between the firm’s dedication to PARCC” and to its own project.

The chiefs asked each vendor to respond by detailing any plans to make such tests, describing how the two projects could be kept sufficiently separate, and assuring them that the vendor’s “top personnel” would be devoted to the PARCC work, to avoid “undermining PARCC’s objectives.”

In a July 6 response, Jon L. Erickson, the president of ACT’s education division, said its forthcoming , announced publicly only four days earlier, was a “natural evolution” of its college and career testing that had been in the planning stages for several years. It was “not created as a result of, nor designed to directly compete against” the PARCC or SBAC systems, Mr. Erickson wrote.

The ACT system, which employs existing company products such as the widely used EXPLORE, PLAN, and ACT tests, as anchors and expands into earlier grades, will include science as well as literacy and math assessments and is expected to gauge a range of student behaviors seen as pivotal to future success, such as career goals.

Because of its long history contracting with multiple states simultaneously, Mr. Erickson’s letter said, ACT has established strict procedures that prevent any conflicts of interest and protect its clients’ intellectual property. ACT’s subcontract, Mr. Erickson pointed out, involved item review, not item development. But “given the spirit” of PARCC’s letter, he said, ACT consulted with Pearson and decided to withdraw to “avoid any perceived conflict of interest or action detrimental to PARCC or its member states.”

Pearson, in a July 4 letter, detailed the state assessment contracts that “could be considered, by some, as alternatives to, or indirectly ‘competing’ with PARCC,” such as its tests for grades 3-8 in New York and Illinois. The company also described its extensive work for other entities, including contracted and proposed work for Smarter Balanced and online test delivery, scoring, and reporting for the new ACT assessments that had prompted concerns within PARCC.

None of those projects interferes with its item-development work for PARCC, Douglas G. Kubach, Pearson’s president and chief executive officer, said in the letter, since the publishing giant is accustomed to “managing multiple large-scale assessment programs and successfully protecting our customers’ confidentiality and intellectual property.” Out of an “abundance of caution,” however, and to avoid “even the appearance of a conflict of interest,” Pearson had decided to release ACT from its role as a subcontractor, Mr. Kubach wrote.

Michael Cohen, the president of Achieve, a Washington-based group that is PARCC’s project manager, said that with the exception of ACT’s work, nothing in the vendors’ letters “rose to the level of concern that shakes our confidence” that those companies would “give us their best effort.”

He said the contracts did not include clauses barring vendors from doing competing work. And given the nature of the testing business, PARCC accepted that such companies often work simultaneously as partners with—and competitors to—one another, he said. But having one of its own vendors engage in a high-profile effort to make a similar suite of tests that “seemed to depend on the failure of the consortia” elicited concern in the consortium, he said.

Competition Wary

ACT officials don’t see it that way.

“Certainly that’s not our message, and that’s not how we would ever talk about anything we do,” Mr. Erickson said in an interview. “This was a logical extension of ACT’s history. I would hope that people wouldn’t feel threatened but would feel that innovation and free enterprise would keep the common standards alive and active.” He noted that ACT’s system is modular, so states can choose only the pieces that suit their needs.

In its letter to all the contractors and subcontractors, PARCC’s executive committee expressed concern about “critical remarks” rumored to have been made by one or more vendors about the consortium’s work, which could promote a vendor’s own tests at the expense of PARCC’s.

Responding by letter, all the vendors denied any such comments, and Education Week was unable to confirm that any were made. But one state assessment director said he attended a meeting at which ACT presenters said their new system would cost less and be ready a year earlier than PARCC’s anticipated 2014-15 completion. Another told Education Week that several colleagues had shared similar stories.

In a measure of the sensitivity of the topic, few people in state assessment divisions or private testing companies were willing to discuss the matter unless their identities were withheld, to avoid causing rifts with vendors or customers.

“ACT is marketing its product aggressively,” said one assessment chief from a PARCC state who attended an ACT meeting about its new tests. “How would you feel about giving money to someone who could turn around and use products you pay for to take states away from the consortium?”

Maintaining robust consortium size must be a key concern for the consortia after 2014-15, many assessment sources said. Both consortia currently have task forces looking into future sustainability.

Because federal funding covers only test development, the two groups of states will have to find other ways of supporting themselves to procure services like new item development, test administration, and score reporting in the future. Assuming that such a model depends heavily on states’ own contributions to those services, membership size is crucial.

“The threat is that if a system like ACT’s is attractive and peels states off the consortia, the consortia get smaller and, at some point, are no longer sustainable financially,” said one senior assessment official who closely follows the work of both consortia. “That puts a question mark over all of their long-term goals.”

Others in the assessment community said that competition, while valuable for offering states additional options, also runs the risk of eliminating a chance to compare student achievement across all states. The consortia are aiming for “cross-consortium comparability,” meaning that test results from any one state could be compared with those from any other.

“Getting comparability within a consortium is going to be hard enough, and getting it across consortia will be a bear,” said Scott Marion, the associate director of the National Center for the Improvement of Educational Assessment in Dover, N.H., which works with both consortia’s technical-advisory committees. “If states start leaving the consortia, and they’re using other tests, you’re killing any hopes of comparability.”

The timing of ACT’s new product line is particularly sensitive, some said.

“ACT is making a genius marketing pitch at exactly the right time, because it hits [schools] chiefs at a time that they’re uncomfortable,” said one assessment director from a state in the Smarter Balanced consortium. “They’re waiting to see how [the assessments] will come out, and they’re worried about the cost. And along comes this company with a known product line saying they’ll build out a system around those products that will be ready sooner and cost less.”

The newness of the consortia adds to the picture, said Pat Roschewski, who recently retired as the assessment chief in Nebraska, which does not belong to either consortium. “I have no doubt the consortia will build good products,” she said. “They have good heads working on this. But the fact is, they’re not built yet. They have yet to prove themselves. It can feel easier to just go for a known quantity.”

BRIC ARCHIVE

Kentucky is a state that is weighing its options. It belongs to the PARCC consortium. But Commissioner Terry Holliday said the state hasn’t decided whether it will keep its current system, which uses new tests designed by Pearson in grades 3-8 and ACT’s assessments in high school; adopt some or all of PARCC’s; or use some or all of the new system ACT presented to Mr. Holliday in June. He said he views those new tests as “kind of an extension of what we’re already doing here.”

“I have confidence in PARCC,” he said. “But it’s a legitimate question for chiefs: Can they deliver, and at a cost we can afford? And via a technology we can handle? At the end of the day, somebody has to show me they’ve got a better product at the same price or less. Otherwise, we’ll just keep what we’ve got now.”

Some noted that other testing companies are already offering assessments “aligned to the common core” and predicted more will follow with comprehensive products like those being built by the two state consortia.

“If one does it, they’ll all do it,” said Mr. Marion. “How long are the other big ones going to sit on the sidelines?”

Coverage of the implementation of the Common Core State Standards and the common assessments is supported in part by a grant from the GE Foundation, at .
A version of this article appeared in the August 08, 2012 edition of Education Week as Questions Dog Design of Tests

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69ý
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Why the Pioneers of High School Exit Exams Are Rolling Them Back
Massachusetts is doing away with a decades-old graduation requirement. What will take its place?
7 min read
Close up of student holding a pencil and filling in answer sheet on a bubble test.
iStock/Getty
Assessment Massachusetts Voters Poised to Ditch High School Exit Exam
The support for nixing the testing requirement could foreshadow public opinion on state standardized testing in general.
3 min read
Tight cropped photograph of a bubble sheet test with  a pencil.
E+
Assessment This School Didn't Like Traditional Grades. So It Created Its Own System
Principals at this middle school said the transition to the new system took patience and time.
6 min read
Close-up of a teacher's hands grading papers in the classroom.
E+/Getty
Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for Education Week