The Department of Education is circulating the draft guidelines it hopes to use for evaluating the studies that go into its What Works Clearinghouse.
When it’s up and running next year, the clearinghouse is intended to provide an online “one-stop shop” where policymakers and educators can go for scientific evidence for what really works in education.
The draft standards circulating this month are aimed at helping reviewers decide which studies should be included in the new research syntheses and how much weight to give them. The guidelines use hierarchies of questions that reviewers have to answer as they pore over each study.
Review the from the . (Requires .)
At one end of that spectrum are the basic questions that policymakers want answered. At the other end are the methodological questions that interest researchers more.
“One of our biggest and most exciting challenges was developing a system that could satisfy a diverse set of users,” said Harris M. Cooper, the University of Missouri- Columbia professor of psychological sciences who drafted the standards with colleague Jeffrey Valentine, a research assistant professor at the university.
Beginning on Nov. 11, the proposed criteria were posted on the clearinghouse’s Web site at www.w-w-c.org. The department is soliciting comments on them through Dec. 3. Clearinghouse developers also planned to hold a public forum here last Friday to discuss them.
â€Gold Standard’
While it’s too soon to know what educators and other researchers think of the new standards, a few experts who have seen them said they rely too heavily on particular research methodologies, such as experiments in which children are randomly assigned to either an intervention group or a comparison group.
“My feeling is that providing experimental and quasi-experimental designs as sole-source evidence is saying 100 years of research methods developed in anthropology, sociology, history, and education have no importance in defining effectiveness,” said H. Jerome Freiberg, a professor of curriculum and instruction at the University of Houston.
But Mr. Cooper said that experimental methods had to play a central role because they are recognized as the “gold standard” for determining whether an intervention actually works.
“However, these standards also maintain a place for well-designed quasi-experiments, and they also clearly indicate that random assignment is only one part of an equation that makes a study trustworthy or not,” Mr. Cooper added.
The clearinghouse plans to issue its final standards for evaluating studies by January.