69ý

School & District Management

‘Mixed Methods’ Research Examined

By Debra Viadero — January 25, 2005 6 min read
  • Save to favorites
  • Print
Email Copy URL

If Northwestern University researcher Greg J. Duncan and his colleagues hadn’t used more than one research method to study an anti-poverty experiment in Milwaukee, they would never have known about the gangs and the sneakers.

That’s because the study’s quantitative results couldn’t explain a puzzling disparity that turned up between the boys and the girls they studied. The boys whose families took part in the program did better in school than boys whose families didn’t. Girls fared about the same whether or not their families were enrolled in the experimental program, which provided a temporary safety net of wage supplements, insurance benefits, and child-care subsidies for working-poor families.

An explanation became clear when the field researchers who had been interviewing the families weighed in. The mothers they met had told them that local street gangs were trying to lure their sons by offering expensive sneakers and other items. To pre-empt those temptations, the mothers had used their additional income to buy their sons, but not daughters, those extras.

Mr. Duncan’s experience makes a good case for a push going on at the national level to shine a spotlight on “mixed methods” research in education—in other words, studies that blend different research strategies. Although various methodologies have always been part of researchers’ toolkits, much of the renewed attention to that strategy is a reaction to the U.S. Department of Education’s current emphasis on using randomized field trials to answer questions about what works in schools and classrooms.

Some experts say such studies, in which subjects are randomly assigned to either experimental or control groups, are the best way to answer questions about what works best. Many education researchers complain, however, that the Bush administration’s focus leaves out other valuable forms of research.

Yet while it seems common sensical that combining different research strategies could yield more complete answers, researchers also caution that the issue is not clear-cut. With limited funding for education studies and a small supply of researchers trained to use a variety of approaches, mixed methods aren’t always practical, according to these researchers.

Plus, within the field, philosophical divisions have for years separated researchers who favor qualitative or more descriptive kinds of studies from those who specialize in the quantitative, number-crunching variety.

“To say that mixed methods are always better would be naive,” said Thomas D. Cook, a professor of sociology, psychology, education, and social policy at Northwestern University. “It implies that we haven’t learned enormously from classical, single-method studies.”

Subject of Debate

The topic of mixed-methods research was the focus of a Dec. 14 conference that was sponsored by four Washington-area national organizations that follow educational research: the National Research Council, an arm of the National Academies, a group chartered by the Congress to advise the federal government on scientific matters; the American Educational Research Association; the American Psychological Association; and the National Science Foundation, an independent federal agency. Sponsors had to draw up a waiting list for the meeting, which attracted a spillover crowd of 238 participants.

“There is unprecedented interest now in the methodological quality of studies in education,” said Martin E. Orland, the director of the National Research Council’s center for education, which hosted the event.

He said that interest has grown since the Bush administration began its push for “scientifically based research” in education following passage of the No Child Left Behind Act in 2001.

Federal definitions of such studies favor randomized experiments over other study methods.

The problem, some education researchers contend, is that while randomized studies can determine whether an intervention works, they cannot answer key questions about why it works, they can’t tell whether it works better where it’s well implemented, and they can’t pick up on any unexpected side effects.

In comparison, mixed-methods research offers the potential for deeper understandings of some education research questions that policymakers need answered, some scholars said.

“We would not have come close to understanding the experimental impact had we been forced to rely on quantitative data alone,” Mr. Duncan said of his study of the New Hope anti-poverty program. Researchers in that project surveyed 1,300 participants in the experiment, randomly assigning half to the program. They also studied a subset of 43 families in depth, interviewing them every six weeks for two years.

Breaking Down Barriers

Besides producing better research, mixed methods might also help heal professional rifts between qualitatively oriented researchers and quantitative-study proponents, said Rena F. Subotnik, the director of the American Psychological Association’s center for psychology in the schools.

“We have members whom we respect who are on the extreme edges of this qualitative-quantitative debate,” she said. “This is taking a more constructive approach.”

Still, the payoff may not always be worth the expense involved in adding descriptive research to experiments, Mr. Cook of Northwestern University said. Known primarily as a quantitative expert, he collaborated with qualitative researchers to study a school improvement program in Chicago. In the end, he said, both the quantitative and qualitative researchers reached nearly identical conclusions on which schools were implementing the program well and which were not.

“We learned in our particular case that we need not have done that,” Mr. Cook said. “However, we could not have known that before we began the study.” The problem, he added, is that the study of mixed-methods practice is still too underdeveloped to guide researchers on which methods to use for what questions.

One qualitative researcher, Joseph A. Maxwell, an associate education professor at George Mason University in Fairfax, Va., pointed out that philosophical barriers between researchers can also get in the way of collaboration.

“Quantitative researchers think about the world in different ways than qualitative researchers,” he said. He worries that the statistically oriented researchers, rather than viewing their qualitative-study colleagues as equal partners, might just use the quotes and anecdotes the field researchers collect to buttress the numbers.

Training Novice Researchers

Such schisms might be overcome, researchers said, if more education schools exposed budding researchers to a variety of approaches. The American Psychological Association, the American Educational Research Association, and the federal Education Department’s Institute of Education Sciences have all begun pilot programs in the past five years that aim to help education researchers in training expand their methodological and disciplinary expertise.

“The challenge is one of having a breadth of understanding and yet to bring one’s own expertise to the table,” said Felice J. Levine, the executive director of the Washington-based AERA. “We do not have to make a jack-of-all-trades out of everyone.”

Related Tags:

A version of this article appeared in the January 26, 2005 edition of Education Week as ‘Mixed Methods’ Research Examined

Events

School & District Management Webinar Crafting Outcomes-Based Contracts That Work for Everyone
Discover the power of outcomes-based contracts and how they can drive student achievement.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in 69ý
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
School & District Management Webinar EdMarketer Quick Hit: What’s Trending among K-12 Leaders?
What issues are keeping K-12 leaders up at night? Join us for EdMarketer Quick Hit: What’s Trending among K-12 Leaders?

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

School & District Management 69ý Want Results When They Spend Big Money. Here's How They're Getting Them
Tying spending to outcomes is a goal many district leaders have. A new model for purchase contracts could make it easier.
7 min read
Illustration of scales balancing books on one end and coins on another.
iStock/Getty
School & District Management Reports Strategic Resourcing for K-12 Education: A Work in Progress
This report highlights key findings from surveys of K-12 administrators and product/service providers to shed light on the alignment of purchasing with instructional goals.
School & District Management Download Shhhh!!! It's Underground Spirit Week, Don't Tell the 69ý
Try this fun twist on the Spirit Week tradition.
Illustration of shushing emoji.
iStock/Getty
School & District Management Opinion How My Experience With Linda McMahon Can Help You Navigate the Trump Ed. Agenda
I have a lesson for district leaders from my (limited) interactions with Trump’s pick for ed. secretary, writes a former superintendent.
Joshua P. Starr
4 min read
Vector illustration of people walking on upward arrows, symbolizing growth, progress, and teamwork towards success.
iStock/Getty Images