69“«Ć½ learn more from certified teachers than they do from uncertified teachers, even when the uncredentialed teachers are Teach For America recruits from some of the nationās top colleges, a Stanford University research team concludes from a study of test scores in Houston.
Findings from the study, which researchers presented here April 15 during the annual conference of the American Educational Research Association, have refueled the fierce, continuing debate in research and policy circles over programs that let new teachers into the field without traditional training.
āOur study would suggest that it does matter that you actually complete some teacher-preparation program,ā said Linda Darling-Hammond, the studyās lead author and an education professor at Stanford. āAnd that seems to cut across a variety of tests and a variety of fields.ā
The report, , is available from at Stanford University. ()
Some scholars, though, viewed the findings with skepticism and suggested that they had been released prematurely.
āI guess my bottom line on it is that it looks like it has very good data,ā Jane Hannaway, the director of the Education Policy Center at the Urban Institute, a Washington think tank, said of the Stanford study. āItās asking important questions. But I wouldnāt have a lot of confidence in these results unless more analyses were done, and they were more clearly reported.ā
The Stanford team arrived at its conclusion after analyzing seven years of test-score data for 4,400 4th and 5th graders in Houston, a district that leans heavily on the Peace Corps-style Teach For America program to fill teaching positions in hard-to-staff schools.
The findings come at a time when alternative routes into the profession, such as the TFA approach, are proliferating. Policymakers, in commission reports and in legislative sessions, have asked whether requiring teachers to take the usual route to certificationāgenerally, four years in a college- or university-based teacher education programāis keeping too many otherwise qualified prospects out of the classroom.
Evaluating Effectiveness
Since its inception in 1990, the privately organized, New York City-based Teach For America program has recruited more than 12,000 young liberal-arts graduates to teach for two years in disadvantaged rural and inner-city schools. For the upcoming 2005-06 school year, according to program materials, a record 17,000 applicants have applied for teaching assignments.
See the accompanying item,
The program has become a lightning rod in the national debate on teacher certification because its recruits get minimal trainingāusually a five-week program that includes student teaching and intensive courseworkābefore they step into classrooms.
Yet two studies have suggested that Teach For America recruits may be as effective as or better than other teachers in their schools and districts. In the more recent of those reports, researchers from Mathematica Policy Research Inc., of Princeton, N.J., found that elementary students taught by TFA recruits in eight cities learned more mathematics over the course of a school year than their peers whose teachers were hired through more traditional routes. In reading, the two groups of students fared about the same. (āStudy Finds Benefits In Teach For America,ā June 16, 2004.)
That pattern also held when the researchers compared the program participants with the small groups of regularly certified teachers in their buildings.
The Mathematica study was widely praised for the rigor of its experimental design. An earlier study, published in 2001, found similar results for participants in the TFA program. Like the new Stanford study, that one was based on Houstonās experience.
But Ms. Darling-Hammond, a longtime critic of such fast-track pathways into teaching, said the regular teachers in the schools involved in both of those studies were often as likely as the Teach For America recruits to be uncredentialed and inexperienced.
āThe obvious next question is whether there are differences in the effectiveness of certified and uncertified teachersāTFA and othersāin supporting student achievement,ā she said.
Test Data Analyzed
The Stanford researchers drew on data from three types of tests that Houston elementary students took between 1995 and 2002. They were state-required reading and math tests; a national standardized test, the Stanford Achievement Test- 9th Edition; and the Aprenda, which Spanish-speaking students take.
For at least the first three years of the study, the results from the state testing program mirrored those of the earlier Houston study: 69“«Ć½ of Teach For America teachers did better in math and about the same in reading as their counterparts in other classrooms across the district. The edge in math evaporated, however, in later years and on some other tests.
āIt looks as though the district was recruiting more teachers with certification in those later years,ā Ms. Darling-Hammond said.
Compared with teachers who had achieved standard certification, the TFA teachers seemed, for the most part, to have less of an impact on improving their studentsā learning. That was true, the researchers said, even after they accounted for differences between the two groups of students, such as their prior achievement levels and their teachersā years on the job.
On the state reading test, for instance, researchers estimated that the learning difference they found would put the students in Teach For America classrooms about a month behind counterparts taught by certified teachers.
But in Houston, as in many other cities where the national teaching program operates, most Teach For America teachers end up earning professional certification anyway before their two-year stints are up. To account for that reality, the Stanford researchers also compared scores for students of certified Teach For America teachers with those of other certified classroom teachers.
The certified TFA teachersā students came out ahead on state tests of mathematics, but behind on the Aprenda test. On some other standardized tests, the two groups of students looked more similar, suggesting that the certified Teach For America teachers may have been doing just as well in those areas.
The Wrong Question?
However, critics said the study left out important information and analyses. They said the researchers omitted information on the number of teachers involved in some of the analyses, and analyzed other data in a way that might confuse teachersā experience with their certification status.
āAn independent review would have revealed some of the flaws that, it appears, would undermine the studyās conclusions,ā said Abigail Smith, the vice president for research and policy at Teach For America.
For her part, though, Ms. Darling-Hammond said she did conduct some of those alternative analyses. She did not discuss them in the report, she said, because the results were no different.
She said colleagues in the field also critiqued the study for her before the research meeting in Montreal. Neither the Mathematica study nor the earlier Houston study was published in a peer-reviewed journal before its release, she pointed out.
Even if the findings hold up, other scholars pointed out, Ms. Darling-Hammond may be asking the wrong question.
āIf I was a principal, I would be asking, āShould I go for TFA teachers or some other teachers who may not have certification?ā ā said Dan D. Goldhaber, a research associate professor at the University of Washington in Seattle. āFor schools hiring uncertified teachers, I would suspect theyāre doing so because they donāt have a lot of other options.ā