Corrected: An earlier version of this story misstated the number of teachers in the TFA sample. There were 135.
Even as evidence builds that Teach For America produces capable secondary math teachers, the question about which parts of the program are most effective in preparing them for the classroom continues to deepen. That’s the implication of a rigorous federally funded study released Sept. 10 that examines the impact of TFA and a second popular program, both of which quickly train and place top college graduates and career-changers in underresourced classrooms.
Secondary math teachers who enter the profession through TFA helped their students learn more in mathematics than colleagues who entered teaching through a less-selective fast-track program or a traditional, university-based program. They also outperformed experienced teachers in that subject—a finding contradicting critics who argue that TFA teachers’ relative inexperience harms learning.
Teachers trained through another selective alternative route, Teaching Fellows, were indistinguishable overall from teachers trained in other ways.
Researchers found few relationships between the characteristics of the TFA teachers, such as college selectivity or academic major, and their students’ academic gains.
“The bottom line is that these résumé-type characteristics don’t explain why TFA teachers are more effective,” said Hanley S. Chiang, a senior researcher at Mathematica Policy Research, which conducted the study.
The findings are notable partly for their implications for hot-button debates about how best to prepare effective teachers. TFA in particular has been a policy lightning rod, given its limited upfront training, aggressive expansion during a period of significant teacher layoffs, and fees it charges to school districts with which it contracts to place teachers.
Still, the evidence of its teachers’ effects is among the most convincing to date of TFA’S impact on secondary math. It also stands in stark contrast to the teacher-preparation field at large: Most, if not all, programs lack experimental evidence of their impact.
Context and Concept
Alternative-route programs, generally defined as those that allow teachers to take much of their coursework while serving as classroom teachers, vary widely in terms of programming and the types of candidates they admit. A number of analyses have essentially called a draw on whether such programs do better or worse overall than the traditional, university-based programs that prepare the bulk of the nation’s teachers.
The study, financed by the U.S. Department of Education, was designed explicitly to examine the effects of two highly selective programs.
Both TFA and Teaching Fellows, which is run by the New York City-based TNTP, recruit candidates with strong academic backgrounds as well as certain personal characteristics, such as the ability to work through obstacles. The programs give their recruits condensed upfront training, place them in low-performing or low-income schools, and offer them intensive on-the-job coaching.
For the study, researchers selected middle and high schools with at least two sections of the same math course. 69ý were randomly assigned to a classroom taught by a teacher trained through one of the selective programs, or a control group taught by a traditionally trained teacher or a teacher from a nonselective alternative route. At the end of the year, the students were given standardized exams, and the researchers compared their achievement growth.
Random-assignment studies eliminate sources of bias caused by differences in student or school characteristics. The data show that the two groups of students had nearly indistinguishable background characteristics and were located in similar schools. In one of the only disparities, charter schools were slightly underrepresented in the treatment schools because many did not offer more than one section of math at a time.
Overall, TFA teachers improved student achievement by .07 of an effect size more than other teachers. That effect size translates to about 2 1/2 months of additional learning, according to the researchers.
No Harm Done?
The study is the first in nearly a decade to use an experimental method to study the TFA program. With more than 32,000 alumni, TFA has been put on the hot seat for years about whether its teachers pass the “do no harm” test in the classroom.
A 2004 randomized experiment, also by the Washington-based Mathematica, found score boosts among elementary students taught by TFA teachers, too, but that study was criticized in part because the control group of teachers had lower rates of certification, less formal education preparation, and less experience in student-teaching than a national sample.
A handful of studies using less rigorous methodologies have had more mixed results, and the report does not address English/language arts. For nearly any classroom intervention, student test scores in that subject generally appear to be less responsive overall, possibly because of variations in home factors, such as access to books.
69ý taught by instructors from the Teaching Fellows program, meanwhile, did better than those taught by teachers from nonselective alternative routes. Their results were indistinguishable from students taught by traditionally prepared teachers.
Teaching Fellows’ novices were slightly less effective than experienced teachers. But unlike TFA, which expects candidates to fulfill a two-year teaching commitment, that program expects that its candidates will remain classroom teachers and continue to improve. And generally, the data found that teacher experience did yield better outcomes for students.
In all, the sample of TFA teachers studied consisted of 135 math teachers in 45 schools across eight states. For the Teaching Fellows program, the study looked at 153 teachers in 44 schools across eight states.
The study does not compare TFA teachers or Teaching Fellows with each other.
Questions Remain
The study offers few clues about what caused the TFA teachers to do better. For instance, its teachers did better than control teachers on a math-content test linked to student achievement, but not enough to explain the performance gains.
The findings also don’t appear to reflect “gaming” through an overemphasis on test preparation: At the high school level, the tests given to students were not required by the state and so were unfamiliar to teachers and students.
Matt Kramer, TFA’s co-CEO, attributed the gains partly to the organization’s scrupulously tailored selection metrics. He also highlighted TFA’S support network as a possible driver behind the findings.
“There is unbelievable research clarity that the length of preservice does not matter that much,” he said of the findings. “Time in the classroom, lessons learned, and mentorship do matter. We spend much, much more than districts on teacher induction.”
While those are legitimate hypotheses, the study does not directly answer such questions, said Melissa A. Clark, a Mathematica senior researcher who was the lead researcher on the project.
For that reason, it is unclear whether districts could produce the same effects if they were to adopt TFA’s hiring and mentoring techniques.
The organization currently spends about $45,000 on training and professional development over three years for each of its teachers.