69´«Ă˝

Opinion
Education Opinion

Supplemental Educational Services: Noble Ideas + Unreasonable Expectations = Disappointing Results

By Sputnik Contributor — October 19, 2011 3 min read
  • Save to favorites
  • Print
Email Copy URL

NOTE: This is a guest post by Steven Ross, Professor in the Center for Research and Reform in Education at Johns Hopkins University

With the upcoming reauthorization of ESEA pending, and the future of Supplemental Educational Services (SES) in question, it is due time to reflect on the research and implementation lessons of this program.

Making tutoring available to increase the academic performance of low-achieving and disadvantaged students is a noble idea. After all, one-on-one and small-group tutoring have been supported by extensive research evidence, while having universal appeal as a teaching strategy. However, expectations that tutoring can be delivered efficiently and effectively when filtered through multiple layers of administrative requirements and processes are unrealistic.

First created under the last ESEA reauthorization in 2001, SES mandates school districts to offer free tutoring to disadvantaged students who attend low-performing schools. While noble in intentions, SES has turned out to be quite costly. To fund SES along with transportation for students who opt to transfer to better schools, districts must set aside 20 percent of their Title I allocations, a cost currently approximating $800 million each year.

If SES could accomplish what was originally hoped--raising student achievement sufficiently to move schools out of improvement status--the cost would be worth every penny. Unfortunately, results from numerous evaluation studies indicate much more modest effects and sometimes none at all. When serving as principal investigator of over 15 state-sponsored evaluations of SES, my impression was that the vast majority of SES providers offered quality tutoring services that were helping students both academically and socio-emotionally. But the road from the noble idea to the tutoring session is long and bumpy. Guided by lengthy federal compliance regulations, the process filters down first to the states, which are charged with approving, monitoring, and evaluating providers. Next in line are the school districts, which must fund and roll out the program locally. Parents who are low-income (and not expert about evidence-based practices) are then charged with choosing their children’s tutoring providers. The providers, in turn, must hire tutors, find tutoring space, market their services with parents, and grapple with all the federal, state, and local regulations. Ironically, those least involved with SES are the classroom teachers who deal with the children every day and best know their needs.

that, on average, SES raises participants’ reading and math scores by compared to matched control groups. For example, a student participating in SES could advance from the 25th to 27nd or 28th percentile, while a comparable non-SES student would remain in the 25th percentile. For an intervention lasting only 30 to 60 hours per student, some might view such effects as a reasonable return. Reasonable or not, small gains by the relatively small subgroup of tutored students can’t do much to remove schools from “improvement status” and being required to follow the same “intervention pathway” another year.

Although the efficacy of the existing SES program needs to be questioned, after-school tutoring remains a viable intervention for boosting student achievement. Judging from the SES experience, rolling out tutoring in a large-scale, top-down, one-size-fits-all manner is not the most efficient way to expend limited Title I resources. A noble idea with much more reasonable expectations for success is to provide schools and districts freedom, with appropriate vetting, to adopt the evidence-based interventions (including tutoring, improved reading and math programs, practice-based professional development, etc.) that most directly address site-based their improvement needs.

-Steven Ross

NOTE: The Center for Research and Reform in Education at Johns Hopkins University develops the which provides unbiased, practical reviews about the strength of evidence supporting a range of education programs. Robert Slavin is the director of CRRE.

Related Tags:

The opinions expressed in Sputnik are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.