Some cities have made greater-than-expected academic progress over a decade, surpassing the growth of the nation鈥檚 other schools as a whole鈥攅ven accounting for poverty and other factors associated with lower test scores.
Notably, the gains are concentrated both in districts well known for years of improvement efforts, like Miami-Dade and Boston, as well as in some, such as Cleveland, Dallas, and Detroit, that have received much less national attention, earlier this week by the Council of the Great City 69传媒.
The District of Columbia posted impressive gains, leading the CGCS to conclude that it is improving faster than any other major city school system in the country.
The study uses statistical methods to facilitate comparisons among some of the nation鈥檚 largest and most-diverse school districts. It takes into account the changing demographics of students in each city over time to rule out other factors that are often cited for academic gains or losses, such as gentrification, increasing poverty rates and homelessness, and changing proportions of children learning English.
The research is based on data from the federally administered National Assessment of Educational Progress.
Read on to learn more about the research and how it helps flesh out the picture of city school districts鈥 performance.
We knew urban school districts were generally improving. So why is this new?
Yes, previous NAEP releases have shown large cities generally narrowing gaps between Black and white students, and between disadvantaged students and their more-advantaged peers. (These opportunity gaps are largely the product of unequal access to resources, good teaching, and other factors that are linked to student achievement.)
Here鈥檚 why the new study matters: It shows more clearly how much each school system appears to improve outcomes for students despite the challenges. That鈥檚 important because it means other, similarly situated school systems can visit them to pinpoint the strategies they used.
And it also provides a better measure of improvement than simply looking at changes in raw test scores, by taking into account each district鈥檚 starting point and how far achievement grew. While the absolute performance of these districts is still lower than the national average, some districts are moving students along much faster and farther than others.
Historically, said Ray Hart, the executive director of the CGCS and the number-cruncher on this study, 鈥渨e鈥檝e given credit to schools based on the population they serve. What we haven鈥檛 done is given credit to schools based on the education they give to the population they serve.鈥
There is one caveat: The study is limited to the 2009 to 2019 time period. 2019 was the last year in which the NAEP was administered. The report therefore does not include any declines caused by the pandemic, which caused massive disruption to teaching and learning.
"...We鈥檝e given credit to schools based on the population they serve. What we haven鈥檛 done is given credit to schools based on the education they give to the population they serve.鈥
Just how does the study account for improvement?
Hart took all of the NAEP data from the last few reading and math administrations in grades 4 and 8, and used statistical methods to predict how students in large city districts were expected to do, including for each demographic category鈥擝lack students, students with disabilities, English-learners, and so on.
Then he compared this to how those students actually performed. The difference between expected and actual performance gives an estimate of the impact the large city districts are having.
The study frames this as an 鈥渆ffect,鈥 and states it in terms of scale-score points. An effect of three, for example, means that the district on average moved students three points higher in that subject than it was expected to, based on demographics. (There are currently 27 of these city districts that participate in NAEP, so comparisons are limited to those.)
All of this information is compared to how students in all other schools鈥攑ublic and private鈥攑erformed.
Take Detroit, for example. In 4th grade math, it still has a very low level of achievement overall, a 232 scale score on a 0 to 500 scale. (The average for all non-city schools in 2019 was 242 points.)
But between 2009 and 2019, Detroit鈥檚 鈥渆ffect鈥濃攈ow much more its scale score in that subject changed relative to expectations鈥攇rew by nearly nine points, much higher than the point-and-a-half increase posted by all other districts in the aggregate. It also outpaced the large city schools as a whole, which as a group weren鈥檛 any more or less effective than predicted.
Which districts are standouts here?
The report lays out answers to this complicated question graphically, so do take a look at it. But here goes with a short list of highlights.
For fast improvement, the District of Columbia really tops the list, with statistically significant 鈥渆ffect鈥 increases in all four grade and subject combinations: 4th and 8th grade math and reading.
Many critics over the years have questioned whether this improvement has something to do with the city鈥檚 demographics, which became wealthier and whiter over this time period. But because the analysis takes demographics into account, it rules that out as a factor. Instead, the city鈥檚 gains appear to be something it鈥檚 doing educationally. (It has invested significantly鈥攁nd controversially鈥攊n efforts to improve teacher quality and curriculum.)
Detroit is another one to watch. It became dramatically more effective at boosting learning between 2009 and 2019, with significant impact in all grade and subject combinations except 4th grade reading. That occurred even while the city got even poorer in the wake of the Great Recession.
(In 2021, Education Week recognized Detroit鈥檚 superintendent and its assistant superintendent of family and community relations for their work rebuilding ties with parents. The district is also well known for its work to improve its secondary reading and literacy curriculum.)
Of course, both of those two districts had more room to improve because they started way behind many of the others.
There are also are some districts that are notable for sustaining their effectiveness over time. Boston and Miami-Dade are both good examples.
In Miami-Dade, well known for the unusually long, 13-years-and-counting tenure of its superintendent, the district not only posted scores higher than that of all other schools in all the grade and subject combinations in 2019; it also got more effective in three out of four grade and subject combinations.
And year after year, students in Boston performed higher than they are statistically expected to in all four grade and subject combinations. (The city has more recently been
There are the 鈥渄ark horses鈥 as well鈥攃ities that don鈥檛 tend to get as much attention for their work. Cleveland became more effective in boosting both 4th grade math and reading, for instance, and is improving faster in 8th grade math than the average of all other schools. Atlanta and Charlotte, N.C., both improved their effectiveness in the teaching of 8th grade math.
There鈥檚 some not-great news in the findings, too.
The Philadelphia district has gradually lost ground. In 2009, it had an effect of about 5 points in 8th grade math, for example. But over time, it鈥檚 moved in the opposite direction. Now, the district 鈥渆ffect鈥 in that grade actually is about 9 points below where the statistical projections say it should be.
Houston has also lost ground by having less impact in two grade/subject areas and stalling in the others. (Even so, it is still outperforming expectations, according to the projections.)
Other cities, like Fresno, Calif., just seem to have a hard time getting traction. That district got less effective in 8th grade math and didn鈥檛 change significantly in the other categories.
How does this data square with other NAEP trends?
One of the most persistent, worrisome trends of late has been a decline in overall scores largely due to what one federal official called a bifurcation in performance: The downturn is concentrated among average and low-performing students, even as the highest-performing students gain ground.
The CGCS data isn鈥檛 broken out in terms of performance quintiles, but the organization agrees that this is a concerning trend that warrants additional research.
Do we know why some of these districts seem to improve faster than others?
Overall, the data are encouraging, especially in some of the most-troubled districts. The data generally confirm, for example, Chicago鈥檚 remarkable progress over the past decade. It鈥檚 a far cry from the 1980s, when then-U.S. Secretary of Education William Bennett famously deemed it the worst district in the nation.
But in trying to figure out why, we run squarely into one of the difficulties of NAEP data: It鈥檚 not really set up to answer cause-and-effect questions. We may know these districts are doing something right, but we鈥檙e not clear about what the 鈥渟omething鈥 is.
Testing what works in school districts is hard to do in any case, because from a research perspective, you鈥檇 ideally want to compare the set of strategies you鈥檙e interested in against some business-as-usual set of policies.
Still, the CGCS did some case study research to see if they could identify common patterns among the districts. It also visited two unnamed districts that didn鈥檛 have as good results to see if there was anything there that stood out.
What it found maybe can best be described as a kind of overall coherence: Districts that seemed to be improving had strong, stable leadership focused on teaching and learning. They gave lots of supports on curriculum, standards, and teaching to their teachers. They focused on improving their teachers鈥 and principals鈥 capabilities and invested in engaging parents and community members in their efforts.
But there was one other common theme: The districts tended to go all in on their instructional plans at scale, rather than piloting things here and there.