69ý

Assessment

Global Test Shows U.S. Stagnating

By Liana Loewus — December 10, 2013 9 min read
  • Save to favorites
  • Print
Email Copy URL

The news that U.S. achievement was stagnant on a global exam as other nations plowed ahead triggered agenda-driven pronouncements from all sides last week, but some experts caution against making policy prescriptions based on 15-year-olds’ results on the assessment.

In all subjects tested—reading, mathematics, and science—more countries scored above the United States than did so in 2009 on the Program for International Student Assessment, or PISA. In the most striking example, 10 additional nations, including Germany and Poland, surpassed the U.S. average in reading compared with three years ago.

“We’re running in place as other high-performing countries start to lap us,” U.S. Secretary of Education Arne Duncan said at a daylong live-webcast event here Dec. 3. There’s “so much to learn from countries that have outperformed us.”

Mr. Duncan emphasized the need for improved early-childhood education and “elevating and strengthening the teaching profession” in the United States.

But Mark Schneider, a vice president at the American Institutes for Research, said that, too often, stakeholders use PISA “to confirm existing policy preferences.”

“People have their favorite policy prescriptions and plug PISA data into it,” said Mr. Schneider, a former commissioner of the National Center for Education Statistics. “It’s not clear to me what the logical foundation is for observing a sample of 15-year-olds and talking about preschool.”

In math, 29 nations and jurisdictions outperformed the United States by a statistically significant margin, up from 23 three years ago, the results show. The nations that eclipsed the U.S. average include not only traditional high fliers like South Korea and Singapore, but also Austria, the United Kingdom, and Vietnam.

In science, 22 education systems scored above the U.S. average, up from 18 in 2009.

Secretary Duncan and Angel Gurría, the secretary general for the Organization for Economic Cooperation and Development, officially announced the U.S. scores at the rollout event, which was cosponsored by a host of organizations, including the College Board, the Council of Chief State School Officers, and the Alliance for Excellent Education, a Washington-based advocacy group. In an opening speech, Gurría noted that 40 of the 65 education systems participating in PISA improved in at least one subject since 2003.

“Brazil progressed from low levels, Germany and Poland moved from adequate to good, and Shanghai and Singapore from good to great,” he said. U.S. performance, on the other hand, has been “fundamentally flat.”

Knowledge and Application

The global assessment compares reading, math, and science “literacy"—or knowledge and application of skills—among 15-year-olds internationally. For the first time, the report also includes separately reported results for public school students in three American states: Connecticut, Florida, and Massachusetts. The states each paid about $600,000 to be tested and ranked separately from the United States.

Massachusetts, long a top-performing state, made an especially strong showing on the global stage: It scored better than the average in all subjects for the 34 industrialized nations that comprise the OECD.

Mitchell D. Chester, the education commissioner for Massachusetts, said the new PISA data “helped reinforce that our students are performing among some of the better-performing nations in the world, and it also made clear to me that we shouldn’t be complacent.”

Among the participating education systems, the highest performer in all three subjects was Shanghai, though the methodology around treating the Chinese city as a stand-alone system has raised eyebrows.

Overall, U.S. performance in reading and science was on par, as it was three years ago, with the OECD average. And once again, U.S. scores were below the OECD average in math.

“It’s a policy question whether one should be OK with average,” said Jack Buckley, the commissioner of the NCES, which issued the U.S. report on PISA. “I’d be more willing to tolerate our position if I saw that we were improving.”

The United States continued to have its strongest showing in reading, though there was no measurable change from its 2009 scores. On the PISA scale of 1 to 1,000, the nation scored 498 in reading, statistically similar to the OECD average of 496 and well below Shanghai’s 570.

Massachusetts scored 527 in reading, outperforming all but three education systems. Connecticut came in just behind its neighbor state. Florida’s score was not statistically different from the U.S. average.

While Americans’ reading scores stood still, 10 education systems have surpassed the United States in the subject since 2009, including Ireland, Chinese Taipei (Taiwan), Poland, Estonia, the Netherlands, and Germany.

The 2012 reading results seem “particularly dramatic,” Mr. Buckley said, because several countries that were tied with the United States in 2009 made just enough improvement to statistically edge ahead.

In math, the United States scored 481, measurably lower than the OECD average. Poland, Vietnam, Austria, Ireland, the United Kingdom, Latvia, and Luxembourg all overtook the United States by statistically significant margins in the 2012 math standings.

TIMSS Tells Different Story

In Massachusetts, about 1 in 5 students were rated “top performers” in math, scoring at levels 5 and 6 (on a scale with six levels of performance). The same proportion scored below level 2, or the “baseline proficiency” level. By comparison, more than half of Shanghai 15-year-olds scored at the top two levels in math and just 4 percent scored at the bottom level.

“One of the things that concerns me is the gap between our top and bottom performers,” said Mr. Chester of Massachusetts. “While our aggregate results are very strong, there’s much room for improvement in bringing up our scores in the bottom.”

For the United States overall, only 9 percent of students fell into the top-performer category for math. Mr. Schneider of the air said this is what “disturbs” him most about the results. “We don’t have enough people in the highest level of performance.”

In science, the U.S. average was statistically similar to the OECD average and not measurably different from the 2009 results. Massachusetts and Connecticut both scored higher than the United States as a whole, while Florida scored lower.

It’s notable that the math and science results differ from those on last year’s Trends in Mathematics and Science Study, or TIMSS, another international exam. On that measure, U.S. 4th and 8th graders performed better than the global average of participating nations in both subjects and 4th graders showed improvement in math.

However, experts say several factors complicate comparisons between results from the two exams, including the types of skills being assessed, the nations participating, and the ages of students tested.

Some of the most-anticipated results on PISA among policymakers are those from Finland, which became a darling of the education policy world after posting strong results on that assessment in 2003. Subsequent results on TIMSS have called Finland’s reputation into question. In math, for example, the performance of Finland’s 8th graders on TIMSS was not measurably different from that of their counterparts in the United States, and trailed several U.S. states that had individually reported results.

On the 2012 PISA, Finland scored above the U.S. and OECD averages in all three subjects, but its raw scores were all down from 2009, with the biggest drop in math. Finland ranked sixth among OECD countries in math for 2012. Three years earlier, it was among the top three math performers.

In discussing the outcomes for Shanghai, the top performer on PISA, several experts offered the caveat that its results are not representative of China as a whole.

“Shanghai has an economically and culturally elite population with systems in place to make sure that students who may perform poorly are not allowed into public schools,” wrote Tom Loveless, a senior fellow at the Brookings Institution’s Brown Center for Education Policy, in a recent blog post.

“Comparing U.S. performance to that of Shanghai isn’t apples and oranges; it’s applesauce and Agent Orange,” Frederick M. Hess, the director of education policy studies at the American Enterprise Institute, wrote last week on an opinion blog published by Education Week.

Twelve provinces in China took the 2012 PISA, the OECD confirmed, but only results from Shanghai, Hong Kong, and Macao were publicly released.

Mr. Loveless was especially critical of that action, and suggested in an interview that the OECD “cut a special deal” with the Chinese government, allowing for “cherry-picked” results. In 2011, a Chinese website leaked the average PISA scores from 2009 for all 12 participating provinces. According to those results, China scored measurably above the United States in math and science, but significantly below the U.S. average in reading.

Mr. Buckley of the NCES said that juxtaposing results in Shanghai and Massachusetts—a top-performing U.S. state by most measures—is “a better comparison than Shanghai to the U.S.” In all three subjects tested, Massachusetts’ scores fell far behind those of the Chinese city.

“The Shanghai results suggest that even better things are possible for Massachusetts,” said Mr. Chester, the state’s education commissioner.

The OECD report also delves into the relationship between socioeconomic factors and student performance. In the United States, the report finds, the strength of the correlation is comparable with the average for OECD nations. However, socioeconomic status is less closely correlated to performance in other countries, including Hong Kong, Korea, Estonia, and Japan. At the Washington PISA event, Secretary Duncan said that “achievement gaps are painfully evident” in the U.S. results but that “our diversity fails to explain why the U.S. lags behind our peers.”

Making Causal Inferences

In a 550-page addendum report, “What Makes 69ý Successful?” released with the PISA results, the OECD provides analyses of the trends seen in PISA and guidance for policymakers. At the live event last week, Gurriá encouraged the United States to “find ways to allocate the most talented teachers and school leaders to the most challenging schools and classrooms.” He also praised the Race to the Top initiative, a signature program of the Obama administration, and said that “the strict implementation of the Common Core State Standards for mathematics would undoubtedly improve PISA results.”

During an hourlong presentation the same day, Andreas Schleicher, the OECD’s deputy director for education and skills, said that the highest-performing countries “place a great value on education,” have “universal education standards,” and use “a high degree of personalization as an approach to address diversity.”

Meanwhile, Randi Weingarten, the president of the American Federation of Teachers, said in a statement last week that the PISA results provide evidence that “a decade of top-down, test-based schooling created by No Child Left Behind and Race to the Top—focused on hyper-testing students, sanctioning teachers, and closing schools—has failed to improve the quality of American public education.”

She said top-tier countries do not have “a fixation on testing like the United States does.”

However, some education experts say policymakers and the public should be wary of drawing policy conclusions based on PISA scores.

“These kinds of studies are really good at describing where we stand and maybe looking at trends,” said Mr. Buckley from the NCES. “They’re not good at all at telling us why. The study design is not one that supports causal inference.”

“There’s a tendency to go beyond the data,” said Mr. Schneider. “For me this is a serious problem.”

Related Tags:

A version of this article appeared in the December 11, 2013 edition of Education Week as Global Test Shows U.S. Stagnating

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 
Assessment K-12 Essentials Forum Making Competency-Based Learning a Reality
Join this free virtual event to hear from educators and experts working to implement competency-based education.

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 69ý Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + Education Week
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/Education Week + Getty Images
Assessment Why Are States So Slow to Release Test Scores?
Nearly a dozen states still haven't put out scores from spring tests. What's taking so long?
7 min read
Illustration of a man near a sheet of paper with test scores on which lies a magnifying glass and next to it is a question mark.
iStock/Getty